Hive集成Mysql作为元数据时,提示错误:Specified key was too long; max key length is 767 bytes
在进行Hive集成Mysql作为元数据过程中。做全然部安装配置工作后。进入到hive模式,运行show databases。运行正常,接着运行show tables;时却报错。
关键错误信息例如以下:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes)
详细操作信息例如以下:
hive> show databases;
OK
default
Time taken: 8.638 seconds
hive> show tables;
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.hive.metastore.api.MetaException javax.jdo.JDODataStoreException: An exception was thrown while adding/validating class(es) : Specified key was too long; max key length is 767
bytes
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.Util.getInstance(Util.java:386)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4098)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4030)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2490)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2651)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2671)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2621)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:842)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:681)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:681)
at org.datanucleus.store.rdbms.table.AbstractTable.create(AbstractTable.java:402)
at org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.java:458)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2689)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:243)
at org.apache.hadoop.hive.metastore.ObjectStore.getTables(ObjectStore.java:781)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at com.sun.proxy.$Proxy4.getTables(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_tables(HiveMetaStore.java:2327)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
at com.sun.proxy.$Proxy5.get_tables(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTables(HiveMetaStoreClient.java:817)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
at com.sun.proxy.$Proxy6.getTables(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:1009)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:983)
at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2215)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1336)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1122)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:935)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
NestedThrowables:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
处理方案:
改动我们创建的hive元数据库的编码字符集,如:
alter database hive_test character set latin1;
Hive集成Mysql作为元数据时,提示错误:Specified key was too long; max key length is 767 bytes的更多相关文章
- 使用图形界面管理工具Navicat for MySQL连接Mysql数据库时提示错误:Can't connect to MySQL server (10060)
版权声明:本文为 testcs_dn(微wx笑) 原创文章,非商用自由转载-保持署名-注明出处,谢谢. https://blog.csdn.net/testcs_dn/article/details/ ...
- EF MySQL 提示 Specified key was too long; max key length is 767 bytes错误
在用EF的CodeFirst操作MySql时,提示 Specified key was too long; max key length is 767 bytes错误,但数据库和表也建成功了.有高人知 ...
- hive mysql元数据,报错 Specified key was too long; max key length is 767 bytes
Specified key was too long; max key length is 767 bytes 此错误为hive 元数据mysql 字符集编码问题 如 show create tabl ...
- Mysql插入中文时提示:ERROR 1366 (HY000): Incorrect string value: '\xE5\x8F\xB0\xE5\xBC\x8F...' fo
Mysql插入数据时提示:ERROR 1366 (HY000): Incorrect string value: ‘\xE5\x8F\xB0\xE5\xBC\x8F…’ fo 分析如下: 首先通过语句 ...
- Oracle登录时提示错误,导致用户无法登录
Oracle登录时提示错误,导致用户无法登录,错误如下 ------------------------------------------------------------------------ ...
- Ubuntu 使用apt-get时提示错误:无法获得锁 /var/lib/dpkg/lock
推荐博客:http://blog.sina.com.cn/s/blog_5c1450a8010188ju.html Ubuntu 使用apt-get时提示错误:无法获得锁 /var/lib/dpkg/ ...
- 在Ubuntu 12.04 - 64bit中安装CodeSourcery时提示错误
安装时提示错误,Your 64-bit Linux host is missing the 32-bit libraries requied to install and use Sourcery C ...
- Navicat for MySQL打开链接时出错错误为:2005 - Unknown MySQL server host 'localhost'(0)?
问题:Navicat for MySQL打开链接时出错错误为:2005 - Unknown MySQL server host 'localhost'(0)? 在使用navicat 连接mysql数据 ...
- Netbeans打开包括中文文件时提示错误
Netbeans打开包括中文文件时提示错误.在Netbeans里找了半天没找到怎么设置,最后发现要改动Netbeans的配置文件才干解决. 编辑C:\Program Files\NetBeans 8. ...
随机推荐
- VIM块操作
一.可视模式 按v启用可视模式,之后移动光标可以选择. 如: 如果想整行操作,则用大写的V,再移动光标可以按行为单位进行选择. 二.列块操作 在 word中有一个功能,按alt加鼠标拖动,可以 ...
- 将 Unity5.3 的老项目升级到 Unity 2018.3 遇到的些许问题。
删除 ParticleEmmiter 等废弃的接口: 删除 WindowsSecurityContext System.Security.Principal.WindowsIdentity 在 .Ne ...
- springMVC整合freemarker遇到的问题 maven
java.lang.IllegalAccessError: tried to access method freemarker.ext.servlet.AllHttpScopesHashModel.& ...
- Centos7 安装单节点Torque PBS
Operation system: CentOS 7.3 Torque PBS: torque-6.1.1.1.tar hostname: rfmlab user name: cfd01 1. Ins ...
- poj2485(Kruskal)
这道题显然是一道最小生成树的问题,参考算法导论中的Kruskal方法,先对路径长度进行排序,然后使用并查集(Disjoint Set Union)来判断节点是否连通,记录连接所有节点的最后一条路径的长 ...
- ARC 058
所以为啥要写来着........... 链接 T1 直接枚举大于等于$n$的所有数,暴力分解判断即可 复杂度$O(10n \log n)$ #include <cstdio> #inclu ...
- 【贪心】【高精度】zoj3987 Numbers
题意:给你一个数n,让你找m个非负整数,使得它们的和为n,并且按位或起来以后的值最小化.输出这个值. 从高位到低位枚举最终结果,假设当前是第i位,如果m*(2^i-1)<n的话,那么说明这一位如 ...
- 零配置文件搭建SpringMvc
零配置文件搭建SpringMvc SpringMvc 流程原理 (1)用户发送请求至前端控制器DispatcherServlet:(2) DispatcherServlet收到请求后,调用Handle ...
- Educational Codeforces Round 12 D. Simple Subset 最大团
D. Simple Subset 题目连接: http://www.codeforces.com/contest/665/problem/D Description A tuple of positi ...
- Java--tomcat线程池(分析)
以apache-tomcat-7.0.57 为例子 tomcat的默认配置如下: <Connector connectionTimeout="/> 默认的线程池为: maxThr ...