Hive集成Mysql作为元数据时,提示错误:Specified key was too long; max key length is 767 bytes
在进行Hive集成Mysql作为元数据过程中。做全然部安装配置工作后。进入到hive模式,运行show databases。运行正常,接着运行show tables;时却报错。
关键错误信息例如以下:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes)
详细操作信息例如以下:
hive> show databases;
OK
default
Time taken: 8.638 seconds
hive> show tables;
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.hive.metastore.api.MetaException javax.jdo.JDODataStoreException: An exception was thrown while adding/validating class(es) : Specified key was too long; max key length is 767
bytes
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.Util.getInstance(Util.java:386)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4098)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4030)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2490)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2651)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2671)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2621)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:842)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:681)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:681)
at org.datanucleus.store.rdbms.table.AbstractTable.create(AbstractTable.java:402)
at org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.java:458)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2689)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:243)
at org.apache.hadoop.hive.metastore.ObjectStore.getTables(ObjectStore.java:781)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at com.sun.proxy.$Proxy4.getTables(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_tables(HiveMetaStore.java:2327)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
at com.sun.proxy.$Proxy5.get_tables(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTables(HiveMetaStoreClient.java:817)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
at com.sun.proxy.$Proxy6.getTables(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:1009)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:983)
at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2215)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1336)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1122)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:935)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
NestedThrowables:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
处理方案:
改动我们创建的hive元数据库的编码字符集,如:
alter database hive_test character set latin1;
Hive集成Mysql作为元数据时,提示错误:Specified key was too long; max key length is 767 bytes的更多相关文章
- 使用图形界面管理工具Navicat for MySQL连接Mysql数据库时提示错误:Can't connect to MySQL server (10060)
版权声明:本文为 testcs_dn(微wx笑) 原创文章,非商用自由转载-保持署名-注明出处,谢谢. https://blog.csdn.net/testcs_dn/article/details/ ...
- EF MySQL 提示 Specified key was too long; max key length is 767 bytes错误
在用EF的CodeFirst操作MySql时,提示 Specified key was too long; max key length is 767 bytes错误,但数据库和表也建成功了.有高人知 ...
- hive mysql元数据,报错 Specified key was too long; max key length is 767 bytes
Specified key was too long; max key length is 767 bytes 此错误为hive 元数据mysql 字符集编码问题 如 show create tabl ...
- Mysql插入中文时提示:ERROR 1366 (HY000): Incorrect string value: '\xE5\x8F\xB0\xE5\xBC\x8F...' fo
Mysql插入数据时提示:ERROR 1366 (HY000): Incorrect string value: ‘\xE5\x8F\xB0\xE5\xBC\x8F…’ fo 分析如下: 首先通过语句 ...
- Oracle登录时提示错误,导致用户无法登录
Oracle登录时提示错误,导致用户无法登录,错误如下 ------------------------------------------------------------------------ ...
- Ubuntu 使用apt-get时提示错误:无法获得锁 /var/lib/dpkg/lock
推荐博客:http://blog.sina.com.cn/s/blog_5c1450a8010188ju.html Ubuntu 使用apt-get时提示错误:无法获得锁 /var/lib/dpkg/ ...
- 在Ubuntu 12.04 - 64bit中安装CodeSourcery时提示错误
安装时提示错误,Your 64-bit Linux host is missing the 32-bit libraries requied to install and use Sourcery C ...
- Navicat for MySQL打开链接时出错错误为:2005 - Unknown MySQL server host 'localhost'(0)?
问题:Navicat for MySQL打开链接时出错错误为:2005 - Unknown MySQL server host 'localhost'(0)? 在使用navicat 连接mysql数据 ...
- Netbeans打开包括中文文件时提示错误
Netbeans打开包括中文文件时提示错误.在Netbeans里找了半天没找到怎么设置,最后发现要改动Netbeans的配置文件才干解决. 编辑C:\Program Files\NetBeans 8. ...
随机推荐
- Void运算符 与 undefined类型
void 运算符 对给定的表达式进行求值,然后返回 undefined. 何为求值,就是执行之后的表达式. 我们最常见的就是 <a href="javascript: void(0)& ...
- 第10天-JavaScript正则表达式
正则有什么用 给定的字符串是否符合正则表达式的过滤逻辑 通过正则表达式,从字符串中获取我们想要的特定部分 替换字符串满足正则表达式的字符 例如:验证邮箱.手机号.银行卡.采集器(爬虫).中奖信息133 ...
- TCP/IP——基础概念简记
TCP/IP协议族的分层: 应用层 运输层 网络层 链路层 互联网地址(IP地址):互联网上的每个接口必须有一个唯一的Internet地址,它一定的结构,分为ABCDE五类.A类保留给政府机构,B类分 ...
- 为什么我喜欢Java
我现在的老板使用一个在线测试系统来筛选在线申请职位的求职者.测试的第一个问题很浅显,仅仅是为了让求职者熟悉一下这个系统的提交和测试代码的流程.问题是这样的,写一个将标准输入拷贝到标准输出的流程.求职者 ...
- BZOJ2938 POI2000病毒
我们不能让重复过的字串出现在无限串上(就叫这个了...) 也就是要自动机一直能匹配但就是匹配不到,那么就是在自动机上找一个环. dfs判环即可.注意是个有向图. #include<bits/st ...
- AtcoderGrandContest 016 D.XOR Replace
$ >AtcoderGrandContest \space 016 D.XOR\space Replace<$ 题目大意 : 有两个长度为 \(n\) 的数组 \(A, B\) ,每次操作 ...
- Educational Codeforces Round 43 (Rated for Div. 2) ABCDE
A. Minimum Binary Number time limit per test 1 second memory limit per test 256 megabytes input stan ...
- python中后端数据序列化不显示中文的解决方法
我们在前后端交互的时候,让序列化的数据更友好的显示,我们会用到 import json js = json.loads('{"name": "多多"}') pr ...
- [转]Android Service完全解析,关于服务你所需知道的一切
目录(?)[+] Android Service完全解析,关于服务你所需知道的一切(上) 分类: Android疑难解析2013-10-31 08:10 6451人阅读 评论(39) 收藏 举报 ...
- HDU 5296 Annoying problem dfs序 lca
Annoying problem 题目连接: http://acm.hdu.edu.cn/showproblem.php?pid=5296 Description Coco has a tree, w ...