Hive 导入数据报错,驱动版本过低
Failed with exception Unable to alter table. javax.jdo.JDODataStoreException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'OPTION SQL_SELECT_LIMIT=DEFAULT' at line 1
at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:275)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:996)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:928)
at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
at com.sun.proxy.$Proxy2.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveAlterHandler.alterTable(HiveAlterHandler.java:122)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_table_core(HiveMetaStore.java:3414)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_table_with_cascade(HiveMetaStore.java:3386)
at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
at com.sun.proxy.$Proxy4.alter_table_with_cascade(Unknown Source)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$alter_table_with_cascade.getResult(ThriftHiveMetastore.java:9464)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$alter_table_with_cascade.getResult(ThriftHiveMetastore.java:9448)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
NestedThrowablesStackTrace:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'OPTION SQL_SELECT_LIMIT=DEFAULT' at line 1
at sun.reflect.GeneratedConstructorAccessor31.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:406)
at com.mysql.jdbc.Util.getInstance(Util.java:381)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1030)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3491)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3423)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1936)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2060)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2536)
at com.mysql.jdbc.StatementImpl.executeSimpleNonQuery(StatementImpl.java:1463)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1365)
at com.mysql.jdbc.ConnectionImpl.getTransactionIsolation(ConnectionImpl.java:3117)
at com.jolbox.bonecp.ConnectionHandle.getTransactionIsolation(ConnectionHandle.java:825)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:444)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getXAResource(ConnectionFactoryImpl.java:378)
at org.datanucleus.store.connection.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:328)
at org.datanucleus.store.connection.AbstractConnectionFactory.getConnection(AbstractConnectionFactory.java:94)
at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:430)
at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:396)
at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:621)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:996)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:928)
at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
at com.sun.proxy.$Proxy2.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveAlterHandler.alterTable(HiveAlterHandler.java:122)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_table_core(HiveMetaStore.java:3414)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_table_with_cascade(HiveMetaStore.java:3386)
at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
at com.sun.proxy.$Proxy4.alter_table_with_cascade(Unknown Source)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$alter_table_with_cascade.getResult(ThriftHiveMetastore.java:9464)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$alter_table_with_cascade.getResult(ThriftHiveMetastore.java:9448)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
抛出这个异常不是错误,只是使用的mysql驱动版本的问题,执行的命令是已经完成执行,如果不想看到这个异常,可以将对应的mysql驱动换成与mysql版本一致即可。
服务器上使用的是mysql 5.7 版本,而驱动还是 5.1.15 系列的,已经更新,换上更高的版本即可。
# 查看mysql版本
mysql -u root -p # 进入之后,上面就会有相应的版本信息
# 查看mysql-connect-java.xx.jar 版本,在mysql 的 lib 目录
ls /xxx/lib # 从官网去下载更高的版本,换到 5.1.20+以后都可以
https://dev.mysql.com/downloads/file/?id=412737 # 下载之后,把对应的移到lib下面,再把原来的删除掉。
Hive 导入数据报错,驱动版本过低的更多相关文章
- IMP 导入数据报错 OCI-21500 OCI-22275
IMP导入数据报错如下: OCI-21500: internal error code, arguments: [kgepop: no error frame to pop to], [], [], ...
- mysql 数据库导入数据报错MySQL server has gone away解决办法
mysql 数据库导入数据报错MySQL server has gone away解决办法: 进入数据库执行以下命令即可: set global wait_timeout = 2880000; set ...
- SQL Server导入数据报错"无法在只读列“Id”中插入数据"
使用sql server 导入数据报错:无法在只读列'id'中插入数据.如下图所示: 查找出现该问题的原因是表中id为自动增长的标识列,需要在[编辑映射]中勾选"启用标识插入": ...
- 解决Myeclipse下Debug出现Source not found以及sql server中导入数据报错
前言:在空间里回顾了我的2014,从生活.技术.家庭等各方面对自己进行总结剖析,也是给自己一个交代.也想在博客上专门写一篇2014年度菜鸟的技术路回忆录,但是因为各种事一再耽搁了,现在来写也就更显得不 ...
- Navicat 导入数据报错 --- 1153 - Got a packet bigger than 'max_allowed_packet' bytes
在用Navicat导入SQL文件时报错:MySql 错误 Err [Imp] 1153 - Got a packet bigger than 'max_allowed_packet' bytes 查了 ...
- Hive插数据报错
报错信息: Failed with exception MetaException(message:javax.jdo.JDODataStoreException: Put request faile ...
- MYSQL导入数据报错|MYSQL导入超大文件报错|MYSQL导入大数据库报错:2006 - MySQL server has gone away
导SQL数据库结构+数据时,如果数据是批量插入的话会报错:2006 - MySQL server has gone away. 解决办法:找到你的mysql目录下的my.ini配置文件(如果安装目录没 ...
- MySQL导入数据报错Got a packet bigger than‘max_allowed_packet’bytes错误的解决方法
由于max_allowed_packet的值设置过小的原因,只需要将max_allowed_packet值设置大一点就OK了.通过终端进入mysql控制台,输入如下命令可以查看max_allowed_ ...
- Oracle impdp导入数据报错:无法读取要读取的存储文件(Linux)
当向Linux下的Oracle11g通过数据泵impdp导入数据库时,出现如图所示错误. 错误原因:bdck.dmp该为大写. 切记:Linux系统严格区分大小写.
随机推荐
- HTML图像标记
1.使用方法 <img src="路径/文件名.图片格式" width="属性值" height="属性值" border=" ...
- dubbo与springboot的三种整合方式
SpringBoot与dubbo整合的三种方式:1.导入dubbo-starter,在application.properties配置属性,使用@Service暴露服务,使用@Reference引用服 ...
- 移动端 mui框架中input输入框或任何输入框聚焦后页面自动上移
一.mui框架中点击input后,安卓手机弹出自带的输入键盘时,页面自动上移 实现方法: (1)只要把input标签放在mui-content这个类里面就可以了 <div class=" ...
- arcengine geometry union操作
以前得到的结果老是某一个,用下面的方法就可以获取合并后的结果 IGeometry pUnionGeo = null; var bFirst = true; foreach (IGeometry pGe ...
- 面向对象-类-成员变量-局部变量-this
1.能够理解面向对象的思想 面向对象是基于面向过程的编程思想,强调的是对象,由对象去调用功能.它是一种更符合人类习惯的编程思想,可以将复杂的事情简单化,将我们的角色从执行者变成了指挥者. 2. ...
- 后台安装 SQL Server 无人值守 安装
(开头闲淡)项目需要必须安装SQL的,查了很久,断断续续用了各种方法,今天终于用了正确的姿(xia)势(mo)弄成了. 最开始用的方法是调用Win的API模拟鼠标操作安装的,嗯,虽然勉强可以,就是有些 ...
- COFF文件格式
链接器 目录 一 COFF-Common Object File Format-通用对象文件格式... 3 COFF的文件格式与结构体... 4 文件头... 5 numberOfSections(区 ...
- (四)mybatis之mybatis初了解
前言:终于到mybatis啦! Mybatis 前文有提到,Hibernate采用的是全表映射的方式,而这方式恰恰使得性能变得较差(https://www.cnblogs.com/NYfor201 ...
- Solr版本安装部署指南
一.依赖包 1. JDK 1.6以上 2. solr-4.3.0.tgz 3. Tomcat或者jetty(注意,solr包中本身就含有jetty的启动相关内容):apache-tomcat-7 ...
- file-leak-detector(文件句柄泄漏)在JDK1.6环境下 weblogic 和 tomcat安装方式以及使用方式
file-leak-detector作者博客详见: http://file-leak-detector.kohsuke.org/ file-leak-detector学习贴: https://blog ...