1、问题描述

(1)问题示例:

[Hadoop@master TestDir]$ sqoop import --connect jdbc:mysql://master:3306/source?useSSL=false --username Hive --password ****** --table customer --hive-import --hive-table rds.customer --hive-overwrite
2021-11-05 20:13:59,638 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
2021-11-05 20:13:59,664 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
2021-11-05 20:13:59,665 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
2021-11-05 20:13:59,665 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
2021-11-05 20:13:59,743 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
2021-11-05 20:13:59,748 INFO tool.CodeGenTool: Beginning code generation
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
2021-11-05 20:14:00,022 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `customer` AS t LIMIT 1
2021-11-05 20:14:00,369 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `customer` AS t LIMIT 1
2021-11-05 20:14:00,377 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/grid/Hadoop/hadoop-3.3.1
......(此处有省略)
2021-11-05 20:15:51,825 INFO mapreduce.ImportJobBase: Transferred 807 bytes in 102.2706 seconds (7.8908 bytes/sec)
2021-11-05 20:15:51,828 INFO mapreduce.ImportJobBase: Retrieved 14 records.
2021-11-05 20:15:51,828 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table customer
2021-11-05 20:15:51,847 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `customer` AS t LIMIT 1
2021-11-05 20:15:52,005 INFO hive.HiveImport: Loading uploaded data into Hive
2021-11-05 20:15:52,009 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
2021-11-05 20:15:52,010 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
... 12 more

(2)问题综述:

问题核心所在:上述示例标红部分,即:

2021-11-05 20:15:52,009 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
2021-11-05 20:15:52,010 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf

上述显示:存在两个问题。

2、问题剖析:

(1)参考:

参考1:https://www.cnblogs.com/drl-blogs/p/11086865.html

参考2:https://blog.csdn.net/lianghecai52171314/article/details/104454332

(2)根据参考资料和问题示例报错部分,进行分析和排查:

对于问题1:“ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.”,推测是HIVE环境未配置好,找不到环境变量HIVE_CONF_DIR,尤其是Hive中的jar工具库。出现示例问题的报错是可能的,存在未正确配置好环境变量HIVE_CONF_DIR而导致该种问题的可能性。但是根据笔者Hive正常使用情况,以及从最终正确结果反推,出现该种错误不是根本原因。笔者亲测验证,各种设置改环境变量值,都未排除示例问题。

对于问题2:“ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf”,是导致示例问题报错出现的根本原因。笔者根据参考资料——参考2,进行排雷,求得正解。

示例报错的原因是:

Sqoop工具缺少了hive-common-*.*.*.jar,将Hive的lib目录下的hive-common-3.1.2.jar拷贝至Sqoop的lib目录下即可。

3、解决方案:

方案1:错误排雷

(1)问题排查:查看Hive安装路径下lib中的hive-common-*.*.*.jar中是否还有类“HiveConf.class”。

[Hadoop@master lib]$ ls hive-common*
hive-common-3.1.2.jar
[Hadoop@master lib]$ jar tf hive-common-3.1.2.jar | grep HiveConf.class
org/apache/hadoop/hive/conf/HiveConf.class
从上面结果可知,Hive安装路径lib中是存在Hive-common-*.jar工具。

(2)配置环境,将hive安装路径lib添加至hadoop环境HADOOP_CLASSPATH中。

此法未解决笔者本示例的问题,但存在引起该示例问题的可能性(笔者未验证)。

方案2:正确排雷

(1)拷贝hive-common-3.1.2.jar

  将Hive的lib目录下的hive-common-3.1.2.jar拷贝至Sqoop的lib目录下即可。

(2)二次测试:看到下面最后1行,即说明解决了示例问题,Sqoop成功将MySQL全覆盖传入Hive数据库中。

[Hadoop@master TestDir]$ sqoop import --connect jdbc:mysql://master:3306/source?useSSL=false --username Hive --password ******* --table customer --hive-import --hive-table rds.customer --hive-overwrite
2021-11-06 00:41:17,492 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
2021-11-06 00:41:17,516 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
2021-11-06 00:41:17,516 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
2021-11-06 00:41:17,516 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
2021-11-06 00:41:17,579 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
2021-11-06 00:41:17,584 INFO tool.CodeGenTool: Beginning code generation
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
2021-11-06 00:41:17,831 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `customer` AS t LIMIT 1
2021-11-06 00:41:17,874 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `customer` AS t LIMIT 1
2021-11-06 00:41:17,880 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/Hadoop/Hadoop/hadoop-3.3.1
......(此处有省略)
2021-11-06 00:42:20,755 INFO hive.HiveImport: OK
2021-11-06 00:42:20,756 INFO hive.HiveImport: 2021-11-06 00:42:20,755 INFO [40671dec-9d11-4d68-b514-9df6685af945 main] ql.Driver (SessionState.java:printInfo(1227)) - OK
2021-11-06 00:42:20,756 INFO hive.HiveImport: 2021-11-06 00:42:20,755 INFO [40671dec-9d11-4d68-b514-9df6685af945 main] lockmgr.DbTxnManager (DbTxnManager.java:stopHeartbeat(846)) - Stopped heartbeat for query: grid_20211106004218_3df662fa-e868-477a-a8db-eb11ce9534fe
2021-11-06 00:42:20,854 INFO hive.HiveImport: Time taken: 2.147 seconds
2021-11-06 00:42:20,855 INFO hive.HiveImport: 2021-11-06 00:42:20,854 INFO [40671dec-9d11-4d68-b514-9df6685af945 main] CliDriver (SessionState.java:printInfo(1227)) - Time taken: 2.147 seconds
2021-11-06 00:42:20,855 INFO hive.HiveImport: 2021-11-06 00:42:20,855 INFO [40671dec-9d11-4d68-b514-9df6685af945 main] conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 40671dec-9d11-4d68-b514-9df6685af945
2021-11-06 00:42:20,856 INFO hive.HiveImport: 2021-11-06 00:42:20,855 INFO [40671dec-9d11-4d68-b514-9df6685af945 main] session.SessionState (SessionState.java:resetThreadName(452)) - Resetting thread name to main
2021-11-06 00:42:20,856 INFO hive.HiveImport: 2021-11-06 00:42:20,856 INFO [main] conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 40671dec-9d11-4d68-b514-9df6685af945
2021-11-06 00:42:20,907 INFO hive.HiveImport: 2021-11-06 00:42:20,905 INFO [main] session.SessionState (SessionState.java:dropPathAndUnregisterDeleteOnExit(885)) - Deleted directory: /user/hive/tmp/grid/40671dec-9d11-4d68-b514-9df6685af945 on fs with scheme hdfs
2021-11-06 00:42:20,907 INFO hive.HiveImport: 2021-11-06 00:42:20,906 INFO [main] session.SessionState (SessionState.java:dropPathAndUnregisterDeleteOnExit(885)) - Deleted directory: /tmp/hive/Local/40671dec-9d11-4d68-b514-9df6685af945 on fs with scheme file
2021-11-06 00:42:20,907 INFO hive.HiveImport: 2021-11-06 00:42:20,906 INFO [main] metastore.HiveMetaStoreClient (HiveMetaStoreClient.java:close(600)) - Closed a connection to metastore, current connections: 1
2021-11-06 00:42:21,038 INFO hive.HiveImport: Hive import complete.

 

Sqoop导入MySQL表中数据到Hive出现错误: ERROR hive.HiveConfig: Make sure HIVE_CONF_DIR is set correctly.ERROR tool.ImportTool: Import failed:的更多相关文章

  1. sqoop mysql--->hive 报错 (ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf)

    ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apa ...

  2. ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf

    sqoop从mysql导入到hive报错: 18/08/22 13:30:53 ERROR tool.ImportTool: Import failed: java.io.IOException: j ...

  3. 清除mysql表中数据

    delete from 表名; truncate table 表名; 不带where参数的delete语句可以删除mysql表中所有内容,使用truncate table也可以清空mysql表中所有内 ...

  4. mysql 表中数据不存在则插入,否则更新数据

    在很多时候我们会操作数据库表,但是在向表中插入数据时,会遇到表中已经存在该id的数据或者没有该id的数据的情况,没有该id的数据的情况时直接插入就OK,遇到已经存在该id的数据的情况则更新该id的数据 ...

  5. Spark 加载数据库mysql表中数据进行分析

    1.工程maven依赖包 <properties> <spark_version>2.3.1</spark_version> <!-- elasticsear ...

  6. PLSQL导入Excel表中数据

     PL/SQL 和SQL Sever导入excel数据的原理类似,就是找到一个导入excel数据的功能项,按照步骤走就是了.下面是一个些细节过程,希望对像我这样的菜鸟有帮助.  www.2cto.co ...

  7. MySQL表中数据的迁移

    INSERT INTO `crm_attachment`(OPERATOR_ID,ATTACHMENT_ID,TYPE ) SELECT APPLICATION_ID ,ATTACHMENT_ID,' ...

  8. 使用sqoop将MySQL数据库中的数据导入Hbase

    使用sqoop将MySQL数据库中的数据导入Hbase 前提:安装好 sqoop.hbase. 下载jbdc驱动:mysql-connector-java-5.1.10.jar 将 mysql-con ...

  9. Sqoop导入mysql数据到Hbase

    sqoop import --driver com.mysql.jdbc.Driver --connect "jdbc:mysql://11.143.18.29:3306/db_1" ...

  10. php实例根据ID删除mysql表中的数据

    在动态网站开发中,我们经常要根据ID删除表中的数据,例如用户删除帖子,就需要根据ID删除帖子.本文章向大家介绍php根据ID删除表中数据的实例,需要的朋友可以参考一下本文章的实例. php实例根据ID ...

随机推荐

  1. java的Stream

    代码 List<Student> all = Student.getAll(); // 转换成数组 过滤所有的男性 Student[] students = all.stream().fi ...

  2. MySQL之校对集问题

    随笔记录方便自己和同路人查阅. #------------------------------------------------我是可耻的分割线--------------------------- ...

  3. JavaScript严格模式(use strict)

    一.什么是严格模式(strict mode) JavaScript严格模式即在严格模式下运行.严格模式下,你将不能使用未声明的变量. 注意,严格模式需要浏览器的支持:Internet explorer ...

  4. 基础篇之DOS命令

    对于编程的小朋友来说,在DOS中输入一些命令也是常有的事情. 今天就来学习一些常见的命令.[该篇是在B站学习Siki学院的课程时做的笔记] DOS命令目录操作 常用DOS命令(输入命令后按下回车) d ...

  5. jmeter参数化时最常用随机函数

    邮箱类: ${__RandomString(8,abcdefghijklmnopqrstuvwxyz,)}@126.com 手机号类: ${__Random(18000000000,189999999 ...

  6. DockerCompose

  7. 计算机网络基础(3):IP与子网掩码/ ping/ ipconfig/ VLAN/ 网络服务器配置

    chapter4 构建中型网络 1. IP地址与子网掩码 A类地址:网络ID开头是0,范围从00000001到01111110,126个,其中0 127留作他用.在每个网段里(网络ID),可以容纳2* ...

  8. hashtable被弃用了

      Hashtable的作者:HashMap的作者:Hash Map的作者比Hashtable的作者多了著名顶顶的并发大神Doug Lea.他写了util.concurrent包.著有并发编程圣经Co ...

  9. mybatis plus 分页总查出来全部数据

    需要配置过滤器 package com.tyyy.example.coreurl.config; import com.baomidou.mybatisplus.annotation.DbType; ...

  10. Delphi实现Windows的气泡提示样式

    其实也不复杂,主要使用到shellAPI下的TNotifyIconData这个结构体: typedef struct _NOTIFYICONDATA { DWORD cbSize; //结构体的大小, ...