Sqoop-问题
1.权限问题
// :: INFO mapreduce.Job: Job job_1555064328415_0328 failed with state FAILED due to: Job setup failed : org.apache.hadoop.security.AccessControlException: Permission denied: user=yarn, access=WRITE, inode="/user/yarn":hdfs:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:)
at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:)
oozie调用sqoop时报错,解决:改所有者
sudo -u hdfs hadoop fs -chown -R yarn /user/yarn
2.少MySQL驱动包
// :: INFO hive.metastore: Closed a connection to metastore, current connections:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.0.-.cdh6.0.0.p0./jars/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.0.-.cdh6.0.0.p0./jars/log4j-slf4j-impl-2.8..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
// :: INFO sqoop.Sqoop: Running Sqoop version: 1.4.-cdh6.0.0
// :: WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
// :: INFO conf.HiveConf: Found configuration file file:/etc/hive/conf.cloudera.hive/hive-site.xml
// :: INFO hive.metastore: Trying to connect to metastore with URI thrift://hadoop1:9083
// :: INFO hive.metastore: Opened a connection to metastore, current connections:
// :: INFO hive.metastore: Connected to metastore.
// :: INFO hive.metastore: Trying to connect to metastore with URI thrift://hadoop1:9083
// :: INFO hive.metastore: Opened a connection to metastore, current connections:
// :: INFO hive.metastore: Connected to metastore.
// :: WARN tool.BaseSqoopTool: Output field/record delimiter options are not useful in HCatalog jobs for most of the output types except text based formats is text. It is better to use --hive-import in those cases. For non text formats,
// :: INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
// :: INFO tool.CodeGenTool: Beginning code generation
// :: ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:)
at org.apache.sqoop.Sqoop.run(Sqoop.java:)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:)
at org.apache.sqoop.Sqoop.main(Sqoop.java:)
// :: INFO hive.metastore: Closed a connection to metastore, current connections:
sqoop时,应该是有的节点没有放mysql驱动连接jar,将mysql驱动包放入/usr/share/java/下(mysql-connector-java.jar)
Sqoop-问题的更多相关文章
- sqoop:Failed to download file from http://hdp01:8080/resources//oracle-jdbc-driver.jar due to HTTP error: HTTP Error 404: Not Found
环境:ambari2.3,centos7,sqoop1.4.6 问题描述:通过ambari安装了sqoop,又添加了oracle驱动配置,如下: 保存配置后,重启sqoop报错:http://hdp0 ...
- 安装sqoop
安装sqoop 1.默认已经安装好java+hadoop 2.下载对应hadoop版本的sqoop版本 3.解压安装包 tar zxvf sqoop-1.4.6.bin__hadoop-2.0.4-a ...
- Hadoop学习笔记—18.Sqoop框架学习
一.Sqoop基础:连接关系型数据库与Hadoop的桥梁 1.1 Sqoop的基本概念 Hadoop正成为企业用于大数据分析的最热门选择,但想将你的数据移植过去并不容易.Apache Sqoop正在加 ...
- Oozie分布式任务的工作流——Sqoop篇
Sqoop的使用应该是Oozie里面最常用的了,因为很多BI数据分析都是基于业务数据库来做的,因此需要把mysql或者oracle的数据导入到hdfs中再利用mapreduce或者spark进行ETL ...
- [大数据之Sqoop] —— Sqoop初探
Sqoop是一款用于把关系型数据库中的数据导入到hdfs中或者hive中的工具,当然也支持把数据从hdfs或者hive导入到关系型数据库中. Sqoop也是基于Mapreduce来做的数据导入. 关于 ...
- [大数据之Sqoop] —— 什么是Sqoop?
介绍 sqoop是一款用于hadoop和关系型数据库之间数据导入导出的工具.你可以通过sqoop把数据从数据库(比如mysql,oracle)导入到hdfs中:也可以把数据从hdfs中导出到关系型数据 ...
- Sqoop切分数据的思想概况
Sqoop通过--split-by指定切分的字段,--m设置mapper的数量.通过这两个参数分解生成m个where子句,进行分段查询.因此sqoop的split可以理解为where子句的切分. 第一 ...
- sqoop数据导出导入命令
1. 将mysql中的数据导入到hive中 sqoop import --connect jdbc:mysql://localhost:3306/sqoop --direct --username r ...
- Apache Sqoop - Overview——Sqoop 概述
Apache Sqoop - Overview Apache Sqoop 概述 使用Hadoop来分析和处理数据需要将数据加载到集群中并且将它和企业生产数据库中的其他数据进行结合处理.从生产系统加载大 ...
- sqoop使用中的小问题
1.数据库连接异常 执行数据导出 sqoop export --connect jdbc:mysql://192.168.208.129:3306/test --username hive --P - ...
随机推荐
- #include <sys/epoll.h> epoll - I/O event notification facility 服务器端 epoll(7) - Linux manual page http://www.man7.org/linux/man-pages/man7/epoll.7.html
epoll使用详解(精髓) - Boblim - 博客园 https://www.cnblogs.com/fnlingnzb-learner/p/5835573.html epoll使用详解(精髓) ...
- intelij IDEA在启动tomcat时控制台日志乱码
1.在idea安装目录的bin下修改idea.exe.vmoptions和idea64.exe.vmoptions,添加 -Dfile.encoding=UTF-8 -Dconsole.encodin ...
- SpringBoot与Mybatis整合实例详解
介绍 从Spring Boot项目名称中的Boot可以看出来,SpringBoot的作用在于创建和启动新的基于Spring框架的项目,它的目的是帮助开发人员很容易的创建出独立运行的产品和产品级别的基于 ...
- java生成多位随机数方法
Math.random()方法可以令系统随机选取大于等于0.0且小于1.0的伪随机double值 利用函数Math.random()即可生成若干位随机数 以下是生成十位随机数代码: public st ...
- HDFS集中式的缓存管理原理与代码剖析
转载自:http://www.infoq.com/cn/articles/hdfs-centralized-cache/ HDFS集中式的缓存管理原理与代码剖析 Hadoop 2.3.0已经发布了,其 ...
- Java中的并发编程集合使用
一.熟悉Java自带的并发编程集合 在java.util.concurrent包里有很多并发编程的常用工具类. package com.ietree.basicskill.mutilthread.co ...
- 将jar包发布到nexus仓库
版本的快速迭代不适合release发布到仓库,snapshot方便版本的快速迭代. 1.pom改为snapshot <dependency> <groupId>com.sf.c ...
- bootstrap常用知识点总结
api地址:https://v3.bootcss.com/css/#forms 栅格参数: bootstrap 其实 是把 网页等 分为 了 12分,bootstrap把 根据屏 幕 大小 把屏 幕分 ...
- iOS 使用markdown 实现编辑和预览文本
注意要点: 1.在iOS 可以依赖UIWebview 来实现 2.丰富的UI样式依赖 html 的样式, js 调用插入markdown内容呈现出来 3.实现markdown编辑快捷键:参考githu ...
- JavaScript 操作JSON总结
JSON(JavaScript Object Notation) 是一种轻量级的数据交换格式,采用完全独立于语言的文本格式,是理想的数据交换格式.同时,JSON是 JavaScript 原生格式,这意 ...