1.权限问题

// :: INFO mapreduce.Job: Job job_1555064328415_0328 failed with state FAILED due to: Job setup failed : org.apache.hadoop.security.AccessControlException: Permission denied: user=yarn, access=WRITE, inode="/user/yarn":hdfs:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:)
at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:)

oozie调用sqoop时报错,解决:改所有者

sudo -u hdfs hadoop fs -chown -R yarn /user/yarn

2.少MySQL驱动包

// :: INFO hive.metastore: Closed a connection to metastore, current connections:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.0.-.cdh6.0.0.p0./jars/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.0.-.cdh6.0.0.p0./jars/log4j-slf4j-impl-2.8..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
// :: INFO sqoop.Sqoop: Running Sqoop version: 1.4.-cdh6.0.0
// :: WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
// :: INFO conf.HiveConf: Found configuration file file:/etc/hive/conf.cloudera.hive/hive-site.xml
// :: INFO hive.metastore: Trying to connect to metastore with URI thrift://hadoop1:9083
// :: INFO hive.metastore: Opened a connection to metastore, current connections:
// :: INFO hive.metastore: Connected to metastore.
// :: INFO hive.metastore: Trying to connect to metastore with URI thrift://hadoop1:9083
// :: INFO hive.metastore: Opened a connection to metastore, current connections:
// :: INFO hive.metastore: Connected to metastore.
// :: WARN tool.BaseSqoopTool: Output field/record delimiter options are not useful in HCatalog jobs for most of the output types except text based formats is text. It is better to use --hive-import in those cases. For non text formats,
// :: INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
// :: INFO tool.CodeGenTool: Beginning code generation
// :: ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:)
at org.apache.sqoop.Sqoop.run(Sqoop.java:)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:)
at org.apache.sqoop.Sqoop.main(Sqoop.java:)
// :: INFO hive.metastore: Closed a connection to metastore, current connections:

sqoop时,应该是有的节点没有放mysql驱动连接jar,将mysql驱动包放入/usr/share/java/下(mysql-connector-java.jar)

Sqoop-问题的更多相关文章

  1. sqoop:Failed to download file from http://hdp01:8080/resources//oracle-jdbc-driver.jar due to HTTP error: HTTP Error 404: Not Found

    环境:ambari2.3,centos7,sqoop1.4.6 问题描述:通过ambari安装了sqoop,又添加了oracle驱动配置,如下: 保存配置后,重启sqoop报错:http://hdp0 ...

  2. 安装sqoop

    安装sqoop 1.默认已经安装好java+hadoop 2.下载对应hadoop版本的sqoop版本 3.解压安装包 tar zxvf sqoop-1.4.6.bin__hadoop-2.0.4-a ...

  3. Hadoop学习笔记—18.Sqoop框架学习

    一.Sqoop基础:连接关系型数据库与Hadoop的桥梁 1.1 Sqoop的基本概念 Hadoop正成为企业用于大数据分析的最热门选择,但想将你的数据移植过去并不容易.Apache Sqoop正在加 ...

  4. Oozie分布式任务的工作流——Sqoop篇

    Sqoop的使用应该是Oozie里面最常用的了,因为很多BI数据分析都是基于业务数据库来做的,因此需要把mysql或者oracle的数据导入到hdfs中再利用mapreduce或者spark进行ETL ...

  5. [大数据之Sqoop] —— Sqoop初探

    Sqoop是一款用于把关系型数据库中的数据导入到hdfs中或者hive中的工具,当然也支持把数据从hdfs或者hive导入到关系型数据库中. Sqoop也是基于Mapreduce来做的数据导入. 关于 ...

  6. [大数据之Sqoop] —— 什么是Sqoop?

    介绍 sqoop是一款用于hadoop和关系型数据库之间数据导入导出的工具.你可以通过sqoop把数据从数据库(比如mysql,oracle)导入到hdfs中:也可以把数据从hdfs中导出到关系型数据 ...

  7. Sqoop切分数据的思想概况

    Sqoop通过--split-by指定切分的字段,--m设置mapper的数量.通过这两个参数分解生成m个where子句,进行分段查询.因此sqoop的split可以理解为where子句的切分. 第一 ...

  8. sqoop数据导出导入命令

    1. 将mysql中的数据导入到hive中 sqoop import --connect jdbc:mysql://localhost:3306/sqoop --direct --username r ...

  9. Apache Sqoop - Overview——Sqoop 概述

    Apache Sqoop - Overview Apache Sqoop 概述 使用Hadoop来分析和处理数据需要将数据加载到集群中并且将它和企业生产数据库中的其他数据进行结合处理.从生产系统加载大 ...

  10. sqoop使用中的小问题

    1.数据库连接异常 执行数据导出 sqoop export --connect jdbc:mysql://192.168.208.129:3306/test --username hive --P - ...

随机推荐

  1. python学习笔记(二)— 字典(Dictionary)

    字典是另一种可变容器模型,且可存储任意类型对象.字典是无序的,因为它没有下标,用key来当索引,所以是无序的. 字典的每个键值(key=>value)对用冒号(:)分割,每个对之间用逗号(,)分 ...

  2. 【keras框架】

    更高级别的封装.更简单的api,以tensorflow.theano为后端,支持更多的平台 读取网络模型后生成网络结构图 读取 from keras.models import load_model ...

  3. Python自省(反射)指南(转)

    原文:http://www.cnblogs.com/huxi/archive/2011/01/02/1924317.html 在笔者看来,自省和反射是一回事,当然其实我并不十分确定一定以及肯定,所以如 ...

  4. BBS项目部署

    1.准备 项目架构为:LNM+Python+Django+uwsgi+Redis   (L:linux,N:nginx,M:mysql) 将bbs项目压缩上传到:  /opt 在shell中直接拖拽 ...

  5. js-jquery-003-条形码-二维码【QR码】

    一.基本使用 插件地址:https://github.com/jeromeetienne/jquery-qrcode 1.首先在页面中加入jquery库文件和qrcode插件. <script ...

  6. PAT 1089 Insert or Merge[难]

    1089 Insert or Merge (25 分) According to Wikipedia: Insertion sort iterates, consuming one input ele ...

  7. PKU 2823 Sliding Window(线段树||RMQ||单调队列)

    题目大意:原题链接(定长区间求最值) 给定长为n的数组,求出每k个数之间的最小/大值. 解法一:线段树 segtree节点存储区间的最小/大值 Query_min(int p,int l,int r, ...

  8. 从硬件到语言,详解C++的内存对齐(memory alignment)

    转载请保留以下声明 作者:赵宗晟 出处:https://www.cnblogs.com/zhao-zongsheng/p/9099603.html 很多写C/C++的人都知道“内存对齐”的概念以及规则 ...

  9. ionic简介

    CordovaCordova是贡献给Apache后的开源项目,是从PhoneGap中抽出的核心代码,是驱动PhoneGap的核心引擎.提供了一组设备相关的API,通过这组API,移动应用能够以Java ...

  10. 10 Linux Commands Every Developer Should Know

    转载:http://azer.bike/journal/10-linux-commands-every-developer-should-know/ As a software engineer, l ...