Sqoop 抽数报错: java.io.FileNotFoundException: File does not exist

一、错误详情

2019-10-17 20:04:49,080 INFO [IPC Server handler 20 on 45158] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1567429685851_474405_m_000001_0: Error: java.lang.RuntimeException:

java.io.FileNotFoundException: File does not exist: /user/hive/warehouse/bgda_hw_stg.db/rs_isdbirthremind_onelife_bak/51424d15ec50cdca-216a29380000000b_1863633016_data.0.


2019-10-17 20:04:49,080 INFO [IPC Server handler 20 on 45158] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1567429685851_474405_m_000001_0: Error: java.lang.RuntimeException: 
java.io.FileNotFoundException: File does not exist: /user/hive/warehouse/bgda_hw_stg.db/rs_isdbirthremind_onelife_bak/51424d15ec50cdca-216a29380000000b_1863633016_data.0.
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:66)
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:56)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:2157)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:2127)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:2040)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:583)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getBlockLocations(AuthorizationProviderProxyClientProtocol.java:94)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:377)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2278)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2274)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2272) at org.apache.sqoop.mapreduce.CombineFileRecordReader.initNextRecordReader(CombineFileRecordReader.java:165)
at org.apache.sqoop.mapreduce.CombineFileRecordReader.nextKeyValue(CombineFileRecordReader.java:71)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:562)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:793)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.io.FileNotFoundException: File does not exist: /user/hive/warehouse/bgda_hw_stg.db/rs_isdbirthremind_onelife_bak/51424d15ec50cdca-216a29380000000b_1863633016_data.0.
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:66)
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:56)

  

二、解决方法

表名过长,重新修改表名 rs_isdbirthremind_onelife_bak 为  rs_isdbirremd_onelife_bak (缩短表名)

再次抽数,即正常了~

三、sqoop抽数常见错误

日期字段类型不匹配:

hive里面存储的是 datetime类型,但是结果库MySQL里面设计的是date类型

处理方案: 修改为一致即可(统一为date,或者统一为datetime)

字段长度不够:

数据类型Mysql结果库里设置的字段(decimal类型)最大长度为5,结果数据里面最大数值为 999999,存不进去 则会报错

处理方案:调整mysql结果库的字段长度

字段对应错误:

--columns 指定的是mysql结果库里面的表字段,而不是hive里面的字段信息,所以--columns指定的字段名一定要和mysql中的表字段保持一致!!!
sqoop export \
--connect jdbc:mysql://10.11.22.33:3306/report \
--username root \
--password 1234\
--table rs_isd_birth_remind_onelife \
--export-dir /user/hive/warehouse/bgda_hw_stg.db/rs_isdbirremd_onelife_bak \
--columns t00salesno,contactsId,isdname,birthTime,needBirthRemind,createUser,createTime,updateUser,updateTime \
--fields-terminated-by '\001' \
--lines-terminated-by '\n' \
--input-null-string '\\N' \
--input-null-non-string '\\N'

  

 抽取的数据中含有特殊字符

目前我的处理方案是通过 regexp_replace 函数将特殊字符替换掉:  regexp_replace(table.content,'@ ','a-' )

其中的特殊字符指 @

Sqoop 抽数报错: java.io.FileNotFoundException: File does not exist的更多相关文章

  1. Spark启动报错|java.io.FileNotFoundException: File does not exist: hdfs://hadoop101:9000/directory

    at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:) at org.a ...

  2. 关于spark入门报错 java.io.FileNotFoundException: File file:/home/dummy/spark_log/file1.txt does not exist

    不想看废话的可以直接拉到最底看总结 废话开始: master: master主机存在文件,却报 执行spark-shell语句:  ./spark-shell  --master spark://ma ...

  3. 文件上传报错java.io.FileNotFoundException拒绝访问

    局部代码如下: File tempFile = new File("G:/tempfileDir"+"/"+fileName); if(!tempFile.ex ...

  4. 完美解决JavaIO流报错 java.io.FileNotFoundException: F:\ (系统找不到指定的路径。)

    完美解决JavaIO流报错 java.io.FileNotFoundException: F:\ (系统找不到指定的路径.) 错误原因 读出文件的路径需要有被拷贝的文件名,否则无法解析地址 源代码(用 ...

  5. 关于SpringMVC项目报错:java.io.FileNotFoundException: Could not open ServletContext resource [/WEB-INF/xxxx.xml]

    关于SpringMVC项目报错:java.io.FileNotFoundException: Could not open ServletContext resource [/WEB-INF/xxxx ...

  6. 解决sqoop抽数报错:IO Error: Connection reset

    遇到的问题:进行sqoop抽数时,虽然能成功执行,但是过程中有很多这样的信息 19/11/20 15:17:11 INFO mapreduce.Job: Task Id : attempt_15737 ...

  7. 云笔记项目- 上传文件报错"java.lang.IllegalStateException: File has been moved - cannot be read again"

    在做文件上传时,当写入上传的文件到文件时,会报错“java.lang.IllegalStateException: File has been moved - cannot be read again ...

  8. [Storm] java.io.FileNotFoundException: File '../stormconf.ser' does not exist

    This bug will kill supervisors Affects Version/s: 0.9.2-incubating, 0.9.3, 0.9.4 Fix Version/s: 0.10 ...

  9. com.jcraft.jsch.JSchException: java.io.FileNotFoundException: file:\D:\development\ideaProjects\salary-card\target\salary-card-0.0.1-SNAPSHOT.jar!\BOOT-INF\classes!\keystore\login_id_rsa 资源未找到

    com.jcraft.jsch.JSchException: java.io.FileNotFoundException: file:\D:\development\ideaProjects\sala ...

随机推荐

  1. [转]Xtrabackup 的 xtrabackup_binlog_pos_innodb和xtrabackup_binlog_info 文件区别

    [转自] http://julyclyde.org/?p=403 在操作 innobackupex 的时候,执行 change master to 的时候发现 xtrabackup_binlog_po ...

  2. Chapter 01—Introduction to R

    1.getwd():list the current working directory. (即获得当前工作路径) 2.setwd("mydirectory"):change th ...

  3. Nginx防盗链、访问控制、解析PHP相关配置及Nginx代理

    6月11日任务 12.13 Nginx防盗链12.14 Nginx访问控制12.15 Nginx解析php相关配置12.16 Nginx代理 扩展502问题汇总 http://ask.apelearn ...

  4. VisualStudio2012编辑器错误

    https://blogs.msdn.microsoft.com/webdev/2014/11/11/dialog-box-may-be-displayed-to-users-when-opening ...

  5. 小白的springboot之路(四)、mybatis-generator自动生成mapper和model、dao

    0-.前言 在用mybatis开发项目中,数据库动辄上百张数据表,如果你一个一个去手动编写,比较耗费时间:还好,我们有mybatis-generator插件,只需简单几步就能自动生成mybatis的m ...

  6. 基于H.ui.Admin UI模板的网站管理后台

    最近接手一个跨境电商平台开发,客户侧重电商网站UI设计,对管理后台要求不高,由我们决定选哪一款后台模板.找来找去,感觉还是H.ui靠谱一些,主要是这个模板清爽,不需要过多选择.其他的流行后台模板也看了 ...

  7. 鲲鹏凌云,并行科技Paramon通过华为云鲲鹏云服务兼容性认证

    随着Cloud2.0时代到来,5G技术开始应用普及,超算云服务需求不断升级,业务多样性.数据多样性不断延伸.2019年7月,华为召开鲲鹏计算产业发展峰会,依托在联接领域坚实的基础,华为未来将着力打造智 ...

  8. 压缩感知重构算法之OLS算法python实现

    压缩感知重构算法之OMP算法python实现 压缩感知重构算法之CoSaMP算法python实现 压缩感知重构算法之SP算法python实现 压缩感知重构算法之IHT算法python实现 压缩感知重构 ...

  9. 【解决】http: server gave HTTP response to HTTPS client

    [问题]上传镜像到私有仓库时报错 $ docker push xxx.xxx.xxx.xxx:5000/java-8 The push refers to repository [xxx.xxx.xx ...

  10. A*算法在最短路问题的应用及其使用举例

    1 A*算法 A*算法在人工智能中是一种典型的启发式搜索算法,启发中的估价是用估价函数表示的: 其中f(n)是节点n的估价函数,g(n)表示实际状态空间中从初始节点到n节点的实际代价,h(n)是从n到 ...