异常信息如下:

java.io.IOException: Failed on local exception: java.nio.channels.ClosedByInterruptException; Host Details : local host is: "hadoop008/192.168.28.77"; destination host is: "hadoop004":8020;

at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
at org.apache.hadoop.ipc.Client.call(Client.java:1415)
at org.apache.hadoop.ipc.Client.call(Client.java:1364)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy17.setReplication(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setReplication(ClientNamenodeProtocolTranslatorPB.java:322)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy18.setReplication(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.setReplication(DFSClient.java:1768)
at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:465)
at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:461)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.setReplication(DistributedFileSystem.java:461)
at org.apache.hadoop.mapreduce.JobSubmitter.copyRemoteFiles(JobSubmitter.java:142)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:214)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:388)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:481)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:564)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:559)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:559)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:550)
at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:420)
at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1554)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1321)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1139)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:962)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:957)
at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:145)
at org.apache.hive.service.cli.operation.SQLOperation.access$000(SQLOperation.java:69)
at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:502)
at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:213)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.nio.channels.ClosedByInterruptException
at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202)
at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:681)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:606)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:700)
at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1463)
at org.apache.hadoop.ipc.Client.call(Client.java:1382)

... 54 more

解决办法:

为HiveOptions.callTimeout设置合适的rpc timeout

参考:

https://issues.apache.org/jira/browse/FLUME-1748

http://stackoverflow.com/questions/32186212/hive-is-failing-to-write-data-in-hdfs

hive executeTask被interrupt处理的更多相关文章

  1. hive执行流程分析

    转自:http://blog.csdn.net/gexiaobaohelloworld/article/details/7719163 入口:bin/hive脚本中,环境检查后执行ext中的cli.s ...

  2. Hive的MoveTask错误

    最近在部署Hive上线,结果在线上线下同时出现了MoveTask报错的现象,虽然两者错误的日志以及错误信息一样,但是经过分析解决又发现两者的原因是不一样的. 首先线下的错误日志: 2015-05-18 ...

  3. hive运行query语句时提示错误:org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOException:

    hive> select product_id, track_time from trackinfo limit 5; Total MapReduce jobs = 1 Launching Jo ...

  4. HADOOP在处理HIVE时权限错误的解决办法

    今天,小乔操作时发现问题: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, acces ...

  5. hive udaf 用maven打包运行create temporary function 时报错

    用maven打包写好的jar,在放到hive中作暂时函数时报错. 错误信息例如以下: hive> create temporary function maxvalue as "com. ...

  6. hive 集成 hbase NoClassDefFoundError: org/apache/htrace/Trace

    更新了hive版本后,在创建hive外部表 级联hbase 的时候报如下异常: hive (default)> create external table weblogs(id string,d ...

  7. 【原创】大数据基础之Hive(2)Hive SQL执行过程之SQL解析过程

    Hive SQL解析过程 SQL->AST(Abstract Syntax Tree)->Task(MapRedTask,FetchTask)->QueryPlan(Task集合)- ...

  8. hive从本地导入数据时出现「Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask」错误

    现象 通过load data local导入本地文件时报无法导入的错误 hive> load data local inpath '/home/hadoop/out/mid_test.txt' ...

  9. hive drop 报错

    执行 drop database xxxx cascade; 删除数据库的时候报错. 报错信息:Execution Error, return code 1 from org.apache.hadoo ...

随机推荐

  1. 孵化器使用Office365的场景及收益

  2. LeetCode 700——二叉搜索树中的搜索

    1. 题目 2. 解答 如果根节点为空,直接返回 NULL.如果根节点非空,从根节点开始循环查找,直到节点为空. 如果待查找的值大于当前节点值,节点指向右孩子: 如果待查找的值小于当前节点值,节点指向 ...

  3. Fluent Python: Mutable Types as Parameter Defaults: Bad Idea

    在Fluent Python一书第八章有一个示例,未看书以先很难理解这个示例运行的结果,我们先看结果,然后再分析问题原因: 定义了如下Bus类: class Bus: def __init__(sel ...

  4. java键盘IO

    public class IO { public static void main(String[] args) throws Throwable { ScannerTest(); // testSc ...

  5. 分页查询es时,返回的数据不是自己所期望的问题

    在进行es分页查询时,一般都是用sql语句转成es查询字符串,在项目中遇到过不少次返回的数据不是自己所期望的那样时,多半原因是自己的sql拼接的有问题. 解决办法:务必要保证自己的sql语句拼接正确.

  6. LintCode-69.二叉树的层次遍历

    二叉树的层次遍历 给出一棵二叉树,返回其节点值的层次遍历(逐层从左往右访问) 样例 给一棵二叉树 {3,9,20,#,#,15,7} : 返回他的分层遍历结果: [     [3],     [9,2 ...

  7. TCP系列36—窗口管理&流控—10、linux下的异常报文系列接收

    在这篇文章中我们看一下server端在接收到异常数据系列时的处理,主要目的是通过wireshark示例对这些异常数据系列的处理有一个直观的认识,感兴趣的自行阅读相关代码和协议,这里不再进行详细介绍 在 ...

  8. Spring Boot(五)启动流程分析

    学习过springboot的都知道,在Springboot的main入口函数中调用SpringApplication.run(DemoApplication.class,args)函数便可以启用Spr ...

  9. C#中堆和栈的区别?

    http://www.jb51.net/article/55306.htm http://www.cnblogs.com/JimmyZhang/archive/2008/01/31/1059383.h ...

  10. android四大组件(一)Activity

    一.创建一个新的Activity 1.android的四大组件都要在清单文件里面配置 2.如果想让你的应用有多个启动图标,你的activity需要这样配置 <intent-filter> ...