自己安装好Hadoop2.7.x之后,发现dfs中的/bin/hadoop fs -put命令不能够使用,报错如下:

[hadoop@master bin]$ ./hadoop fs -put ../logs/123 /wc/mytemp
16/07/27 01:29:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.net.NoRouteToHostException: No route to host
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
        at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1537)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1313)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:449)
16/07/27 01:29:26 INFO hdfs.DFSClient: Abandoning BP-555863411-172.16.95.100-1469590594354:blk_1073741825_1001
16/07/27 01:29:26 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[172.16.95.101:50010,DS-ee00e1f8-5143-4f06-9ef8-b0f862fce649,DISK]
16/07/27 01:29:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.net.NoRouteToHostException: No route to host
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
        at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1537)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1313)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:449)
16/07/27 01:29:26 INFO hdfs.DFSClient: Abandoning BP-555863411-172.16.95.100-1469590594354:blk_1073741826_1002
16/07/27 01:29:26 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[172.16.95.102:50010,DS-eea51eda-0a07-4583-9eee-acd7fc645859,DISK]
16/07/27 01:29:26 WARN hdfs.DFSClient: DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /wc/mytemp/123._COPYING_ could only be replicated to 0 nodes instead of minReplication (=1).  There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
        at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1547)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:724)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

at org.apache.hadoop.ipc.Client.call(Client.java:1475)
        at org.apache.hadoop.ipc.Client.call(Client.java:1412)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
        at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:418)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
        at com.sun.proxy.$Proxy10.addBlock(Unknown Source)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1459)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1255)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:449)
put: File /wc/mytemp/123._COPYING_ could only be replicated to 0 nodes instead of minReplication (=1).  There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
[hadoop@master bin]$ service firewall
The service command supports only basic LSB actions (start, stop, restart, try-restart, reload, force-reload, status). For other actions, please try to use systemctl.

到网上找资料,好不统一找到几个解决的办法,感觉值的参考的是该文:hadoop常见错误解决办法,在其第14条,他给出的解决办法如下:

I have resolved the issue:
What i did:
1) '/etc/init.d/iptables stop' -->stopped firewall
2) SELINUX=disabled in '/etc/selinux/config' file.-->disabled selinux
I worked for me after these two changes

需要补充的地方:

  (1)此处CentOS7中,防火墙的方式改变,关闭防火墙改成service firewalld stop

  (2)不仅是master中的firewall要关闭,所有的从节点(slave)中的防火墙也要关闭。

  (3)检查所有的主机,/etc/selinux/config下的SELINUX,设置SELINUX=disable。

  (4)建议关闭防火墙之后,再使用systemctl disable firewalld.service #禁止firewall开机启动

然后在进行put的时候,就成功了。截图如下:

  

另外一个人博主给出的解决方法如下:hadoop报错:hdfs.DFSClient: Exception in createBlockOutputStream, 大家可以尝试一下,可以这样说,新手一般都会挂掉,因为格式化namenode的时候,slave中的文件会和namenode不一致,造成DataNode节点启动失败,解决办法就是删除slave和master中所有的data节点,然后重新执行namenode格式化操作。(不只是自己水平有限还是问题不一样,这种方法初步判定为坑人)

INFO hdfs.DFSClient: Exception in createBlockOutputStream java.net解决办法的更多相关文章

  1. hadoop报错:hdfs.DFSClient: Exception in createBlockOutputStream

    hadoop跑任务搞的好好的,后来把hadoop-dir移了一个位置,结果报错: java.io.EOFException: Premature EOF: no length prefix avail ...

  2. hdfs.DataStreamer: Exception in createBlockOutputStream

    在上传文件至 HDFS 提示如下信息 [root@h136 jdk1.8.0_202]# hadoop fs -put javafx-src.zip / 19/04/11 23:19:36 INFO ...

  3. HDFS执行getDatanodeReport时权限不足的解决办法

    通过JAVA获取HDFS的getDatanodeReport方法时,报权限不足的错误信息. org.apache.hadoop.ipc.RemoteException(org.apache.hadoo ...

  4. Error: IO_ERROR : java.io.IOException: Error while connecting Oozie server. No of retries = 5. Exception = Connection refused (Connection refused)解决办法(图文详解)

    不多说,直接上干货! 问题详情 解决办法 Copy/Paste oozie.services property tag set from oozie-default.xml to oozie-site ...

  5. Eclipse遇到Initializing Java Tooling解决办法

    可参考博客:http://liaojuncai.iteye.com/blog/1954113 打开Eclipse的时候莫名其妙的出现这个错误,进度条一直显示这个提示Initializing java ...

  6. Error: could not find java.dll 解决办法

    Error: could not find java.dll 问题: 安装配置Java环境变量后,在命令行中运行java -version进行测试时却出现下面的问题: Error: opening r ...

  7. MyEclipse Servers视窗出现“Could not create the view: An unexpected exception was thrown”错误的解决办法

    打开myeclipse所在的wordspace文件夹,在下面子文件夹 .metadata\.plugins\org.eclipse.core.runtime\.settings删除 com.genui ...

  8. eclipse导入web项目变成java项目解决办法

    右键工程,properties-> Project Facets-> 点convert to faceted..连接 -> 把Dynamic Web Moudle勾上

  9. idea自动重置language level和java compiler解决办法:修改setting

    maven工程: 错误: -source 1.6 中不支持 diamond 运算符. 尝试按网的的方式修改后,自动恢复,也在pom文件指定版本,依然不行. 后来发现:pom这样配置了: <plu ...

随机推荐

  1. linux 启动jar命令

    进入ECS目录jar 存放目录: 1.执行命令: linux:启动jar nohup java -jar qualityshop-api.jar>log.file 2>&1 &am ...

  2. Java 的数组

    几乎所有程序设计语言都支持数组.在C和 C++里使用数组是非常危险的,因为那些数组只是内存块.若程 序访问自己内存块以外的数组,或者在初始化之前使用内存(属于常规编程错误),会产生不可预测的后果 (注 ...

  3. 【BZOJ】2101: [Usaco2010 Dec]Treasure Chest 藏宝箱(dp)

    http://www.lydsy.com/JudgeOnline/problem.php?id=2101 这个dp真是神思想orz 设状态f[i, j]表示i-j先手所拿最大值,注意,是先手 所以转移 ...

  4. java和C#异常处理的差异

    Java异常处理和C#非常相似,不过Java中支持强制异常处理方式, 一旦方法加入了throws关键字,那么调用这个方法的类就必须加上try和catch进行异常处理, 如果不处理(没有try catc ...

  5. Windows平台安装最新OpenCV-2.4.9,利用Eclipse、MinGW构建C++调用OpenCV开发环境

    近期电脑重装系统了,第一件事重装OpenCV. 这次直接装最新版,2014-4-25日公布的OpenCV2.4.9版本号,下载链接: http://sourceforge.net/projects/o ...

  6. Android无线测试之—UiAutomator UiScrollable API介绍一

    UiScrollable类介绍 一.UiScrollable类说明: 1.UiScrollable是UiCollection的子类,因此它可以使用UiCollection和Uiobject类的所有公共 ...

  7. Android测试:从零开始1——简介

    参考文档:https://developer.android.com/training/testing/start/index.html 测试分类 使用android studio进行测试,首先需要先 ...

  8. Dapper的语法应用

    (1)返回某个整型或字符串类型的字段 public string GetSupplierCodeById(int Id) { using( var conn=DbFactory.CreateConne ...

  9. 浅述python中range()函数的用法

    函数用法说明: 用法一:range(m) 输出: [0,1,...,m-1](从0到m-1的一个list,不包括m) 示例: 用法二:range(m,n),m<n 输出:[m,m+1,..,n- ...

  10. 【BZOJ1570】[JSOI2008]Blue Mary的旅行 动态加边网络流

    [BZOJ1570][JSOI2008]Blue Mary的旅行 Description 在一段时间之后,网络公司终于有了一定的知名度,也开始收到一些订单,其中最大的一宗来自B市.Blue Mary决 ...