ssh那些都已经搞了,跑一个书上的例子出现了Connection Refused异常,如下:

12/04/09 01:00:54 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 0 time(s).
12/04/09 01:00:56 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 1 time(s).
12/04/09 01:00:57 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 2 time(s).
12/04/09 01:00:58 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 3 time(s).
12/04/09 01:00:59 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 4 time(s).
12/04/09 01:01:00 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 5 time(s).
12/04/09 01:01:01 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 6 time(s).
12/04/09 01:01:02 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 7 time(s).
12/04/09 01:01:03 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 8 time(s).
12/04/09 01:01:04 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 9 time(s).
Exception in thread "main" java.net.ConnectException: Call to localhost/127.0.0.1:8020 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
at org.apache.hadoop.ipc.Client.call(Client.java:1071)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at $Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
at com.lan.hadoop.FileSystemCat.main(FileSystemCat.java:15)

.............................

记得以前也出过这个异常,当时是因为Namenode没有启动,今天的情况是使用hadoop的ls 、mkdir那些都正常,使用jps也发现有namenode进程,telnet localhost 8020失败,但是自己也不知道是为啥失败,最后董问我namenode的端口号是不是8020,我还没反应过来,最后看了下core-site.xml发现我将fs.default.name配置为

hdfs://localhost:9000了,果然是端口号不对。在代码中把端口号改为9000就OK了

Hadoop: failed on connection exception: java.net.ConnectException: Connection refuse的更多相关文章

  1. Call From master/192.168.128.135 to master:8485 failed on connection exception: java.net.ConnectException: Connection refused

    hadoop集群搭建了ha,初次启动正常,最近几天启动时偶尔发现,namenode1节点启动后一段时间(大约10几秒-半分钟左右),namenode1上namenode进程停掉,查看日志: -- :: ...

  2. Caused by: java.net.ConnectException: Call From master/192.168.199.130 to master:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.

    1:安装好hive,准备启动的时候出现下面的错误(由于hive是基于Hadoop的,所以必须先将你的集群启动起来,我就是没有启动集群,直接启动hive导致的错误): [root@master bin] ...

  3. java.net.ConnectException: Call From slaver1/192.168.19.128 to slaver1:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org

    1:练习spark的时候,操作大概如我读取hdfs上面的文件,然后spark懒加载以后,我读取详细信息出现如下所示的错误,错误虽然不大,我感觉有必要记录一下,因为错误的起因是对命令的不熟悉造成的,错误 ...

  4. Bad connection to FS. command aborted. exception: Call to chaoren/192.168.80.100:9000 failed on connection exception: java.net.ConnectException: Connection refused

    Bad connection to FS. command aborted. exception: Call to chaoren/192.168.80.100:9000 failed on conn ...

  5. 异常: Call From * 9000 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

    场景: eclipse链接不上阿里云hadoop解决: 将hadoop的配置文件中的ip改为内网IP即可

  6. Call to localhost/127.0.0.1:9000 failed on connection exception:java.net.ConnectException的解决方案

    Call to localhost/127.0.0.1:9000 failed on connection exception:java.net.ConnectException的解决方案 作者:凯鲁 ...

  7. exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

    1.虽然,不是大错,还说要贴一下,由于我运行run-example streaming.NetworkWordCount localhost 9999的测试案例,出现的错误,第一感觉就是Spark没有 ...

  8. hadoop报错:java.io.IOException(java.net.ConnectException: Call From xxx/xxx to xxx:10020 failed on connection exception: java.net.ConnectException: 拒绝连接

    任务一直报错 现象比较奇怪,部分任务可以正常跑,部分问题报错 报错信息如下: Ended Job = job_1527476268558_132947 with exception 'java.io. ...

  9. ls: Call From hdoop2/192.168.18.87 to hdoop2:8020 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see

    场景:  预发环境中,同事已经搭建了一套hadoop集群,由于版本与所需不符,所以需要替换版本 问题描述: 在配置文件都准确的情况下,启动hadoop,出现以下报错: 启动之前初始化:   初始化目录 ...

随机推荐

  1. centos下安装与配置Apache方法

    下面以httpd-2.0.55.tar.gz版本为例,介绍Apache在Linux中的安装过程:1.解压和解包安装文件:gzip -d httpd-2.0.55.tar.gztar xvf httpd ...

  2. 第四十四节,subprocess系统命令模块

    用python执行系命令的相关的模块和函数的功能均在 subprocess 模块中实现,并提供了丰富的功能 call()模块函数 功能:输入系统命令,直接执行命令,返回状态码,只能查看[有参] 使用方 ...

  3. 安卓无法生成R文件原因

    原因个人总结出来: 清单文件报错,则无法生成R文件 gen和bin目录可以删除

  4. 07-09 07:28:38.350: E/AndroidRuntime(1437): Caused by: java.lang.ClassNotFoundException: Didn't find class "com.example.googleplay.ui.activity.MainActivity" on path: DexPathList[[zip file "/data/app/c

    一运行,加载mainActivity就报错 布局文件乱写一通,然后急着运行,报莫名其妙的错误: 07-09 07:28:38.350: E/AndroidRuntime(1437): Caused b ...

  5. 改良UIScrollView滚动视图

    #define HEIGHT  self.view.frame.size.height #define WIDTH    self.view.frame.size.width @interface V ...

  6. 转: Windows下安装Oracle Database 12c Release 1(12.1.0.2.0) - Enterprise Edition

    http://www.cnblogs.com/xqzt/p/4395053.html Windows下安装Oracle Database 12c Release 1(12.1.0.2.0) - Ent ...

  7. ignite通过注解配置查询

    官方文档的叙述可能有些不清楚,我做了一个测试,并且可以成功运行,待会儿后面贴出小栗子. 两步操作: 第一步在属性值处贴上@QuerySqlField注解 第二部设置key和value类型 Person ...

  8. poi 合并单元格、设置边框

    HSSFWorkbook wb = new HSSFWorkbook(); HSSFSheet sheet = wb.createSheet(); //创建一个样式 HSSFCellStyle sty ...

  9. Ugly Problem

    Ugly Problem Time Limit: 2000/1000 MS (Java/Others)    Memory Limit: 65536/65536 K (Java/Others)Spec ...

  10. 最小点集覆盖=最大匹配<二分图>/证明

    来源 最小点集覆盖==最大匹配. 首先,最小点集覆盖一定>=最大匹配,因为假设最大匹配为n,那么我们就得到了n条互不相邻的边,光覆盖这些边就要用到n个点. 现在我们来思考为什么最小点击覆盖一定& ...