不多说,直接上干货!

问题详情

问题排查

spark@master:~/app/hadoop$ sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [master]
master: starting namenode, logging to /home/spark/app/hadoop-2.7./logs/hadoop-spark-namenode-master.out
slave1: starting datanode, logging to /home/spark/app/hadoop-2.7./logs/hadoop-spark-datanode-slave1.out
slave2: starting datanode, logging to /home/spark/app/hadoop-2.7./logs/hadoop-spark-datanode-slave2.out
Starting secondary namenodes [master]
master: starting secondarynamenode, logging to /home/spark/app/hadoop-2.7./logs/hadoop-spark-secondarynamenode-master.out
starting yarn daemons
starting resourcemanager, logging to /home/spark/app/hadoop-2.7./logs/yarn-spark-resourcemanager-master.out
slave2: starting nodemanager, logging to /home/spark/app/hadoop-2.7./logs/yarn-spark-nodemanager-slave2.out
slave1: starting nodemanager, logging to /home/spark/app/hadoop-2.7./logs/yarn-spark-nodemanager-slave1.out
spark@master:~/app/hadoop$ jps
SecondaryNameNode
NameNode
ResourceManager
sun.tools.jps.Jps
spark@master:~/app/hadoop$

 解决办法

spark@slave1:~/app/hadoop-2.7./logs$ cat  hadoop-spark-datanode-slave1.log
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG: host = slave1/192.168.80.146
STARTUP_MSG: args = []
STARTUP_MSG: version = 2.7.3
STARTUP_MSG: classpath = /home/spark/app/hadoop-2.7.3/etc/hadoop:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-io-2.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-net-3.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jersey-server-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/hamcrest-core-1.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-digester-1.8.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/guava-11.0.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/gson-2.2.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/servlet-api-2.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/hadoop-auth-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jsp-api-2.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/httpclient-4.2.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-collections-3.2.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/xmlenc-0.52.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jersey-core-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/curator-client-2.7.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jsch-0.1.42.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/httpcore-4.2.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-cli-1.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-lang-2.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jets3t-0.9.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/activation-1.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/xz-1.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-logging-1.1.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/paranamer-2.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/curator-framework-2.7.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/asm-3.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/junit-4.11.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jersey-json-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-math3-3.1.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/avro-1.7.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/zookeeper-3.4.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/stax-api-1.0-2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jetty-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-codec-1.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jettison-1.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/hadoop-annotations-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/log4j-1.2.17.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jsr305-3.0.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/hadoop-common-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/hadoop-common-2.7.3-tests.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/hadoop-nfs-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-io-2.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/htrace-core-3.1.0-incubating.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/asm-3.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/hadoop-hdfs-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/hadoop-hdfs-nfs-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/hadoop-hdfs-2.7.3-tests.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-io-2.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/guava-11.0.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/servlet-api-2.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-collections-3.2.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-cli-1.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-lang-2.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/activation-1.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/xz-1.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/asm-3.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jersey-json-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/javax.inject-1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jersey-client-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jetty-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-codec-1.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jettison-1.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/guice-3.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-client-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-common-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-api-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-common-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-registry-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/junit-4.11.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/hadoop-annotations-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
STARTUP_MSG: build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r baa91f7c6bc9cb92be5982de4719c1c8af91ccff; compiled by 'root' on 2016-08-18T01:41Z
STARTUP_MSG: java = 1.8.0_60
************************************************************/
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at second(s).
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is slave1
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory =
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is bytes/s
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is
-- ::, INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
-- ::, INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
-- ::, INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined
-- ::, INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
-- ::, INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
-- ::, INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
-- ::, INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
-- ::, INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port
-- ::, INFO org.mortbay.log: jetty-6.1.
-- ::, INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = spark
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
-- ::, INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
-- ::, INFO org.apache.hadoop.ipc.Server: Starting Socket Reader # for port
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null
-- ::, INFO org.mortbay.log: Stopped HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:
-- ::, INFO org.apache.hadoop.ipc.Server: Stopping server on
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping DataNode metrics system...
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system stopped.
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system shutdown complete.
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Shutdown complete.
-- ::, FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.io.IOException: Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
at org.apache.hadoop.hdfs.DFSUtil.getNNServiceRpcAddressesForCluster(DFSUtil.java:)
at org.apache.hadoop.hdfs.server.datanode.BlockPoolManager.refreshNamenodes(BlockPoolManager.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:)
-- ::, INFO org.apache.hadoop.util.ExitUtil: Exiting with status
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at slave1/192.168.80.146
************************************************************/

<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://master:9000</value>
</property>
<property>
  <name>io.file.buffer.size</name>
  <value></value>
</property>
<property>
  <name>hadoop.tmp.dir</name>
  <value>/usr/local/hadoop/hadoop-2.6./tmp</value>
</property>
<property>
   <name>hadoop.proxyuser.hadoop.hosts</name>
<value>*</value>
</property>
<property>
   <name>hadoop.proxyuser.hadoop.groups</name>
  <value>*</value>
</property>
</configuration>

  成功!

hadoop报错java.io.IOException: Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured的更多相关文章

  1. hadoop报错java.io.IOException: Bad connect ack with firstBadLink as 192.168.1.218:50010

    [root@linuxmain hadoop]# bin/hadoop jar hdfs3.jar com.dragon.test.CopyToHDFS Java HotSpot(TM) Client ...

  2. Spark报错java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

    Spark 读取 JSON 文件时运行报错 java.io.IOException: Could not locate executable null\bin\winutils.exe in the ...

  3. Kafka 启动报错java.io.IOException: Can't resolve address.

    阿里云上 部署Kafka 启动报错java.io.IOException: Can't resolve address. 本地调试的,报错 需要在本地添加阿里云主机的 host 映射   linux ...

  4. React Natived打包报错java.io.IOException: Could not delete path '...\android\support\v7'解决

    问题详情 React Native打包apk时在第二次编译时候报错: java.io.IOException: Could not delete path 'D:\mycode\reactnative ...

  5. github提交失败并报错java.io.IOException: Authentication failed:

    一.概述 我最近在写一个android的项目. 软件:android studio.Android studio VCS integration(插件) Android studio VCS inte ...

  6. vue app混合开发蓝牙串口连接(报错java.io.IOException: read failed, socket might closed or timeout, read ret: -1;at android.bluetooth.BluetoothSocket.connect at js/BluetoothTool.js:329)

    我使用的uni-app <template> <view class="bluetooth"> <!-- 发送数据 --> <view c ...

  7. java get请求带参数报错 java.io.IOException: Server returned HTTP response code: 400 for URL

    解决方案 在使用JAVA发起http请求的时候,经常会遇到这个错误,我们copy请求地址在浏览器中运行的时候又是正常运行的,造成这个错误的原因主要是因为请求的URL中包含空格,这个时候我们要使用URL ...

  8. jsp报错java.io.IOException: Stream closed

    在使用jsp的时候莫名其妙的抛出了这个异常,经过反复检查 去掉了网友们说的jsp使用流未关闭,以及tomcat版本冲突等原因,最后发现是书写格式的原因. 当时使用的代码如下 <jsp:inclu ...

  9. Spark- ERROR Shell: Failed to locate the winutils binary in the hadoop binary path java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

    运行 mport org.apache.log4j.{Level, Logger} import org.apache.spark.rdd.RDD import org.apache.spark.{S ...

随机推荐

  1. ZC_知识点

    1. 在创建一个JNI动态库的工程时应该将工程的输出目标设置为动态连接库(Windows下为.dll,Unix-like系统下为.so,OS X下为.dylib) 2.类型对应关系 (Java与C/C ...

  2. 重置 Launchpad 和更新APP图标缓存

    重置 Launchpad 方法一:在终端里粘贴 defaults write com.apple.dock ResetLaunchPad -bool true; killall Dock 方法二:在终 ...

  3. 炫酷tab栏--第三方开源--NavigationTabStrip

    github下载地址:https://github.com/DevLight-Mobile-Agency/NavigationTabStrip 这个开源项目很强大,只是一个自定义的控件,只有一个类 / ...

  4. FMDB的操作(转),这篇比我写的好

    直接看吧 http://blog.devtang.com/blog/2012/04/22/use-fmdb/

  5. Ubuntu16.04上安装搜狗输入法

    一.下载搜狗输入法的deb包: http://pinyin.sogou.com/linux/ 二.打开终端输入命令: $ sudo dpkg -i sogoupinyin_2.0.0.0078_i38 ...

  6. UVA - 11107 Life Forms (广义后缀自动机)

    题意:给你n个字符串,求出在超过一半的字符串中出现的所有子串中最长的子串,按字典序输出. 对这n个字符串建广义后缀自动机,建完后每个字符串在自动机上跑一遍,沿fail树向上更新所有子串结点的出现次数( ...

  7. python_根据"词库"进行“词联想”

    输入法中,当你输入一个字的时候,输入法就能猜出你要输入什么词.这就是词联想.现在,再python中简单实现类似这样的功能:根据制定好的词库,输入一个新的词,帮助实现词联想.其中分词用了jieba包. ...

  8. GO语言list剖析

    GO语言list剖析 本节内容 使用方法 list提供的方法 源码剖析 1. 使用方法 在GO语言的标准库中,提供了一个container包,这个包中提供了三种数据类型,就是heap,list和rin ...

  9. AMD 规范

    AMD(异步模块定义)是为浏览器环境设计的,因为 CommonJS 模块系统是同步加载的,当前浏览器环境还没有准备好同步加载模块的条件. AMD 定义了一套 JavaScript 模块依赖异步加载标准 ...

  10. mysql时间随笔

    SELECT FROM_UNIXTIME(create_time,'%Y-%m-%d %H:%i:%s') FROM `order`; select date_add(FROM_UNIXTIME(cr ...