错误信息如下:

5/11/03 16:48:15 INFO spark.SparkContext: Running Spark version 1.4.1
15/11/03 16:48:15 WARN spark.SparkConf: In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
15/11/03 16:48:15 WARN spark.SparkConf:
SPARK_JAVA_OPTS was detected (set to '-verbose:gc -XX:-UseGCOverheadLimit -XX:+UseCompressedOops -XX:-PrintGCDetails -XX:+PrintGCTimeStamps -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/home/xujingwen/ocdc/spark-1.4.1-bin-hadoop2.6/1103164805.hprof').
This is deprecated in Spark 1.0+. Please instead use:
- ./spark-submit with conf/spark-defaults.conf to set defaults for an application
- ./spark-submit with --driver-java-options to set -X options for a driver
- spark.executor.extraJavaOptions to set -X options for executors
- SPARK_DAEMON_JAVA_OPTS to set java options for standalone daemons (master or worker) 15/11/03 16:48:15 WARN spark.SparkConf: Setting 'spark.executor.extraJavaOptions' to '-verbose:gc -XX:-UseGCOverheadLimit -XX:+UseCompressedOops -XX:-PrintGCDetails -XX:+PrintGCTimeStamps -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/home/xujingwen/ocdc/spark-1.4.1-bin-hadoop2.6/1103164805.hprof' as a work-around.
15/11/03 16:48:15 WARN spark.SparkConf: Setting 'spark.driver.extraJavaOptions' to '-verbose:gc -XX:-UseGCOverheadLimit -XX:+UseCompressedOops -XX:-PrintGCDetails -XX:+PrintGCTimeStamps -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/home/xujingwen/ocdc/spark-1.4.1-bin-hadoop2.6/1103164805.hprof' as a work-around.
15/11/03 16:48:15 WARN spark.SparkConf:
SPARK_CLASSPATH was detected (set to ':ls $SPARK_HOME/lib/*.jar').
This is deprecated in Spark 1.0+. Please instead use:
- ./spark-submit with --driver-class-path to augment the driver classpath
- spark.executor.extraClassPath to augment the executor classpath 15/11/03 16:48:15 WARN spark.SparkConf: Setting 'spark.executor.extraClassPath' to ':ls $SPARK_HOME/lib/*.jar' as a work-around.
15/11/03 16:48:15 ERROR spark.SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$8.apply(SparkConf.scala:444)
at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$8.apply(SparkConf.scala:442)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:442)
at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:430)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:430)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:365)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.<init>(<console>:9)
at $line3.$read$$iwC.<init>(<console>:18)
at $line3.$read.<init>(<console>:20)
at $line3.$read$.<init>(<console>:24)
at $line3.$read$.<clinit>(<console>)
at $line3.$eval$.<init>(<console>:7)
at $line3.$eval$.<clinit>(<console>)

查看spark-env.sh 和spark-default.conf中的配置发现两边都写的有classpath

//spark-default.conf
# Default system properties included when running spark-submit.
# This is useful for setting default environmental settings. # Example:
# spark.master spark://master:7077
# spark.eventLog.enabled true
# spark.eventLog.dir hdfs://namenode:8021/directory
# spark.serializer org.apache.spark.serializer.KryoSerializer
# spark.driver.memory 5g
# spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"
#
#
spark.serializer org.apache.spark.serializer.KryoSerializer
spark.local.dir /home/xujingwen/data/pseudo-dist/spark/local,/home/xujingwen/data/pseudo-dist/spark/local
spark.io.compression.codec snappy
spark.speculation false
spark.yarn.executor.memoryOverhead 512
#spark.storage.memoryFraction 0.4
spark.eventLog.enabled true
spark.eventLog.dir hdfs://cdh5cluster/eventLog
spark.eventLog.compress true
spark.driver.extraClassPath /home/xujingwen/ocdc/spark-1.4.1-bin-2.6.0-cdh5.4.4/lib/mysql-connector-java-5.1.30-bin.jar:/home/xujingwen/ocdc/spark-1.4.1-bin-2.6.0-cdh5.
4.4/lib/datanucleus-api-jdo-3.2.6.jar:/home/xujingwen/ocdc/spark-1.4.1-bin-2.6.0-cdh5.4.4/lib/datanucleus-core-3.2.10.jar:/home/xujingwen/ocdc/spark-1.4.1-bin-2.6.0-cdh
5.4.4/lib/datanucleus-rdbms-3.2.9.jar
//spark-env.sh
# Generic options for the daemons used in the standalone deploy mode
# - SPARK_CONF_DIR Alternate conf dir. (Default: ${SPARK_HOME}/conf)
# - SPARK_LOG_DIR Where log files are stored. (Default: ${SPARK_HOME}/logs)
# - SPARK_PID_DIR Where the pid file is stored. (Default: /tmp)
# - SPARK_IDENT_STRING A string representing this instance of spark. (Default: $USER)
# - SPARK_NICENESS The scheduling priority for daemons. (Default: 0) MASTER=yarn-client
SPARK_HOME=/home/xujingwen/ocdc/spark-1.4.1-bin-2.6.0-cdh5.4.4
SCALA_HOME=/home/xujingwen/ocdc/scala
JAVA_HOME=/home/xujingwen/ocdc/jdk1.7.0_21
HADOOP_HOME=/home/xujingwen/ocdc/hadoop-2.6.0-cdh5.4.4 export SPARK_MASTER_IP=192.168.0.4 HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop SPARK_EXECUTOR_INSTANCES=50
SPARK_EXECUTOR_CORES=2
SPARK_EXECUTOR_MEMORY=4G
SPARK_DRIVER_MEMORY=3G
SPARK_YARN_APP_NAME="Spark-1.1.0"
#export SPARK_YARN_QUEUE="default" SPARK_SUBMIT_LIBRARY_PATH=$SPARK_LIBRARY_PATH:$HADOOP_HOME/lib/native
SPARK_JAVA_OPTS="-verbose:gc -XX:-UseGCOverheadLimit -XX:+UseCompressedOops -XX:-PrintGCDetails -XX:+PrintGCTimeStamps $SPARK_JAVA_OPTS -XX:+HeapDumpOnOutOfMemoryEr
ror -XX:HeapDumpPath=/home/xujingwen/ocdc/spark-1.4.1-bin-hadoop2.6/`date +%m%d%H%M%S`.hprof"
export SPARK_HISTORY_OPTS="-Dspark.history.ui.port=18080 -Dspark.history.retainedApplications=1000 -Dspark.history.retainedApplications=1000 -Dspark.history.fs.logD
irectory=hdfs://cdh5cluster/eventLog" #export SPARK_CLASSPATH=$SPARK_CLASSPATH:/home/xujingwen/ocdc/apache-hive-1.2.1-bin/lib/mysql-connector-java-5.1.30-bin.jar for libjar in 'ls $SPARK_HOME/lib/*.jar'
do
SPARK_CLASSPATH=$SPARK_CLASSPATH:$libjar
done

spark1.4版本以后 应统一将classpath配置到spark-default.conf文件中 如下:

Spark1.4启动spark-shell时initializing失败的更多相关文章

  1. Spark Standalone Mode 单机启动Spark -- 分布式计算系统spark学习(一)

    spark是个啥? Spark是一个通用的并行计算框架,由UCBerkeley的AMP实验室开发. Spark和Hadoop有什么不同呢? Spark是基于map reduce算法实现的分布式计算,拥 ...

  2. Spark学习进度-Spark环境搭建&Spark shell

    Spark环境搭建 下载包 所需Spark包:我选择的是2.2.0的对应Hadoop2.7版本的,下载地址:https://archive.apache.org/dist/spark/spark-2. ...

  3. [Spark内核] 第36课:TaskScheduler内幕天机解密:Spark shell案例运行日志详解、TaskScheduler和SchedulerBackend、FIFO与FAIR、Task运行时本地性算法详解等

    本課主題 通过 Spark-shell 窥探程序运行时的状况 TaskScheduler 与 SchedulerBackend 之间的关系 FIFO 与 FAIR 两种调度模式彻底解密 Task 数据 ...

  4. TaskScheduler内幕天机解密:Spark shell案例运行日志详解、TaskScheduler和SchedulerBackend、FIFO与FAIR、Task运行时本地性算法详解等

    本课主题 通过 Spark-shell 窥探程序运行时的状况 TaskScheduler 与 SchedulerBackend 之间的关系 FIFO 与 FAIR 两种调度模式彻底解密 Task 数据 ...

  5. haproxy启动时提示失败

    haproxy启动时提示失败:[ALERT] 164/110030 (11606) : Starting proxy linuxyw.com: cannot bind socket 这个问题,其实就是 ...

  6. Spark shell的原理

    Spark shell是一个特别适合快速开发Spark原型程序的工具,可以帮助我们熟悉Scala语言.即使你对Scala不熟悉,仍然可以使用这个工具.Spark shell使得用户可以和Spark集群 ...

  7. Spark源码分析之Spark Shell(下)

    继上次的Spark-shell脚本源码分析,还剩下后面半段.由于上次涉及了不少shell的基本内容,因此就把trap和stty放在这篇来讲述. 上篇回顾:Spark源码分析之Spark Shell(上 ...

  8. spark-shell启动spark报错

    前言 离线安装好CDH.Coudera Manager之后,通过Coudera Manager安装所有自带的应用,包括hdfs.hive.yarn.spark.hbase等应用,过程很是波折,此处就不 ...

  9. Spark源码分析之Spark Shell(上)

    终于开始看Spark源码了,先从最常用的spark-shell脚本开始吧.不要觉得一个启动脚本有什么东东,其实里面还是有很多知识点的.另外,从启动脚本入手,是寻找代码入口最简单的方法,很多开源框架,其 ...

随机推荐

  1. nfs文件系统启动参数配置

    1. tiny6410(增强版)bootargs(nfs文件挂载)启动参数(周学伟) noinitrd console=ttySAC0,115200 lcd=S70 init=/init root=/ ...

  2. JavaScript 遗漏知识再整理;错误处理,类型转换以及获取当前时间、年份、月份、日期;

    1.JavaScript 错误处理 Throw.Try 和 Catch try 语句测试代码块的错误. catch 语句处理错误. throw 语句创建自定义错误. JavaScript 错误 当 J ...

  3. UVA-11468 Substring(AC自动机+DP)

    题目大意:给一些模板串,一些字符的出现概率.问不会出现模板串的概率是多少. 题目分析:是比较简单的概率DP+AC自动机.利用全概率公式递推即可. 代码如下: # include<iostream ...

  4. HDU-4003 Find Metal Mineral (树形DP+分组背包)

    题目大意:用m个机器人去遍历有n个节点的有根树,边权代表一个机器人通过这条边的代价,求最小代价. 题目分析:定义状态dp(root,k)表示最终遍历完成后以root为根节点的子树中有k个机器人时产生的 ...

  5. Windows7下QT5开发环境搭建 分类: QT开发 2015-03-09 23:44 65人阅读 评论(0) 收藏

    Windows7下QT开法环境常见搭配方法有两种. 第一种是:QT Creator+QT SDK: 第二种是:VS+qt-vs-addin+QT SDK: 以上两种均可,所需文件见QT社区,QT下载地 ...

  6. 使用grep恢复被删除文件内容【转】

    http://www.cnblogs.com/ggjucheng/archive/2012/10/07/2714311.html Unix/Linux下,最危险的命令恐怕就属rm命令了,每次在root ...

  7. 论文笔记之:Decoupled Deep Neural Network for Semi-supervised Semantic Segmentation

    Decoupled Deep Neural Network for Semi-supervised Semantic Segmentation xx

  8. PS CS5

    1.图层 背景图层:双击解锁 右下按键:新建图层.删除图层.新建图层组 眼睛-图层的显示与隐藏 缩略图大小选择:右上三角-面板设置 打开那个图片然后选择工具里面的移动工具然后按住鼠标拖动,拖到那个图片 ...

  9. 尾数为0零BigDecimal不能装成正常数

    BigDecimal b1 = rs.getBigDecimal("binary_double_column"); System.out.println( "ceshi: ...

  10. jquery 自动补全控件(支持IE6)待整理

    自动补全控件(兼容IE6):http://bassistance.de/ download地址:http://jquery.bassistance.de/autocomplete/jquery.aut ...