Spark Shell启动时遇到<console>:14: error: not found: value spark import spark.implicits._ <console>:14: error: not found: value spark import spark.sql错误的解决办法(图文详解)
不多说,直接上干货!
最近,开始,进一步学习spark的最新版本。由原来经常使用的spark-1.6.1,现在来使用spark-2.2.0-bin-hadoop2.6.tgz。
前期博客
Spark on YARN模式的安装(spark-1.6.1-bin-hadoop2.6.tgz + hadoop-2.6.0.tar.gz)(master、slave1和slave2)(博主推荐)
这里我,使用的是spark-2.2.0-bin-hadoop2.6.tgz + hadoop-2.6.0.tar.gz 的单节点来测试下。
其中,hadoop-2.6.0的单节点配置文件,我就不赘述了。
这里,我重点写下spark on yarn。我这里采取的是这模式。
spark-defaults.conf
默认,保持不修改。
spark-env.sh

export JAVA_HOME=/home/spark/app/jdk1..0_60
export SCALA_HOME=/home/spark/app/scala-2.10.
export HADOOP_HOME=/home/spark/app/hadoop-2.6.
export HADOOP_CONF_DIR=/home/spark/app/hadoop-2.6./etc/hadoop
export SPARK_MASTER_IP=192.168.80.218
export SPARK_WORKER_MERMORY=1G
slaves
sparksinglenode
问题详情
我已经是启动了hadoop进程。
然后,来执行

[spark@sparksinglenode spark-2.2.-bin-hadoop2.]$ bin/spark-shell

at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$.apply$mcZ$sp(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$.apply(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$.apply(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:)
... more
Caused by: org.apache.hadoop.ipc.RemoteException: Cannot create directory /tmp/hive/spark/1b6e6e4f-7e08-4d49--4e722bab607a. Name node is in safe mode.
The reported blocks needs additional blocks to reach the threshold 0.9990 of total blocks .
The number of live datanodes has reached the minimum number . Safe mode will be turned off automatically once the thresholds have been reached.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:)
at org.apache.hadoop.ipc.Server$Handler$.run(Server.java:)
at org.apache.hadoop.ipc.Server$Handler$.run(Server.java:)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:)

at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:)
at org.apache.hadoop.hdfs.DistributedFileSystem$.doCall(DistributedFileSystem.java:)
at org.apache.hadoop.hdfs.DistributedFileSystem$.doCall(DistributedFileSystem.java:)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:)
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:)
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:)
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:)
... more
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.2.
/_/ Using Scala version 2.11. (Java HotSpot(TM) -Bit Server VM, Java 1.8.0_60)
Type in expressions to have them evaluated.
Type :help for more information. scala>
解决办法

[spark@sparksinglenode ~]$ jps
SecondaryNameNode
Jps
NameNode
ResourceManager
NodeManager
DataNode
[spark@sparksinglenode ~]$ hdfs dfsadmin -safemode leave
// :: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Safe mode is OFF
[spark@sparksinglenode ~]$
再次执行,成功了

[spark@sparksinglenode spark-2.2.-bin-hadoop2.]$ bin/spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
// :: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
// :: WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/spark/app/spark-2.2.0-bin-hadoop2.6/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/spark/app/spark/jars/datanucleus-api-jdo-3.2.6.jar."
// :: WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/spark/app/spark/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/spark/app/spark-2.2.0-bin-hadoop2.6/jars/datanucleus-rdbms-3.2.9.jar."
// :: WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/spark/app/spark-2.2.0-bin-hadoop2.6/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/spark/app/spark/jars/datanucleus-core-3.2.10.jar."
// :: WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
Spark context Web UI available at http://192.168.80.218:4040
Spark context available as 'sc' (master = local[*], app id = local-).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.2.
/_/ Using Scala version 2.11. (Java HotSpot(TM) -Bit Server VM, Java 1.8.0_60)
Type in expressions to have them evaluated.
Type :help for more information. scala>
或者
[spark@sparksinglenode spark-2.2.-bin-hadoop2.]$ bin/spark-shell --master yarn-client
注意,这里的--master是固定参数
Spark Shell启动时遇到<console>:14: error: not found: value spark import spark.implicits._ <console>:14: error: not found: value spark import spark.sql错误的解决办法(图文详解)的更多相关文章
- IDEA里运行代码时出现Error:scalac: error while loading JUnit4, Scala signature JUnit4 has wrong version expected: 5.0 found: 4.1 in JUnit4.class错误的解决办法(图文详解)
不多说,直接上干货! 问题详情 当出现这类错误时是由于版本不匹配造成的 Information:// : - Compilation completed with errors and warnin ...
- 关于MyEclipse启动报错:Error starting static Resources;下面伴随Failed to start component [StandardServer[8005]]; A child container failed during start.的错误提示解决办法.
最后才发现原因是Tomcat的server.xml配置文件有问题:apache-tomcat-7.0.67\conf的service.xml下边多了类似与 <Host appBase=" ...
- PHP生成页面二维码解决办法?详解
随着科技的进步,二维码应用领域越来越广泛,今天我给大家分享下如何使用PHP生成二维码,以及如何生成中间带LOGO图像的二维码. 具体工具: phpqrcode.php内库:这个文件可以到网上下载,如果 ...
- SQL SERVER 2000安装教程图文详解
注意:Windows XP不能装企业版.win2000\win2003服务器安装企业版一.硬件和操作系统要求 下表说明安装 Microsoft SQL Server 2000 或 SQL Server ...
- SQL server 2008 r2 安装图文详解
文末有官网下载地址.百度网盘下载地址和产品序列号以及密钥,中间需要用到密钥和序列号的可以到文末找选择网盘下载的下载解压后是镜像文件,还需要解压一次直接右键点击解如图所示选项,官网下载安装包的可以跳过前 ...
- Elasticsearch集群状态健康值处于red状态问题分析与解决(图文详解)
问题详情 我的es集群,开启后,都好久了,一直报red状态??? 问题分析 有两个分片数据好像丢了. 不知道你这数据怎么丢的. 确认下本地到底还有没有,本地要是确认没了,那数据就丢了,删除索引 ...
- error: not found: value sqlContext/import sqlContext.implicits._/error: not found: value sqlContext /import sqlContext.sql/Caused by: java.net.ConnectException: Connection refused
1.今天启动启动spark的spark-shell命令的时候报下面的错误,百度了很多,也没解决问题,最后想着是不是没有启动hadoop集群的问题 ,可是之前启动spark-shell命令是不用启动ha ...
- 对于maven创建spark项目的pom.xml配置文件(图文详解)
不多说,直接上干货! http://mvnrepository.com/ 这里,怎么创建,见 Spark编程环境搭建(基于Intellij IDEA的Ultimate版本)(包含Java和Scala版 ...
- 全网最详细的启动或格式化zkfc时出现java.net.NoRouteToHostException: No route to host ... Will not attempt to authenticate using SASL (unknown error)错误的解决办法(图文详解)
不多说,直接上干货! 全网最详细的启动zkfc进程时,出现INFO zookeeper.ClientCnxn: Opening socket connection to server***/192.1 ...
随机推荐
- Job-Show Liang,你来掌管诺基亚王国,可好?
保留我一向高大上风格,开头当然来一个段子 在即将到来MWC(Mobile World Congress缩写,世界移动通信大会),很高兴能听到小诺来参展,我不得不给它32个赞,因为小诺已经好几届没有浮头 ...
- c# Include 与 用户控件
<!-- #Include File="~/App_UC/head.bootstrap.aspx --> 这个路径文件可以是你html代码,也可以是应用脚本文件, 原理:跟用户控 ...
- .NET MVC对接POLYV——HTML5播放器播放加密视频
官方参考文档:http://dev.polyv.net/2017/videoproduct/v-playerapi/html5player/html5-docs/ 1.上传视频之前根据自己需要对所上传 ...
- 转载:解决CentOS7虚拟机无法上网并设置CentOS7虚拟机使用静态IP上网
最近在VMware虚拟机里玩Centos,装好后发现上不了网.经过一番艰辛的折腾,终于找到出解决问题的方法了.最终的效果是无论是ping内网IP还是ping外网ip,都能正常ping通.方法四步走: ...
- 内联函数背景、例子、与普通函数的区别及要注意的地方 ------新标准c++程序设计
背景: 使用函数能够避免将相同代码重些多次的烦恼,还能减少可执行程序的体积,但也会带来程序运行时间上的开销.函数调用在执行时,首先在栈中为形参和局部变量分配存储空间,然后还要将实参的值复制给形参,接下 ...
- java 列表与集合总结
列表与集合 (一切输出都用for each!丢弃迭代器) 列表List 1 顺序表 Arraylist 适用于静态查找2 链式双向表 Linkedlist 适用于增删该查3 (容器) Vecto ...
- 看了这篇Dubbo RPC面试题,让天下没有难面的面试题!
前言: RPC非常重要,很多人面试的时候都挂在了这个地方!你要是还不懂RPC是什么?他的基本原理是什么?你一定要把下边的内容记起来!好好研究一下!特别是文中给出的一张关于RPC的基本流程图,重点中 ...
- 利用keytool工具生成数字证书
一.制作数字证书 因测试微信小程序, 腾讯要求使用 https协议,所以需要使用证书.使用jdk工具制作数字证书流程如下: 1.查看JDK是否安装,使用命令java -version 2.切换目录至 ...
- yum及RPM安装
yum及RPM安装 基本说明: 1.yum相当于windows上面的360软件中心 2.yum是redhat系列发行版的软件安装命令 debian系统用的是apt-get 3.yum安装软件的来源得存 ...
- centos7多节点部署redis4.0.11集群
1.服务器集群服务器 redis节点node-i(192.168.0.168) 7001,7002node-ii(192.168.0.169) 7003,7004node-iii(192.168.0. ...