在进行Spark与HBase 集成的过程中遇到以下问题:

java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$.apply(SparkSession.scala:)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$.apply(SparkSession.scala:)
at scala.Option.getOrElse(Option.scala:)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$.apply(SparkSession.scala:)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$.apply(SparkSession.scala:)
at scala.collection.mutable.HashMap$$anonfun$foreach$.apply(HashMap.scala:)
at scala.collection.mutable.HashMap$$anonfun$foreach$.apply(HashMap.scala:)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:)
... elided
Caused by: java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: org/htrace/Trace when creating Hive client using classpath:...... Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$.apply$mcZ$sp(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$.apply(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$.apply(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:)
... more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.NoClassDefFoundError: org/htrace/Trace
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:)
at java.lang.reflect.Constructor.newInstance(Constructor.java:)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:)
... more
Caused by: java.lang.NoClassDefFoundError: org/htrace/Trace
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:)
at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:)
at java.lang.reflect.Method.invoke(Method.java:)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:)
at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:)
at org.apache.hadoop.hdfs.DistributedFileSystem$.doCall(DistributedFileSystem.java:)
at org.apache.hadoop.hdfs.DistributedFileSystem$.doCall(DistributedFileSystem.java:)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:)
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:)
... more
Caused by: java.lang.ClassNotFoundException: org.htrace.Trace
at java.net.URLClassLoader.findClass(URLClassLoader.java:)
at java.lang.ClassLoader.loadClass(ClassLoader.java:)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:)
at java.lang.ClassLoader.loadClass(ClassLoader.java:)
... more
<console>:: error: not found: value spark
import spark.implicits._
^
<console>:: error: not found: value spark
import spark.sql
^
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.2.
/_/ Using Scala version 2.11. (Java HotSpot(TM) -Bit Server VM, Java 1.8.0_60)
Type in expressions to have them evaluated.
Type :help for more information. scala>

我事先已经将所需要的jar支持包全部拷贝到了各节点下的spark路径下的jar目录,但是在启动bin/spark-shell时总是报以上错误。在网上也看来很多解决方法,比如将缺失的jar包 :hbase/lib 下 hbase*.jar,metrics*.jar.htrace*.jar  等拷贝到spark/jar目录,但这不正是我的启动准备工作吗,等于没说!

从故障根源分析,造成此错误的原因在于Spark启动的时候无法正常从我们的jar目录加载到它所需要的jar包(此处表现为/htrace-core-3.0.4.jar包)。所以,直接将包添加到jar目录是无法解决此问题的,而我们需要做的是给它指明CLASSPATH路径,在它不能找到所需要的jar包时就可以从此路径寻找。

解决办法:

在spark/conf目录下的spark-env.sh配置文件添加以下配置内容

export HBASE_CLASSPATH='hbase classpath'
export HADOOP_CLASSPATH='hadoop classpath'
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HBASE_CLASSPATH


以上就是博主为大家介绍的这一板块的主要内容,这都是博主自己的学习过程,希望能给大家带来一定的指导作用,有用的还望大家点个支持,如果对你没用也望包涵,有错误烦请指出。如有期待可关注博主以第一时间获取更新哦,谢谢!同时也欢迎转载,但必须在博文明显位置标注原文地址,解释权归博主所有!

Spark-HBase集成错误之 java.lang.NoClassDefFoundError: org/htrace/Trace的更多相关文章

  1. Apache Spark Exception in thread “main” java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class

    问题: 今天用Maven搭建了一个Spark的Scala项目,运行后遇到下面异常: Apache Spark Exception in thread “main” java.lang.NoClassD ...

  2. 错误:java.lang.NoClassDefFoundError: com/project/common/exception/ServiceException 的解决

    问题: 项目编译通过,启动报错误信息java.lang.NoClassDefFoundError: com/project/common/exception/ServiceException. 解决方 ...

  3. Selenium 运行时出现错误(java.lang.NoClassDefFoundError: com/google/common/base/Function)

    已经写好了java脚本,点击运行的过程中如果出现如下的错误提示时: java.lang.NoClassDefFoundError: com/google/common/base/Function 问题 ...

  4. eclipse启动错误:java.lang.NoClassDefFoundError: org/eclipse/core/resources/IContainer

    转自:http://blog.csdn.net/niu_hao/article/details/9332521 eclipse启动时报错如下:java.lang.NoClassDefFoundErro ...

  5. 【MyEclipse常见错误】-java.lang.NoClassDefFoundError: org/apache/juli/logging/LogFactory的解决

    ApacheJavaTomcatMyeclipse  自己前一段时间出现了这个问题,通过在网上搜索,大概知道了原因,整理下一,以供大家参考. 将项目部署好后,启动tomcat后报错,java.lang ...

  6. 错误:java.lang.NoClassDefFoundError: org/jaxen/JaxenException

    tomcat运行时候报错: java.lang.NoClassDefFoundError: org/jaxen/JaxenException at org.dom4j.DocumentFactory. ...

  7. 用idea+maven编译打包spark project core错误:java.lang.RuntimeException: Unable to load a Suite class

    Discovery starting. *** RUN ABORTED *** java.lang.RuntimeException: Unable to load a Suite class tha ...

  8. spring boot 集成axis1.4 java.lang.NoClassDefFoundError: Could not initialize class org.apache.axis.client.AxisClient

    pom配置: <dependencies> <dependency> <groupId>org.springframework.boot</groupId&g ...

  9. SparkStreaming运行出现 java.lang.NoClassDefFoundError: org/apache/htrace/Trace 错误

    1.简介 最近在摸索利用sparkstreaming从kafka中准实时的读取数据,并将在读取的过程中,可以做一个简单的分析,最后将分析结果写入hbase中. 2.出现的问题 (1)将从kafka中读 ...

随机推荐

  1. MongoDB整理笔记のCRUD

    添加 下面我们来建立一个test 的集合并写入一些数据.建立两个对象j 和t , 并保存到集合中去.在例子里 “>” 来表示是 shell 输入提示符    > j = { name : ...

  2. 中介者(Mediator)模式

    中介者(Mediator)模式:用一个中介对象来封装一系列的对象交互,中介者使各个对象不需要显示的相互引用,从而使得耦合松散,而且可以独立的改变他们之间的交互 了解<迪米特法则>的朋友就知 ...

  3. $(this)在ajax里面不生效的探究

    第一个箭头时, 如果没有将$(this) 赋值给 _this ,那么$(this)就无法在ajax方法里面使用. 应该是应为他们属于不同的域. 赋值给_this的话, 就类似于全局变量

  4. 团队内的沟通方式:网络 OR 当面

    4月1日,我和老王稍微聊了聊关于互联网的工作方式,团队的成员不在一个地方,通过远程互联网的方式进行沟通是不是OK.现在确实有很多公司是有多地的分公司,子公司,不同的团队在不同的办公室中,甚至一个大团队 ...

  5. Delphi XE8中Delphi和JAVA数据类型对应关系!

    Delphi XE8中Delphi和JAVA数据类型对应关系所在单元文件:Androidapi.JNI.JavaTypes 对应关系: JObject = interface;//java.lang. ...

  6. 个人常用Markdow语法代码备用

    1.分隔线 -------------------------------- 2.OC代码 ``` Objective-C ``` 3.字体加粗 ##加粗## 4.标题样式 <h1> &l ...

  7. could not read data from '/Users/lelight/Desktop/ViewControllerLife/ViewControllerLife/Info.plist': The file “Info.plist” couldn’t be opened because there is no such file.

    1.Info.plist放置至新文件夹下,路径被修改了,报错. could not read data from '/Users/lelight/Desktop/ViewControllerLife/ ...

  8. NSRange 范围

    前言 结构体,这个结构体用来表示事物的一个范围,通常是字符串里的字符范围或者集合里的元素范围. typedef struct _NSRange { NSUInteger location; // 表示 ...

  9. javascript选项卡2

    <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8&quo ...

  10. win7系统电脑显示windows副本不是正版怎么办

    win7系统电脑显示windows副本,可以:在开始输入框中输入cmd——以管理员权限运行——在命令行中输入SLMGR -REARM,——重启.