报错:Exception in thread "main" java.lang.NoClassDefFoundError: Lorg/apache/hadoop/fs/FileSystem
报错现象:
Exception in thread "main" java.lang.NoClassDefFoundError: Lorg/apache/hadoop/fs/FileSystem;
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
at java.lang.Class.getDeclaredFields(Class.java:1916)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:72)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:1560)
at org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:185)
at org.apache.flink.streaming.api.datastream.DataStream.addSink(DataStream.java:1191)
at KafkaHadoop.main(KafkaHadoop.java:46)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FileSystem
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 8 more
报错背景:
当时我是在做kafka+flink+hadoop数据存取的功能,一直在报这个错,代码也没有问题,困扰了很长时间。
报错解决:
第一步:我怀疑是依赖冲突问题,于是将关于hadoop的pom依赖给一一注释掉,发现做完之后还是报错;
第二步:我开始怀疑是没有这个jar包,但是当我去MVNREPOSITORY网站搜索关于org.apache.hadoop.fs.FileSystem的依赖时,搜索到了一下几个依赖:
hadoop-hdfs、flink-connector-filesystem_2.11、flink-hadoop-fs 相关的,但是放上去之后发现并没有什么卵用;
第三步:于是我就开始百度 org.apache.hadoop.fs.FileSystem 相关信息,
(1)找到了下面这个网站入口:
(2)点进去发现:
(3)然后点了进去:
虽然看不懂,但是我感觉很有可能就是在下面这几个jar包中,于是我就去MVNREPOSITORY网站搜索hadoop-core,
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.1.0</version>
</dependency>
将这个依赖加了进去,然后报错解决。
完整pom:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <groupId>yk.bigdate</groupId>
<artifactId>flinkHDFS</artifactId>
<version>1.0-SNAPSHOT</version> <properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.6.1</flink.version>
<slf4j.version>1.7.7</slf4j.version>
<log4j.version>1.2.17</log4j.version>
</properties> <dependencies>
<!--******************* flink *******************-->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- <dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>${flink.version}</version>
</dependency>-->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.11_2.11</artifactId>
<version>${flink.version}</version>
<scope> compile</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-filesystem_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.1.0</version>
</dependency>
<!--******************* kafka *******************-->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>1.1.1</version>
</dependency> </dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<!--打jar包-->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.allen.capturewebdata.Main</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>
</project>
报错:Exception in thread "main" java.lang.NoClassDefFoundError: Lorg/apache/hadoop/fs/FileSystem的更多相关文章
- spark-shell报错:Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
环境: openSUSE42.2 hadoop2.6.0-cdh5.10.0 spark1.6.0-cdh5.10.0 按照网上的spark安装教程安装完之后,启动spark-shell,出现如下报错 ...
- Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/CanUnbuffer
在执行spark on hive 的时候在 sql.show()处报错 : Exception in thread "main" java.lang.NoClassDefFoun ...
- 错误Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream排查思路
spark1(默认CDH自带版本)不存在这个问题,主要是升级了spark2(CDHparcel升级)版本安装后需要依赖到spark1的旧配置去读取hadoop集群的依赖包. 1./etc/spark2 ...
- hive 启动不成功,报错:hive 启动报 Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/MRVersi
1. 现象:在任意位置输入 hive,准备启动 hive 时,报错: Exception in thread "main" java.lang.NoClassDefFoundErr ...
- 创建Sqoop作业,报错Exception in thread "main" java.lang.NoClassDefFoundError: org/json/JSONObject
WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P in ...
- 使用IntelliJ工具打包kotlin为bat文件运行报错 Exception in thread "main" java.lang.NoClassDefFoundError
Exception in thread "main" java.lang.NoClassDefFoundError 这个很有可能是因为idea里的java版本与电脑上的java环境 ...
- 报错Exception in thread "main" java.lang.NoClassDefFoundError: javax/xml/bind/...
首先我的jdk是11.05的 这个是由于: 这个是 由于缺少了javax.xml.bind,在jdk10.0.1中没有包含这个包,所以我自己去网上下载了jdk 8,然后把jdk10.0.1换成jdk ...
- springBoot报错Exception in thread "main" java.lang.NoClassDefFoundError: ch/qos/logback/classic/Level
解决办法: 如果使用的是阿里云 maven 镜像,在这会有找不到最新 Springboot 相关包的问题,请把加速镜像指向华为云: <mirror> <id>huaweiclo ...
- spark使用idea向yarn提交报错:Exception in thread "main" java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
解决方法: 找到1.19版本放到spark的jars目录下
随机推荐
- python基础语法二
迭代 test = "妹子有种冲我来" #可迭代对象 == 被for进行循环获取 for item in test: print(item) break #练习题: test = ...
- Linux 驱动——Button驱动1
button_drv.c驱动文件: #include <linux/module.h>#include <linux/kernel.h>#include <linux/i ...
- swiper遇到的问题
<!DOCTYPE html> <html> <head> <meta name="viewport" content="wid ...
- Spring源码学习笔记2
1.默认标签的解析 对四种不同标签的解析 private void parseDefaultElement(Element ele, BeanDefinitionParserDelegate dele ...
- 爱因斯坦求和约定 (Einstein summation convention)
- phpStudy 切换版本后没有权限的问题
在httpd-vhosts.conf配置如下: <VirtualHost *:80> ServerName www.jy.com DocumentRoot "C:\htdocs\ ...
- UVa439——骑士的移动
简单bfs #include <iostream> #include <cstring> #include <string> #include <map> ...
- POJ1064 Cable master(二分 浮点误差)
题目链接:传送门 题目大意: 给出n根长度为1-1e5的电线,想要从中切割出k段等长的部分(不可拼接),问这个k段等长的电线最长可以是多长(保留两位小数向下取整). 思路: 很裸的题意,二分答案即可. ...
- 准备学习wrf
namelist.wps 中的 geog_data_path=后面的文件夹的文件即土地覆被资料:(看下图) 推荐中科院地理所承担的地球系统科学数据共享平台上共享的100m分辨率资料,官网 http: ...
- spring3-struts2整合
spring 负责对象创建 struts 用Action处理请求 说明: spring版本:spring-framework-3.2.5.RELEASE struts版本:struts-2.3. ...