plugin划红线报错:

maven-scala-plugin
maven-shade-plugin

查找Maven仓库,发现一个没有jar包,一个jar包无法解压缩打开,删除Maven中坏的jar包,并Reimport成功,IDEA不再报错:

在IDEA的Maven面板中,设置跳过test,为打包做准备:

打包Spark程序:

打包程序报错:

Failed to execute goal org.scala-tools:maven-scala-plugin:2.15.2:compile

wrap: org.apache.commons.exec.ExecuteException

error: scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found.

Re-run Maven using the -X switch to enable full debug logging.

参考相关博客,推断Maven库中jar包有问题,但是报错不具体,无法定位有问题的jar包,设置Maven输出更详细的信息,将Output level由Info调成Debug:

先Clean,后执行Compile:

返回错误信息,经过滤如下:

[FATAL] Non-parseable POM D:\Development\MavenRepository\org\apache\hadoop\hadoop-mapreduce-client-core\2.6.0-cdh5.14.2\hadoop-mapreduce-client-core-2.6.0-cdh5.14.2.pom: end tag name </head> must be the same as start tag <link> from line 21 (position: TEXT seen ...<![endif]-->\r\n</head>... @66:8)  @ line 66, column 8

[FATAL] Non-parseable POM D:\Development\MavenRepository\org\apache\phoenix\phoenix-core\4.14.0-cdh5.14.2\phoenix-core-4.14.0-cdh5.14.2.pom: end tag name </head> must be the same as start tag <link> from line 21 (position: TEXT seen ...<![endif]-->\r\n</head>... @66:8)  @ line 66, column 8

[FATAL] Non-parseable POM D:\Development\MavenRepository\com\lmax\disruptor\3.3.8\disruptor-3.3.8.pom: end tag name </head> must be the same as start tag <link> from line 21 (position: TEXT seen ...<![endif]-->\r\n</head>... @66:8)  @ line 66, column 8

……

到对应目录下查看jar包是否正常,解压打开报错,说明对应jar包确实有问题,删除Maven中坏的jar包,执行Reimport重新导入jar包:

虽然IDEA依然有红线报错,但是画红线的jar包已经导入并能解压缩打开,尝试执行Compile,Compile成功,执行Package,Package成功,返回如下信息:

[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing D:\Development\asset\statistics-master-190725\target\statistics-1.0-SNAPSHOT.jar with D:\Development\asset\statistics-master-190725\target\statistics-1.0-SNAPSHOT-shaded.jar
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  02:49 min
[INFO] Finished at: 2019-07-25T16:41:04+08:00
[INFO] ------------------------------------------------------------------------

上传到服务器:

执行打包的Spark程序:

[root@node2 ~]# spark-submit  --master yarn-cluster --driver-memory 4g --num-executors  --executor-memory 2g --executor-cores   --class statistics.CostAsset   --conf spark.driver.extraClassPath=/opt/cloudera/parcels/APACHE_PHOENIX--cdh5./lib/phoenix/lib/*  --conf spark.executor.extraClassPath=/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.14.2.p0.3/lib/phoenix/lib/* /home/microservices/statistics-1.0-SNAPSHOT.jar total model

执行后返回如下信息:

[root@node2 ~]# spark-submit  --master yarn-cluster --driver-memory 4g --num-executors  --executor-memory 2g --executor-cores   --class statistics.CostAsset   --conf spark.driver.extraClassPath=/opt/cloudera/parcels/APACHE_PHOENIX--cdh5./lib/phoenix/lib/*  --conf spark.executor.extraClassPath=/opt/cloudera/parcels/APACHE_PHOENIX-4.14.0-cdh5.14.2.p0.3/lib/phoenix/lib/* /home/microservices/statistics-1.0-SNAPSHOT.jar total model
19/07/25 16:54:52 INFO client.RMProxy: Connecting to ResourceManager at node1/10.200.101.131:8032
19/07/25 16:54:52 INFO yarn.Client: Requesting a new application from cluster with 3 NodeManagers
19/07/25 16:54:52 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (40874 MB per container)
19/07/25 16:54:52 INFO yarn.Client: Will allocate AM container, with 4505 MB memory including 409 MB overhead
19/07/25 16:54:52 INFO yarn.Client: Setting up container launch context for our AM
19/07/25 16:54:52 INFO yarn.Client: Setting up the launch environment for our AM container
19/07/25 16:54:52 INFO yarn.Client: Preparing resources for our AM container
19/07/25 16:54:53 INFO yarn.Client: Uploading resource file:/home/microservices/statistics-1.0-SNAPSHOT.jar -> hdfs://node1:8020/user/root/.sparkStaging/application_1563417834812_0018/statistics-1.0-SNAPSHOT.jar
19/07/25 16:54:54 INFO yarn.Client: Uploading resource file:/tmp/spark-2573c8b3-f471-452f-85b1-d3582877290e/__spark_conf__4623511860207833838.zip -> hdfs://node1:8020/user/root/.sparkStaging/application_1563417834812_0018/__spark_conf__4623511860207833838.zip
19/07/25 16:54:54 INFO spark.SecurityManager: Changing view acls to: root
19/07/25 16:54:54 INFO spark.SecurityManager: Changing modify acls to: root
19/07/25 16:54:54 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
19/07/25 16:54:54 INFO yarn.Client: Submitting application 18 to ResourceManager
19/07/25 16:54:54 INFO impl.YarnClientImpl: Submitted application application_1563417834812_0018
19/07/25 16:54:55 INFO yarn.Client: Application report for application_1563417834812_0018 (state: ACCEPTED)
19/07/25 16:54:55 INFO yarn.Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: root.users.root
     start time: 1564044894496
     final status: UNDEFINED
     tracking URL: http://node1:8088/proxy/application_1563417834812_0018/
     user: root
19/07/25 16:54:56 INFO yarn.Client: Application report for application_1563417834812_0018 (state: ACCEPTED)
19/07/25 16:54:57 INFO yarn.Client: Application report for application_1563417834812_0018 (state: ACCEPTED)
19/07/25 16:54:58 INFO yarn.Client: Application report for application_1563417834812_0018 (state: ACCEPTED)
19/07/25 16:54:59 INFO yarn.Client: Application report for application_1563417834812_0018 (state: ACCEPTED)
19/07/25 16:55:00 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:00 INFO yarn.Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: 10.200.101.133
     ApplicationMaster RPC port: 0
     queue: root.users.root
     start time: 1564044894496
     final status: UNDEFINED
     tracking URL: http://node1:8088/proxy/application_1563417834812_0018/
     user: root
19/07/25 16:55:01 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:02 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:03 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:04 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:05 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:06 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:07 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:08 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:09 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:10 INFO yarn.Client: Application report for application_1563417834812_0018 (state: ACCEPTED)
19/07/25 16:55:10 INFO yarn.Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: root.users.root
     start time: 1564044894496
     final status: UNDEFINED
     tracking URL: http://node1:8088/proxy/application_1563417834812_0018/
     user: root
19/07/25 16:55:11 INFO yarn.Client: Application report for application_1563417834812_0018 (state: ACCEPTED)
19/07/25 16:55:12 INFO yarn.Client: Application report for application_1563417834812_0018 (state: ACCEPTED)
19/07/25 16:55:13 INFO yarn.Client: Application report for application_1563417834812_0018 (state: ACCEPTED)
19/07/25 16:55:14 INFO yarn.Client: Application report for application_1563417834812_0018 (state: ACCEPTED)
19/07/25 16:55:15 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:15 INFO yarn.Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: 10.200.101.135
     ApplicationMaster RPC port: 0
     queue: root.users.root
     start time: 1564044894496
     final status: UNDEFINED
     tracking URL: http://node1:8088/proxy/application_1563417834812_0018/
     user: root
19/07/25 16:55:16 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:17 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:18 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:19 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:20 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:21 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:22 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:23 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:24 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:25 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:26 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:27 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:28 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:29 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:30 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:31 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:32 INFO yarn.Client: Application report for application_1563417834812_0018 (state: RUNNING)
19/07/25 16:55:33 INFO yarn.Client: Application report for application_1563417834812_0018 (state: FINISHED)
19/07/25 16:55:33 INFO yarn.Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: 10.200.101.135
     ApplicationMaster RPC port: 0
     queue: root.users.root
     start time: 1564044894496
     final status: SUCCEEDED
     tracking URL: http://node1:8088/proxy/application_1563417834812_0018/
     user: root
19/07/25 16:55:33 INFO util.ShutdownHookManager: Shutdown hook called
19/07/25 16:55:33 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-2573c8b3-f471-452f-85b1-d3582877290e

通过界面分析执行的程序:

参考:

https://www.cnblogs.com/nurseryboy/p/6155925.html

https://www.oschina.net/question/1422726_2263380?sort=time

Spark教程——(7)编写spark-sql程序读取HBase定时生成报表的更多相关文章

  1. Spark&Hadoop:scala编写spark任务jar包,运行无法识别main函数,怎么办?

    昨晚和同事一起看一个scala写的程序,程序都写完了,且在idea上debug运行是ok的.但我们不能调试的方式部署在客户机器上,于是打包吧.打包时,我们是采用把外部引入的五个包(spark-asse ...

  2. 如何编写 PL/SQL 程序

    本文的操作选用的数据库是oracle 数据库,登陆的用户是自带的scott用户,默认密码:tiger,有不懂得可以相互讨论一下,谢谢. 首先需要了解PL/SQL块的结构,PL/SQL块由定义部门.执行 ...

  3. 大数据学习day25------spark08-----1. 读取数据库的形式创建DataFrame 2. Parquet格式的数据源 3. Orc格式的数据源 4.spark_sql整合hive 5.在IDEA中编写spark程序(用来操作hive) 6. SQL风格和DSL风格以及RDD的形式计算连续登陆三天的用户

    1. 读取数据库的形式创建DataFrame DataFrameFromJDBC object DataFrameFromJDBC { def main(args: Array[String]): U ...

  4. 【未完成】[Spark SQL_2] 在 IDEA 中编写 Spark SQL 程序

    0. 说明 在 IDEA 中编写 Spark SQL 程序,分别编写 Java 程序 & Scala 程序 1. 编写 Java 程序 待补充 2. 编写 Scala 程序 待补充

  5. Spark教程——(11)Spark程序local模式执行、cluster模式执行以及Oozie/Hue执行的设置方式

    本地执行Spark SQL程序: package com.fc //import common.util.{phoenixConnectMode, timeUtil} import org.apach ...

  6. [大数据从入门到放弃系列教程]第一个spark分析程序

    [大数据从入门到放弃系列教程]第一个spark分析程序 原文链接:http://www.cnblogs.com/blog5277/p/8580007.html 原文作者:博客园--曲高终和寡 **** ...

  7. 使用Scala编写Spark程序求基站下移动用户停留时长TopN

    使用Scala编写Spark程序求基站下移动用户停留时长TopN 1. 需求:根据手机基站日志计算停留时长的TopN 我们的手机之所以能够实现移动通信,是因为在全国各地有许许多多的基站,只要手机一开机 ...

  8. 编写Spark的WordCount程序并提交到集群运行[含scala和java两个版本]

    编写Spark的WordCount程序并提交到集群运行[含scala和java两个版本] 1. 开发环境 Jdk 1.7.0_72 Maven 3.2.1 Scala 2.10.6 Spark 1.6 ...

  9. 理解Spark SQL(三)—— Spark SQL程序举例

    上一篇说到,在Spark 2.x当中,实际上SQLContext和HiveContext是过时的,相反是采用SparkSession对象的sql函数来操作SQL语句的.使用这个函数执行SQL语句前需要 ...

随机推荐

  1. malloc实现机制

    使用过c语言的都知道malloc是一个动态分配内存的函数,还可以通过free释放内存空间. 如果我们想分析一下malloc的源码,这其实不是一会就能看懂的,但是我们可以讨论一下malloc的简单实现. ...

  2. ExecutorService 的Future类

    1.概述 在本文中,我们将了解Future.自Java 1.5以来一直存在的接口,在处理异步调用和并发处理时非常有用. 2.创建Future 简单地说,Future类表示异步计算的未来结果 - 这个结 ...

  3. vue项目打包后运行报错400如何解决

    昨天一个Vue项目打包后,今天测试,发现无论localhost还是服务器上都运行不了,报错如下: Failed to load resource: the server responded with ...

  4. 关于umask的计算方式(简单任性)

    1.对于文件夹[d]来说  用755-umask 3.对与文件[f]来说,在2的基础上减掉x属性就完事儿(有x的,减掉1,没有的,就什么也不管) 再来个手绘版本的举例

  5. js中的跨域

    因为javascript的同源策略,导致它普通情况下不能跨域,直到现在,我还是不能完全理解js跨域的几种方法,没办法,只能慢慢学习,慢慢积累,这不,几天又在园里看到一篇博文,有所收获,贴上来看看; 原 ...

  6. The Preliminary Contest for ICPC Asia Nanjing 2019 - D Robots(概率dp+拓扑排序)

    这题概率dp + 拓扑排序可以写 改天补解释 #include <bits/stdc++.h> using namespace std; const int maxn=1e5+10; ve ...

  7. 【代码审计】VAuditDemo 后台登录功能验证码绕过

    在 admin/logCheck.php中 $_POST['user']和$_POST['pass'] 未经过任何过滤或者编码处理就传入到$query中,可能存在万能密码绕过机制 但是$pass经过了 ...

  8. SpringBoot RESTful API 架构风格实践

    如果你要问 Spring Boot 做什么最厉害,我想答案就在本章标题 RESTful API 简称 REST API . 本项目源码下载 1 RESTful API 概述 1.1 什么是 RESTf ...

  9. mybatis用mybatis-generator-core-1.3.5.jar自动生成实体类

    原文出处:https://blog.csdn.net/shuoshuo_12345/article/details/80626241,本文只是个人总结而已! 方法1:在pom文件中添加依赖 只需在搭建 ...

  10. 防止重复发送Ajax请求问题

    在工作中有很多场景需要通过Ajax请求发送数据,像是注册.登录.提交用户反馈等.用户在点击了“确认”按钮之后有可能一段时间内没有收到反馈页面无任何反应,然后就接着连续多次点击“确认”按钮导致发送n个重 ...