1. spark 如何执行程序? 首先看下spark 的部署图: 节点类型有: 1. master 节点: 常驻master进程,负责管理全部worker节点. 2. worker 节点: 常驻worker进程,负责管理executor 并与master节点通信. dirvier:官方解释为: The process running the main() function of the application and creating the SparkContext.即理解为用户自己编写的应用
进行如下设置,解决报错信息. val conf = new SparkConf().setAppName("helloSpark").setMaster("spark://master:7077").set("spark.executor.memory", "2g")val sc = new SparkContext(conf)sc.addJar("/home/spark/IdeaProjects/FirstApp/