在Spark应用开发中,很容易出现如下报错: org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:304) at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(C…
Spark序列化这块网上讲的比较少,自己还没来得及看这块代码,今天编程的时候遇到一个Hadoop的Writerable实现在Spark无法序列化的问题.我的代码如下: object EntryApp extends App{ val conf = new SparkConf().setAppName("cgbdata").setMaster("local") val sc = new SparkContext(conf) val hadoopConfig = new…