spark提示Caused by: java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to [Lscala.collection.immutable.Map;
spark提示Caused by: java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to [Lscala.collection.immutable.Map;
起因
编写了一个处理两列是否相等的UDF,这两列的数据结构是一样的,但是结构比较复杂,如下:
|-- list: array (nullable = true)
| |-- element: map (containsNull = true)
| | |-- key: string
| | |-- value: array (valueContainsNull = true)
| | | |-- element: struct (containsNull = true)
| | | | |-- Date: integer (nullable = true)
| | | | |-- Name: string (nullable = true)
|-- list2: array (nullable = true)
| |-- element: map (containsNull = true)
| | |-- key: string
| | |-- value: array (valueContainsNull = true)
| | | |-- element: struct (containsNull = true)
| | | | |-- Date: integer (nullable = true)
| | | | |-- Name: string (nullable = true)
也就是说Array里嵌着Map,Map里还嵌着一个Array,只能依次去比较,编写的UDF如下:
case class AppList(Date: Int, versionCode: Int, Name:String)
def isMapEqual(map1: Map[String, Array[AppList]], map2:Map[String, Array[AppList]]): Boolean = {
try{
if (map1.size != map2.size){
return false
} else{
for ( x <- map1.keys){
if (map1(x) != map2(x)){
return false
}
}
return true
}
} catch {
case e: Exception => false
}
}
def isListEqual(list1: Array[Map[String, Array[AppList]]], list2:Seq[Map[String, Seq[AppList]]]): Boolean = {
try {
if (list1.length != list2.length){
return false
} else if (list1.length == 0 || list2.length == 0){
return false
} else {
return isMapEqual(list1(0), list2(0))
}
} catch {
case e: Exception => false
}
}
val isColumnEqual = udf((list1: Array[Map[String, Array[AppList]]], list2:Array[Map[String, Array[AppList]]]) => {
isListEqual(list1, list2)
})
然后我就贴到spark-shell里去执行了下面语句:
val dat = df.withColumn("equal", isColumnEqual($"list1", $"list2"))
dat.show()
此时就出现了如下错误:
Caused by: org.apache.spark.SparkException: Failed to execute user defined function($anonfun$1: (array<map<string,array<struct<Date:int,Name:string>>>>, array<map<string,array<struct<Date:int,Name:string>>>>) => boolean)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:231)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:225)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to [Lscala.collection.immutable.Map;
at $anonfun$1.apply(<console>:42)
... 16 more
解决办法
所谓的解决办法,自然是去谷歌了…
在这里看到,说把Array改成Seq就好了,囧,尝试了一下,果然就好了
原因
这里说:
So it looks like the ArrayType on Dataframe "idDF" is really a WrappedArray and not an Array - So the function call to "filterMapKeysWithSet" failed as it expected an Array but got a WrappedArray/ Seq instead (which doesn't implicitly convert to Array in Scala 2.8 and above).
意思是,此Array非Scala中的原生Array,而是封装了一下的Array(有错的一定指出来,我都没写过Scala,慌
参考
- https://stackoverflow.com/questions/40199507/scala-collection-mutable-wrappedarrayofref-cannot-be-cast-to-integer
- https://stackoverflow.com/questions/40764957/spark-java-lang-classcastexception-scala-collection-mutable-wrappedarrayofref
spark提示Caused by: java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to [Lscala.collection.immutable.Map;的更多相关文章
- hive orc压缩数据异常java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.hive.ql.io.orc.OrcSerde$OrcSerdeRow
hive表在创建时候指定存储格式 STORED AS ORC tblproperties ('orc.compress'='SNAPPY'); 当insert数据到表时抛出异常 Caused by: ...
- sbt spark2.3.1 idea打包Caused by: java.lang.ClassNotFoundException: scala.Product$class
今天同事在服务区上面装的是最新版本的hadoop3.10和spark2.3.1,因为用scala开发, 所以我想用sbt进行开发.过程中遇到各种坑,刚开始用的jdk10,结果也报错,后来改成jdk1. ...
- 关于android使用ksoap2报Caused by: java.lang.ClassCastException: org.ksoap2.SoapFault cannot be cast to org.ksoap2.serialization.SoapObject
Caused by: java.lang.ClassCastException: org.ksoap2.SoapFault cannot be cast to org.ksoap2.serializa ...
- Caused by: java.lang.ClassCastException: org.springframework.web.SpringServletContainerInitializer cannot be cast to javax.servlet.ServletContainerInitializer错误解决办法
严重: Failed to initialize end point associated with ProtocolHandler ["http-bio-8080"] java. ...
- Caused by: java.lang.ClassCastException: org.springframework.web.SpringServletContainerInitializer cannot be cast to javax.servlet.ServletContainerInitializer
A child container failed during startjava.util.concurrent.ExecutionException: org.apache.catalina.Li ...
- Caused by: java.lang.ClassCastException: class java.lang.Double cannot be cast to class org.apache.hadoop.io.WritableComparable
错误: Caused by: java.lang.ClassCastException: class java.lang.Double cannot be cast to class org.apac ...
- java.lang.ClassCastException: net.sf.ezmorph.bean.MorphDynaBean cannot be cast to
Java.lang.ClassCastException: net.sf.ezmorph.bean.MorphDynaBean cannot be cast to 在使用JSONObject.toBe ...
- SSH整合时执行hibernate查询报错:java.lang.ClassCastException: com.ch.hibernate.Department_$$_javassist_0 cannot be cast to javassist.util.proxy
今天在整合ssh三个框架时,有一个功能,是查询所有员工信息,且员工表和部门表是多对一的映射关系,代码能正常运行到查询得到一个List集合,但在页面展示的时候,就报异常了, java.lang.Clas ...
- java.lang.ClassCastException: org.springframework.web.filter.CharacterEncodingFilter cannot be cast to javax.servlet.Filter
java.lang.ClassCastException: org.springframework.web.filter.CharacterEncodingFilter cannot be cast ...
随机推荐
- grep 和 awk的buffer
当使用 tail -f test.log | grep "mode" | awk '{print $5}'命令 或者 tail -f test.log | awk '/mode/ ...
- [转]HTML DIV+CSS 命名规范大全
原文链接 常用DIV+CSS命名大全集合,即CSS命名规则 我们开发CSS+DIV网页(Xhtml)时候,比较困惑和纠结的事就是CSS命名,特别是新手不知道什么地方该如何命名,怎样命名才是好的方法. ...
- 解决编译Apache出现的问题:configure: error: APR not found
今日编译apache时出错: #./configure --prefix……检查编辑环境时出现: checking for APR... noconfigure: error: APR not fou ...
- 五年屌丝运维工作shell精华
屌丝运维常用shell列出你最常用的10条shellhistory | awk '{a[$2]++}END{for(i in a){print a[i] " " i}}' | so ...
- 进阶之路(基础篇) - 011 arduino api基础手册
arduino 函数 api 程序结构 在Arduino中, 标准的程序入口main函数在内部被定义, 用户只需要关心以下两个函数:void setup()void loop()setup() 函数用 ...
- Map的有序和无序实现类,与Map的排序
1.HashMap.Hashtable不是有序的: 2.TreeMap和LinkedHashMap是有序的(TreeMap默认 Key 升序,LinkedHashMap则记录了插入顺序). 今天做统计 ...
- IIS7虚拟目录出现HTTP错误500.19(由于权限不足而无法读取配置文件)的解决方案
今天在window7上配置asp.net网站,但是访问总是提示 错误摘要HTTP 错误 500.19 - Internal Server Error无法访问请求的页面,因为该页的相关配置数据无效.详细 ...
- Java中类的设计技巧
1) 一定要将数据设计为私有: 不要破坏封装性.有时需要编写一个访问器或更改器方法,但是最好还是保持实例域的私有性.数据的表示形式可能会改变,但他们的使用方式却不会经常发生变化.当数据保持私有时,他 ...
- 8个实用而有趣Bash命令提示行
很多人都对过命令行提示的重要性不屑一顾,甚至是一点都不关心.但是我却一点都不这么认为,一个好的命令行提示可以改变你使用命令的方式.为此,我在internet上找到一些非常实用,优秀,并有趣的bash的 ...
- zookeeper注册中心安装(单机版)
下载zookeeper-3.4.9.tar.gz wget http://apache.fayea.com/zookeeper/zookeeper-3.4.9/zookeeper-3.4.9.tar. ...