Caused by: java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
eclipse java 连接远程spark运行和调试,使用hadoop和hive均出现此问题,单独打包放到集群上运行正常,把master修改成local就可以了
eclipse java 连接远程spark运行和调试,使用hadoop和hive均出现此问题,单独打包放到集群上运行正常,把master修改成local就可以了
String appName = "Java Spark Hive Example";
String master = "local";//"spark://master.spark.redblue-ai.com:7077";
String metastore = "thrift://metastore.hive.redblue-ai.com:9083";
String warehouse = "hdfs://namenode.hadoop.redblue-ai.com:8020/user/hive/warehouse";
SparkSession spark = SparkSession.builder().appName(appName).master(master)
.config("hive.metastore.uris", metastore).config("spark.sql.warehouse.dir", warehouse)
.enableHiveSupport().getOrCreate();