spark java lambda_java.lang.ClassCastException在远程服务器上的Spark作业中使用Lambda表达式...

我正在尝试使用sparkjava.com框架为我的Apache Spark作业构建Web API。我的代码是:

@Override

public void init() {

get("/hello",

(req, res) -> {

String sourcePath = "hdfs://spark:54310/input/*";

SparkConf conf = new SparkConf().setAppName("LineCount");

conf.setJars(new String[] { "/home/sam/resin-4.0.42/webapps/test.war" });

File configFile = new File("config.properties");

String sparkURI = "spark://hamrah:7077";

conf.setMaster(sparkURI);

conf.set("spark.driver.allowMultipleContexts", "true");

JavaSparkContext sc = new JavaSparkContext(conf);

@SuppressWarnings("resource")

JavaRDD log = sc.textFile(sourcePath);

JavaRDD lines = log.filter(x -> {

return true;

});

return lines.count();

});

}

如果删除lambda表达式或将其放在简单的jar而不是Web服务(以某种方式称为servlet)中,它将运行无任何错误。但是在servlet中使用lambda表达式将导致以下异常:

15/01/28 10:36:33 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, hamrah): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1

at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2089)

at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1261)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1999)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)

at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)

at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)

at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:57)

at org.apache.spark.scheduler.Task.run(Task.scala:56)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

PS:我尝试过将jersey和javaspark与码头,tomcat和resin结合使用,所有这些使我得出了相同的结果。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值