执行这个20几条记录,机器就会内存溢出,在本地、单机服务器都是这样。换个电脑也是这样。
8/10/31 22:18:26 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker for task 353,5,main]
java.lang.StackOverflowError
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1707)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:479)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
Process finished with exit code 50
后来也改了代码,还会这样。不过才这么几条数据,就内存溢出,讲不通啊。
后来找到一个临时解决方法,就是删除这个application,重新运行。