不懂JAVA,但是要用spark执行任务的小白踩过的坑
不断记录自己在用spark-submit执行任务的过程中遇到的问题吧。
持续更新中
driver节点内存不足
Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x00007f626e5d4000, 262144, 0) failed; error='Cannot allocate memory' (errno=12)
driver内存不足导致无法启动application,将driver分配到内存足够的机器上或减少driver-memory
executor堆栈过大
There is insufficient memory for the Java Runtime Environment to continue.
executor设置得太大,可以适当减小
ERROR LiveListenerBus: Dropping SparkListenerEvent because no remaining room in event queue. This likely means one of the SparkListeners is too slow and cannot keep up with the rate at which tasks are being started by the scheduler.
消息队列中的消息数超过了spark.scheduler.listenerbus.eventqueue.size设置的数量,这个时候会将最新的消息移除。由于你移除了,状态无法得到更新,所以会出现上面描述的现象。