今天在跑任务的时候报了一个错误
Job failed with java.lang.ArrayIndexOutOfBoundsException: 1
FAILED: Execution Error, return code 3 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Spark job failed during runtime. Please check stacktrace for the root cause.
嗯?
数组角标越界???
想想肯定不是,于是查了一下,说是内存不足于是加大内存
set spark.executor.memory=6g;
然后.........
还是报错了
就去yarn看了下日志,有这么一条错误日志
Unexpected exception from MapJoinOperator : null
啊哈!! 这个把报错很明显是因为Mapjoin
直接关掉..........
set hive.auto.convert.join=false
然后就好了........但是没整明白怎么回事................