Spark错误记录
- 1、java.lang.IllegalArgumentException: Required executor memory (1024), overhead (384 MB), and PySpark memory (0 MB) is above the max threshold (1024 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.
- 2、Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=syq, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
- 3、问题:idea使用自带的反编译插件只有函数名,而函数体出现/* compiled code*/。这通常是多个反编译插件混淆导致。
- 4、Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
1、java.lang.IllegalArgumentException: Required executor memory (1024), overhead (384 MB), and PySpark memory (0 MB) is above the max threshold (1024 MB) of this cluster! Please check the values of ‘yarn.scheduler.maximum-allocation-mb’ and/or ‘yarn.nodemanager.resource.memory-mb’.
安装好CDH,启动Spark-Shell报错:
java.lang.IllegalArgumentException: Required executor memory (1024), overhead (384 MB), and PySpark memory (0 MB) is above the max threshold (1024 MB) of this cluster! Please check the values of ‘yarn.scheduler.maximum-allocation-mb’ and/or ‘yarn.nodemanager.resource.memory-mb’.
解决办法:
1、将报错的2个参数增大到2048MB
#最大容器内存
yarn.scheduler.maximum-allocation-mb
#容器内存
yarn.nodemanager.resource.memory-mb
2、重启yarn服务。
启动spark,你会发现另外一个错误。
2、Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=syq, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
于是切换成hdfs用户运行spark,这次没有任何错误,启动成功。
sudo -u hdfs spark-shell
3、问题:idea使用自带的反编译插件只有函数名,而函数体出现/* compiled code*/。这通常是多个反编译插件混淆导致。
解决方法:file->settings->plugins 禁用掉‘Java Decompiler Intellij Plugin’,重启即可。
4、Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
资源不足,增大内存。