在spark standalone模式启动结合虚拟机遇到的driver url获取问题
学习博客:http://bit1129.iteye.com/blog/2179543
在job启动后,输出如下问题:
17/12/22 10:01:21 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
定位问题:
查看worker ui中executor的stdout和stderr。stderr输出如下:
Caused by: org.apache.spark.rpc.RpcTimeoutException: Cannot receive any reply from192.168.56.1:61266 in 120 seconds. This timeout is controlled by spark.rpc.askTimeout
全部过程中并没有使用该ip,但是考虑到启动该job时并未设置任何参数,所以应该是环境变量设置问题。查看ip地址,发现192.168.56.1为虚拟机网卡ip。
然后查看worker的工作日志,也就
学习博客:http://bit1129.iteye.com/blog/2179543
在job启动后,输出如下问题:
17/12/22 10:01:21 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
定位问题:
查看worker ui中executor的stdout和stderr。stderr输出如下:
Caused by: org.apache.spark.rpc.RpcTimeoutException: Cannot receive any reply from192.168.56.1:61266 in 120 seconds. This timeout is controlled by spark.rpc.askTimeout
全部过程中并没有使用该ip,但是考虑到启动该job时并未设置任何参数,所以应该是环境变量设置问题。查看ip地址,发现192.168.56.1为虚拟机网卡ip。
然后查看worker的工作日志,也就