上述案例是冗余的写法: --jars可以指定driver和executor都需要的依赖,--driver-library-path 为driver程序中依赖的命令行参数解析 jar包,但是excutor中并不需要。
executor需要则: --spark.executor.extraClassPath to augment the executor classpath
spark-submit --class EntropyWeights --jars scopt_2.10-3.5.0.jar --driver-library-path scopt_2.10-3.5.0.jar --conf spark.dynamicAllocation.enabled=true --conf spark.shuffle.service.enabled=true /home/mart_mobile/fdy/EntropyWeights.jar -t app.app_celebrity_properties_4rank_da -d author_id1,author_pin --colweights commission=1.0,pv=1.0,uv=1.0,upvote_num=1.0,comment_num=1.0,share_num=1.0,enter_detail_pv=1.0,enter_detail_uv=1.0,ordnum_inby_5thevent=1.0,ordsum_inby_5thevent=1.0,ordnum_in_direct=1.0,ordsum_in_direct=1.0,ordnum_in_indirect=1.0,ordsum_in_indirect=1.0,detail_ratio=1.0,import_ratio=1.0,fans_num=1.0,rank=1.0,open_rate=1.0, -o app.app_celebrity_rank_da
新的集群出现 org.apache.commons.math jar 包(多 jar 包依赖的制定方法, 分隔符): spark-submit --class EntropyWeights --jars ./scopt_2.10-3.5.0.jar,./commons-math-2.1.jar --driver-library-path ./scopt_2.10-3.5.0.jar:./commons-math-2.1.jar --conf spark.dynamicAllocation.enabled=true --conf spark.shuffle.service.enabled=true ./EntropyWeights.jar -i app.app_celebrity_properties_4rank_da -d author_id1,author_pin --colweights commission=1.0,pv=1.0,uv=1.0,upvote_num=1.0,comment_num=1.0,share_num=1.0,enter_detail_pv=1.0,enter_detail_uv=1.0,ordnum_inby_5thevent=1.0,ordsum_inby_5thevent=1.0,ordnum_in_direct=1.0,ordsum_in_direct=1.0,ordnum_in_indirect=1.0,ordsum_in_indirect=1.0,detail_ratio=1.0,import_ratio=1.0,fans_num=1.0,rank=1.0,open_rate=1.0, -o app.app_celebrity_rank_da -s 5.0 -t norm