转载请注明出处: http://blog.csdn.net/u012842205/article/details/53160171
声明: 关于以上环境的搭建,配置,本文不作叙述。直接进入正题。本文实例是编写一个简单的spark作业,用于统计输入文件的长度总和。并使用spark-submit脚本提交给spark计算平台运行。
1 环境
操作系统: Ubuntu 16.04 x86_64
Spark: Apache spark 2.0.0-bin-hadoop2.7
hadoop: Apache Hadoop 2.7.1
JDK: Oracle JDK 1.8.1_101
Scala环境: 2.11.8
IDE: scala ide 4.4.1
这里说明一下。笔者亲自尝试过,只要jar文件引用全了,只运行Master为local(非集群方式,本地运行)的spark作业,是可以不需要安装hadoop和spark环境的。本文的spark示例作业所需要的包见下文(通过maven依赖插件dependency获取到)。而没有spark环境意味着没有$SPARK_HOME这个环境变量,也可能会报错,我之前报错出现端口绑定异常,在程序中加入了spark.driver.host变量,设置为localhost即可。部分此错误堆栈如下:
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 1