系统:centos6.4 spark版本:0.8.1
1.spark官方网址
里面有软件下载,文档,和视频教程。官网见:猛戳此处
2.spark安装
centos下安装方法见参考文献1。
安装后运行example出现问题:
1)WARN cluster.ClusterScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory
ERROR client.Client$ClientActor: All masters are unresponsive! Giving up.
解决方法:vi spark-env.sh,添加export SPARK_MASTER_IP=192.168.178.92
其中后面的ip是master节点的ip
到这步spark就可以用了,可以先不管第3步。此时spark工作在standalone cluster模式下,运用自身单独的集群管理器。
3.如果要在mesos上跑spark,用mesos管理集群,还需要安装mesos。
centos安装时,官网中(http://mesos.apache.org/gettingstarted/)安装的依赖包是ubuntu系统下的名字,centos下包的名字是不一样的!期初还以为我yum出问题了!!centos安装见参考文献3,4和5。
遇到的问题:
1)WARNING: 'automake-1.13' is missing on your system.
解决方法:下载automake-1.13,安装发现需要安装autoconf,下载autoconf,安装,可能之前系统里已经有了autoconf,所以要将之前比较低的autoconf版本卸载了,不然安装不起效。卸载和安装autoconf见参考文献2.
2)Libtool library used but `LIBTOOL' is undefined。
解决方法:libtool版本低了,或者没安装好。把之前的卸载并重新安装。
3)Could not link test program to Python. Maybe the main Python library has been installed in some non-standard library path. If so, pass it to configure, via the LDFLAGS environment variable. Example: ./configure LDFLAGS="-L/usr/non-standard-path/python/lib"
解决方法:其实不是python没装,或者库的问题,是没装python-dev或者没装好。重新安装下python-dev
4)ibjvm.so: cannot open shared object file: No such file or directory
解决方法:修改下LD_LIBRARY_PATH环境变量,即export LD_LIBRARY_PATH=/usr/lib/jvm/java-1.6.0/jre/lib/amd64:usr/lib/jvm/java-1.6.0/jre/lib/amd64/server,具体路径根据自己情况而定。
参考文献:
[1]spark安装:http://www.yanjiuyanjiu.com/blog/20131017/
[2]卸载和安装autoconf:http://www.cnblogs.com/sunss/archive/2011/07/20/2111697.html
[3]mesos安装1:http://stackoverflow.com/questions/19080597/setting-up-mesos-on-centos
[4]mesos安装2:http://www.it165.net/admin/html/201301/650.html
[5]mesos安装3:http://www.cnblogs.com/jasonkoo/articles/2834727.html