Spark 1.1 和 hadoop2.5.1 搭配

<!-- lang: shell -->
	[root@localhost java]# cd spark
[root@localhost spark]# git branch
--* branch-1.1

    [root@localhost spark]# ls
assembly     conf  dist    examples  lib_managed           mllib    python     sbt                                 sql        tox.ini
bagel        core  docker  external  lib_managed_bak       NOTICE   README.md  scalastyle-config.xml               streaming  yarn
bin          data  docs    extras    LICENSE               pom.xml  repl       scalastyle-output.xml               target
CHANGES.txt  dev   ec2     graphx    make-distribution.sh  project  sbin       spark-1.1.2-SNAPSHOT-bin-2.5.1.tgz  tools

[root@localhost spark]# ls assembly/target/
classes                         scala-2.10
maven-shared-archive-resources  test-classes
[root@localhost spark]# ls assembly/target/scala-2.10/
spark-assembly-1.1.2-SNAPSHOT-hadoop2.5.1.jar
[root@localhost spark]# bin/spark-shell
    <!-- lang:scala -->

export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.5.1 -Phive -X -DskipTests clean package


export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"


修改脚本make-distribution.sh的maven编译参数,
去掉maven的clean操作,

#BUILD_COMMAND="mvn clean package -DskipTests $@"
BUILD_COMMAND="mvn package -DskipTests $@"

http://pan.baidu.com/s/1i3qt6Ix
提取密码 x242

Either update to a newer scala version (2.10.3+) or downgrade java to java 6/7. As you have seen in the output, 2.9.2 
was here long before java  8 was introduced (Copyright 2002-2011, LAMP/EPFL), so they don't work well together.

转载于:https://my.oschina.net/innovation/blog/346171

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值