sparkshelljarlib_Spark-Shell:如何定义JAR加载顺序

Running spark-shell locally + define classpath to some 3rd party JARs:

$ spark-shell --driver-class-path /Myproject/LIB/*

Within the shell, I typed

scala> import com.google.common.collect.Lists

:19: error: object collect is not a member of package com.google.common

import com.google.common.collect.Lists

^

I suppose Spark has loaded first /usr/local/spark-1.4.0-bin-hadoop2.6/lib/spark-assembly-1.4.0-hadoop2.6.0.jar which doesn't contain the com.google.common.collect package.

/Myproject/LIB/ contains google-collections-1.0.jar and has the com.google.common.collect. However, this jar seems to be ignored.

Question: How to tell spark-shell to load the JARs in --driver-class-path before those in spark-1.4.0-bin-hadoop2.6/lib/ ?

ANSWER: combining hints from Sathish and Holden comments

--jars must be used instead of --driver-class-path. All jar files must be specified. The jars must be comma-delimited, no space (as per spark-shell --help)

$ spark-shell --jars $(echo ./Myproject/LIB/*.jar | tr ' ' ',')

解决方案

The driver class path flag needs to be comma separated. So ,based on Setting multiple jars in java classpath , we can try spark-shell --driver-class-path $(echo ./Myproject/LIB/*.jar | tr ' ' ',')

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值