I am setting up a Java Spark application and am following the Datastax documentation on getting started with the Java API. I've added
com.datastax.spark
spark-cassandra-connector-java_2.10
1.1.1
...
and (a previously installed dse.jar to my local Maven repository)
com.datastax
dse
version number
. Next step in the guide is to do
SparkConf conf = DseSparkConfHelper.enrichSparkConf(new SparkConf())
.setAppName( "My application");
DseSparkContext sc = new DseSparkContext(conf);
. However, the class SparkConf can't be resolved. Should it? Am I missing some additional Maven dependency? Which?
解决方案
The class is org.apache.spark.SparkConf which is in the spark-core_scala version artifact.
So your pom.xml might look like this:
org.apache.spark
spark-core_2.10
1.4.1
com.datastax.spark
spark-cassandra-connector-java_2.10
1.5.0-M2
com.datastax
dse
*version number*
The spark-core JAR is also located in:
dse_install/resources/spark/lib/spark_core_2.10-version.jar (tarball)
or:
/usr/share/dse/spark/lib/spark_core_2.10-version.jar (package installs)