<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-assembly_2.10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-assembly_2.10</artifactId>
<version>1.6.0-cdh5.7.0</version>
</dependency>
maven中的依赖如上所示
org.apache.spark.sql.api.java.JavaSQLContext
该类在包中并没有找到
原因如下:
Prior to Spark 1.3 there were separate Java compatible classes (JavaSQLContext
and JavaSchemaRDD
) that mirrored the Scala API. In Spark 1.3 the Java API and Scala API have been unified. Users of either language should use SQLContext
and DataFrame
. In general theses classes try to use types that are usable from both languages (i.e. Array
instead of language specific collections). In some cases where no common type exists (e.g., for passing in closures or Maps
) function overloading is used instead.
Additionally the Java specific types API has been removed. Users of both Scala and Java should use the classes present in org.apache.spark.sql.types
to describe schema programmatically.
参考:https://stackoverflow.com/questions/31648248/which-jar-contains-org-apache-spark-sql-api-java-javasqlcontext解释