步骤
- 复制
hive-site.xml、hdfs-site.xml文件至项目的资源文件夹。 maven中引入spark-hive_2.11以及访问元数据数据库所需要的依赖如mysql-connector-java
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.47</version>
</dependency>
- 代码中添加
enableHiveSupport()
val spark = SparkSession.builder().master("local[*]" ).appName("Demo").enableHiveSupport().getOrCreate()
可能出现的问题
- 出现异常:
javax.jdo.JDOFatalInternalException: Error creating transactional connection factory
at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
资源文件中缺少hdfs-site.xml文件
- 没有查询到hive表
原因:没有在maven中引入spark-hive_2.11依赖
2万+

被折叠的 条评论
为什么被折叠?



