操作描述:
本地使用spark 代码 操作 自己搭建的hadoop 集群 中 hive 数据
环境背景:本机搭建的hive 版本是 3.1.2 本地window scala版本是2.12.15
java 代码:
//import org.apache.spark.SparkConf;
import org.apache.spark.sql.SparkSession;
public class sparkHive {
public static void main(String[] args) {
SparkSession sparkSession = SparkSession.builder()
.master("local")
.appName("spark_hive")
.config("hive.metastore.uris", "thrift://hadoop2:9083")
// .config("spark.sql.warehouse.dir","hdfs://mycluster/user/hive/warehouse")
.enableHiveSupport()
.getOrCreate();
sparkSession.sql("create table wqg.spark_hive_20220107_04 (name string)");
sparkSession.sql("insert into wqg.spark_hive_20220107_04 values('tom')");
sparkSession.stop();
}
}
问题一:
org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:883)
这里我依赖的版本是spark3.x
开始的maven依赖
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.12</artifactId>
<version>3.0.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
网上的搜索基本都是 提示需要导入spark-hive 这个依赖
但是我导入还是不行 依旧报错 提示找不到类
实际是有的 后来降低了版本解决
maven依赖改为
<properties>
<scala.version>2.12</scala.version>
<spark.version>2.4.4</spark.version>
</properties>
<dependencies>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.version}</artifactId>
<version>${spark.version}</version>
</dependency>
</dependencies>
问题二:权限问题
错误信息:
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=Administrator, access=WRITE, inode="/user/hive/warehouse/xxx.db/
解决:
idea中设置环境变量:
变量设置:HADOOP_USER_NAME=root
问题三:因为我本机是tez 引擎执行
错误信息:
java.lang.NoClassDefFoundError: org/apache/tez/dag/api/SessionNotRunning
修改配置文件: 将tez配置注释就好
结果展示: