spark 本地远程连接hive

1 将 core-site.xml,hdfs-site.xml,hive-site.xml三个文件从服务器上down下来,放在项目的resources目录中

2 添加maven依赖,注意版本号要一致

<dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-mllib_2.11</artifactId>
      <version>2.2.0</version>
</dependency>
<dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-hive_2.11</artifactId>
      <version>2.2.0</version>
</dependency>
<dependency>
      <groupId>mysql</groupId>
      <artifactId>mysql-connector-java</artifactId>
      <version>5.1.39</version>
</dependency>

3 测试代码

SparkSession sparkSession = SparkSession.builder()
                .appName("app_1")
                .master("local[*]")
                .enableHiveSupport()
                .getOrCreate();
        sparkSession.sparkContext().setLogLevel("ERROR");

        SparkContext sparkContext = sparkSession.sparkContext();
        JavaSparkContext javaSparkContext = new JavaSparkContext(sparkContext);

        sparkSession.catalog().listDatabases().show(50,false);
        Dataset<Row> sql = sparkSession.sql("select * from test02.deliver");
        sql.show();
        sql.printSchema();

 

©️2020 CSDN 皮肤主题: 数字20 设计师:CSDN官方博客 返回首页