相关的hive的链接如下所示:
http://master:50070/explorer.html#/user/hive/warehouse/
相关代码显示如下所示:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.wt</groupId>
<artifactId>sql</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.0</version>
</dependency>
</dependencies>
</project>
相关的sparksql的显示代码如下所示:
import org.apache.spark.sql.SparkSession
import org.apache.spark.{SparkConf, SparkContext}
object sparksql_context {
def main(args: Array[String]): Unit = {
val warehouseLocation ="hdfs://192.168.100.21:9000/user/hive/warehouse/"
//ToDo:1.创建sparksession
val spark: SparkSession=SparkSession
.builder()
.appName("sparksql_context")
.config("spark.sql.warehouse.dir",warehouseLocation)
.enableHiveSupport()
.getOrCreate()
val te=spark.sql("select * from wt_access_log limit 10")
te.show()
// spark.sparkContext.setLogLevel("warn")//设置日志输出级别
// import spark.implicits._
// import spark.sql
//
// //Todo:2.操作sql语句
// sql("show databases;").show()
// sql("show tables").show()
// spark.stop()
}
}
显示过程如下所示,进行相关的坑进行填埋的过程如下所示: