object Tang {
def main(args: Array[String]): Unit = {
//1.创建sparkConf
val conf = new SparkConf().setMaster("local[1]").setAppName("Tang")
//2.创建sparkSession
val spark = SparkSession.builder()
.config(conf)
//118.25.122.125
122.51.241.109
//.config("hive.metastore.uris", "thrift://118.25.122.125:9083")
.config("spark.sql.warehouse.dir", "hdfs://122.51.241.109:9000/opt/bigdata2.7/hive_remote/warehouse")
.enableHiveSupport()
.getOrCreate()
//3.读取hive信息
// spark.sql("CREATE TABLE IF NOT EXISTS monitor_camera_info2(monitor_id string ,camera_id string ) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\\t'")
import spark.implicits._
val resultDF: DataFrame = spark.sql("select * from traffic.monitor_camera_info")
//4.打印结果
resultDF.show(10)
//5.关闭会话
spark.close()
}
}
集群中测试Spark和Hive整合成功,但是本地测试Hive元数据不起作用,Spark默认使用本地Hive元数据
导致结果:

解决办法:
将hive-site.xml文件放到resource,目录下

885

被折叠的 条评论
为什么被折叠?



