一.场景:
spark通过phoenix 读取hbase表,其实说白了先要去Zookeeper建立connection。
二.代码:
点击(此处)折叠或打开
val zkUrl = "192.168.100.39,192.168.100.40,192.168.100.41:2181"
val formatStr = "org.apache.phoenix.spark"
val oms_orderinfoDF = spark.read.format(formatStr)
.options(Map("table" -> "oms_orderinfo", "zkUrl" -> zkUrl))
.load
三.查看SparkJob日志:
点击(此处)折叠或打开
17/10/24 03:25:25 INFO zookeeper.ClientCnxn: Opening socket connection to server hadoop40/192.168.100.40:2181. Will not attempt to authenticate using SASL (unknown error)
17/10/24 03:25:25 INFO zookeeper.ClientCnxn: Socket connection established, initiating session, client: /192.168.100.48:35952, server: hadoop40/192.168.100.40:2181
17/10/24 03:25:25 WARN zookeeper.ClientCnxn: Session 0x0 for server hadoop40/192.168.100.40:2181, unexpected error, c