1.拷贝如下jar包到
sparkhome/jars(spark2.0之前是
s
p
a
r
k
h
o
m
e
/
j
a
r
s
(
s
p
a
r
k
2.0
之
前
是
{spark_home}/lib):
hbase-protocol-1.2.0-cdh5.10.2.jar
hbase-client-1.2.0-cdh5.10.2.jar
hbase-common-1.2.0-cdh5.10.2.jar
hbase-server-1.2.0-cdh5.10.2.jar
hive-hbase-handler-1.1.0-cdh5.10.2.jar
metrics-core-2.2.0.jar
2.将hbase的配置文件hbase-site.xml 拷贝到${spark_home}/conf目录下。
3.如果运行报错Causedby: java.lang.ClassNotFoundException:org.apache.htrace.Trace
解决方法:cp/opt/cloudera/parcels/CDH/lib/hbase/lib/htrace-core.jar ${spark_home}/jars
4.Causedby:java.io.IOException:com.google.protobuf.ServiceException:java.lang.
NoClassDefFoundError:com/yammer/meics/core/Gauge
解决方法:cp /opt/cloudera/parcels/CDH/lib/hbase/lib/metrics-core-2.2.0.jar ${spark_home}/jars