最初是参考林子雨的文章,http://dblab.xmu.edu.cn/blog/1307-2/ ,但发现sbt-0.13的版本已经不被支持了。
无奈参考这篇文章, https://www.cnblogs.com/hank-yan/p/8686281.html ,安装1.1.1版本。
安装相对容易,spark针对hbase的操作,需要注意如下事项。
1.每一个项目都会需要一个配置文件,即xxx.sbt 。我建立的内容如下:
[root@k8s-1 user01]# cat /usr/local/sbt/test/simple.sbt
name := "Simple Project"
version := "1.0"
scalaVersion := "2.11.12"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.4.3",
"org.apache.spark" %% "spark-sql" % "2.4.3",
"org.apache.spark" %% "spark-hive" % "2.4.3",
"org.apache.spark" %% "spark-streaming" % "2.4.3",
"org.apache.hbase" % "hbase-client" % "2.1.0",
"org.apache.hbase" % "hbase-common" % "2.1.0",
"org.apache.hbase" % "hbase-server" % "2.1.0",
"org.apache.hbase" % "hbase-protocol" % "2.1.0",
"org.apache.hbase" % "hbase-mapreduce" % "2.1.0"
)
2.配置spark-env.sh文件
[root@k8s-1 user01]# cat /opt/spark-2.4.3-bin-hadoop2.7/conf/spark-env.sh|grep -v "#"
export SPARK_MASTER_IP=k8s-1
export SPARK_WORKER_MEMORY=8g
export JAVA_HOME=/usr/java/jdk1.8.0_171
export SCALA_HOME=/opt/scala-2.11.8