spark环境idea与sbt的配置

sbt下载官网: https://www.scala-sbt.org/download.html,我下载的是msi安装包,默认安装改个文件夹就行

sbt默认源基本是连不上的,安装完Scala,idea(idea要装scala插件)和sbt后,要在sbt文件夹和idea设置中进行配置

本机安装sbt路径为D://Client/Spark/sbt,红色字体是需要根据自己配置修改的内容

安装完sbt后进入 D://Client/Spark/sbt/conf 文件夹,修改/新建文件 repo.properties

[repositories]
  local
  aliyun: http://maven.aliyun.com/nexus/content/groups/public/
  typesafe: http://repo.typesafe.com/typesafe/ivy-releases/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext], bootOnly
  sonatype-oss-releases
  maven-central
  sonatype-oss-snapshots

之后修改文件 sbtconfig.txt 为

# Set the java args to high
-Xmx512M
-XX:MaxPermSize=256m
-XX:ReservedCodeCacheSize=128m
# Set the extra SBT options
-Dsbt.log.format=true
-Dsbt.boot.directory=
D://Client/Spark/sbt/data/boot/
-Dsbt.global.base=
D://Client/Spark/sbt/data/.sbt
-Dsbt.ivy.home=
D://Client/Spark/sbt/data/.ivy2
-Dsbt.repository.config=
D://Client/Spark/sbt/conf/repo.properties
-Dsbt.repository.secure=false
 
然后在 D://Client/Spark/sbt/\bin\sbt-launch.jar 中的 \sbt\sbt.boot.properties中(可用rar解压工具直接打开修改并覆盖,记住是用rar打开文件不需要解压,否则弄不回jar了),修改里面的内容为:

[scala]
  version: ${sbt.scala.version-
2.10.5}

[app]
  org: ${sbt.organization-org.scala-sbt}
  name: sbt
  version: ${sbt.version-
1.3.3}
  class: ${sbt.main.class-sbt.xMain}
  components: xsbti,extra
  cross-versioned: ${sbt.cross.versioned-false}
  resources: ${sbt.extraClasspath-}

[repositories]
  local
  spring: http://conjars.org/repo/
  cloudera: https://repository.cloudera.com/artifactory/cloudera-repos/
  aliyun: http://maven.aliyun.com/nexus/content/groups/public/
  maven-central
  sbt-maven-releases: https://repo.scala-sbt.org/scalasbt/maven-releases/, bootOnly
  sbt-maven-snapshots: https://repo.scala-sbt.org/scalasbt/maven-snapshots/, bootOnly
  typesafe-ivy-releases: https://repo.typesafe.com/typesafe/ivy-releases/, [organization]/[module]/[revision]/[type]s/[artifact](-[classifier]).[ext], bootOnly
  sbt-ivy-snapshots: https://repo.scala-sbt.org/scalasbt/ivy-snapshots/, [organization]/[module]/[revision]/[type]s/[artifact](-[classifier]).[ext], bootOnly

[boot]
  directory: ${sbt.boot.directory-${sbt.global.base-${user.home}/.sbt}/boot/}
  lock: ${sbt.boot.lock-true}

[ivy]
  ivy-home:
D://Client/Spark/sbt/data/.ivy2
  checksums: ${sbt.checksums-sha1,md5}
  override-build-repos: ${sbt.override.build.repos-false}
  repository-config: ${sbt.repository.config-${sbt.global.base-${user.home}/.sbt}/repositories}


配置完后,在cmd中运行一次sbt初始化,初始化过程有点慢,不过可以通过观察D://Client/Spark/sbt/data/.ivy2发现并没有卡住,而是真的慢。。

复制sbtconfig内容到idea中setting->buildtools->sbt的vm parameters 并修改launcher为..../sbt/bin/sbt-launch(从安装的sbt启动),并勾选Settings中的use sbt shell for build and import 

idea引入assembly时注意要对应sbt版本

 

 

IDEA中安装sbt的一些依赖,文件名build.sbt ,放在项目里打开后IDEA会自动检测并安装:

name := "iptv1"

version := "1.0"

scalaVersion := "2.10.5"
scalacOptions += "-deprecation"

libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.6.0-cdh5.7.2" excludeAll ExclusionRule(organization = "javax.servlet")
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.0-cdh5.7.2"
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "1.6.0-cdh5.7.2"
libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.6.0-cdh5.7.2"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.6.0-cdh5.7.2"
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.0-cdh5.7.2"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.7.2"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.7.2"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.7.2" excludeAll ExclusionRule(organization = "javax.servlet")
libraryDependencies += "org.apache.hbase" % "hbase-protocol" % "1.2.0-cdh5.7.2"
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.38"
libraryDependencies += "com.yammer.metrics" % "metrics-core" % "2.2.0"


//libraryDependencies += "org.apache.hadoop" % "hadoop-core" % "2.6.0-mr1-cdh5.7.2"
// libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.7.2"
// libraryDependencies += "org.apache.hbase" % "hbase-mapreduce" % "1.2.0-cdh5.7.2"

// libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.2.3"
// libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging-slf4j" % "2.1.2"
//----------------------------CDH 5.13.1----------------------------//
//name := "demo_sbt2"

//version := "0.1"

//scalaVersion := "2.11.8"

//libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.3.0.cloudera2"
//libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.3.0.cloudera2"
//libraryDependencies += "org.apache.spark" % "spark-hive_2.11" % "2.3.0.cloudera2"
//libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.13.1"
//libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.13.1"

//--------------------------------------------------------------------//

 

 

 

 

 

 

 

 

 

 

 

 


 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值