#解压改名[root@master home]# tar -zxvf scala-2.10.4.tgz -C /app/[root@master app]# mv scala-2.10.4/ scala#配置Scala环境变量vi /etc/profile
export SCALA_HOME=/usr/local/software/scala
export PATH=$PATH:$SCALA_HOME/bin
#输入 scala,进入 shell 界面则表明安装成功[root@master app]# scala
Welcome to Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79).
Type in expressions to have them evaluated.
Type :help formore information.
scala>
#进入sbin目录下启动[root@master sbin]# ./start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /app/spark/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out
master: starting org.apache.spark.deploy.worker.Worker, logging to /app/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out
#jps进程[root@master sbin]# jps
1987 ResourceManager
1670 DataNode
3036 Master
1835 SecondaryNameNode
3123 Worker
1541 NameNode
3172 Jps
2094 NodeManager
#进入bin目录访问[root@master bin]# ./spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
21/03/13 20:17:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
21/03/13 20:17:19 WARN spark.SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://192.168.5.128:4040
Spark context available as 'sc'(master = local[*], app id= local-1615637838708).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.0.0
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help formore information.
scala> print("hello hadoop")
hello hadoop
scala>