关闭

Spark安装

7651人阅读 评论(0) 收藏 举报
分类:

我的安装版本是spark-1.6.1-bin-hadoop2.6.tgz   这个版本必须要求jdk1.7或者1.7以上


安装spark必须要scala-2.11  版本支撑    我安装的是scala-2.11.8.tgz 





tg@master:/software$ tar -zxvf scala-2.11.8.tgz 


tg@master:/software/scala-2.11.8$ ls
bin  doc  lib  man




添加环境变量
tg@master:/$ sudo  /etc/profile


加入
export SCALA_HOME=/software/scala-2.11.8
export PATH=$SCALA_HOME/bin:$PATH






tg@master:/$ source /etc/profile
启动scala
tg@master:/$ scala
Welcome to Scala 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_80).
Type in expressions for evaluation. Or try :help.




scala> 9*9
res0: Int = 81












安装Spark
----------------


tg@master:~$ cp ~/Desktop/spark-1.6.1-bin-hadoop2.6.tgz  /software/
tg@master:~$ cd /software/
tg@master:/software$ ls
apache-hive-2.0.0-bin         jdk-7u80-linux-x64.tar.gz
apache-hive-2.0.0-bin.tar.gz  scala-2.11.8
hadoop-2.6.4                  scala-2.11.8.tgz
hadoop-2.6.4.tar.gz           spark-1.6.1-bin-hadoop2.6.tgz
hbase-1.2.1                   zookeeper-3.4.8
hbase-1.2.1-bin.tar.gz        zookeeper-3.4.8.tar.gz
jdk1.7.0_80
tg@master:/software$ tar -zxvf spark-1.6.1-bin-hadoop2.6.tgz 


添加环境变量
sudo gedit /etc/profile
export SPARK_HOME=/software/spark-1.6.1-bin-hadoop2.6
export PATH=$SPARK_HOME/bin:$PATH


source  /etc/profile


修改spark-env.sh
tg@master:~$ cd /software/spark-1.6.1-bin-hadoop2.6/conf/
tg@master:/software/spark-1.6.1-bin-hadoop2.6/conf$ ls
docker.properties.template  metrics.properties.template   spark-env.sh.template
fairscheduler.xml.template  slaves.template
log4j.properties.template   spark-defaults.conf.template
tg@master:/software/spark-1.6.1-bin-hadoop2.6/conf$ cp spark-env.sh.template  spark-env.sh
tg@master:/software/spark-1.6.1-bin-hadoop2.6/conf$ sudo gedit spark-env.sh






加入


export SCALA_HOME=/software/scala-2.11.8
export JAVA_HOME=/software/jdk1.7.0_80


export SPARK_MASTER_IP=192.168.52.140
export SPARK_WORKER_MEMORY=512m
export master=spark://192.168.52.140:7070








修改slaves
tg@master:/software/spark-1.6.1-bin-hadoop2.6/conf$ cp slaves.template slaves
tg@master:/software/spark-1.6.1-bin-hadoop2.6/conf$ sudo gedit slaves




master




启动


tg@master:/software/spark-1.6.1-bin-hadoop2.6$ sbin/start-all.sh 
starting org.apache.spark.deploy.master.Master, logging to /software/spark-1.6.1-bin-hadoop2.6/logs/spark-tg-org.apache.spark.deploy.master.Master-1-master.out
master: starting org.apache.spark.deploy.worker.Worker, logging to /software/spark-1.6.1-bin-hadoop2.6/logs/spark-tg-org.apache.spark.deploy.worker.Worker-1-master.out


jps查看进程 多了Worker,Master


tg@master:/software/hbase-1.2.1/conf$ jps
4400 HRegionServer
3033 DataNode
5794 Jps
4793 Main
3467 ResourceManager
5652 SparkSubmit
5478 Master
3591 NodeManager
3240 SecondaryNameNode
3910 QuorumPeerMain
2911 NameNode
5567 Worker
4246 HMaster
tg@master:/software/hbase-1.2.1/conf$ 






tg@master:/software/spark-1.6.1-bin-hadoop2.6$ spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.1
      /_/


Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_80)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context available as sc.
16/05/31 02:17:47 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/05/31 02:17:49 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/05/31 02:18:02 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/05/31 02:18:03 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/05/31 02:18:10 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/05/31 02:18:11 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/05/31 02:18:19 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/05/31 02:18:19 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
SQL context available as sqlContext.


scala> 



查看




查看job



2
1
查看评论
发表评论
* 以上用户言论只代表其个人观点,不代表CSDN网站的观点或立场

Spark介绍与安装教程(Linux系统)

简单的Spark介绍,以及相当简单的安装教程。
  • lin360580306
  • lin360580306
  • 2016-04-24 14:09
  • 26461

mac os环境搭建spark

1.eclipse按装scala ide 地址如下: http://download.scala-ide.org/sdk/lithium/e44/scala211/stable/site 2.spar...
  • lv836735240
  • lv836735240
  • 2016-10-23 15:10
  • 5963

spark安装步骤

1、下载安装Scala    http://www.scala-lang.org/download/2.11.8.html页面下载scala-2.11.8.tgz        下载安装Spark...
  • wuliu_forever
  • wuliu_forever
  • 2016-09-21 10:29
  • 1909

Spark的安装和配置

1  本文是在hadoop正确安装的前提的条件上进行的  Hadoop安装请参考:hadoop的安装和配置.docx 2  下载程序包 http://spark.apache.org/download...
  • tian_li
  • tian_li
  • 2015-10-22 07:13
  • 7278

spark安装具体步骤

最近,在大波面试来袭的时候,我默默的在这钻研spark的安装,以前的linux的基础知识都忘得差不多了,所以安装起来比较麻烦,于是写下这篇安装博文,希望有用。
  • dengpei187
  • dengpei187
  • 2016-08-23 09:47
  • 3137

Spark在Windows下的环境搭建

由于Spark是用Scala来写的,所以Spark对Scala肯定是原生态支持的,因此这里以Scala为主来介绍Spark环境的搭建,主要包括四个步骤,分别是:JDK的安装,Scala的安装,Spar...
  • u011513853
  • u011513853
  • 2016-10-19 23:40
  • 43909

Mac下安装Spark开发环境(Linux系统可参照)

之前一直使用VNC在远程集群上进行Spark程序开发,但是网络的不稳定以及集群时常升级让人头疼。在这里我在自己的Mac上 搭建单机Spark开发环境,作为学习开发已经足够。Linux系统用户也可以参...
  • yangmuted
  • yangmuted
  • 2015-01-10 14:24
  • 7743

Win 7 64位 单机Spark安装

1.准备安装软件 (1)jdk 1.8 (2)scala (3)Intellij IDE (4)spark-1.3.0-bin-hadoop2.4.tgz 为了读者方便,这里已经把以上文...
  • a819825294
  • a819825294
  • 2016-06-10 13:35
  • 8932

spark 安装单机版和集群

本文介绍安装mac单机版的spark,和spark 集群安装 分以下步骤 安装scala 下载spark 压缩包并解压 修改spark的配置文件 配置环境变量 验证安装情况 安装Sc...
  • u012373815
  • u012373815
  • 2016-11-18 10:38
  • 3103

在Windows上安装单机Spark

在Windows上安装单机Spark 一、           依照官方文档第一次尝试 1.       安装jdk...
  • avasoso
  • avasoso
  • 2015-10-26 20:11
  • 2835
    个人资料
    • 访问:548599次
    • 积分:6327
    • 等级:
    • 排名:第4477名
    • 原创:160篇
    • 转载:36篇
    • 译文:0篇
    • 评论:320条
    博客专栏
    最新评论