Hadoop+HBase+Spark伪分布式整合部署(mac)

(linux和windows可以参考)

首先修改主机名(建议):

sudo scutil --set HostName localhost


Hadoop下载安装:

brew install hadoop

找到Hadoop配置文件目录

cd /usr/local/Cellar/hadoop/2.7.3/libexec/etc/hadoop

修改core-site.xml

<configuration>
		<property>
				<name>hadoop.tmp.dir</name>
				<value>file:/usr/local/Cellar/hadoop/2.7.3/libexec/tmp</value>
		</property>
		<property>
				<name>fs.defaultFS</name>
				<value>hdfs://localhost:8020</value>
		</property>
</configuration>

修改hdfs-site.xml

<configuration>
		<property>
				<name>dfs.replication</name>
				<value>1</value>
		</property>
		<property>
				<name>dfs.namenode.name.dir</name>
				<value>file:/usr/local/Cellar/hadoop/2.7.3/libexec/tmp/dfs/name</value>
		</property>
		<property>
				<name>dfs.namenode.data.dir</name>
				<value>file:/usr/local/Cellar/hadoop/2.7.3/libexec/tmp/dfs/data</value>
		</property>
</configuration>


在/etc/profile添加

#Hadoop environment configs
export HADOOP_HOME=/usr/local/Cellar/hadoop/2.7.3/libexec
export PATH=$PATH:${HADOOP_HOME}/bin

HDFS格式化

cd /usr/local/Cellar/hadoop/2.7.3/bin
./hdfs namenode -format

启动DFS

cd /usr/local/Cellar/hadoop/2.7.3/sbin
./start-dfs.sh 

如果启动成功,输入

jps

出现

1206 DataNode
1114 NameNode
1323 SecondaryNameNode


Hbase下载安装

brew install hbase


找到HBase配置文件目录

cd /usr/local/Cellar/hbase/1.2.2/libexec/conf

修改hbase-env,sh

export HBASE_CLASSPATH=/usr/local/Cellar/hadoop/2.7.3/libexec/etc/hadoop
export HBASE_MANAGES_ZK=true
export HBASE_HOME=/usr/local/Cellar/hbase/1.2.2/libexec
export HBASE_LOG_DIR=${HBASE_HOME}/logs
export HBASE_REGIONSERVERS=${HBASE_HOME}/conf/regionservers


修改hbase-site.xml

<configuration>
  <property>
    <name>hbase.rootdir</name>
    <value>hdfs://localhost:8020/hbase</value>
  </property>
  <property>
    <name>hbase.cluster.distributed</name>
    <value>true</value>
  </property>
  <property>
    <name>dfs.replication</name>
    <value>1</value>
  </property>
</configuration>

在regionservers文件添加

localhost


在/etc/profile添加

#HBase environment configs
export HBASE_HOME=/usr/local/Cellar/hbase/1.2.2/libexec
export PATH=$PATH:${HBASE_HOME}/bin



启动HBase

cd /usr/local/Cellar/hbase/1.2.2/bin
./start-hbase.sh

如果成功,输入

jps

会出现

30465 HRegionServer
30354 HMaster
1605 HQuorumPeer
1206 DataNode
30534 Jps
1114 NameNode
1323 SecondaryNameNode

Spark下载安装

brew install spark


找到Spark配置文件目录

cd /usr/local/Cellar/apache-spark/1.6.0/libexec/conf

修改spark-env.sh

cp spark-env.sh.template spark-env.sh

export SPARK_HOME=/usr/local/Cellar/apache-spark/1.6.0/libexec
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home


在/etc/profile添加

#Spark environment configs
export SPARK_HOME=/usr/local/Cellar/apache-spark/1.6.0/libexec
export PATH=$PATH:${SPARK_HOME}/bin

启动spark-shell

cd /usr/local/Cellar/apache-spark/1.6.0/bin
./spark-shell



  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值