大数据应用之 — hbase安装部署
节点规划
节点 | Master | ZooKeeper | RegionServer |
---|---|---|---|
lsyk01 | yes | yes | yes |
lsyk02 | backup | yes | yes |
lsyk03 | no | yes | yes |
lsyk04 | no | no | yes |
解压安装
tar -zxvf hbase-2.4.12-bin.tar.gz -C /opt
cd /opt/hbase-2.4.12/conf
vi hbase-env.sh
#增加以下内容
export JAVA_HOME=/usr/java/jdk1.8.0_333
export HBASE_DISABLE_HADOOP_CLASSPATH_LOOKUP="true"
export HBASE_CLASSPATH=/opt/hadoop-3.3.3/etc/hadoop/
export HBASE_MANAGES_ZK=false
vi hbase-site.xml
#修改和添加如下配置
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.rootdir</name>
<value>hdfs://lsyk01:9000/hbase</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value>lsyk01,lsyk02,lsyk03</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>/tmp/zookeeper/data</value>
</property>
<property>
<name>hbase.unsafe.stream.capability.enforce</name>
<value>false</value>
</property>
vi regionservers
#替换成以下内容
lsyk02
lsyk03
lsyk04
vi backup-masters
#配置备主节点,添加以下内容
lsyk02
vi /etc/profile
#添加如下内容
export HBASE_HOME=/opt/hbase-2.4.12
export PATH=$PATH:$HBASE_HOME/bin
#建立软连接
ln -s $HADOOP_HOME/etc/hadoop/core-site.xml $HBASE_HOME/conf/core-site.xml
ln -s $HADOOP_HOME/etc/hadoop/hdfs-site.xml $HBASE_HOME/conf/hdfs-site.xml
#同步
scp /etc/profile lsyk02:/etc/
scp /etc/profile lsyk03:/etc/
scp /etc/profile lsyk04:/etc/
scp /etc/profile lsyk05:/etc/
scp -r /opt/hbase-2.4.12/ lsyk02:/opt/
scp -r /opt/hbase-2.4.12/ lsyk03:/opt/
scp -r /opt/hbase-2.4.12/ lsyk04:/opt/
#分别在lsyk02\lsyk03\lsyk04上 建立软连接
ln -s $HADOOP_HOME/etc/hadoop/core-site.xml $HBASE_HOME/conf/core-site.xml
ln -s $HADOOP_HOME/etc/hadoop/hdfs-site.xml $HBASE_HOME/conf/hdfs-site.xml
启动
#启动Hadoop
xxx starthadoop
#启动zk
xxx startzk
#启动hbase
xxx starthbase
查看进程
测试hbase
- 启动hbase shell测试
shell hbase shell
报错:
[root@lsyk01 logs]# hbase shell 2022-06-18 00:16:17,801 WARN [main]
util.NativeCodeLoader: Unable to load native-hadoop library for your
platform… using builtin-java classes where applicable解决,与spark类似
#添加如下: export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native #分发 scp $HBASE_HOME/conf/hbase-env.sh lsyk02:/$HBASE_HOME/conf/ scp $HBASE_HOME/conf/hbase-env.sh lsyk03:/$HBASE_HOME/conf/ scp $HBASE_HOME/conf/hbase-env.sh lsyk04:/$HBASE_HOME/conf/ #启动客户端 hbase shell ```
-
建表
hbase:002:0> create "lsyk","cf" hbase:002:0> put 'lsyk', 'row1', 'cf:a', 'value1' hbase:002:0> put 'lsyk', 'row2', 'cf:b', 'value2' hbase:002:0> put 'lsyk', 'row3', 'cf:c', 'value3' hbase:002:0> put 'lsyk', 'row4', 'cf:d', 'value4' hbase:002:0> put 'lsyk', 'row5', 'cf:e', 'value5' hbase:002:0> scan "lsyk"