hadoop, mave, hbase, hive, sqoop在ubuntu 14.04.04下的安装
前提:hadoop安装:
参考:http://blog.csdn.net/xanxus46/article/details/45133977
一、maven
1、安装jdk
2、下载:
http://maven.apache.org/download.cgi
wget http://mirrors.cnnic.cn/apache/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz
3、解压:
tar -xzf apache-maven-3.3.9-bin.tar.gz
4、配置环境变量
vi ~/.bashrc
export MAVEN_HOME=/home/Hadoop/apache-maven-3.3.9
export PATH=$MAVEN_HOME/bin:$PATH
生效:
source ~/.bashrc
5、验证
$mvn --version
结果:
root@spark:/usr/local/maven/apache-maven-3.3.9# mvn --version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /usr/local/maven/apache-maven-3.3.9
Java version: 1.8.0_65, vendor: Oracle Corporation
Java home: /usr/lib/java/jdk1.8.0_65/jre
Default locale: en_HK, platform encoding: UTF-8
OS name: "linux", version: "3.19.0-58-generic", arch: "amd64", family: "unix"
root@spark:/usr/local/maven/apache-maven-3.3.9#
http://www.linuxidc.com/Linux/2015-03/114619.htm
二、hbase
1、下载:
http://mirrors.hust.edu.cn/apache/hbase/stable/
http://mirrors.hust.edu.cn/apache/hbase/stable/hbase-1.1.5-bin.tar.gz
2、解压:
HBase的安装也有三种模式:单机模式、伪分布模式和完全分布式模式,在这里只介绍完全分布模式。前提是Hadoop集群和Zookeeper已经安装完毕,并能正确运行。
第一步:下载安装包,解压到合适位置,并将权限分配给hadoop用户(运行hadoop的账户,比如root)
这里下载的是hbase-1.1.5,Hadoop集群使用的是2.6,将其解压到/usr/local下
tar -zxvf hbase-1.1.5-bin.tar.gz
mkdir /usr/local/hbase
mv hbase-1.1.5 /usr/local/hbase
cd /usr/local
chmod -R 775 hbase
chmod -R root: hbase
3、环境变量
$vi ~/.bashrc
export HBASE_HOME=/usr/local/hbase/hbase-1.1.5
PATH=$HBASE_HOME/bin:$PATH
source ~/.bashrc
4、配置文件
4.1 jdk[有默认的jdk,可以不改]
sudo vim /opt/hbase/conf/hbase-env.sh
修改$JAVA_HOME为jdk安装目录,这里是/opt/jdk1.8.0
4.2 hbase-site.xml
/usr/local/hbase/hbase-1.1.5/conf/hbase-site.xml
<configuration>
<property>
<name>hbase.rootdir</name>
<value>hdfs://spark:9000/hbase</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
</configuration>
5、验证
先启动hadoop
sbin/start-dfs.sh
sbin/start-yarn.sh
$hbase shell
结果:
root@spark:/usr/local/hbase/hbase-1.1.5/bin# hbase shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hbase/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/hadoop-2.6.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase Shell; enter 'help<RETURN>' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 1.1.5, r239b80456118175b340b2e562a5568b5c744252e, Sun May 8 20:29:26 PDT 2016
hbase(main):001:0>
http://blog.csdn.net/xanxus46/article/details/45133977
1、下载:http://apache.fayea.com/hive/stable/
http://apache.fayea.com/hive/stable/apache-hive-1.2.1-bin.tar.gz
2、解压:
tar xvzf apache-hive-1.2.1-bin.tar.gz
3、环境变量
root@spark:/home/alex/xdowns# vi ~/.bashrc
export HIVE_HOME=/usr/local/hive/apache-hive-1.2.1-bin
export PATH=$PATH:$HIVE_HOME/bin
root@spark:/home/alex/xdowns# source ~/.bashrc
4、修改配置文件
首先将hive-env.sh.template和hive-default.xml.template进行复制并改名为hive-env.sh和hive-site.xml。
/home/hadoop/apache-hive-1.0.0-bin/conf/hive-env.sh修改,如下所示:
export HADOOP_HEAPSIZE=1024
# Set HADOOP_HOME to point to a specific hadoop install directory
HADOOP_HOME=/home/hadoop/hadoop-2.5.2
# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/home/hadoop/apache-hive-1.0.0-bin/conf
# Folder containing extra ibraries required for hive compilation/execution can be controlled by:
export HIVE_AUX_JARS_PATH=/home/hadoop/apache-hive-1.0.0-bin/lib
/home/hadoop/apache-hive-1.0.0-bin/conf/hive-site.xml修改,如下所示:
<property>
<name>hive.metastore.warehouse.dir</name>
<value>hdfs://Master:9000/hbase</value>
</property>
<property>
<name>hive.querylog.location</name>
<value>/usr/hadoop/hive/log</value>
<
2016.05.15
本文测试环境:
hadoop2.6.2 ubuntu 14.04.04 amd64 jdk1.8
安装版本:
maven 3.3.9 hbase 1.15 hive 1.2.1 sqoop2(1.99.6)和sqoop1(1.4.6)
另外,本文参考了一些文章,基本上都有原文链接。
前提:hadoop安装:
参考:http://blog.csdn.net/xanxus46/article/details/45133977
本文的安装教程可以辅助基本的hadoop日志分析,详细教程,参考:
http://www.cnblogs.com/edisonchou/p/4449082.html
一、maven
1、安装jdk
2、下载:
http://maven.apache.org/download.cgi
wget http://mirrors.cnnic.cn/apache/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz
3、解压:
tar -xzf apache-maven-3.3.9-bin.tar.gz
4、配置环境变量
vi ~/.bashrc
export MAVEN_HOME=/home/Hadoop/apache-maven-3.3.9
export PATH=$MAVEN_HOME/bin:$PATH
生效:
source ~/.bashrc
5、验证
$mvn --version
结果:
root@spark:/usr/local/maven/apache-maven-3.3.9# mvn --version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /usr/local/maven/apache-maven-3.3.9
Java version: 1.8.0_65, vendor: Oracle Corporation
Java home: /usr/lib/java/jdk1.8.0_65/jre
Default locale: en_HK, platform encoding: UTF-8
OS name: "linux", version: "3.19.0-58-generic", arch: "amd64", family: "unix"
root@spark:/usr/local/maven/apache-maven-3.3.9#
http://www.linuxidc.com/Linux/2015-03/114619.htm
二、hbase
1、下载:
http://mirrors.hust.edu.cn/apache/hbase/stable/
http://mirrors.hust.edu.cn/apache/hbase/stable/hbase-1.1.5-bin.tar.gz
2、解压:
HBase的安装也有三种模式:单机模式、伪分布模式和完全分布式模式,在这里只介绍完全分布模式。前提是Hadoop集群和Zookeeper已经安装完毕,并能正确运行。
第一步:下载安装包,解压到合适位置,并将权限分配给hadoop用户(运行hadoop的账户,比如root)
这里下载的是hbase-1.1.5,Hadoop集群使用的是2.6,将其解压到/usr/local下
tar -zxvf hbase-1.1.5-bin.tar.gz
mkdir /usr/local/hbase
mv hbase-1.1.5 /usr/local/hbase
cd /usr/local
chmod -R 775 hbase
chmod -R root: hbase
3、环境变量
$vi ~/.bashrc
export HBASE_HOME=/usr/local/hbase/hbase-1.1.5
PATH=$HBASE_HOME/bin:$PATH
source ~/.bashrc
4、配置文件
4.1 jdk[有默认的jdk,可以不改]
sudo vim /opt/hbase/conf/hbase-env.sh
修改$JAVA_HOME为jdk安装目录,这里是/opt/jdk1.8.0
4.2 hbase-site.xml
/usr/local/hbase/hbase-1.1.5/conf/hbase-site.xml
<configuration>
<property>
<name>hbase.rootdir</name>
<value>hdfs://spark:9000/hbase</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
</configuration>
5、验证
先启动hadoop
sbin/start-dfs.sh
sbin/start-yarn.sh
$hbase shell
结果:
root@spark:/usr/local/hbase/hbase-1.1.5/bin# hbase shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hbase/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/hadoop-2.6.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase Shell; enter 'help<RETURN>' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 1.1.5, r239b80456118175b340b2e562a5568b5c744252e, Sun May 8 20:29:26 PDT 2016
hbase(main):001:0>
http://blog.csdn.net/xanxus46/article/details/45133977
集群安装:http://blog.sina.com.cn/s/blog_6145ed810102vtws.html
1、下载:http://apache.fayea.com/hive/stable/
http://apache.fayea.com/hive/stable/apache-hive-1.2.1-bin.tar.gz
2、解压:
tar xvzf apache-hive-1.2.1-bin.tar.gz
3、环境变量
root@spark:/home/alex/xdowns# vi ~/.bashrc
export HIVE_HOME=/usr/local/hive/apache-hive-1.2.1-bin
export PATH=$PATH:$HIVE_HOME/bin
root@spark:/home/alex/xdowns# source ~/.bashrc
4、修改配置文件
首先将hive-env.sh.template和hive-default.xml.template进行复制并改名为hive-env.sh和hive-site.xml。
/home/hadoop/apache-hive-1.0.0-bin/conf/hive-env.sh修改,如下所示:
export HADOOP_HEAPSIZE=1024
# Set HADOOP_HOME to point to a specific hadoop install directory
HADOOP_HOME=/home/hadoop/hadoop-2.5.2
# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/home/hadoop/apache-hive-1.0.0-bin/conf
# Folder containing extra ibraries required for hive compilation/execution can be controlled by:
export HIVE_AUX_JARS_PATH=/home/hadoop/apache-hive-1.0.0-bin/lib
/home/hadoop/apache-hive-1.0.0-bin/conf/hive-site.xml修改,如下所示:
<property>
<name>hive.metastore.warehouse.dir</name>
<value>hdfs://Master:9000/hbase</value>
</property>
<property>
<name>hive.querylog.location</name>
<value>/usr/hadoop/hive/log</value>
<