Sqoop的安装步骤
1)上传解压缩安装包到指定目录
因为之前hive只是安装在hadoop3机器上,所以sqoop也同样安装在hadoop3机器上
[hadoop@hadoop3 ~]$ tar -zxvf sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz -C apps/
(2)进入到 conf 文件夹,找到 sqoop-env-template.sh,修改其名称为 sqoop-env.sh
[hadoop@hadoop3 apps]$ cd sqoop-1.4.6/conf/
[hadoop@hadoop3 conf]$ ls
oraoop-site-template.xml sqoop-env-template.sh sqoop-site.xml
sqoop-env-template.cmd sqoop-site-template.xml
[hadoop@hadoop3 conf]$ mv sqoop-env-template.sh sqoop-env.sh
(3)修改 sqoop-env.sh
[hadoop@hadoop3 conf]$ vi sqoop-env.sh
#Set path to where bin/hadoop is available
export HADOOP_COMMON_HOME=/opt/app/hadoop-2.8.5/
#Set path to where hadoop-*-core.jar is available
export HADOOP_MAPRED_HOME=/opt/app/hadoop-2.8.5/share/hadoop/mapreduce
#set the path to where bin/hbase is available
#export HBASE_HOME=
#Set the path to where bin/hive is available
export HIVE_HOME=/opt/app/hive-2.1.0/
#Set the path for where zookeper config dir is
#export ZOOCFGDIR=
为什么在sqoop-env.sh 文件中会要求分别进行 common和mapreduce的配置呢???
在apache的hadoop的安装中;四大组件都是安装在同一个hadoop_home中的
但是在CDH(cloudera公司), HDP(Hortonworks)中, 这些组件都是可选的。
在安装hadoop的时候,可以选择性的只安装HDFS或者YARN,
CDH,HDP在安装hadoop的时候,会把HDFS和MapReduce有可能分别安装在不同的地方。
(4)加入 mysql 驱动包到 sqoop1.4.7/lib 目录下(可以去hive的lib中找)
[hadoop@hadoop3 ~]$ cp mysql-connector-java-5.1.40-bin.jar apps/sqoop-1.4.6/lib/
(5)配置系统环境变量
[hadoop@hadoop3 ~]$ vi /etc/profile
export SQOOP_HOME=/home/hadoop/apps/sqoop-1.4.6
export PATH=$PATH:$SQOOP_HOME/bin
[hadoop@hadoop3 ~]$ source .bashrc
(6)输入sqoop-version验证sqoop是否安装成功