数据迁移工具--Sqoop安装与部署

下载链接:http://archive.apache.org/dist/sqoop/1.4.6/

下载后传到集群中master上。

image-20200622104615691

前提环境:JDK1.8环境,Hadoop2.7环境。

第一步:解压缩,重命名。

解压缩:tar -zxvf sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz

在这里插入图片描述

重命名:mv sqoop-1.4.6.bin__hadoop-2.0.4-alpha sqoop

在这里插入图片描述

第二步:修改配置文件

进入sqoop安装目录中的conf目录:
在这里插入图片描述
修改配置文件名:

image-20200622110709839

修改配置文件:

hadoop与hadoopMR的环境变量是必须配置的,zookeeper可以选择使用我们安装的zookeeper,hive与hbase在使用的时候必须配置。


# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
 
# included in all the hadoop scripts with source command
# should not be executable directly
# also should not be passed any arguments, since we need original $*
 
# Set Hadoop-specific environment variables here.
 
#Set path to where bin/hadoop is available
#hadoop的环境信息必须
export HADOOP_COMMON_HOME=/usr/hdk/hadoop
 
#Set path to where hadoop-*-core.jar is available
#hadoop的mr存放目录的配置信息必须
export HADOOP_MAPRED_HOME=/usr/hdk/hadoop/tmp/mapred
 
#set the path to where bin/hbase is available
export HBASE_HOME=/usr/hdk/hbase
 
#Set the path to where bin/hive is available
export HIVE_HOME=/usr/hdk/hive
 
#Set the path for where zookeper config dir is
export ZOOKEEPER_HOME=/usr/hdk/zookeeper
export ZOOCFGDIR=/usr/hdk/zookeeper

添加JDBC驱动

打算从什么数据库中提取数据,就将这个数据库的工具类放在sqoop安装路径/lib/下。
在这里插入图片描述

然后进入bin目录,运行sqoop help命令,会出来一些警告信息(忽略警告信息)和help指南。

Available commands:
  codegen            Generate code to interact with database records
  create-hive-table  Import a table definition into Hive
  eval               Evaluate a SQL statement and display the results
  export             Export an HDFS directory to a database table
  help               List available commands
  import             Import a table from a database to HDFS
  import-all-tables  Import tables from a database to HDFS
  import-mainframe   Import datasets from a mainframe server to HDFS
  job                Work with saved jobs
  list-databases     List available databases on a server
  list-tables        List available tables in a database
  merge              Merge results of incremental imports
  metastore          Run a standalone Sqoop metastore
  version            Display version information

这就表示配置成功。

然后测试是否可以连接数据库

sqoop list-databases --connect jdbc:mysql://ip地址:3306/ --username 用户名 --password 密码

image-20200622113945392

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

寒 暄

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值