当我们遇到数据量较大的数据迁移时,并且还是跨数据库操作,则建议用数据库的自带工具比较好!
本例子介绍由MySQL-----》Oracle的用数据库的自带工具数据迁移!
第一步:将数据从MySQL中提取出来到文件,用到shell脚本,sql脚本如下:
主要用到 select ....from ...into outfile ...
MySqlExport.sh
#/bin/sh
echo "从数据库中导出数据开始执行,请等待......"
mkdir /var/lib/mysql/tmpdir
if [[ $? -eq 0 ]];then
chown mysql:mysql /var/lib/mysql/tmpdir
if [[ $? -eq 0 ]];then
mysql -uposp -pOAA7O9fHZ4rP jwposp < /var/lib/mysql/MySqlExport.sql
if [[ $? -eq 0 ]];then
echo "数据导出成功^-^,数据文件在var/lib/mysql/tmpdir目录下!!"
else
echo "数据导出失败!"
exit 1;
fi
else
echo "将/var/lib/mysql/tmpdir目录赋给MySQL用户失败!"
exit 1;
fi
else
echo "创建目录/var/lib/mysql/tmpdir失败!"
exit 1;
fi
MySqlExport.sql
/*批处理记录表*/
SELECT
IFNULL(job,''),
IFNULL(job_name,''),
IFNULL(work_date,''),
IFNULL(addi_data1,''),
IFNULL(addi_data2,''),
IFNULL(addi_data3,''),
IFNULL(addi_data4,''),
IFNULL(addi_data5,''),
IFNULL(STATUS,''),
IFNULL(exec_date,''),
IFNULL(exec_time,'')
FROM T_BATCH_REC INTO OUTFILE '/var/lib/mysql/tmpdir/BatchRec.txt'
CHARACTER SET gbk
FIELDS
TERMINATED BY '\,'
LINES
TERMINATED BY '\n';
/*批处理详情表*/
SELECT
IFNULL(down_date,''),
IFNULL(merid,''),
IFNULL(STATUS,''),
IFNULL(down_type,'')
FROM T_BATCH_REC_DETAIL INTO OUTFILE '/var/lib/mysql/tmpdir/BatchRecDetail.txt'
CHARACTER SET gbk
FIELDS
TERMINATED BY '\,' //字段之间,分隔
LINES
TERMINATED BY '\n'; //换行Linux下
此时数据文件 BatchRec.txt, BatchRecDetail.txt 就生成了
第二步:将数据文件导入Oracle数据库,用到shell脚本,ctl文件如下:
主要用到sqlldr命令
OracleImport.sh
#批处理记录表
sqlldr userid=jw/jw123@orcl control=/home/moveData/BatchRec.ctl bad=/home/moveData/BatchRec.bad log=/home/moveData/BatchRec.log errors=0
if [[ $? -eq 0 ]];then
echo "批处理记录表导入成功!"
else
echo "批处理记录表导入失败!"
exit 1;
fi
#批处理详情表
sqlldr userid=jw/jw123@orcl control=/home/moveData/BatchRecDetail.ctl bad=/home/moveData/BatchRecDetail.bad log=/home/moveData/BatchRecDetail.log errors=0
if [[ $? -eq 0 ]];then
echo "批处理详情表导入成功!"
else
echo "批处理详情表导入失败!"
exit 1;
fi
BatchRec.ctl
load data
infile '/home/moveData/tmpdir/BatchRec.txt'
append into table BATCH_REC
fields terminated by ','
trailing nullcols
(
job,
job_name,
work_date,
addi_data1,
addi_data2,
addi_data3,
addi_data4,
addi_data5,
status,
exec_date,
exec_time
)
load data
infile '/home/moveData/tmpdir/BatchRecDetail.txt'
append into table BATCH_REC_DETAIL
fields terminated by ','
trailing nullcols
(
down_date,
merid,
status CHAR,
down_type CHAR
)
此时数据就导入完成了!
接下来在两个数据库中执行相同的sql脚本以验证数据导入的正确性,这一步很重要!!!