DataX相关问题

DataX进行传输

编辑语句:

{
    "core":{
        "transport":{
            "channel":{
                "speed":{
                    "channel":2,
                    "record":-1,
                    "byte":-1,
                    "batchSize":2048
                }
            }
        }
    },
    "job":{
        "setting":{
            "speed":{
                "channel":5
            }
        },
        "content":[
            {
                "reader":{
                    "name":"oraclereader",
                    "parameter":{
                        "username":"jflhnew",
                        "password":"gj_jflhnew_12#$",
                        "connection":[
                            {
                                "querySql":[
                                    "查询语句"
                                ],
                                "jdbcUrl":[
                                    "jdbc:oracle:thin:@192.123.76.134:1521:bjsrbj1"
                                ]
                            }
                        ]
                    }
                },
                "writer":{
                    "name":"hdfswriter",
                    "parameter":{
                        "defaultFS":"hdfs://hacluster",
                        "fileType":"text",
                        "path":"/warehouse/origin/${out}/${tab}/",
                        "fileName":"${tab}",
                        "column":[
                            {"name":"字段名","type":"字段属性"},
                            {"name":"字段名","type":"字段属性"}
                        ],
                        "writeMode":"append",
                        "fieldDelimiter": "\t",
                        "haveKerberos":"true",
                        "kerberosKeytabFilePath":"/mnt/datax/kerberos/user.keytab",
                        "kerberosPrincipal":"test_java2@HADOOP.COM",
                        "hadoopConfig":{
                            "dfs.nameservices":"hacluster",
                            "dfs.ha.namenodes.hacluster":"10,9",
                            "dfs.namenode.rpc-address.hacluster.10":"192.123.78.5:25000",
                            "dfs.namenode.rpc-address.hacluster.9":"192.123.78.4:25000",
                            "dfs.client.failover.proxy.provider.hacluster":"org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider",
                            "hadoop.security.authentication":"Kerberos",
                            "hadoop.rpc.protection":"privacy"
                        }
                    }
                }
            }
        ]
    }
}

先进性:

[root@host-192-125-30-10 ~]# source /opt/hadoopclient/bigdata_env
[root@host-192-125-30-10 ~]# kinit -kt /mnt/test_java2_keytab/user.keytab test_java2

执行json脚本语句:
后台登陆:

[root@host-192-125-30-10 ~]# nohup python /mnt/datax/bin/datax.py /mnt/guanyu/文件名.json    >log.log 2>&1 &

直接执行:

[root@host-192-125-30-10 ~]# python /mnt/datax/bin/datax.py /mnt/guanyu/文件名.json
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值