从hive里找一张测试用表
在测试mysql数据库建立一张相同的表
编写转换json
{
“job”: {
“setting”: {
“speed”: {
“channel”: 3
}
},
“content”: [
{
“reader”: {
“name”: “hdfsreader”,
“parameter”: {
“path”: “/metastore/table/spider_test/region_test23/*”,
“defaultFS”: “hdfs://ip:port”,
“column”: [
{
“index”: 0,
“type”: “string”
},
{
“index”: 1,
“type”: “string”
},
{
“index”: 2,
“type”: “string”
},
{
“index”: 3,
“type”: “string”
},
{
“index”: 4,
“type”: “string”
},
{
“index”: 5,
“type”: “string”
}
{
“index”: 6,
“type”: “string”
},
{
“index”: 7,
“type”: “string”
}
],
“fileType”: “text”,
“encoding”: “UTF-8”,
//: 分隔符
“fieldDelimiter”: “\u0001”
}
},
"writer": {
"name": "mysqlwriter",
"parameter": {
"writeMode": "insert //插入方式,可选insert,update,replace等",
"username": "用户名",
"password": "密码",
"column": [
"id",
"name",
"age",
"conent",
"city",
"transform_bd_sp_key",
"transform_bd_sp_key_pk",
"transform_bd_sp_time"
],
"session": [
"set session sql_mode='ANSI'"
],
"preSql": [
//: 执行前执行sql,也可编辑执行后执行sql
"delete from hive2mysqltest"
],
"connection": [
//: 连接
{
"jdbcUrl": "jdbc:mysql://192.168.2.52:3306/test?useUnicode=true&characterEncoding=gbk",
"table": [
"hive2mysqltest"
]
}
]
}
}
}
]
}
}
具体json格式请参阅github相关文档.
执行测试
java.lang.NoSuchMethodError: org.apache.hadoop.tracing.SpanReceiverHost.get(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;)Lorg/apache/hadoop/tracing/SpanReceiverHost;
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:634)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:619)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:149)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
at org.apache.hadoop.fs.FileSystem.access 200 ( F i l e S y s t e m . j a v a : 91 ) a t o r g . a p a c h e . h a d o o p . f s . F i l e S y s t e m 200(FileSystem.java:91) at org.apache.hadoop.fs.FileSystem 200(FileSystem.java:91)atorg.apache.hadoop.fs.Fi