1.首先你得去git拉replicator的代码。然后编译。编译之后启动。
1.首先我有2套zk和kafka
源kafka 是172.30.3.120
目标kafka是 172.30.3.210
a.启动第一步:启动controller
/home/datacanvas/huml/uReplicator/uReplicator-Distribution/target/uReplicator-Distribution-pkg/bin/start-controller.sh
-mode haha
-enableAutoWhitelist false
-port 9000 -refreshTimeInSeconds 10
-srcKafkaZkPath 172.30.3.120:2181
-zookeeper 172.30.3.120:2181 //存储ureplicator的信息
-destKafkaZkPath 172.30.3.210:2181
-helixClusterName uReplicatorDev01
b.启动 work
需要修改配置文件。
1>修改 helix.properties
zkServer=172.30.3.120:2181 //与controller启动时填写的zk信息相同
instanceId=testHelixMirrorMaker01
helixClusterName=uReplicatorDev01 //需要与controller启动的clusterName一样
enableAutowhitelist=false
port=9000
srckafkaZkpath=172.30.3.120:2181
destKafkaZkpath=172.30.3.210:2181
2> consumer.properties
zookeeper.connect=172.30.3.120:2181 //源kafka
//source zk
zookeeper.connection.timeout.ms=30000
zookeeper.session.timeout.ms=30000
#consumer group id
group.id=kloak-mirrormaker-test
consumer.id=kloakmms01-sjc1
partition.assignment.strategy=roundrobin
socket.receive.buffer.bytes=1048576
fetch.message.max.bytes=8388608
queued.max.message.chunks=5
auto.offset.reset=smallest
3> product.properties
//目标 ip:host
bootstrap.servers=172.30.3.210:9092
target kafka
client.id=kloak-mirrormaker-test
//source kafka 和producer.type=async
compression.type=none
key.serializer=org.apache.kafka.common.serialization.ByteArraySerializer
value.serializer=org.apache.kafka.common.serialization.ByteArraySerializer
启动:
/home/datacanvas/huml/uReplicator/uReplicator-Distribution/target/uReplicator-Distribution-pkg/bin/start-worker.sh
--consumer.config
/home/datacanvas/huml/uReplicator/config/consumer.properties
--producer.config
/home/datacanvas/huml/uReplicator/config/producer.properties
--helix.config
/home/datacanvas/huml/uReplicator/config/helix.properties
官方文档例子:
https://github.com/uber/uReplicator/wiki/uReplicator-User-Guide