Kafka数据写入Hbase
本篇博客是关于通过java将kafka中一个消息队列的数据进行消费并写入Hbase中。
1.准备
create_namespace 'events_db'
create 'events_db:user_friend','uf'
2.源数据的处理
我们通过flume将源数据写入kafka的topic当中,然后这个topic就已经具有所有需要处理的数据了,然后通过以下代码对数据进行处理,分为3个模块:
1.write模块 ,消费kafka源数据,并调用handler中的ICusTopo对消费的数据进行处理,并写入hbase表中。
IWriter接口
public interface IWriter {
int write(ConsumerRecords<String,String> records);
}
KafkaWriter实现类
public class KafkaWriter2 implements IWriter {
private IHandler handler;
private Connection con = null;
private Table table = null;
public KafkaWriter2(IHandler handler,String tableName) {
this.handler = handler;
final Configuration conf = HBaseConfiguration.create();
conf.set("hbase.zookeeper.quorum","192.168.126.166");
conf.set("hbase.zookeeper.property.clientPort","2181");
conf.set("hbase.dir","hdfs://192.168.126.166:9000/hbase");
try {
con = ConnectionFactory.createConnection(conf);
table = con.getTable(TableName.valueOf(tableName));
} catch (IOException e) {
e.printStackTrace();
}
}
@Override
public int write(ConsumerRecords<String, String> records) {
try {
List<Put> datas = handler.ICustopo(records);
table.put(datas);
} catch (IOException e) {
e