Java API 生产和消费Kafka消息

     在我的文章 Kafka集群搭建中,展示了如何创建一个单机版的Kafka服务,在此基础上我们可以利用Java程序来对Kafka服务进行生产和消费消息。

1. 创建Maven程序

首先在 Intellij IDEA中创建一个maven程序,在pom.xml文件中加入如下的依赖和插件:

    <dependencies>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>0.10.1.1</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka_2.10 -->
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka_2.10</artifactId>
            <version>0.10.1.1</version>
        </dependency>

    <!--插件-->
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <configuration>
                    <source>1.7</source>
                    <target>1.7</target>
                </configuration>
            </plugin>
        </plugins>
    </build>

2. 创建Producer程序

import kafka.producer.KeyedMessage;
import kafka.producer.ProducerConfig;
import kafka.serializer.StringEncoder;
import kafka.javaapi.producer.Producer;

import java.util.Properties;
import java.util.concurrent.TimeUnit;

/**
 * Created by zhanghuayan on 2017/1/22.
 */
public class ProducerTest extends Thread {
    private String topic;

    public ProducerTest(String topic) {
        super();
        this.topic = topic;
    }

    public void run() {
        Producer producer = createProducer();
        int i=0;
        while(true){
            producer.send(new KeyedMessage<Integer, String>(topic, "message: " + i++));
            try {
                TimeUnit.SECONDS.sleep(1);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
        }
    }

    private Producer<Integer, String> createProducer() {
        Properties properties = new Properties();
        properties.put("zookeeper.connect", "192.168.180.66:2181");//声明zk
        properties.put("serializer.class", StringEncoder.class.getName());
        properties.put("metadata.broker.list", "192.168.180.66:9092");// 声明kafka broker
        return new Producer<Integer, String>(new ProducerConfig(properties));
    }

    public static void main(String[] args){
        new ProducerTest("test").start();
    }
}

运行程序,此时Producer就不断地往kafka中写消息。

3. 创建Consumer程序

import kafka.producer.KeyedMessage;
import kafka.producer.ProducerConfig;
import kafka.serializer.StringEncoder;
import kafka.javaapi.producer.Producer;

import java.util.Properties;
import java.util.concurrent.TimeUnit;

/**
 * Created by zhanghuayan on 2017/1/22.
 */
public class ProducerTest extends Thread {
    private String topic;

    public ProducerTest(String topic) {
        super();
        this.topic = topic;
    }

    public void run() {
        Producer producer = createProducer();
        int i=0;
        while(true){
            producer.send(new KeyedMessage<Integer, String>(topic, "message: " + i++));
            try {
                TimeUnit.SECONDS.sleep(1);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
        }
    }

    private Producer<Integer, String> createProducer() {
        Properties properties = new Properties();
        properties.put("zookeeper.connect", "192.168.180.66:2181");//声明zk
        properties.put("serializer.class", StringEncoder.class.getName());
        properties.put("metadata.broker.list", "192.168.180.66:9092");// 声明kafka broker
        return new Producer<Integer, String>(new ProducerConfig(properties));
    }

    public static void main(String[] args){
        new ProducerTest("test").start();
    }
}

运行程序,此时Consumer就不断地从kafka中读消息,此时console中得到如下数据:

接收到: message: 0
接收到: message: 1
接收到: message: 2
接收到: message: 3
接收到: message: 4
接收到: message: 5
接收到: message: 6
接收到: message: 7
接收到: message: 8
接收到: message: 9
接收到: message: 10
接收到: message: 11
接收到: message: 12
接收到: message: 13
接收到: message: 14
评论 7
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值