准备
您需要一台Linux机器,用来安装Docker 以及 Kafka
您需要安装VisualStudio C# 来应用Kafka
Docker安装
Kafka快速安装
创建目录
mkdir Kafka
cd Kafka
vi docker-compose.yml
编写docker-compose.yml文件
version: "2"
services:
kafkaserver:
image: "spotify/kafka:latest"
container_name: kafka
hostname: kafkaserver
restart:always
networks:
- kafkanet
ports:
- 2181:2181
- 9092:9092
environment:
ADVERTISED_HOST: kafkaserver
ADVERTISED_PORT: 9092
kafka_manager:
image: "mzagar/kafka-manager-docker:1.3.3.4"
container_name: kafkamanager
restart:always
networks:
- kafkanet
ports:
- 9000:9000
links:
- kafkaserver
environment:
ZK_HOSTS: "kafkaserver:2181"
networks:
kafkanet:
driver: bridge
运行Docker Compose
docker-compose up
docker-compose ps
设置本地host文件
"kafkaserver" 127.0.0.1
等待kafka的manager 启动完毕(3分钟)
Kafka快速配置
登录Kafka管理界面
http://localhost:9000
创建集群
Cluster->Add Cluster->Enter your cluster name
- host "kafkaserver:2181"
- Version "0.10.0.1"
-(可选)开启size跟踪(JMX Enable Checked)
创建主题
Topic->Create->Set your topic name
Kafka应用场景:
一般业务主数据高吞吐的时候,是扮演Producer的角色,向Kafka高吞吐的发送主题写入。
一般应用程序处理程序,尤其是向数据库等有写入瓶颈的应用统称为消费者,控制Kafka的流量并且采用分布式、微服务订阅与消费,以及自动伸缩。
Kafka应用编码(C#)
VisualStudio 创建两个控制台项目
Kafka.Producer - 主题消息发布者
Kafka.Consumer - 主题消息订阅与消费者
添加Nuget引用
Confluent.Kafka
添加消息转换器中间类
public class KafkaConverter : IDeserializer<object>
{
public object Deserialize(ReadOnlySpan<byte> data, bool isNull, SerializationContext context)
{
if (isNull) return null;
var json = Encoding.UTF8.GetString(data.ToArray());
try
{
return JsonConvert.DeserializeObject(json);
}
catch
{
return json;
}
}
}
Kafka.Producer.csproj
static void Main(string[] args)
{
try
{
ProducerConfig config = new ProducerConfig();
config.BootstrapServers = "XXX.XXX.XX.XXX:9092";
var builder = new ProducerBuilder<string, object>(config);
builder.SetValueSerializer(new KafkaConverter());//设置序列化方式
var producer = builder.Build();
producer.Produce("Kafka_test_topic", new Message<string, object>() { Key = "Test", Value = "hello world" });
Console.ReadKey();
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
Kafka.Consumer.csproj
static void Main(string[] args)
{
ConsumerConfig config = new ConsumerConfig();
config.BootstrapServers = "XXX.XXX.XX.XXX:9092";
config.GroupId = "group.1";
config.AutoOffsetReset = AutoOffsetReset.Earliest;
config.EnableAutoCommit = false;
var builder = new ConsumerBuilder<string, object>(config);
builder.SetValueDeserializer(new KafkaConverter());//设置反序列化方式
var consumer = builder.Build();
consumer.Subscribe("SinocareExternal");//订阅消息使用Subscribe方法
//consumer.Assign(new TopicPartition("test", new Partition(1)));//从指定的Partition订阅消息使用Assign方法
while (true)
{
var result = consumer.Consume();
Console.WriteLine($"recieve message:{result.Message.Value}");
consumer.Commit(result);//手动提交,如果上面的EnableAutoCommit=true表示自动提交,则无需调用Commit方法
}
}
需要理解好Kafka的消息原理、消费概念
推荐书籍:《深入理解Kafka 核心设计与实践原理》