SPRING 集成 KAFKA 发送消息

时间:2023-03-08 19:12:45

准备工作

1.安装kafka+zookeeper环境

2.利用命令创建好topic,创建一个topic my-topic

集成步骤

1.配置生产者

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans.xsd"> <!-- 定义producer的参数 -->
<bean id="producerProperties" class="java.util.HashMap">
<constructor-arg>
<map>
<!-- 配置kafka的broke -->
<entry key="bootstrap.servers" value="192.168.1.88:9001,192.168.1.88:9002"/>
<!-- 配置组-->
<entry key="group.id" value="group1"/>
<entry key="acks" value="all"/>
<entry key="retries" value="10"/>
<entry key="batch.size" value="16384"/>
<entry key="linger.ms" value="1"/>
<entry key="buffer.memory" value="33554432"/>
<entry key="key.serializer" value="org.apache.kafka.common.serialization.StringSerializer"/>
<entry key="value.serializer" value="com.redxun.jms.ObjectSerializer"/>
</map>
</constructor-arg>
</bean> <!-- 创建kafkatemplate需要使用的producerfactory bean -->
<bean id="producerFactory" class="org.springframework.kafka.core.DefaultKafkaProducerFactory">
<constructor-arg>
<ref bean="producerProperties"/>
</constructor-arg>
</bean> <!-- 创建kafkatemplate bean,使用的时候,只需要注入这个bean,即可使用template的send消息方法 -->
<bean id="kafkaTemplate" class="org.springframework.kafka.core.KafkaTemplate">
<constructor-arg ref="producerFactory"/>
<constructor-arg name="autoFlush" value="true"/>
<property name="defaultTopic" value="my-topic"/>
</bean> </beans>

2.配置消费者

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:int="http://www.springframework.org/schema/integration"
xmlns:int-kafka="http://www.springframework.org/schema/integration/kafka"
xmlns:task="http://www.springframework.org/schema/task"
xsi:schemaLocation="http://www.springframework.org/schema/integration/kafka http://www.springframework.org/schema/integration/kafka/spring-integration-kafka.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task.xsd"> <!-- 定义consumer的参数 -->
<bean id="consumerProperties" class="java.util.HashMap">
<constructor-arg>
<map>
<!-- 配置kafka的broke -->
<entry key="bootstrap.servers" value="192.168.1.88:9001,192.168.1.88:9002"/>
<!-- 配置组-->
<entry key="group.id" value="group1"/>
<entry key="enable.auto.commit" value="true"/>
<entry key="auto.commit.interval.ms" value="1000"/>
<entry key="session.timeout.ms" value="30000"/>
<entry key="key.deserializer" value="org.apache.kafka.common.serialization.StringDeserializer"/>
<entry key="value.deserializer" value="com.redxun.jms.ObjectDeSerializer"/>
</map>
</constructor-arg>
</bean> <!-- 创建consumerFactory bean -->
<bean id="consumerFactory" class="org.springframework.kafka.core.DefaultKafkaConsumerFactory">
<constructor-arg>
<ref bean="consumerProperties"/>
</constructor-arg>
</bean> <!-- 实际执行消息消费的类 -->
<bean id="messageListernerConsumerService" class="com.redxun.jms.KafkaConsumerListener"/> <!-- 消费者容器配置信息 -->
<bean id="containerProperties" class="org.springframework.kafka.listener.config.ContainerProperties">
<!-- 重要!配置topic -->
<constructor-arg value="my-topic"/>
<property name="messageListener" ref="messageListernerConsumerService"/>
</bean> <!-- 创建kafkatemplate bean,使用的时候,只需要注入这个bean,即可使用template的send消息方法 -->
<bean id="messageListenerContainer" class="org.springframework.kafka.listener.KafkaMessageListenerContainer" init-method="doStart">
<constructor-arg ref="consumerFactory"/>
<constructor-arg ref="containerProperties"/>
</bean> </beans>

3.消息序列化和反序列化

在发送消息时,我们可以发送对象,而不只是字符串,所以我们需要将发送的数据进行序列化和反序列化,上面的配置文件有配置序列化和反序列化。

序列化代码

package com.redxun.jms;

import java.util.Map;

import org.apache.kafka.common.serialization.Serializer;

import com.redxun.core.util.FileUtil;

public class ObjectSerializer implements Serializer<Object> {

    @Override
public void configure(Map<String, ?> configs, boolean isKey) { } @Override
public byte[] serialize(String topic, Object data) {
try {
return FileUtil.objToBytes(data);
} catch (Exception e) {
return null;
}
} @Override
public void close() { } }

反序列化

package com.redxun.jms;

import java.util.Map;

import org.apache.kafka.common.serialization.Deserializer;

import com.redxun.core.util.FileUtil;

public class ObjectDeSerializer implements Deserializer<Object> {

    @Override
public void configure(Map<String, ?> configs, boolean isKey) { } @Override
public Object deserialize(String topic, byte[] data) {
try {
return FileUtil.bytesToObject(data);
} catch (Exception e) {
e.printStackTrace();
return null;
}
} @Override
public void close() {
// TODO Auto-generated method stub } }

4.发送消息代码

OsUser user=new OsUser();
user.setUserId("00001");
user.setFullname("zyg");
kafkaTemplate.sendDefault("demo", user);

5.接收消息代码

package com.redxun.jms;

import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.springframework.kafka.listener.MessageListener; import com.redxun.sys.org.entity.OsUser; public class KafkaConsumerListener implements MessageListener<String, Object> { @Override
public void onMessage(ConsumerRecord<String, Object> record) {
if(record.value() instanceof OsUser ){
OsUser user=(OsUser) record.value();
System.out.println(user.getFullname());
} } }

6.注意事项

在配置 kafka 配置文件

需要配置

host.name=ip地址

port=端口

SPRING 集成 KAFKA 发送消息