Java程序中使用Kafka和ElasticSearch

 

xml配置文件

<bean id="kafkaProducerProperites" class="java.util.HashMap">
        <constructor-arg>
            <map>
                <entry key="bootstrap.servers" value="${kafka.host}" />
                <entry key="group.id" value="${kafka.group.id}"/>
                <entry key="auto.offset.reset" value="latest"/>
                <entry key="enable.auto.commit" value="true"/>
                <entry key="auto.commit.interval" value="100"/>
                <entry key="retries" value="3"/>
                <entry key="batch.size" value="16384"/>
                <entry key="linger.ms" value="1"/>
                <entry key="key.serializer" value="org.apache.kafka.common.serialization.StringSerializer" />
                <entry key="value.serializer" value="org.apache.kafka.common.serialization.StringSerializer" />
            </map>

        </constructor-arg>
    </bean>

    <bean id="kafkaProducerFactory" class="org.springframework.kafka.core.DefaultKafkaProducerFactory">
        <constructor-arg ref="kafkaProducerProperites"/>
    </bean>

    <bean id="kafkaTemplate" class="org.springframework.kafka.core.KafkaTemplate">
        <constructor-arg ref="kafkaProducerFactory" />
        <constructor-arg name="autoFlush" value="true"/>
        <property name="defaultTopic" value="defaultTopic"/>
    </bean>

使用kafka的时候,注入KafkaTemplate

kafkaTemplate.send(SaleGoodsConstants.TOPIC_SALE_GOODS_NEW, str);

SaleGoodsConstants.TOPIC_SALE_GOODS_NEW,是发送的Topic名称

str,则是你传递的消息,注意,如果消息过多,这里需要拆分数据

2、创建Kafka监听类

@Component
public class SaleGoodsConsumer {
	private static Logger logger = LoggerFactory.getLogger(SaleGoodsConsumer.class);

	@Resource
	private IMq2EsService mq2EsService;


	@KafkaListener(topics = { TopicConstants.TOPIC_SALE_GOODS_NEW })
	public void listenTableSaleGoodsNew(ConsumerRecord<?, ?> cr) {
		String mqJson = "" + cr.value();
		logger.debug("Receive [{}]:{}", cr.topic(), cr);
		try {
			mq2EsService.consumeSaleGoodsElastic(mqJson);
		} catch (Exception e) {
			logger.warn("listenTableSaleGoodsNew process is in error:" + e, e);
		}
	}
}

3、处理kafka消息,使用到elasticsearchTemplate类

@Override
	public void consumeSaleGoodsElastic(String mqJson){
		logger.debug("consumeSaleGoodsElastic is begin  消费kafka中的商品详情信息,并提交到到ES任务开始:"+ System.currentTimeMillis());
		List<SaleGoodsElastic> saleGoodsElastics = new ArrayList<SaleGoodsElastic>();
		Map<String, Object> map = CommonUtils.jsonToObject(Map.class, mqJson);
		String data = ""+map.get("data");
		String eventType = "" + map.get("eventType");
		Boolean bool = (Boolean) map.get("delAll");
		if("INSERT".equalsIgnoreCase(eventType)) {
			saleGoodsElastics = JSONObject.parseArray(data, SaleGoodsElastic.class);
			int count = 0;
			List<IndexQuery> indexs = new ArrayList<IndexQuery>();
			for (SaleGoodsElastic product : saleGoodsElastics) {
				IndexQuery indexquery = new IndexQuery();
				indexquery.setId(product.getSaleGoodsId());
				indexquery.setObject(product);
				indexs.add(indexquery);
				count++;
			}
			if(bool){
				elasticsearchTemplate.deleteIndex(SaleGoodsElastic.class);
			}
			elasticsearchTemplate.createIndex(SaleGoodsElastic.class);
			elasticsearchTemplate.putMapping(SaleGoodsElastic.class);
			elasticsearchTemplate.bulkIndex(indexs);
			elasticsearchTemplate.refresh(SaleGoodsElastic.class);
			logger.debug("consumeSaleGoodsElastic is end  消费kafka中的商品详情信息,并提交到到ES任务结束:" + System.currentTimeMillis());
		}else if("DELETE".equalsIgnoreCase(eventType)){
			List<String> ids = new ArrayList<String>();
			ids = JSONObject.parseArray(data, String.class);
			for(int i = 0;i<ids.size();i++){
				String id = ids.get(i);
				elasticsearchTemplate.delete(SaleGoodsElastic.class, id);
			}
		}
	}

4、使用ES搜索,分页搜索ES的数据情况

        Integer pageNumber1 = 0;
		Integer pageSize1 = 1000 ;
		Pageable pages = new PageRequest(pageNumber1,pageSize1);
		//固定查询
		QueryBuilder queryShopType = QueryBuilders.termQuery("appId", appId);
		//全文检索
		QueryBuilder querycontent = queryStringQuery(key);

		QueryBuilder querys = null;
		// 匹配多个字段
		querys = QueryBuilders.boolQuery().must(queryShopType).must(querycontent);
		 //SearchQuery searchQuery1 = new NativeSearchQueryBuilder().withQuery(queryStringQuery(key)).withPageable(pages).build();
		SearchQuery searchQuery1 = new NativeSearchQueryBuilder().withQuery(querys).withPageable(pages).build();
		List<SaleGoodsElastic> shops =  elasticsearchTemplate.queryForList(searchQuery1, SaleGoodsElastic.class);

查询es中共有多少数据匹配的到

QueryBuilder appIdKey = QueryBuilders.termQuery("appId", appId);
QueryBuilder queryBuilder  = QueryBuilders.queryStringQuery(key);
QueryBuilder query = null;
query = QueryBuilders.boolQuery().must(queryBuilder).must(appIdKey);
SearchQuery searchQuery = new NativeSearchQueryBuilder().withQuery(query).build();
int count  = (int) elasticsearchTemplate.count(searchQuery,SaleGoodsElastic.class);
if (count == 0) {
			return null;
		}
		int pageNo = 0, pageSize = 20;// default
		if (page != null && page.getPageNo() >= 1) {
			pageNo = page.getPageNo() - 1;
		}
		if (page != null && page.getPageSize() > 0) {
			pageSize = page.getPageSize();
		}
		searchQuery.setPageable(new PageRequest(pageNo, pageSize));
		if (page != null) {
			page.setTotal(count);
		}
		// ==排序:releaseTime为默认,price为按价格排序,stock为按销量排序 rebate 为按返利排序
		String sortFieldValue = null;
		Direction direction = Direction.ASC;

		if (sortFiled != null) {
			String[] fields = {"price", "stock"};
			List<String> listFields = Arrays.asList(fields);
			if (listFields.contains(sortFiled)) {
				sortFieldValue = sortFiled;
			}
		}

		if (ascOrDesc != null && ascOrDesc.equalsIgnoreCase("DESC")) {
			direction = Direction.DESC;
		}
		if(StringUtils.isNotBlank(sortFieldValue)){
			Sort sort = new Sort(direction, sortFieldValue);
			searchQuery.addSort(sort);
		}

		Iterable<SaleGoodsElastic> iter = elasticsearchTemplate.queryForPage(searchQuery, SaleGoodsElastic.class);
		if (iter == null) {
			return null;
		}
		List<String> list = new ArrayList<String>();
		for (SaleGoodsElastic model : iter) {
			list.add(model.getSaleGoodsId());
		}
		return list;

 

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值