Elasticsearch与Spring的集成

目前常见的Elasticsearch Java API有四类client连接方式:

  • TransportClient(不推荐):Elasticsearch原生的api,TransportClient可以支持2.x,5.x版本,TransportClient将会在 Elasticsearch7.0弃用并在8.0中完成删除。
  • RestClient:ES官方推荐使用。
  • Jest(不推荐):是Java社区开发的,是Elasticsearch的Java Http Rest客户端。
  • Spring Data Elasticsearch:与Spring生态对接,可以在web系统中整合到Spring中使用,与SpringBoot、SpringData 版本容易冲突,而且往往很难跟上Elasticsearch版本的更新,比如SpringBoot目前的2.3.1.RELEASE,所支持Elasticsearch7.6.2。

从使用上来说,Spring Data的使命是给各种数据访问提供统一的编程接口,不管是关系型数据库(如 MySQL),还是非关系数据库(如Redis),或者类似Elasticsearch这样的索引数据库。从而简化开发人员的代码,提高开发效率,也就是说,Spring Data想要把对任何数据的访问都抽象为类似接口,这就导致了Spring Data Elasticsearch在基本查询上没问题,但是复杂查询(模糊、通配符、match查询、聚集查询等)就显得力不从心了,此时,我们还是只能使用原生查询。

所以我们把精力放在REST Client上,Java REST Client有Low Level和High Level两种:

  • Java Low Level REST Client:使用该客户端需要将HTTP请求的body手动拼成JSON格式,HTTP响应也必须将返回的JSON数据手动封装成对象,使用上更为原始。
  • Java High Level REST Client:该客户端基于Low Level客户端实现,提供API解决Low Level客户端需要手动转换数据格式的问题。

ES的官网已经提供了非常详尽的API参考手册,参见
https://www.elastic.co/guide/en/elasticsearch/client/java-rest/current/index.html

Java Low Level REST Client的使用

pom.xml中引入依赖,注意依赖的版本与Elasticsearch保持一致:

<dependency>
    <groupId>org.elasticsearch.client</groupId>
    <artifactId>elasticsearch-rest-client</artifactId>
    <version>7.7.0</version>
</dependency>

演示代码:

package com.morris.es.low;

import org.apache.http.HttpEntity;
import org.apache.http.HttpHost;
import org.apache.http.entity.ContentType;
import org.apache.http.nio.entity.NStringEntity;
import org.elasticsearch.client.Request;
import org.elasticsearch.client.Response;
import org.elasticsearch.client.RestClient;

import java.io.IOException;

public class LowDemo {

    public static void main(String[] args) throws IOException {

        RestClient restClient = RestClient.builder(
                new HttpHost("10.0.44.5", 9200, "http")
        ).build();

        String indexName = "low_test";
        String docId = "1";

        createIndex(restClient, indexName);

        addDoc(restClient, indexName, docId);

        queryDoc(restClient, indexName, docId);

        restClient.close();
    }

    private static void createIndex(RestClient restClient, String indexName) throws IOException {
        Request request = new Request("PUT", indexName);
        Response response = restClient.performRequest(request);
        System.out.println(response);
    }

    private static void addDoc(RestClient restClient, String indexName, String docId) throws IOException {
        String jsonString = "{" +
                "\"msg\":\"Java Low Level REST Client\"" +
                "}";
        HttpEntity entity = new NStringEntity(jsonString, ContentType.APPLICATION_JSON);
        Request request = new Request("PUT", indexName + "/_doc/" + docId);
        request.setEntity(entity);
        Response response = restClient.performRequest(request);
        System.out.println(response);
    }

    private static void queryDoc(RestClient restClient, String indexName, String docId) throws IOException {
        Request request = new Request("get", indexName + "/_doc/" + docId);
        Response response = restClient.performRequest(request);
        System.out.println(response.getEntity());
    }

}

Java High Level REST Client

pom.xml中引入依赖,注意依赖的版本与Elasticsearch保持一致:

<dependency>
    <groupId>org.elasticsearch.client</groupId>
    <artifactId>elasticsearch-rest-high-level-client</artifactId>
    <version>7.4.0</version>
</dependency>

获得连接:

RestClientBuilder restClientBuilder =
        RestClient.builder(
                new HttpHost("10.0.73.146", 9200, "http")
        );
restHighLevelClient = new RestHighLevelClient(restClientBuilder);

索引的管理

创建索引

DSL:

put high_test/_mappings
{
    "properties": {
      "age": {
        "type": "integer"
      },
      "content": {
        "type": "text"
      },
      "firstName": {
        "type": "keyword"
      },
      "secondName": {
        "type": "keyword"
      }
    } 
}

代码如下:

XContentBuilder xContentBuilder = XContentFactory.jsonBuilder()
        .startObject()
        .field("properties").startObject()
        .field("firstName").startObject()
        .field("type", "keyword")
        .endObject()
        .field("secondName").startObject()
        .field("type", "keyword")
        .endObject()
        .field("age").startObject()
        .field("type", "integer")
        .endObject()
        .field("content").startObject()
        .field("type", "text")
        .endObject()
        .endObject()
        .endObject();

CreateIndexRequest createIndexRequest = new CreateIndexRequest(indexName);
createIndexRequest.mapping(xContentBuilder);

CreateIndexResponse createIndexResponse
        = restHighLevelClient.indices().create(createIndexRequest,
        RequestOptions.DEFAULT);
System.out.println(createIndexResponse.index());
System.out.println(createIndexResponse.isAcknowledged());

索引是否存在

DSL:

HEAD high_test

代码如下:

GetIndexRequest createIndexRequest = new GetIndexRequest(indexName);
boolean exists = restHighLevelClient.indices().exists(createIndexRequest, RequestOptions.DEFAULT);
System.out.println(exists);

删除索引

DSL:

DELETE high_test

代码如下:

DeleteIndexRequest createIndexRequest = new DeleteIndexRequest(indexName);
AcknowledgedResponse acknowledgedResponse = restHighLevelClient.indices().delete(createIndexRequest, RequestOptions.DEFAULT);
System.out.println(acknowledgedResponse.isAcknowledged());

文档的管理

创建文档

DSL:

put high_test/_doc/777
{
  "age": 18,
  "content": "hello world",
  "firstName": "morris",
  "id": 666,
  "secondName": "chen"
}

代码如下:

User user = new User();
user.setId(666L);
user.setFirstName("morris");
user.setSecondName("chen");
user.setAge(18);
user.setContent("hello world");

IndexRequest indexRequest = new IndexRequest(indexName);
indexRequest.id(String.valueOf(user.getId()));
indexRequest.source(JSON.toJSONString(user), XContentType.JSON);
IndexResponse indexResponse =
        restHighLevelClient.index(indexRequest, RequestOptions.DEFAULT);
if (indexResponse != null) {
    String id = indexResponse.getId();
    String index = indexResponse.getIndex();
    System.out.println(id);
    System.out.println(index);
    if (indexResponse.getResult() == DocWriteResponse.Result.CREATED) {
        System.out.println("新增文档成功");
    } else if (indexResponse.getResult() == DocWriteResponse.Result.UPDATED) {
        System.out.println("覆盖文档成功");
    }
}

查询文档

DSL:

get high_test/_doc/666

代码如下:

GetRequest getRequest = new GetRequest(indexName, "666");
GetResponse getResponse = restHighLevelClient.get(getRequest, RequestOptions.DEFAULT);
System.out.println(getResponse.getSource());

修改文档

DSL:

post high_test/_update/666
{
  "doc": {
    "name": "morris131"
  },
  "doc_as_upsert": true
}

代码如下:

XContentBuilder xContentBuilder = XContentFactory.jsonBuilder();
xContentBuilder.startObject();
{
    xContentBuilder.field("name", "morris131");
}
xContentBuilder.endObject();
UpdateRequest request =
        new UpdateRequest(indexName, "666").doc(xContentBuilder);
request.docAsUpsert(true);
request.fetchSource(true);/*在应答里包含当前文档的内容*/
UpdateResponse updateResponse =
        restHighLevelClient.update(request, RequestOptions.DEFAULT);
GetResult getResult = updateResponse.getGetResult();
if (getResult.isExists()) {
    String sourceAsString = getResult.sourceAsString();
    System.out.println(sourceAsString);
} else {
    System.out.println("更新失败");
}

删除文档

DSL:

DELETE high_test/_doc/777

代码如下:

DeleteRequest deleteRequest = new DeleteRequest(indexName, "666");
DeleteResponse deleteResponse =
        restHighLevelClient.delete(deleteRequest, RequestOptions.DEFAULT);
if (deleteResponse.getResult() == DocWriteResponse.Result.NOT_FOUND) {
    System.out.println("文档不存在");
} else {
    System.out.println("删除成功");
}

普通搜索

DSL:

post kibana_sample_data_flights/_search
{
  "query": {
    "match_all": {}
  },
  "_source": {
    "includes": [
      "Origin*",
      "*Weather"
    ]
  },
  "sort": {
    "DistanceKilometers": "asc",
    "FlightNum": "desc"
  },
  "from": 0,
  "size": 5
}

代码如下:

SearchRequest searchRequest = new SearchRequest();
searchRequest.indices(indexName);
SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder();
searchSourceBuilder.from(0);
searchSourceBuilder.size(5);
searchSourceBuilder.query(QueryBuilders.matchAllQuery());
String[] includeFields = new String[]{"Origin*","*Weather"};
searchSourceBuilder.fetchSource(includeFields,null);
searchSourceBuilder.sort(new FieldSortBuilder("DistanceKilometers")
        .order(SortOrder.ASC)

);
searchSourceBuilder.sort(new FieldSortBuilder("FlightNum")
        .order(SortOrder.DESC)

);
searchRequest.source(searchSourceBuilder);
SearchResponse search = restHighLevelClient.search(searchRequest, RequestOptions.DEFAULT);
SearchHits hits = search.getHits();
for(SearchHit hit:hits){
    String src = hit.getSourceAsString();
    System.out.println(src);
}

聚合搜索

DSL:

post kibana_sample_data_flights/_search?filter_path=aggregations
{
  "query": {
    "term": {
      "OriginCountry": "CN"
    }
  },
  "aggs": {
    "month_price_histogram": {
      "date_histogram": {
        "field": "timestamp",
        "fixed_interval": "30d"
      },
      "aggs": {
        "avg_delay": {
          "avg": {
            "field": "FlightDelayMin"
          }
        }
      }
    }
  }
}

代码如下:

SearchRequest searchRequest = new SearchRequest();
        searchRequest.indices(indexName);

        /*query部分*/
        SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder();
        searchSourceBuilder.query(
                QueryBuilders.termQuery("OriginCountry", "CN"));
        
        /*聚集部分*/
DateHistogramAggregationBuilder date_price_histogram
        = AggregationBuilders.dateHistogram("month_price_histogram");
date_price_histogram.field("timestamp")
        .fixedInterval(DateHistogramInterval.days(30));
date_price_histogram.subAggregation(
        AggregationBuilders.avg("avg_delay").field("FlightDelayMin")
);
searchSourceBuilder.aggregation(date_price_histogram);
searchRequest.source(searchSourceBuilder);

JSONArray jsonArray = new JSONArray();

SearchResponse searchResponse = restHighLevelClient.search(searchRequest, RequestOptions.DEFAULT);

Aggregations aggregations = searchResponse.getAggregations();
for (Aggregation aggregation : aggregations) {
    String aggString = JSON.toJSONString(aggregation);
    jsonArray.add(JSON.parseObject(aggString));
    List<? extends Histogram.Bucket> buckets
            = ((Histogram) aggregation).getBuckets();
    for (Histogram.Bucket bucket : buckets) {
        System.out.println("--------------------------------------");
        System.out.println(bucket.getKeyAsString());
        System.out.println(bucket.getDocCount());
        ParsedAvg parsedAvg
                = (ParsedAvg) bucket.getAggregations().getAsMap().get("avg_delay");
        System.out.println(parsedAvg.getValueAsString());
    }
}

Spring Data Elasticsearch

引入依赖:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-elasticsearch</artifactId>
    <version>2.6.1</version>
</dependency>

<dependency>
    <groupId>org.elasticsearch.client</groupId>
    <artifactId>elasticsearch-rest-high-level-client</artifactId>
    <version>7.15.2</version>
</dependency>

配置文件中指定Elasticsearch的host:

spring.elasticsearch.uris=http://10.0.73.146:9200

搜索代码如下:

@Resource
private ElasticsearchOperations elasticsearchOperations;

@GetMapping("search")
public String search() {
    Criteria criteria = new Criteria("agent").matches("firefox");
    Query query = new CriteriaQuery(criteria);
    query.setPageable(Pageable.ofSize(5).withPage(1));
    query.addSort(Sort.by("timestamp").descending());
    IndexCoordinates indexCoordinates = IndexCoordinates.of("kibana_sample_data_logs");
    SearchHits<Log> searchHits = elasticsearchOperations.search(query, Log.class, indexCoordinates);
    for (SearchHit<Log> searchHit : searchHits.getSearchHits()) {
        System.out.println(searchHit.getContent());
    }
    return "OK";
}
  • 2
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

morris131

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值