@Resource和@Autowired
@Resource的作用相当于@Autowired,只不过@Autowired按byType自动注入,而@Resource默认按 byName自动注入罢了。@Resource有两个属性是比较重要的,分是name和type,Spring将@Resource注解的name属性解析为bean的名字,而type属性则解析为bean的类型。所以如果使用name属性,则使用byName的自动注入策略,而使用type属性时则使用byType自动注入策略。如果既不指定name也不指定type属性,这时将通过反射机制使用byName自动注入策略。
@Resource装配顺序
1. 如果同时指定了name和type,则从Spring上下文中找到唯一匹配的bean进行装配,找不到则抛出异常
2. 如果指定了name,则从上下文中查找名称(id)匹配的bean进行装配,找不到则抛出异常
3. 如果指定了type,则从上下文中找到类型匹配的唯一bean进行装配,找不到或者找到多个,都会抛出异常
4. 如果既没有指定name,又没有指定type,则自动按照byName方式进行装配;如果没有匹配,则回退为一个 原始类型进行匹配,如果匹配则自动装配;
@Resource和@Autowired区别
1、@Autowired与@Resource都可以用来装配bean. 都可以写在字段上,或写在setter方法上。
2、@Autowired默认按类型装配(这个注解是属业spring的),默认情况下必须要求依赖对象必须存在,
如果要允许null值,可以设置它的required属性为false,如:@Autowired(required=false) ,如果我们想使用名称装配可以结合@Qualifier注解进行使用,如下:
@Autowired()@Qualifier("baseDao")
privateBaseDao baseDao;
@Resource(这个注解属于J2EE的),默认按照名称进行装配,名称可以通过name属性进行指定,如果没有指定name属性,当注解写在字段上时,默认取字段名进行安装名称查找,如果注解写在setter方法上默认取属性名进行装配。当找不到与名称匹配的bean时才按照类型进行装配。但是需要注意的是,如果name属性一旦指定,就只会按照名称进行装配。
@Resource(name="baseDao")
privateBaseDao baseDao;
注意:安装elasticsearch版本为6.2.3
Elasticsearch配置:
spring.elasticsearch.host = 100.98.45.221
spring.elasticsearch.port = 9300
elasticsearch.cluster.name = my-application
package com.itheima.springBoot.springBootelasticsearch.config;
import com.itheima.springBoot.springBootelasticsearch.utils.EmptyUtils;
import org.elasticsearch.action.bulk.BackoffPolicy;
import org.elasticsearch.action.bulk.BulkProcessor;
import org.elasticsearch.action.bulk.BulkRequest;
import org.elasticsearch.action.bulk.BulkResponse;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.transport.TransportAddress;
import org.elasticsearch.common.unit.ByteSizeUnit;
import org.elasticsearch.common.unit.ByteSizeValue;
import org.elasticsearch.common.unit.TimeValue;
import org.elasticsearch.transport.client.PreBuiltTransportClient;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import javax.annotation.PostConstruct;
import java.net.InetAddress;
import java.net.UnknownHostException;
/**
* 在解决es入库问题上,之前使用过rest方式,经过一段时间的测试发现千万级别的数据会存在10至上百条数据的丢失问题,
* 在需要保证数据的准确性的场景下,rest方式并不能保证结果的准确性,因此采用了elasticsearch的BulkProcessor方式
* 来进行数据入库,实际上采用es客户端不同,rest方式采用的是restClient,基于http协议,BulkProcessor使用的是
* TransportClient,基于Tcp协议。
*/
@Configuration
public class ElasticsearchConfiguration {
private static final Logger logger = LoggerFactory.getLogger(ElasticsearchConfiguration.class);
/**
* elasticsearch的地址
*/
@Value("${spring.elasticsearch.host}")
private String host;
/**
* elasticsearch的端口
*/
@Value("${spring.elasticsearch.port}")
private Integer port;
/**
* 集群
*/
@Value("${elasticsearch.cluster.name}")
private String clusterName;
private TransportClient transportClient;
@Bean
public TransportClient transportClient() {
Settings settings = Settings.EMPTY;
if (!EmptyUtils.isEmpty(clusterName)) {
Settings.builder().put("cluster.name", clusterName).put("client.transport.sniff", true).build();
}
try {
transportClient = new PreBuiltTransportClient(settings).addTransportAddress(new TransportAddress(InetAddress.getByName(host), port));
} catch (UnknownHostException e) {
logger.error("创建elasticsearch客户端失败");
e.printStackTrace();
}
logger.info("创建elasticsearch客户端成功");
return transportClient;
}
@Bean
public BulkProcessor bulkProcessor() {
Settings settings = Settings.EMPTY;
if (!EmptyUtils.isEmpty(clusterName)) {
settings = Settings.builder()
.put("cluster.name", clusterName)
.build();
}
TransportClient transportClient = null;
BulkProcessor build = null;
try {
// 1:Add your Elasticsearch client
transportClient = new PreBuiltTransportClient(settings).addTransportAddress(new TransportAddress(InetAddress.getByName(host), port));
build = BulkProcessor.builder(transportClient, new BulkProcessor.Listener() {
// 2:This method is called just before bulk is executed.
// You can for example see the numberOfActions with request.numberOfActions()
@Override
public void beforeBulk(long executionId, BulkRequest request) {
logger.info("numberOfActions:" + request.numberOfActions());
}
// 3:This method is called after bulk execution.
// You can for example check if there was some failing requests with response.hasFailures()
@Override
public void afterBulk(long executionId, BulkRequest request, BulkResponse response) {
logger.info("responseFailures:" + response.hasFailures());
}
// 4:This method is called when the bulk failed and raised a Throwable
@Override
public void afterBulk(long executionId, BulkRequest request, Throwable failure) {
logger.error("Throwable Exception:" + request.numberOfActions(), failure);
}
})
.setBulkActions(10000) //5: We want to execute the bulk every 10 000 requests
.setBulkSize(new ByteSizeValue(5, ByteSizeUnit.MB)) // 6:We want to flush the bulk every 5mb
.setFlushInterval(TimeValue.timeValueSeconds(5)) // 7:We want to flush the bulk every 5 seconds whatever the number of requests
.setConcurrentRequests(1) // 8:Set the number of concurrent requests. A value of 0 means that only a single request will be allowed to be executed. A value of 1 means 1 concurrent request is allowed to be executed while accumulating new bulk requests.
.setBackoffPolicy(BackoffPolicy.exponentialBackoff(TimeValue.timeValueMillis(100), 3)).build();
// 9:设置一个自定义的backoff策略,该策略最初将等待100ms,然后按指数增长并重试三次。
// 每当出现EsRejectedExecutionException异常导致一个或多个批量项请求失败时,就会尝试重试,
// 该异常指示用于处理请求的可用计算资源太少。要禁用backoff,请传递backoff . nobackoff()。
} catch (UnknownHostException e) {
e.printStackTrace();
}
return build;
}
//@PostConstruct修饰的方法会在服务器加载Servle的时候运行,并且只会被服务器执行一次
// PostConstruct在构造函数之后执行,init()方法之前执行s
// PreDestroy()方法在destroy()方法执行之后执行
@PostConstruct
void init() {
System.setProperty("es.set.netty.runtime.available.processors", "false");
}
}
Model层:
import lombok.Data;
@Data
public class Es {
private String index;
private String type;
public Es(String index, String type) {
this.index = index;
this.type = type;
}
}
package com.itheima.springBoot.springBootelasticsearch.Model;
import com.alibaba.fastjson.annotation.JSONField;
import lombok.Data;
import java.util.Date;
@Data
public class Order {
private long id;
@JSONField(name = "store_id")
private int storeId; //店铺ID
@JSONField(name = "store_name")
private String storeName; //店铺name
@JSONField(name = "category_id")
private int categoryId; //类目ID
@JSONField(name = "category_code")
private String categoryCode; //类目名称
@JSONField(name = "product_code")
private String productCode; //货号
private int quantity; //销售件数
private double amount; //销售金额
@JSONField(name = "pay_date")
private Date payDate; //日期
public Order() {
}
public Order(long id, int storeId, String storeName,
int categoryId, String categoryCode,
String productCode, int quantity,
double amount, Date payDate) {
this.id = id;
this.storeId = storeId;
this.storeName = storeName;
this.categoryId = categoryId;
this.categoryCode = categoryCode;
this.productCode = productCode;
this.quantity = quantity;
this.amount = amount;
this.payDate = payDate;
}
}
import lombok.Data;
@Data
public class Student {
private String name;
private String age;
private String address;
}
DAO层:
package com.itheima.springBoot.springBootelasticsearch.Dao;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.itheima.springBoot.springBootelasticsearch.Model.Student;
import org.elasticsearch.action.bulk.BulkProcessor;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.common.xcontent.XContentType;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Repository;
@Repository
public class StudentInsertDao {
private static final Logger logger = LoggerFactory.getLogger(StudentInsertDao.class);
@Autowired
private BulkProcessor bulkProcessor;
private ObjectMapper objectMapper = new ObjectMapper();
public void insert(Student student) {
String type = student.getAge();
String id = student.getName() + student.getAddress() + student.getAge();
try {
byte[] json = objectMapper.writeValueAsBytes(student);
bulkProcessor.add(new IndexRequest("students", type, id).source(json, XContentType.JSON));
} catch (Exception e) {
logger.error("bulkProcessor failed ,reason:{}", e);
}
}
}
Service层:
package com.itheima.springBoot.springBootelasticsearch.service;
public interface BulkProcessorService {
void insertById(String index, String type, String id, String jsonStr);
void updateById(String index, String type, String id, String jsonStr);
void deleteById(String index, String type, String id);
}
package com.itheima.springBoot.springBootelasticsearch.service;
import org.elasticsearch.action.bulk.BulkProcessor;
import org.elasticsearch.action.delete.DeleteRequest;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.action.update.UpdateRequest;
import org.elasticsearch.common.xcontent.XContentType;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
@Service
public class BulkProcessorServiceImpl implements BulkProcessorService {
@Autowired
private BulkProcessor bulkProcessor;
@Override
public void insertById(String index, String type, String id, String jsonStr) {
bulkProcessor.add(new IndexRequest(index,type,id).source(jsonStr, XContentType.JSON));
}
@Override
public void updateById(String index, String type, String id, String jsonStr) {
bulkProcessor.add(new UpdateRequest(index,type,id).doc(jsonStr,XContentType.JSON));
}
@Override
public void deleteById(String index, String type, String id) {
bulkProcessor.add(new DeleteRequest(index,type,id));
}
}
package com.itheima.springBoot.springBootelasticsearch.service;
public interface ElasticSearchService {
void insertById(String index, String type, String id, String jsonStr);
void updateById(String index, String type, String id, String jsonStr);
void deleteById(String index, String type, String id);
}
package com.itheima.springBoot.springBootelasticsearch.service;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.xcontent.XContentType;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import javax.annotation.Resource;
@Service
public class ElasticSearchServiceImpl implements ElasticSearchService {
private static final Logger logger = LoggerFactory.getLogger(ElasticSearchServiceImpl.class);
@Resource
private TransportClient transportClient;
@Override
public void insertById(String index, String type, String id, String jsonStr) {
transportClient.prepareIndex(index,type,id).setSource(jsonStr, XContentType.JSON).get();
}
@Override
public void updateById(String index, String type, String id, String jsonStr) {
transportClient.prepareUpdate(index,type,id).setDoc(jsonStr,XContentType.JSON).get();
}
@Override
public void deleteById(String index, String type, String id) {
transportClient.prepareDelete(index,type,id).get();
}
}
package com.itheima.springBoot.springBootelasticsearch.service;
import com.itheima.springBoot.springBootelasticsearch.Model.Es;
import java.util.List;
import java.util.Map;
public interface QueryService {
List<Map<String, Object>> queryListFromES(Es es, int storeId, String storeName, String startDate, String endDate);
}
package com.itheima.springBoot.springBootelasticsearch.service;
import com.itheima.springBoot.springBootelasticsearch.Model.Es;
import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.script.Script;
import org.elasticsearch.script.ScriptType;
import org.elasticsearch.search.aggregations.AggregationBuilders;
import org.elasticsearch.search.aggregations.BucketOrder;
import org.elasticsearch.search.aggregations.bucket.terms.Terms;
import org.elasticsearch.search.aggregations.metrics.sum.Sum;
import org.elasticsearch.search.aggregations.pipeline.PipelineAggregatorBuilders;
import org.springframework.stereotype.Service;
import javax.annotation.Resource;
import java.util.*;
@Service
public class QueryServiceImpl implements QueryService{
@Resource
private TransportClient transportClient; //注入ES对象
@Override
public List<Map<String, Object>> queryListFromES(Es es, int storeId, String storeName, String startDate, String endDate) {
List<Map<String, Object>> list = new ArrayList<>();
Map<String, Object> map = Collections.emptyMap();
//提前定义好查询销量是否大于1000的脚本,类似SQL里面的having
Script script = new Script(ScriptType.INLINE, "painless","params._value0 > 0",map);
long beginTime = System.currentTimeMillis();
//设置查询条件,没有就不加,如果是字符串类型的要加keyword后缀
SearchResponse sr = transportClient.prepareSearch(es.getIndex()).setTypes(es.getType())
.setQuery(QueryBuilders.boolQuery()
.must(QueryBuilders.termQuery("store_id", storeId))
.must(QueryBuilders.termQuery("store_name.keyword", storeName))
.must(QueryBuilders.rangeQuery("pay_date.keyword").gte(startDate).lte(endDate)))
.addAggregation(
AggregationBuilders.terms("by_product_code").field("product_code.keyword").size(2000)
.subAggregation(AggregationBuilders.sum("quantity").field("quantity")) //分组计算销量汇总
.subAggregation(AggregationBuilders.sum("amount").field("amount")) //分组计算实付款汇总,需要加其他汇总的在这里依次加
.subAggregation(PipelineAggregatorBuilders.bucketSelector("sales_bucket_filter",script,"quantity"))//查询是否大于指定值
.order(BucketOrder.aggregation("amount", false)))//分组排序;
.execute().actionGet();
//查询遍历第一个根据货号分组的aggregation
Terms terms = sr.getAggregations().get("by_product_code");
System.out.println(terms.getBuckets().size());
for(Terms.Bucket entry: terms.getBuckets()){
Map<String, Object> objectMap = new HashMap<String,Object>();
System.out.println("------------------");
System.out.println("【 " + entry.getKey() + " 】订单数 : " + entry.getDocCount() );
Sum sum0 = entry.getAggregations().get("quantity"); //取得销量的汇总
Sum sum1 = entry.getAggregations().get("amount"); //取得销量的汇总
objectMap.put("product_code", entry.getKey());
objectMap.put("quantity",sum0.getValue());
objectMap.put("amount",sum1.getValue());
list.add(objectMap);
}
long endTime = System.currentTimeMillis();
System.out.println("查询耗时" + ( endTime - beginTime ) + "毫秒");
return list;
}
}
Utils工具类:
package com.itheima.springBoot.springBootelasticsearch.utils;
import java.util.List;
import java.util.Map;
public class EmptyUtils {
public static boolean isEmpty(Object s) {
if (s == null) {
return true;
}
if ((s instanceof String) && (((String) s).trim().length() == 0)) {
return true;
}
if (s instanceof Map) {
return ((Map<?, ?>) s).isEmpty();
}
if (s instanceof List) {
return ((List<?>) s).isEmpty();
}
if (s instanceof Object[]) {
return (((Object[]) s).length == 0);
}
return false;
}
}
Test类:
package com.itheima.springBoot.springBootelasticsearch;
import com.alibaba.fastjson.JSON;
import com.alibaba.fastjson.serializer.SerializerFeature;
import com.itheima.springBoot.springBootelasticsearch.Model.Order;
import com.itheima.springBoot.springBootelasticsearch.service.BulkProcessorService;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import java.util.Date;
import java.util.Random;
@RunWith(SpringJUnit4ClassRunner.class)
@SpringBootTest
public class BulkProcessorServiceTest {
@Autowired
private BulkProcessorService bulkProcessorService;
private Order order;
@Test
public void testInsertById() {
Random random = new Random();
for (int i = 0; i < 1200; i++) {
Order order = new Order();
int j = random.nextInt(20) % 20 + 1;
order.setId(i);
order.setStoreId(j);
order.setStoreName("旗舰店" + j);
order.setCategoryId(j);
order.setCategoryCode("shirt_" + j);
order.setProductCode("product_" + i);
order.setQuantity(random.nextInt(20) % 20 + 1);
order.setAmount(200 + (random.nextInt(20) % 20 + 1));
order.setPayDate(new Date());
String jsonStr = JSON.toJSONString(order, SerializerFeature.WriteDateUseDateFormat);
bulkProcessorService.insertById("search_index",
"search_index", i + "", jsonStr);
}
}
/**
* 将所有的销售件数置为0
*/
@Test
public void testUpdateById() {
for (int i = 0; i < 1200; i++) {
Order order = new Order();
order.setId(i);
order.setQuantity(0);
String jsonString = JSON.toJSONString(order, SerializerFeature.WriteDateUseDateFormat);
bulkProcessorService.updateById("search_index","search_index",i + "",jsonString);
}
}
@Test
public void testDeleteById(){
for (int i = 0; i < 1200; i++) {
bulkProcessorService.deleteById("search_index","search_index",i + "");
}
}
}
package com.itheima.springBoot.springBootelasticsearch;
import com.alibaba.fastjson.JSON;
import com.alibaba.fastjson.serializer.SerializerFeature;
import com.itheima.springBoot.springBootelasticsearch.Model.Es;
import com.itheima.springBoot.springBootelasticsearch.Model.Order;
import com.itheima.springBoot.springBootelasticsearch.service.QueryService;
import org.elasticsearch.action.bulk.BulkProcessor;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.xcontent.XContentType;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import java.util.Date;
import java.util.List;
import java.util.Map;
import java.util.Random;
@RunWith(SpringJUnit4ClassRunner.class)
@SpringBootTest
public class QueryServiceTest {
@Autowired
BulkProcessor bulkProcessor;
@Autowired
TransportClient transportClient;
@Autowired
QueryService queryService;
@Before
public void setUp() {
Random random = new Random();
for (int i=0;i<1200;i++){
Order order = new Order();
int j = random.nextInt(20) % 20 + 1;
order.setId(i);
order.setStoreId(j);
order.setStoreName("旗舰店" + j);
order.setCategoryId(j);
order.setCategoryCode("shirt_" + j);
order.setProductCode("product_" + i);
order.setQuantity(random.nextInt(20) % 20 + 1);
order.setAmount(200 + (random.nextInt(20) % 20 + 1));
order.setPayDate(new Date());
String jsonStr = JSON.toJSONString(order, SerializerFeature.WriteDateUseDateFormat);
bulkProcessor.add(new IndexRequest("search_index", "search_index", i + "").source(jsonStr, XContentType.JSON));
}
}
@Test
public void testInsertById(){
Es es = new Es("search_index","search_index");
List<Map<String, Object>> list = queryService.queryListFromES(es, 13,"旗舰店"+13, "2019-03-01", "2019-03-31");
System.out.println(JSON.toJSONString(list));
}
}
package com.itheima.springBoot.springBootelasticsearch;
import com.itheima.springBoot.springBootelasticsearch.Dao.StudentInsertDao;
import com.itheima.springBoot.springBootelasticsearch.Model.Student;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
@RunWith(SpringJUnit4ClassRunner.class)
@SpringBootTest
public class StudentInsertDaoTest {
@Autowired
private StudentInsertDao insertDao;
@Test
public void insert() throws Exception {
Student student = new Student();
student.setAge("12");
student.setAddress("SH");
student.setName("Jack");
insertDao.insert(student);
}
}