SpringCloud分布式实现高并发秒杀

本文介绍了如何利用Idea搭建Alibaba SpringCloud,包括创建SpringCloud-demo父项目,建立SpringCloud-provider、consumer、gateway模块,实现前端请求、OpenFeign接口调用和微服务间的通信。同时,文章还涉及了整合Kafka进行消息队列处理,以及在秒杀场景中引入限流策略,确保系统稳定。
摘要由CSDN通过智能技术生成

idea搭建AlibabaSpringCloud

项目结构如下:

SpringCloud-demo是父项目
SpringCloud-consumer是消费者模块(前端控制器,秒杀开始),同时整合了openfeign,秒杀业务模块
SpringCloud-gateway就是网关了
SpringCloud-provider是生产者模块(提供DB的API),订单创建模块

(下面还有个公共模块SpringCloud-common)

 nacos

下载并安装nacos,单机启动bin/startup.cmd即可

http://ip:8848/nacos

账号、密码都是nacos

创建SpringCloud-demo父项目过程略

创建SpringCloud-provider

pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>com.fan</groupId>
        <artifactId>SpringCloud-demo</artifactId>
        <version>0.0.1-SNAPSHOT</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>
    <groupId>com.example</groupId>
    <artifactId>SpringCloud-provider</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>SpringCloud-provider</name>
    <description>SpringCloud-provider</description>
    <properties>
        <java.version>1.8</java.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>com.alibaba.cloud</groupId>
            <artifactId>spring-cloud-starter-alibaba-nacos-discovery</artifactId>
        </dependency>

        <!-- mybatis 与 spring boot 2.x的整合包 -->
        <dependency>
            <groupId>org.mybatis.spring.boot</groupId>
            <artifactId>mybatis-spring-boot-starter</artifactId>
            <version>1.3.2</version>
        </dependency>



        <!--mysql JDBC驱动 -->
        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <scope>runtime</scope>
        </dependency>
        <!-- <version>5.1.39</version> -->

        <!--引入druid-->
        <!-- https://mvnrepository.com/artifact/com.alibaba/druid -->
        <dependency>
            <groupId>com.alibaba</groupId>
            <artifactId>druid</artifactId>
            <version>1.1.8</version>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-data-redis</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-cache</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-aop</artifactId>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.xuxueli/xxl-job-core -->
        <dependency>
            <groupId>com.xuxueli</groupId>
            <artifactId>xxl-job-core</artifactId>
            <version>2.3.0</version>
        </dependency>
        <dependency>
            <groupId>io.netty</groupId>
            <artifactId>netty-all</artifactId>
            <version>4.1.58.Final</version>
        </dependency>
        <!--xxl-job结束 -->


        <!--fastjson-->
        <!-- https://mvnrepository.com/artifact/com.alibaba/fastjson -->
        <dependency>
            <groupId>com.alibaba</groupId>
            <artifactId>fastjson</artifactId>
            <version>2.0.14</version>
        </dependency>



    </dependencies>



</project>

application.yml

server:
  port: 8008
spring:
  application:
    name: provider-server  #服务名称
  cloud:
    nacos:
      discovery:
        server-addr: 127.0.0.1:8848  #nacos的服务注册中心地址


  datasource:
    type: com.alibaba.druid.pool.DruidDataSource
    url: jdbc:mysql://localhost:3306/seckill?useUnicode=true&characterEncoding=utf-8&useSSL=false&serverTimezone=UTC
    driver-class-name: com.mysql.cj.jdbc.Driver
    username: root
    password: 

controller

/**
 * @ClassName -> UserController
 * @Description openFeign调这里的API
 * @Author fan
 * @Date 2022/10/14 16:33 星期五
 * @Version 1.0
 */
@RestController
@Slf4j
public class UserController {

    @Autowired
    UserService userService;

    static SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");


    @GetMapping(value = "/getListById")
    public List<User> getListById(@RequestParam("id") Integer id) {
        log.info("当前时间-->" + sdf.format(new Date()));
        List<User> users = userService.selectUserById(id);
        log.info("[查询结果] users={}" , users);
        return users;
    }

}

service略

创建SpringCloud-consumer--前端请求的入口

pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>com.fan</groupId>
        <artifactId>SpringCloud-demo</artifactId>
        <version>0.0.1-SNAPSHOT</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>

    <groupId>com.example</groupId>
    <artifactId>SpringCloud-consumer</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>SpringCloud-consumer</name>
    <description>SpringCloud-consumer</description>

    <properties>
        <java.version>1.8</java.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>com.alibaba.cloud</groupId>
            <artifactId>spring-cloud-starter-alibaba-nacos-discovery</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-starter-openfeign</artifactId>
            <version>2.2.5.RELEASE</version>
        </dependency>
        <!-- 集成sentinel -->
        <dependency>
            <groupId>com.alibaba.cloud</groupId>
            <artifactId>spring-cloud-starter-alibaba-sentinel</artifactId>
            <exclusions> <!--除去该依赖,controller返回的结果才是json,不去除就xml-->
                <exclusion>
                    <groupId>com.fasterxml.jackson.dataformat</groupId>
                    <artifactId>jackson-dataformat-xml</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

    </dependencies>



</project>

application.yml

server:
  port: 8009
spring:
  application:
    name: consumer-server  #服务名称
  cloud:
    nacos:
      discovery:
        server-addr: 127.0.0.1:8848  #nacos的服务注册中心地址
#开启熔断开关
feign:
  sentinel:
    enabled: true

openFeign

/**
 * @ClassName -> ConsumerOpenFeign
 * @Description name :是yml 文件中的sprping.application.name ,
 *             FeignServiceFallBack  降级   当服务宕机时返回给页面的异常处理
 * @Author fan
 * @Date 2022/10/14 17:16 星期五
 * @Version 1.0
 */
@FeignClient(name = "provider-server",fallbackFactory = FeignFallBackService.class)
public interface ConsumerOpenFeignService {

    
    @GetMapping(value = "/getListById")
    List<User> getListById(@RequestParam("id") Integer id);

}

实现类

@Component
@Slf4j
public class FeignFallBackService implements FallbackFactory<ConsumerOpenFeignService> {
    @Override
    public ConsumerOpenFeignService create(Throwable throwable) {
        return new ConsumerOpenFeignService() {
            @Override
            public String index() {
                return "生产者SpringCloud-provider服务被降级停用了";
            }

            @Override
            public List<User> getListById(Integer id) {
                return null;
            }
        };
    }
}

controller如下

/**
 * @ClassName -> UserWebController
 * @Description 前端入口在此
 * @Author fan
 * @Date 2022/10/14 16:38 星期五
 * @Version 1.0
 */
@RestController
@RequestMapping("/user")
@Slf4j
public class UserWebController {

    @Autowired
    private ConsumerOpenFeignService consumerFeign;


    @GetMapping("/getListById")
    public List<User> getListById2(Integer id){
        log.info("Consumer_openFeign: id=" + id);
        List<User> userList = consumerFeign.getListById(id);
        return  userList;
    }


}

 启动类加上

@SpringBootApplication
@EnableDiscoveryClient
@EnableFeignClients

创建SpringCloud-gateway

pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.2.1.RELEASE</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>

    <artifactId>SpringCloud-gateway</artifactId>

    <dependencies>
        <!--网关-->
        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-starter-gateway</artifactId>
            <version>2.2.1.RELEASE</version>
        </dependency>
        <!--nacos注册中心-->
        <dependency>
            <groupId>com.alibaba.cloud</groupId>
            <artifactId>spring-cloud-starter-alibaba-nacos-discovery</artifactId>
        </dependency>
    </dependencies>

    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>com.alibaba.cloud</groupId>
                <artifactId>spring-cloud-alibaba-dependencies</artifactId>
                <version>2.2.1.RELEASE</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>
</project>

application.yml

server:
  port: 8010
spring:
  application:
    name: gateway-server
  cloud:
    nacos:
      discovery:
        server-addr: 127.0.0.1:8848
    gateway:
      routes:                       # 路由数组[路由 就是指定当请求满足什么条件的时候转到哪个微服务]
        - id: provider-route          # 当前路由的标识, 要求唯一,可以随便写
          uri: lb://provider-server  # lb指的是从nacos中按照名称获取微服务,并遵循负载均衡策略
          predicates:                # 断言(就是路由转发要满足的条件)
            - Path=/provider/**       # 当请求路径满足Path指定的规则时,才进行路由转发
          filters:                   # 过滤器,请求在传递过程中可以通过过滤器对其进行一定的修改
            - StripPrefix=1           # 转发之前去掉1层路径

        - id: consumer-route
          uri: lb://consumer-server
          predicates:
            - Path=/consumer/**
          filters:
            - StripPrefix=1

启动类就不贴了

自此简单的微服务就搭建完成了

分别启动provider,consumer,gateway

刷新

 测试结果

 前端请求到前端控制器调openFeign接口,openFeign接口调用provider,provider提供后台DB的API

整合Kafka

下载kafka,版本为3.2.3,不需要安装ZK

https://kafka.apache.org/downloads

解压后

 编辑如下文件

zookeeper.properties

dataDir=D:\kafka_2.12-3.2.3\zookeeper-logs

server.properties 修改

 log.dirs=D:\kafka_2.12-3.2.3\kafka-logs

用cmd进入D:\kafka_2.12-3.2.3

  • 启动zookeeper

    bin\windows\zookeeper-server-start.bat config\zookeeper.properties
  • 启动kafka

    bin\windows\kafka-server-start.bat .\config\server.properties

pom.xml添加

<dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
        </dependency>

application.yml

kafka:
    bootstrap-servers: localhost:9092 
    listener:
      ack-mode: manual_immediate
    #设置一个默认组
    consumer:
      group-id: myContainer
      #是否开启自动提交
      enable-auto-commit: false
      #Kafka中没有初始偏移或如果当前偏移在服务器上不再存在时,默认区最新 ,有三个选项 【latest, earliest, none】
      auto-offset-reset: latest
      #key-value序列化反序列化
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      auto-commit-interval: 5000
    producer:
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: org.apache.kafka.common.serialization.StringSerializer
      batch-size: 65536 # 批量抓取
      buffer-memory: 524288 # 缓存容量
      retries: 5 # 重试次数
      acks: -1 #生产者会等待所有副本成功写入该消息,这种方式是最安全的,能够保证消息不丢失,但是延迟也是最大的

    #--------------------kafka相关配置end--------------------------------

KafkaProducers
@Service
@Slf4j
public class KafkaProducers {

    public static final String topic = "myCloudTopic";
    @Autowired
    KafkaTemplate<String, String> kafkaTemplate;

    public void send(String msg) {
        try {
            kafkaTemplate.send(topic,msg).addCallback(new ListenableFutureCallback() {
                @Override
                public void onFailure(Throwable throwable) {
                    log.error("There was an error sending the message.");
                }
                @Override
                public void onSuccess(Object o) {
                    log.info(" My message was sent successfully.");
                }
            });
        } catch (Exception e){
            e.printStackTrace();
        }
    }


}

KafkaConsumer 

@Component
@Slf4j
public class KafkaConsumer {

    @KafkaListener(topics = "myCloudTopic",groupId = "myContainer")
    public void runStart(ConsumerRecord<?, ?> record){
        ObjectMapper objectMapper = new ObjectMapper();
        Optional<?> kafkaMessage = Optional.ofNullable(record.value());
        List<User> objects = new ArrayList<>();
        try {
            if (kafkaMessage.isPresent()) {
                String message = (String) kafkaMessage.get();
                log.info("[runStart()-->] message: {}", message);
                List<User> userList = objectMapper.readValue(message, new TypeReference<List<User>>() {});
                if (!CollectionUtils.isEmpty(userList)) {
                    for (User bean : userList) {
                        User detail = new User();
                        detail.setName(bean.getName());
                        objects.add(detail);
                    }
                    try {
                        if (objects.size() > 0) { //入库
                            //myService.saveBatch(objects);
                            System.out.println("当前userList=" + userList);
                        }
                    } catch (Exception e) {
                        e.printStackTrace();
                    }
                }
            }
        } catch (Exception e){
            e.printStackTrace();
            log.error(e.getMessage());
        }
    }



}

控制器

@RequestMapping(value = "/kafka/run")
    public void run(){
        log.info("----------begin----------");
        //查询数据--这里仿接口造数据
        List<User> userList = new ArrayList<>();
        User user = new User();
        user.setId(2);
        user.setDateTime(new Date());
        user.setName("王八");
        user.setSex("男");
        user.setAge(20);
        user.setCreateDate(new Date());
        userList.add(user);
        if(userList != null && userList.size() > 0) {
            try {
                ObjectMapper objectMapper = new ObjectMapper();
                String msg = objectMapper.writeValueAsString(userList);
                log.info("[run()] msg={}" , msg);
                kafkaProducers.send(msg);
            } catch (Exception e) {
                e.printStackTrace();
                log.error(e.getMessage());
            }
        }
        log.info("----------end----------");
    }

效果图

秒杀开始--consumer模块

SeckillController

@RestController
@RequestMapping("/seckill")
@Slf4j
public class SeckillController {

    @Resource
    KafkaTemplate<String, String> kafkaTemplate;

    @Resource
    DefaultRedisScript<Long> defaultRedisScript;

    @Autowired
    ConsumerOpenFeignService consumerOpenFeignService;

    @Resource
    RedisTemplate redisTemplate;

    private static SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");


    /***
     * @description: 秒杀商品设置到Redis中
     * @param: seckillDate
     * @return: {@link List< Product>}
     * @author fan
     * @date: 2022/10/18 10:37
     */
    @GetMapping(value = "/queryAll")
    public List<Product> queryAll(@RequestParam("seckillDate") String seckillDate) {
        DateTimeFormatter dtf = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
        try {
            ObjectMapper objectMapper = new ObjectMapper();
            List<Product> productList = null;
            String key = "userList_plan_all";
            boolean hasKey = redisTemplate.hasKey(key);//判断redis中是否有键为key的缓存
            if (hasKey) {
                productList = redisTemplate.opsForList().range(key,0,-1);
                        //redisClient.getStringList(key, 0, -1);
                log.info("redis的数据list.size()-->" + productList.size());
            } else {
                productList = consumerOpenFeignService.queryAll();
                redisTemplate.opsForList().leftPushAll(key , productList);
                //redisClient.setStringList(key, productList);//把商品信息存入缓存,列表展示用
                //list = userService.findPage(cp,ps);
                redisTemplate.expire(key, 1_000 * 60 * 60, TimeUnit.MILLISECONDS);//设置过期时间1个小时
                log.info("mysql的数据list.size()-->" + productList.size());
            }
            if (productList == null) {
                return null;
            }
            for (Product product : productList) {
                Long productId = product.getId();
                redisTemplate.opsForValue().set("product_" + productId, objectMapper.writeValueAsString(product));
                // 一个用户只买一件商品
                // 商品购买用户Set
                redisTemplate.opsForSet().add("product_buyers_" + product.getId(), "");
                for (int i = 0; i < product.getStock(); i++) {
                    redisTemplate.opsForList().leftPush("product_stock_key_" + product.getId(), String.valueOf(i));
                }
                log.info("[queryAll()] 商品product:" + objectMapper.writeValueAsString(product));
            }
            redisTemplate.opsForValue().set("seckill_plan_" + seckillDate, objectMapper.writeValueAsString(productList));//把商品信息存入缓存,列表展示用
            return productList;
        }catch (Exception e){
            e.printStackTrace();
        }
        return null;
    }

    /***
     * @description: 开始秒杀商品,并限流
     * @param: userId
     * @Param: productId
     * @return: {@link String}
     * @author fan
     * @date: 2022/10/18 10:41
     */
    @RequestMapping(value = "/seckillProduct")
    //@PostMapping(value = "seckillProduct")
    @ResponseBody
    @MyAcessLimter(count = 100,timeout = 1)
    public  String seckillProduct( String userId,  String productId) throws JsonProcessingException {
        log.info("单线程秒杀开始时间StartTime-->" + sdf.format(new Date()));
        List<String> list = Lists.newArrayList("product_stock_key_" + productId , "product_buyers_" + productId , userId );
        Long code = (Long) redisTemplate.execute(defaultRedisScript, list, "");
        if (code == -1) {
            return "库存不足";
        } else if (code == 2) {
            return "不允许重复秒杀";
        } else if (code == 1) {//整个秒杀过程在缓存中进行,秒杀结束后从缓存中拿数据库加入队列同步到数据库中
            ObjectMapper objectMapper = new ObjectMapper();
            String productJson = (String) redisTemplate.opsForValue().get("product_" + productId);
            Gson gson = new Gson();
            Product product = objectMapper.readValue(productJson , Product.class);
            //gson.fromJson(productJson , Product.class);
            Order order = new Order();
            order.setProductId(Long.parseLong(productId));
            order.setUserId(Long.parseLong(userId));
            StringBuffer sb = new StringBuffer();
            String id = sb.append(userId).append("_").append(productId).toString();
            order.setId(id);
            order.setOrderName("抢购" + product.getName());
            order.setProductName(product.getName());
            MessageObject messageObject = new MessageObject(order, product);//MessageObject中status为1表示定时任务要处理的数据
            //redisClient.setString("messageObject_" + productId, gson.toJson(messageObject));
            // 把要发送的数据messageObject放到同一个表中,这里演示就放到Redis
            String message = objectMapper.writeValueAsString(messageObject);
            //redisClient.putHash("messageObject_" , productId + "_" + userId , message);
            redisTemplate.opsForHash().put("messageObject_",  productId + "_" + userId , message);
            kafkaTemplate.send("seckill_order", JSONUtil.objToString(order));
            return "sueccss";
        }//、、XxlJob
        return "error";
    }


}

限流开始

@Component
@Aspect
@Slf4j
public class MyAcessLimiterAspect {

    @Resource
    private RedisTemplate<String, Object> redisTemplate;

    @Resource(name = "ipLimitLua")
    private DefaultRedisScript<Boolean> ipLimitLua;

    @Resource
    StringRedisTemplate stringRedisTemplate;

    // 1: 切入点   创建的注解类
    @Pointcut("@annotation(com.example.springcloudconsumer.consumer.limit.MyAcessLimter)")
    public void myLimiterPonicut() {
    }

    @Before("myLimiterPonicut()")
    public void limiter(JoinPoint joinPoint) {
        log.info("限流进来了......." + LocalDate.now());
        // 1:获取方法的签名作为key
        MethodSignature methodSignature = (MethodSignature) joinPoint.getSignature();
        Method method = methodSignature.getMethod();
        String classname = methodSignature.getMethod().getDeclaringClass().getName();
        String packageName = methodSignature.getMethod().getDeclaringClass().getPackage().getName();
        log.info("method:{},classname:{},packageName:{}",method,classname,packageName);
        // 4: 读取方法的注解信息获取限流参数
        MyAcessLimter annotation = method.getAnnotation(MyAcessLimter.class);
        // 5:获取注解方法名
        String methodNameKey = method.getName();
        log.info("获取注解方法名:{}" , methodNameKey);
        // 6:获取服务请求的对象
        ServletRequestAttributes requestAttributes = (ServletRequestAttributes) RequestContextHolder.getRequestAttributes();
        HttpServletRequest request = requestAttributes.getRequest();
        HttpServletResponse response = requestAttributes.getResponse();
        String userIp = MyIPUtils.getIpAddr(request);
        log.info("用户IP是:.......{}" , userIp);
        // 7:通过方法反射获取注解的参数
        Integer count = annotation.count();
        Integer timeout = annotation.timeout();

        /*Object[] args = joinPoint.getArgs();
        for (int i = 0; i < args.length; i++) {
            log.info("参数id--> " + request.getParameter("userId") + "---" + args[i]);
        }*/
        String redisKey =  userIp;//这里用IP
        log.info("当前的key-->" + redisKey);
        // 8: 请求lua脚本
        Boolean b =  stringRedisTemplate.execute(ipLimitLua, Lists.newArrayList(redisKey) , count.toString() , timeout.toString());
        // 如果超过限流限制
        if (!b) {
            // 抛出异常,然后让全局异常去处理
            response.setCharacterEncoding("UTF-8");
            response.setContentType("text/html;charset=UTF-8");
            try (PrintWriter writer = response.getWriter()) {
                writer.print("<h1>操作频繁,请稍后在试</h1>");
            } catch (Exception ex) {
                throw new RuntimeException("操作频繁,请稍后在试");
            }
        }
    }
}

@Target(ElementType.METHOD)
@Retention(RetentionPolicy.RUNTIME)
@Documented
@Component
public @interface MyAcessLimter {

    /**
     * 限流唯一标示
     * @author fan
     * @date 2022/5/7 1:55
     * @return java.lang.String
    */
    String key() default "";

    /**
     * 每timeout限制请求的个数
     * @author fan
     * @date 2022/5/7 1:54
     * @return int
    */
    int count() default 5;

    /**
     * 超时时间,单位默认是秒
     * @author fan
     * @date 2022/5/7 1:54
     * @return int
    */
    int timeout() default 10;

    /**
     * 访问间隔
     * @author fan
     * @date 2022/5/7 1:54
     * @return int
    */
    int waits() default 20;
}
@Configuration
public class MyLuaConfiguration {

    /** 限流
     * 将IP-lua脚本的内容加载出来放入到DefaultRedisScript
     * @author fan
     * @date 2022/5/7 9:35
     * @return org.springframework.data.redis.core.script.DefaultRedisScript<java.lang.Boolean>
    */
    @Bean(name = "ipLimitLua")
    public DefaultRedisScript<Boolean> ipLimitLua() {
        DefaultRedisScript<Boolean> defaultRedisScript = new DefaultRedisScript<>();
        defaultRedisScript.setScriptSource(new ResourceScriptSource(new ClassPathResource("myLimit_ip.lua")));
        defaultRedisScript.setResultType(Boolean.class);
        return defaultRedisScript;
    }

    /**
     * 将秒杀lua脚本的内容加载出来放入到DefaultRedisScript
     * @return
     */
    @Bean
    public DefaultRedisScript<Long> seckillLimiterLuaScript2() {
        DefaultRedisScript<Long> defaultRedisScript = new DefaultRedisScript<Long>();
        defaultRedisScript.setResultType(Long.class);
        defaultRedisScript.setScriptSource(new ResourceScriptSource(new ClassPathResource("buyone.lua")));
        return defaultRedisScript;
    }

    /**
     * 解锁
     * @return
     */
    @Bean(name = "unLock")
    public DefaultRedisScript<Long> unLockLuaScript() {
        DefaultRedisScript<Long> defaultRedisScript = new DefaultRedisScript<Long>();
        defaultRedisScript.setResultType(Long.class);
        defaultRedisScript.setScriptSource(new ResourceScriptSource(new ClassPathResource("unLock.lua")));
        return defaultRedisScript;
    }

}

限流结束

openfeign

@FeignClient(name = "provider-server",fallbackFactory = FeignFallBackService.class)
public interface ConsumerOpenFeignService {

    // 路径、方法名称必须与生产者一样,不然没有办法调用
    @GetMapping("/index")
    String index();

    @GetMapping(value = "/getListById")
    List<User> getListById(@RequestParam("id") Integer id);

    @PostMapping(value = "/oc/order/insert")
    Integer insert(@RequestBody Order order);

    @PostMapping(value = "/oc/order/selectOneOrder")
    List<Order> selectOneOrder(@RequestParam("id") String id);

    @PostMapping(value = "/pc/product/updateProduct")
    Integer updateProduct(@RequestParam("id") Long id);

    @PostMapping(value = "/pc/product/selectOne")
    Product selectOne(@RequestParam("id") Long id);

    @PostMapping(value = "/pc/product/queryAll")
    List<Product> queryAll();


    @PostMapping(value = "/pc/product/findOneOrder")
    Order findOneOrder(@RequestParam("id") String id);

}
@Component
@Slf4j
public class FeignFallBackService implements FallbackFactory<ConsumerOpenFeignService> {
    @Override
    public ConsumerOpenFeignService create(Throwable throwable) {
        return new ConsumerOpenFeignService() {
            @Override
            public String index() {
                return "生产者SpringCloud-provider服务被降级停用了";
            }

            @Override
            public List<User> getListById(Integer id) {
                return null;
            }

            @Override
            public Integer insert(Order order) {
                return null;
            }

            @Override
            public List<Order> selectOneOrder(String id) {
                return null;
            }

            @Override
            public Integer updateProduct(Long id) {
                return null;
            }

            @Override
            public Product selectOne(Long id) {
                return null;
            }

            @Override
            public List<Product> queryAll() {
                return null;
            }

            @Override
            public Order findOneOrder(String id) {
                return null;
            }
        };
    }
}
server:
  port: 8009

spring:
  application:
    name: consumer-server  #服务名称
  cloud:
    nacos:
      discovery:
        server-addr: 127.0.0.1:8848  #nacos的服务注册中心地址

  datasource:
    type: com.alibaba.druid.pool.DruidDataSource
    url: jdbc:mysql://localhost:3306/seckill?useUnicode=true&characterEncoding=utf-8&useSSL=false&serverTimezone=UTC
    driver-class-name: com.mysql.cj.jdbc.Driver
    username: root
    password: 

  ## Redis 配置
  redis:
    ## Redis数据库索引(默认为0)
    database: 0
    ## Redis服务器地址
    host: 127.0.0.1
    ## Redis服务器连接端口
    port: 6379
    ## Redis服务器连接密码(默认为空)
    password:
    connectTimeout: 3000
    timeout: 3000
    connectionPoolSize: 50
    connectionMinimumIdleSize: 50
    jedis:
      pool:
        ## 连接池最大连接数(使用负值表示没有限制)
        #spring.redis.pool.max-active=8
        max-active: 8
        ## 连接池最大阻塞等待时间(使用负值表示没有限制)
        #spring.redis.pool.max-wait=-1
        max-wait: -1
        ## 连接池中的最大空闲连接
        #spring.redis.pool.max-idle=8
        max-idle: 8
        ## 连接池中的最小空闲连接
        #spring.redis.pool.min-idle=0
        min-idle: 0
    ## 连接超时时间(毫秒)
    #timeout: 1200

  kafka:
    bootstrap-servers: localhost:9092 #rl-hadoop4:9092,rl-hadoop5:9092,rl-hadoop6:9092
    listener:
      ack-mode: manual_immediate
    #设置一个默认组
    consumer:
      group-id: myContainer
      #是否开启自动提交
      enable-auto-commit: false
      #Kafka中没有初始偏移或如果当前偏移在服务器上不再存在时,默认区最新 ,有三个选项 【latest, earliest, none】
      auto-offset-reset: latest
      #key-value序列化反序列化
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      auto-commit-interval: 5000
    producer:
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: org.apache.kafka.common.serialization.StringSerializer
      batch-size: 65536 # 批量抓取
      buffer-memory: 524288 # 缓存容量
      retries: 5 # 重试次数
      acks: -1 #生产者会等待所有副本成功写入该消息,这种方式是最安全的,能够保证消息不丢失,但是延迟也是最大的
    #--------------------kafka相关配置end--------------------------------

#开启熔断开关
feign:
  sentinel:
    enabled: true

#打印sql时的语句
logging:
  level:
    com:
      acong:
        dao: debug
  file: d:/logs/redis.log

buyone.lua

--商品库存Key
local product_stock_key = KEYS[1]
--商品购买用户记录Key
local buyersKey = KEYS[2]
--用户ID
local uid = KEYS[3]

--校验用户是否已经购买
local result = redis.call("sadd" , buyersKey , uid )
if(tonumber(result) == 1)
then
    --没有购买过,可以购买
    local stock = redis.call("lpop" , product_stock_key )
    --除了nil和false,其他值都是真(包括0)
    if(stock)
    then
        --有库存
        return 1
    else
        --没有库存
        return -1
    end
else
    --已经购买过
    return 2
end

myLimit_ip.lua

-- 为某个接口的请求IP设置计数器,比如:127.0.0.1请求课程接口
-- KEYS[1] = 127.0.0.1 也就是用户的IP或 接口的URL
-- ARGV[1] = 过期时间 30m
-- ARGV[2] = 限制的次数
local count = redis.call('incr',KEYS[1]);
if count == 1 then
    redis.call("expire",KEYS[1],ARGV[2])
end
-- 如果次数还没有过期,并且还在规定的次数内,说明还在请求同一接口
if count > tonumber(ARGV[1]) then
    return false
end

return true

unLock.lua脚本原子性,判断是否是自己加的锁

-- 判断锁的值是否相等。 KEYS[1], ARGV[1],是指传入的参数,以上面为例,KEYS[1] 指的是 lock_order,ARGV[1] 指的是 123uD,
if redis.call("get",KEYS[1]) == ARGV[1]
then
    return redis.call("del",KEYS[1])    -- 删除这个 key,返回删除 key 的个数
else
    return 0                            -- 锁值不相等返回 0
end

provider模块

@RestController
@RequestMapping(value = "/oc")
@Slf4j
public class OrderController {

    @Autowired
    OrderService orderService;

    @RequestMapping(value = "/order/insert", method = RequestMethod.POST, produces = { "application/json;charset=UTF-8" })
    public Integer insert(@RequestBody Order order){
        Integer i = orderService.insert(order);
        return i;
    }

    @RequestMapping(value = "/order/selectOneOrder", method = RequestMethod.POST, produces = { "application/json;charset=UTF-8" })
    public List<Order> selectOneOrder(@RequestParam("id") String id){
        List<Order> orderList = orderService.selectOneOrder(id);
        return orderList;
    }

}

@RestController
@RequestMapping(value = "/pc")
@Slf4j
public class ProductController {

    @Autowired
    ProductService productService;

    @Autowired
    OrderService orderService;

    @RequestMapping(value = "/product/updateProduct", method = RequestMethod.POST, produces = { "application/json;charset=UTF-8" })
    public Integer updateProduct(@RequestBody Product product){
        Integer i = productService.updateProduct(product.getId());
        return i;
    }

    @RequestMapping(value = "/product/selectOne", method = RequestMethod.POST, produces = { "application/json;charset=UTF-8" })
    public Product selectOne(@RequestParam("id") Long id){
        Product product = productService.selectById(id);
        return product;
    }

    @RequestMapping(value = "/product/queryAll", method = RequestMethod.POST, produces = { "application/json;charset=UTF-8" })
    public List<Product> queryAll(){
        List<Product> productList = productService.queryAll();
        return productList;
    }

    @RequestMapping(value = "/product/findOneOrder", method = RequestMethod.POST, produces = { "application/json;charset=UTF-8" })
    public Order findOneOrder(@RequestParam("id") String id){
        List<Order> orderList = orderService.selectOneOrder(id);
        return orderList.get(0);
    }

}

消费者

@Component
@Slf4j
public class OrderSeckillConsumer {

    @Resource
    RedisTemplate<String, Object> redisTemplate;

    @Autowired
    ProductService productService;

    @Autowired
    OrderService orderService;

    @Resource("unLock")
    DefaultRedisScript<Long> defaultRedisScript;

    private static final long TIME_OUT = 1000 * 5;

    private static SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");

    public void unlock(String key, String value) {
        try {
            long ttl = redisTemplate.opsForValue().getOperations().getExpire(key);
            String currentValue = (String)redisTemplate.opsForValue().get(key);
            if(!StringUtils.isEmpty(currentValue) && currentValue.equals(value) && ttl > 5) {
                redisTemplate.opsForValue().getOperations().delete(key);
            }
        //推荐使用
        List<String> list = Lists.newArrayList(key );
        Long code = (Long) redisTemplate.execute(defaultRedisScript, list, value);
        } catch (Exception e) {
            log.error("解锁失败,{}", e);
        }
    }


    @KafkaListener(topics = "seckill_order",groupId = "myContainer")
    public void run(ConsumerRecord<?, ?> records, Acknowledgment ack, @Header(KafkaHeaders.RECEIVED_TOPIC) String topic) throws JsonProcessingException {
        AtomicBoolean isError = new AtomicBoolean(false);
        String key = null;
        String lockValue = UUID.randomUUID().toString();
        Optional<?> kafkaMessage = Optional.ofNullable(records.value());
        try {
            if (kafkaMessage.isPresent()) {
                String message = (String) kafkaMessage.get();
                log.info("[run()-->]message: {}", message);
                ObjectMapper objectMapper = new ObjectMapper();
                Order order = objectMapper.readValue(message, Order.class);
                if (order != null) { //先调通知成功的接口给用户后对订单数据入库
                    key = order.getId();
                    Object idRedis = redisTemplate.opsForValue().get(key);
                            //redisClient.getString(key);
                    Boolean b = redisTemplate.opsForValue().setIfAbsent(key, lockValue, TIME_OUT, TimeUnit.MILLISECONDS);
                    if ( b && !key.equals(idRedis)) { // 防止消息重复消费,保证幂等性(存个key到Redis指定过期时间)
                        //redisClient.setKey(id , id ,5L);
                        log.info("当前时间:" + sdf.format(new Date()) + ",key=" + key);
                        Long productId = order.getProductId();
                        int p = productService.updateProduct(productId);//开启事务
                        if (p > 0) {
                            int i = orderService.insert(order);//开启事务
                            if (i > 0) {
                                //日志写入略
                                String productJson = (String) redisTemplate.opsForValue().get("product_" + productId);
                                        //(String) redisClient.getString("product_" + productId);//product_
                                Product product = objectMapper.readValue(productJson, Product.class);
                                MessageObject messageObject = new MessageObject(order, product);//MessageObject中status为1表示定时任务要处理的数据
                                messageObject.setStatus("0");
                                String msg = objectMapper.writeValueAsString(messageObject);
                                //redisClient.putHash("messageObject_", productId + "_" + order.getUserId(), msg);//把状态更新回缓存(表)
                                redisTemplate.opsForHash().put("messageObject_", productId + "_" + order.getUserId(), msg);
                                ack.acknowledge();//手动提交
                            }
                        }
                    }
                }
            }
        } catch (Exception e){
            isError.set(true);
            e.printStackTrace();
            log.error(e.getMessage());
        }finally {
            unlock(key,lockValue);
            log.info("单线程秒杀结束时间endTime-->" + sdf.format(new Date()));
        }

    }



}

定时任务--代码补偿(分布式事务)

@Component
@Slf4j
public class MyKafkaTask {

    private static SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");

    @Resource
    private KafkaTemplate<String, String> kafkaTemplate;

    @Autowired
    OrderService orderService;

    @Autowired
    ProductService productService;

    @Autowired
    private RedisTemplate redisTemplate;

    private static final long TIME_OUT = 1000 * 5;



    /***
     * @description: 执行的代码补偿
     * @param:
     * @return:
     * @author fan
     * @date: 2022/9/23 17:16
     */
    @Scheduled(cron = "0/30 * * * * ?")
    //@Scheduled(initialDelay=2000, fixedRate=60000)
    public void excuteTask() throws JsonProcessingException {
        List<String> keySet = redisTemplate.opsForHash().values("messageObject_");
                //redisClient.getHash("messageObject_");
        if (!CollectionUtils.isEmpty(keySet)){
            ObjectMapper objectMapper = new ObjectMapper();
            for (String jonsonObjStr : keySet) {
                MessageObject messageObject = objectMapper.readValue(jonsonObjStr , MessageObject.class);
                String status = messageObject.getStatus();
                if (!"1".equals(status) && !StringUtils.isEmpty(status)){
                    continue;
                }
                String id = messageObject.getId();
                Long userId = messageObject.getUserId();
                Long productId = messageObject.getProductId();
                String productName = messageObject.getProductName();
                String orderName = messageObject.getOrderName();
                Order order = new Order();
                order.setProductId(productId);
                order.setUserId(userId);
                order.setId(id);
                order.setOrderName(orderName);
                order.setProductName(productName);
                Order obj = orderService.selectOneOrder(id).get(0);
                if (obj != null){ // 防止消息重复消费,保证幂等性(存个key到Redis指定过期时间也可以,但要保证是同一条消息)
                    log.info("[该订单已存在] date={}" , sdf.format(new Date()));
                    continue;
                }
                Integer i = orderService.insert(order);
                Product product =  new Product();
                String name = messageObject.getName();
                Integer stock = messageObject.getStock();
                Date creatTime = messageObject.getCreatTime();
                Date startTime = messageObject.getStartTime();
                Date endTime = messageObject.getEndTime();
                product.setId(productId);
                product.setStock(stock);
                product.setName(name);
                product.setStartTime(startTime);
                product.setEndTime(endTime);
                product.setCreatTime(creatTime);
                int p = productService.updateProduct(productId);
                if (p > 0 && i > 0){ //更新到Redis(表)
                    messageObject.setStatus("0");
                    String message = objectMapper.writeValueAsString(messageObject);
                    //redisClient.putHash("messageObject_" , productId + "_" + userId , message);//把状态更新回缓存
                    redisTemplate.opsForHash().put("messageObject_" , productId + "_" + userId , message);
                    kafkaTemplate.send("seckill_order", objectMapper.writeValueAsString(order));//重新发送到队列
                }
            }
        }
    }

}

MessageObject 实体类 

@Data
public class MessageObject implements Serializable {

    private String id;
    private Long userId;
    private Long productId;
    private String productName;
    private String orderName;

    //private long productId;
    private String name;
    private Integer stock;
    private Date creatTime;
    private Date startTime;

    private Date endTime;

    private String status;

    public MessageObject() {}


    public MessageObject(Order order , Product product) {
        this.id = order.getId();
        this.userId = order.getUserId();
        this.productId = order.getProductId();
        this.productName = order.getProductName();
        this.orderName = order.getOrderName();
        this.name = product.getName();
        this.stock = product.getStock();
        this.creatTime = product.getCreatTime();
        this.startTime = product.getStartTime();
        this.endTime = product.getEndTime();
        this.status = "1";
    }
}

service

public interface OrderService {

    Integer insert(Order order);

    List<Order> selectOneOrder(String id);

    Integer saveOrder(Order order);


}

@Service
public class OrderServiceImpl implements OrderService {
    @Qualifier("orderMapper")
    @Autowired
    private OrderMapper orderMapper;

    @Transactional
    @Override
    public Integer insert(Order order) {
        return orderMapper.insertOrder(order);
    }

    @Override
    public List<Order> selectOneOrder(String id) {
        return orderMapper.selectOneOrder(id);
    }

    @Transactional(propagation = Propagation.REQUIRED, rollbackFor = {Exception.class})
    @Override
    public Integer saveOrder(Order order) {
        Integer count = 0;
        List<Order> list = orderMapper.selectOneOrder(order.getId());
        if (CollectionUtils.isEmpty(list)) {
            int i = orderMapper.insertOrder(order);
            if (i > 0) {
                count++;
            }
        }else {
            int i = orderMapper.updateOrder(order);
            if (i > 0) {
                count++;
            }
        }
        return count;
    }


}
public interface ProductService {

    Product selectById(long id);

    int updateProduct(long id);

    List<Product> queryAll();

}
@Service("productService")
public class ProductServiceImpl implements ProductService {

    @Qualifier("productMapper")
    @Autowired
    private ProductMapper productMapper;

    @Override
    public Product selectById(long id) {
        return productMapper.selectById(id);
    }

    @Transactional(propagation = Propagation.REQUIRED)
    @Override
    public int updateProduct(long id) {
        return productMapper.updateProduct(id);
    }

    @Override
    public List<Product> queryAll() {
        return productMapper.queryAll();
    }
}

DAO

@Repository("orderMapper")
@Mapper
public interface OrderMapper {

  
    int insertOrder(Order order);

    List<Order> selectOneOrder(String id);

 
    int updateOrder(Order order);
}
@Repository("productMapper")
@Mapper
public interface ProductMapper {
    
    Product selectById(long id);

    
    int updateProduct(long id);


    List<Product> queryAll();
}

Mapping.xml不贴了

application.yml

server:
  port: 8008
spring:
  application:
    name: provider-server  #服务名称
  cloud:
    nacos:
      discovery:
        server-addr: 127.0.0.1:8848  #nacos的服务注册中心地址

  ## Redis 配置
  redis:
    ## Redis数据库索引(默认为0)
    database: 0
    ## Redis服务器地址
    host: 127.0.0.1
    ## Redis服务器连接端口
    port: 6379
    ## Redis服务器连接密码(默认为空)
    password:
    connectTimeout: 3000
    timeout: 3000
    connectionPoolSize: 50
    connectionMinimumIdleSize: 50
    jedis:
      pool:
        ## 连接池最大连接数(使用负值表示没有限制)
        #spring.redis.pool.max-active=8
        max-active: 8
        ## 连接池最大阻塞等待时间(使用负值表示没有限制)
        #spring.redis.pool.max-wait=-1
        max-wait: -1
        ## 连接池中的最大空闲连接
        #spring.redis.pool.max-idle=8
        max-idle: 8
        ## 连接池中的最小空闲连接
        #spring.redis.pool.min-idle=0
        min-idle: 0
    ## 连接超时时间(毫秒)
    #timeout: 1200


  kafka:
    bootstrap-servers: localhost:9092 
    listener:
      ack-mode: manual_immediate
    #设置一个默认组
    consumer:
      group-id: myContainer
      #是否开启自动提交
      enable-auto-commit: false
      #Kafka中没有初始偏移或如果当前偏移在服务器上不再存在时,默认区最新 ,有三个选项 【latest, earliest, none】
      auto-offset-reset: latest
      #key-value序列化反序列化
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      auto-commit-interval: 5000
    producer:
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: org.apache.kafka.common.serialization.StringSerializer
      batch-size: 65536 # 批量抓取
      buffer-memory: 524288 # 缓存容量
      retries: 5 # 重试次数
      acks: -1 #生产者会等待所有副本成功写入该消息,这种方式是最安全的,能够保证消息不丢失,但是延迟也是最大的

  datasource:
    type: com.alibaba.druid.pool.DruidDataSource
    url: jdbc:mysql://localhost:3306/seckill?useUnicode=true&characterEncoding=utf-8&useSSL=false&serverTimezone=UTC
    driver-class-name: com.mysql.cj.jdbc.Driver
    username: root
    password: 

创建SpringCloud-common放公共类

实体类

@Data
@NoArgsConstructor
@AllArgsConstructor
public class Order implements Serializable {
    private String id;
    private Long userId;
    private Long productId;
	private String productName;
	private String orderName;
}
@Data
@NoArgsConstructor
@AllArgsConstructor
public class Product implements Serializable {
    private Long id;
    private String name;
    private Integer stock;
	private Date creatTime;
	private Date startTime;

	private Date endTime;
}
@Component
public class MyIPUtils {

    public static String getIpAddr(HttpServletRequest request) {
        if (request == null) {
            return "unknown";
        }
        String ip = request.getHeader("x-forwarded-for");
        if (ip == null || ip.length() == 0 || "unknown".equalsIgnoreCase(ip)) {
            ip = request.getHeader("Proxy-Client-IP");
        }
        if (ip == null || ip.length() == 0 || "unknown".equalsIgnoreCase(ip)) {
            ip = request.getHeader("X-Forwarded-For");
        }
        if (ip == null || ip.length() == 0 || "unknown".equalsIgnoreCase(ip)) {
            ip = request.getHeader("WL-Proxy-Client-IP");
        }
        if (ip == null || ip.length() == 0 || "unknown".equalsIgnoreCase(ip)) {
            ip = request.getHeader("X-Real-IP");
        }

        if (ip == null || ip.length() == 0 || "unknown".equalsIgnoreCase(ip)) {
            ip = request.getRemoteAddr();
        }
        return "0:0:0:0:0:0:0:1".equals(ip) ? "127.0.0.1" : ip;
    }
}
public class JSONUtil {

    private static ObjectMapper objectMapper = new ObjectMapper();

    static {
        // 对象字段全部列入
        objectMapper.setSerializationInclusion(Inclusion.NON_DEFAULT);

        // 取消默认转换timestamps形式
        objectMapper.configure(SerializationConfig.Feature.WRITE_DATES_AS_TIMESTAMPS,false);

        // 忽略空bean转json的错误
        objectMapper.configure(SerializationConfig.Feature.FAIL_ON_EMPTY_BEANS,false);

        // 统一日期格式yyyy-MM-dd HH:mm:ss
        objectMapper.setDateFormat(new SimpleDateFormat(DateTimeUtil.STANDARD_FORMAT));//STANDARD_FORMAT

        // 忽略在json字符串中存在,但是在java对象中不存在对应属性的情况
        objectMapper.configure(DeserializationConfig.Feature.FAIL_ON_UNKNOWN_PROPERTIES,false);
    }

    /**
     * Object转json字符串
     * @param obj
     * @param <T>
     * @return
     */
    public static <T> String objToString(T obj){
        if (obj == null){
            return null;
        }
        try {
            return obj instanceof String ? (String) obj : objectMapper.writeValueAsString(obj);
        } catch (Exception e) {
            System.out.println("Parse object to String error");
            e.printStackTrace();
            return null;
        }
    }

    public static <T> List<T> jsonToList(String s, Class<T[]> cls) {
        T[] arr = new Gson().fromJson(s, cls);
        return Arrays.asList(arr);
    }//List<User> list = jsonToList(result,User[].class);

    public static <T> List<T> json2List(String json, Class<T> cls) {
        List<T> list = new ArrayList<T>();
        JsonArray array = new JsonParser().parse(json).getAsJsonArray();
        for(final JsonElement elem : array){
            list.add(new Gson().fromJson(elem, cls));
        }
        return list;
    }//List<User> list = jsonToList(result,User.class);

    public static String mapToJson(Map<String,String> map) {
        map.put("age","18");
        map.put("name","小明");
        map.put("gender","男");
        String string = JSON.toJSONString(map);
        System.out.println(string);
        map.clear();
        String stringNull = JSON.toJSONString(map);
        System.out.println(stringNull);
        return string;
    }


    /**
     * Object转json字符串并格式化美化
     * @param obj
     * @param <T>
     * @return
     */
    public static <T> String objToStringPretty(T obj){
        if (obj == null){
            return null;
        }
        try {
            return obj instanceof String ? (String) obj : objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(obj);
        } catch (Exception e) {
            System.out.println("Parse object to String error");
            e.printStackTrace();
            return null;
        }
    }

    /**
     * string转object
     * @param str json字符串
     * @param clazz 被转对象class
     * @param <T>
     * @return
     */
    public static <T> T stringToObj(String str,Class<T> clazz){
        if (StringUtils.isEmpty(str) || clazz == null){
            return null;
        }
        try {
            return clazz.equals(String.class)? (T) str :objectMapper.readValue(str,clazz);
        } catch (IOException e) {
            System.out.println("Parse String to Object error");
            e.printStackTrace();
            return null;
        }
    }

    /**
     * string转object
     * @param str json字符串
     * @param typeReference 被转对象引用类型
     * @param <T>
     * @return
     */
    public static <T> T stringToObj(String str, TypeReference<T> typeReference){
        if (StringUtils.isEmpty(str) || typeReference == null){
            return null;
        }
        try {
            return (T)(typeReference.getType().equals(String.class)? str :objectMapper.readValue(str,typeReference));
        } catch (IOException e) {
            System.out.println("Parse String to Object error");
            e.printStackTrace();
            return null;
        }
    }

    /**
     * string转object 用于转为集合对象
     * @param str json字符串
     * @param collectionClass 被转集合class
     * @param elementClasses 被转集合中对象类型class
     * @param <T>
     * @return
     */
    public static <T> T stringToObj(String str,Class<?> collectionClass,Class<?>... elementClasses){
        JavaType javaType = objectMapper.getTypeFactory().constructParametricType(collectionClass,elementClasses);
        try {
            return objectMapper.readValue(str,javaType);
        } catch (IOException e) {
            System.out.println("Parse String to Object error");
            e.printStackTrace();
            return null;
        }
    }
}

启动创建的4个模块

开始测试

开始秒杀 

查看order

查看pruduct

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值