学习笔记参考尚硅谷
日志采集的搭建
1.1上传日志生成的jar包
上传文件到一台机器:/opt/applog
:
本地资料路径:F:\归档资料\实时数仓项目\1、日志生成文件
网络资料路径:稍等....
1.2创建maven项目
创建gmall-parent maven项目
添加gmall-logger子模块(springboot):
注意,web选项也要勾选,不然@RestController注解用不了。
@RestController
public class LoggerController {
@RequestMapping("/applog")
public String createLog(@RequestBody String log) {
System.out.println(log);
return log;
}
}
在Linux上修改vim application.properties
mock.date=2020-08-22
mock.url=http://192.168.66.1:8080/applog
java -jar gmall2020-mock-log-2020-05-10.jar
启动springboot项目,控制台会有日志输出:
1.3日志落盘
在resource下创建logback.xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<property name="LOG_HOME" value="/opt/applog/gmall2020" />
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%msg%n</pattern>
</encoder>
</appender>
<appender name="rollingFile" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${LOG_HOME}/app.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>${LOG_HOME}/app.%d{yyyy-MM-dd}.log</fileNamePattern>
</rollingPolicy>
<encoder>
<pattern>%msg%n</pattern>
</encoder>
</appender>
<!-- 将某一个包下日志单独打印日志 -->
<logger name="com.learning.gmall.gmalllogger.controller.LoggerController"
level="INFO" additivity="true">
<appender-ref ref="rollingFile" />
<appender-ref ref="console" />
</logger>
<root level="error" additivity="true">
<appender-ref ref="console" />
</root>
</configuration>
@RestController
@Slf4j
public class LoggerController {
private Logger logger = LoggerFactory.getLogger(this.getClass());
@RequestMapping("/applog")
public String createLog(@RequestBody String logString) {
System.out.println(logString);
logger.info("info:{}",logString);
return logString;
}
}
1.4发送数据到kafka
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.56</version>
</dependency>
#============== kafka ===================
# 指定kafka 代理地址,可以多个
spring.kafka.bootstrap-servers=cdh01:9092,cdh02:9092,cdh03:9092
# 指定消息key和消息体的编解码方式
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer
@RestController
@Slf4j
public class LoggerController {
private Logger logger = LoggerFactory.getLogger(this.getClass());
@Autowired
private KafkaTemplate kafkaTemplate;
@RequestMapping("/applog")
public String createLog(@RequestBody String logString) {
System.out.println(logString);
logger.info("info:{}",logString);
JSONObject jsonObject = JSON.parseObject(logString);
if(jsonObject.getString("start")!=null && jsonObject.getString("start").length()>0) {
kafkaTemplate.send("GMALL_STARTUP",logString);
}else {
kafkaTemplate.send("GMALL_EVENT",logString);
}
return logString;
}
}
启动springboot
启动消费者:
kafka-console-consumer --bootstrap-server cdh01:9092,cdh02:9092,cdh03:9092 --topic GMALL_EVENT
启动日志程序:
java -jar gmall2020-mock-log-2020-05-10.jar
结果:kafka消费者能够消费到数据。
1.5 打包项目到linux
点package,打出一个jar包
本地已经打好的包:F:\归档资料\实时数仓项目\2、gmall-logger
修改application.properties的地址:
mock.url=http://cdh01:8080/applog
启动日志生成的jar包、kafka消费者、gmall-logger-0.0.1-SNAPSHOT,检查消费者是否能够消费到数据。
1.6 nginx安装
在cdh01安装nginx
yum -y install openssl openssl-devel pcre pcre-devel zlib zlib-devel gcc gcc-c++
mkdir /opt/module
tar -zxvf nginx-1.12.2.tar.gz
cd nginx-1.12.2/
./configure --prefix=/opt/module/nginx
make && make install
cd /opt/module/nginx/sbin
./nginx
关闭命令:
./nginx -s stop
重新加载命令:
./nginx -s reload
修改nginx.conf:
upstream logserver{
server cdh01:8080 weight=1;
server cdh02:8080 weight=1;
server cdh03:8080 weight=1;
}
location / {
#root html;
#index index.html index.htm;
proxy_pass http://logserver;
proxy_connect_timeout 10;
}
./nginx -s reload
ps -ef | grep nginx
将idea中上传的jar包在cdh02、cdh03:
scp -r gmall/ root@cdh02:`pwd`
三台都执行:
java -jar /opt/gmall/gmall-logger-0.0.1-SNAPSHOT.jar >/dev/null 2>&1 &
jps
在cdh01修改启动参数:
mock.url=http://cdh01/applog
在cdh01启动日志生成的jar包:
/opt/applog/java -jar gmall2020-mock-log-2020-05-10.jar
kafka消费数据:
kafka-console-consumer --bootstrap-server cdh01:9092,cdh02:9092,cdh03:9092 --topic GMALL_EVENT
检查三台app.log中是否有数据生成:
/opt/applog/gmall2020/app.log
经过了上面的配置,我们已经配置成功了nginx,请求的日志数据会到达三台机器。