ELK实战(Springboot日志输出查找)

时间:2023-03-09 14:21:23
ELK实战(Springboot日志输出查找)

需求

  1. 把分布式系统,集群日志集中处理快速查询
  2. 搭建ELK并与springboot日志输出结合

搭建ELK

  1. 基于我前面的elasticsearch搭建博客文档docker-compose.yml基础上进行添加修改
  2. 新建docker-compose.yml文件,内容如下
version: '2'
services:
elasticsearch-central:
image: elasticsearch:5.6.4
container_name: es1
volumes:
- /root/mydocker/docker-es/es1.yml:/usr/share/elasticsearch/config/elasticsearch.yml
- /root/mydocker/docker-es/data1:/usr/share/elasticsearch/data
restart: always
environment:
- ES_CLUSTERNAME=elasticsearch
- "ES_JAVA_OPTS=-Xmx50m -Xms50m"
command: elasticsearch
ports:
- "9200:9200"
- "9300:9300"
elasticsearch-data:
image: elasticsearch:5.6.4
container_name: es2
volumes:
- /root/mydocker/docker-es/es2.yml:/usr/share/elasticsearch/config/elasticsearch.yml
- /root/mydocker/docker-es/data2:/usr/share/elasticsearch/data
restart: always
environment:
- ES_CLUSTERNAME=elasticsearch
- "ES_JAVA_OPTS=-Xmx50m -Xms50m"
command: elasticsearch
ports:
- "9201:9200"
- "9301:9300"
links:
- elasticsearch-central:elasticsearch
elasticsearch-head:
image: mobz/elasticsearch-head:5
container_name: head
restart: always
volumes:
- /root/mydocker/docker-es/head-conf/Gruntfile.js:/usr/src/app/Gruntfile.js
- /root/mydocker/docker-es/head-conf/app.js:/usr/src/app/_site/app.js
ports:
- "9100:9100"
links:
- elasticsearch-central:elasticsearch
kibana:
image: kibana
container_name: kibana
restart: always
environment:
- ELASTICSEARCH_URL=http://ip:9200
links:
- elasticsearch-central:elasticsearch
ports:
- "5601:5601"
logstash:
image: docker.elastic.co/logstash/logstash:5.5.1
command: logstash -f /etc/logstash/conf.d/logstash.conf
volumes:
- $PWD/logstash/conf.d:/etc/logstash/conf.d
- $PWD/log:/tmp
container_name: logstash551
hostname: logstash
restart: always
depends_on:
- elasticsearch-central
ports:
- "7001-7005:7001-7005"
- "4567:4567"
  1. 同级目录新建 logstash/conf.d 目录用于挂载,建logstash.conf文件,内容如下
input {
tcp {
port => 4567
codec => json_lines
}
} output {
elasticsearch {
action => "index"
hosts => ["172.16.147.200:9200","172.16.147.200:9201"]
index => "%{[appname]}"
}
}
  1. 使用 docker-compose up -d 启动全部容器
  2. 访问ip:5601 进入kibana管理界面,创建索引

    ELK实战(Springboot日志输出查找)
  3. 创建applog索引的监控,用来用kibana查询日志

    ELK实战(Springboot日志输出查找)

与springboot结合

  1. 在springboot项目中pom加入
<!-- logstash -->
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>4.11</version>
</dependency>
  1. 在logback.xml中加入
<appender name="logstash"
class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>127.0.0.1:4567</destination>
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter>
<!-- encoder必须配置,有多种可选 -->
<encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder">
<customFields>{"appname":"carer"}</customFields>
</encoder>
<connectionStrategy>
<roundRobin>
<connectionTTL>5 minutes</connectionTTL>
</roundRobin>
</connectionStrategy>
</appender>
<!-- 开发环境 -->
<springProfile name="dev">
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>${PATTERN}</pattern>
</encoder>
</appender>
<logger name="com.zhiyis" level="debug"/>
<root level="info">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="logstash" />
</root>
</springProfile>
  • 主要是上半部分,下半部分就是加个
  • 要修改的就是127.0.0.1:4567 这个中把logstash地址填上,端口别弄错~我就因为开放端口4567,这里填了网上例子中的4560,明明把logstash配置改成了4560还是不通,后来想了想才发现,我配docker容器开放的端口就只有4567
  • 把springboot项目运行起来,调几个测试接口,然后去kibana看日志

    ELK实战(Springboot日志输出查找)
  • 更高级的用法再慢慢研究了,左边一排过滤的能更精准查找,还有表单的统计等等待研究

参考

https://blog.****.net/guduyishuai/article/details/79228306

https://www.cnblogs.com/zhyg/p/6994314.html