A session of Log Collect, Retrieval and Analysis using ELK Stack

时间:2021-04-11 08:12:32

Motivation

运维过程中对问题的定位往往需要跟踪和定位日志。分布式和弹性计算的引入,使得日志的定位和分析变得越发复杂。

本次实践主要的目的是考察现有成熟的日志收集、检索和分析方案:Kafka+ELK。

Progress

  • 20160324 init

  • 20160329
    build playground of Logstash, Elastissearch, Kibana,对Log4j、Logback的文件执行相应处理,这已经满足了业务需求。
    对Kafka的考察还是纳入消息处理框架中,这里不再记录。
    同时,因日志限于资质原因,这里不展示Kibana的查询和统计界面。

Outline

  • 0 参考
  • 1 Logstash
  • 2 Elasticsearch
  • 3 Kibana4
  • 参考资料

0 参考

0.1 Log Management for Spring Boot Applications with Logstash, Elasticsearch and Kibana

multiline

grok

代码:/home/zhoujiagen/workspace/github/elk-example

0.2 log4j Input plugin

0.3 multiline Codec plugin

0.4 grok Filter Plugin

内建的patterns: https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns

0.5 kafka Input plugin

0.6 date Filter plugin

1 Logstash

bin/logstash -f config/log4j.conf

Log4j 1.x的配置

#log4j.rootLogger=INFO, console
log4j.rootLogger=INFO, console, logstash

### Console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSS} [%p] [%t] %l => %m%n

### SocketAppender
log4j.appender.logstash=org.apache.log4j.net.SocketAppender
log4j.appender.logstash.Port=4560
log4j.appender.logstash.RemoteHost=localhost
log4j.appender.logstash.ReconnectionDelay=60000
log4j.appender.logstash.LocationInfo=true

log4j.conf

input {
  log4j {
    type => "log4j-logstash"
    port => 4560
  }
}

filter {
   multiline {
      pattern => "^(%{TIMESTAMP_ISO8601})"
      negate => true
      what => "previous"
   }
   grok {
      "message" => "%{TIMESTAMP_ISO8601:timestamp} \[%{LOGLEVEL:loglevel}\] \[%{WORD:threadname}\] %{JAVACLASS:class}\.%{WORD:method}\(%{JAVAFILE:file}\:%{NUMBER:line}\) => %{GREEDYDATA:logmessage}"
   }
}

output{
  elasticsearch { hosts => "localhost:9200" }
  stdout { codec => rubydebug }
}

Logback的配置

#########################################################
### 输入
#########################################################
input{
  stdin{}
  file{
    path => ["/home/zhoujiagen/filecenter/logs/app*.log"]
  }
}

#########################################################
### 过滤
###
### grok 可以使用测试链接:http://grokdebug.herokuapp.com/
#########################################################
filter{
  multiline {
      pattern => "^(%{TIMESTAMP_ISO8601})"
      negate => true
      what => "previous"
   }
   grok {
      # Do multiline matching with (?m) as the above mutliline filter may add newlines to the log messages.
      match => [ "message", "(?m)^%{TIMESTAMP_ISO8601:logtime} \[%{PROG:threadname}\] %{LOGLEVEL:loglevel} %{SPACE} %{JAVACLASS:classname}\:%{NUMBER:codeline} - %{GREEDYDATA:logmessage}" ]
   }
}

#########################################################
### 输出
#########################################################
output{
  elasticsearch { hosts => "localhost:9200" }
  stdout{ codec=>rubydebug }
}

2 Elasticsearch

# instance 1
~/devtools/elasticsearch-2.2.1$ bin/elasticsearch

# instance 2
~/devtools/elasticsearch-2.2.1$ bin/elasticsearch
# or
~/devtools/elasticsearch-2.2.1_instance2$ bin/elasticsearch

3 Kibana4

bin/kibana

assess through: http://localhost:5601/

参考资料

-1 ELK官方文档

elastic Docs

Logstash Reference 2.2

Elasticsearch Reference 2.2

Kibana Reference 4.4

0 ELK介绍

ELKstack 中文指南

1 ELK安装

How To Install Elasticsearch, Logstash, and Kibana (ELK Stack) on Ubuntu 14.04

2 ELK使用

Centralized logging with an ELK stack (Elasticsearch-Logstash-Kibana) on Ubuntu

Log Management for Spring Boot Applications with Logstash, Elasticsearch and Kibana

使用 ELK Stack 集中 IBM Bluemix 应用程序日志