为独特的错误解析Apache错误日志。

时间:2022-06-01 01:43:09

I have some unruly apache error logs that I would like to parse through and get unique errors.

我有一些不受约束的apache错误日志,我希望对其进行解析并获得惟一的错误。

[Fri Sep 21 06:54:24 2012] [error] [client xxx.xxx.xxx.xxx ] PHP Fatal error: <error message>, referrer: <url>

[星期五06:54:24][错误][客户xxx.xxx.xxx。PHP致命错误: , referrer:

I think I just want to chop the lines at the "PHP Fatal" section, discarding the first half and running the second half through uniq. My goal is to identify all the errors, but there are too many lines to look through each manually, due to to the many duplicate errors.

我想我只想在“PHP致命”一节中删去这几行,去掉前半部分,然后通过uniq运行后半部分。我的目标是识别所有的错误,但是由于有许多重复的错误,每个错误都有太多的行需要手动检查。

What is the best way to accomplish this?

最好的方法是什么?

3 个解决方案

#1


2  

Try grep -o '\[error\].*$' file | sort | uniq

尝试grep - o ' \[错误\]。*$'文件|排序| uniq。

This will show only thing which match the regex (rather than the whole of a line which contains the match).

这将只显示与regex匹配的内容(而不是包含匹配的整个行)。

Then sort puts similair entries next to each other, so that uniq can ensure there are no duplicates.

然后排序将雅培的条目放在一起,以便uniq可以确保没有重复。

If you want to remove the client bit before sorting / uniq'ing, use grep -o '\[error\].*$' file | sed 's/\[client.*\?\]//' | sort | uniq

如果您想在排序/ uniq'ing之前删除客户端位,请使用grep -o '\[错误\]。*$' file | sed 's/\[client.*\?/' |排序| uniq

#2


1  

To analyze /var/log/apache2/error.log use

分析/var/log/apache2/error.日志使用

sed 's^\[.*\]^^g' /var/log/apache2/error.log | uniq -c | sort -n

This will

这将

  1. cut the date at the beginning of each line like:

    在每一行的开头写上日期:

    [28-Aug-2012 11:20:24 UTC] PHP Notice: Undefined index: test in /var/www/... on line ...

    [28- august -2012 11:20:24 UTC] PHP通知:未定义索引:在/var/www/…在网上…

  2. count unique lines

    独特的行数

  3. sort them by occurrence
  4. 它们的发生

source: strictcoder.blogspot.de

来源:strictcoder.blogspot.de


If you create new logs, you could configure php beforehand:

如果您创建新的日志,您可以预先配置php:

Set ignore-repeated-errors= On in php.ini or add ini_set('ignore-repeated-errors', 1); to your php scripts

在php中设置ignore-repeated errors= On。ini或添加ini_set('ignore-repeated errors', 1);你的php脚本

This will stop php from logging an error more than once i.e. error messages caused by same line in same script.

这将阻止php不止一次地记录一个错误,即同一个脚本中同一行的错误消息。

source: php error log, how to remove the duplicates/find unique errors

源:php错误日志,如何删除重复/查找唯一错误

(but this doesen'i help on analyzing existing logs)

(但这对分析现有日志没有帮助)

#3


0  

With sed:

对话:

sed -r 's/(.*)(PHP Fatal error)/\2/' logfile | sort -u

#1


2  

Try grep -o '\[error\].*$' file | sort | uniq

尝试grep - o ' \[错误\]。*$'文件|排序| uniq。

This will show only thing which match the regex (rather than the whole of a line which contains the match).

这将只显示与regex匹配的内容(而不是包含匹配的整个行)。

Then sort puts similair entries next to each other, so that uniq can ensure there are no duplicates.

然后排序将雅培的条目放在一起,以便uniq可以确保没有重复。

If you want to remove the client bit before sorting / uniq'ing, use grep -o '\[error\].*$' file | sed 's/\[client.*\?\]//' | sort | uniq

如果您想在排序/ uniq'ing之前删除客户端位,请使用grep -o '\[错误\]。*$' file | sed 's/\[client.*\?/' |排序| uniq

#2


1  

To analyze /var/log/apache2/error.log use

分析/var/log/apache2/error.日志使用

sed 's^\[.*\]^^g' /var/log/apache2/error.log | uniq -c | sort -n

This will

这将

  1. cut the date at the beginning of each line like:

    在每一行的开头写上日期:

    [28-Aug-2012 11:20:24 UTC] PHP Notice: Undefined index: test in /var/www/... on line ...

    [28- august -2012 11:20:24 UTC] PHP通知:未定义索引:在/var/www/…在网上…

  2. count unique lines

    独特的行数

  3. sort them by occurrence
  4. 它们的发生

source: strictcoder.blogspot.de

来源:strictcoder.blogspot.de


If you create new logs, you could configure php beforehand:

如果您创建新的日志,您可以预先配置php:

Set ignore-repeated-errors= On in php.ini or add ini_set('ignore-repeated-errors', 1); to your php scripts

在php中设置ignore-repeated errors= On。ini或添加ini_set('ignore-repeated errors', 1);你的php脚本

This will stop php from logging an error more than once i.e. error messages caused by same line in same script.

这将阻止php不止一次地记录一个错误,即同一个脚本中同一行的错误消息。

source: php error log, how to remove the duplicates/find unique errors

源:php错误日志,如何删除重复/查找唯一错误

(but this doesen'i help on analyzing existing logs)

(但这对分析现有日志没有帮助)

#3


0  

With sed:

对话:

sed -r 's/(.*)(PHP Fatal error)/\2/' logfile | sort -u