在外理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如log4j。运行时日志跟访问日志最大的不同是,运行时日志是多行,也就是说,连续的多行才能表达一个意思。

在filter中,加入以下代码:

filter {

  multiline {  }

}

如果能按多行处理,那么把他们拆分到字段就很容易了。

字段属性:

对于multiline插件来说,有三个设置比较重要:negate , pattern 和 what

negate:类型是boolean默认为false

pattern:

必须设置,并且没有默认值,类型为string,要匹配下则表达式

what:

必须设置,并且没有默认值,可以为previous(之前的)或next

下面看看这个例子:

# cat logstash_multiline_shipper.conf
input {
file {
path => "/apps/logstash/conf/test/c.out"
type => "runtimelog"
codec => multiline {
pattern => "^\["
negate => true
what => "previous"
}
start_position => "beginning"
sincedb_path => "/apps/logstash/logs/sincedb-access"
ignore_older =>
}
}
output {
stdout{
codec => rubydebug
}
}

说明:区配以"["开头的行,如果不是,那肯定是属于前一行的

测试数据如下:

[-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.

[-- :: DEBUG] impl.JdbcEntityInserter:- from product_category product_category

where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null

order by product_category.ORDERS asc

[-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.

[-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.

[-- :: DEBUG] impl.JdbcEntityInserter:- from product_category product_category

where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null

order by product_category.ORDERS desc

[-- :: DEBUG] impl.JdbcEntityInserter:- from product_category product_category

where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null

order by product_category.ORDERS asc

[-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.

启动logstash:

# ./../bin/logstash -f logstash_multiline_shipper.conf
Sending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties
[--09T15::,][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>, "pipeline.batch.size"=>, "pipeline.batch.delay"=>, "pipeline.max_inflight"=>}
[--09T15::,][INFO ][logstash.pipeline ] Pipeline main started
[--09T15::,][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>}

加入测试数据到被监控的log后,查看输出:

# ./../bin/logstash -f logstash_multiline_shipper.conf
Sending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties
[--09T15::,][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>, "pipeline.batch.size"=>, "pipeline.batch.delay"=>, "pipeline.max_inflight"=>}
[--09T15::,][INFO ][logstash.pipeline ] Pipeline main started
[--09T15::,][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>}
{
"path" => "/apps/logstash/conf/test/c.out",
"@timestamp" => --09T07::.403Z,
"@version" => "",
"host" => "ofs1",
"message" => "# ./../bin/logstash -f logstash_multiline_shipper.conf \nSending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties",
"type" => "runtimelog",
"tags" => [
[] "multiline"
]
}
{
"path" => "/apps/logstash/conf/test/c.out",
"@timestamp" => --09T07::.409Z,
"@version" => "",
"host" => "ofs1",
"message" => "[2016-12-09T15:16:59,173][INFO ][logstash.pipeline ] Starting pipeline {\"id\"=>\"main\", \"pipeline.workers\"=>4, \"pipeline.batch.size\"=>125, \"pipeline.batch.delay\"=>5, \"pipeline.max_inflight\"=>500}",
"type" => "runtimelog",
"tags" => []
}
{
"path" => "/apps/logstash/conf/test/c.out",
"@timestamp" => --09T07::.410Z,
"@version" => "",
"host" => "ofs1",
"message" => "[2016-12-09T15:16:59,192][INFO ][logstash.pipeline ] Pipeline main started",
"type" => "runtimelog",
"tags" => []
}

logstash之multiline插件,匹配多行日志的更多相关文章

  1. Logstash——multiline 插件,匹配多行日志

    本文内容 测试数据 字段属性 按多行解析运行时日志 把多行日志解析到字段 参考资料 在处理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如 log4j.运行时日志跟访问日志最大的 ...

  2. Logstash——multiline 插件,匹配多行日志

    本文内容 测试数据 字段属性 按多行解析运行时日志 把多行日志解析到字段 参考资料 在处理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如 log4j.运行时日志跟访问日志最大的 ...

  3. Logstash-安装logstash-filter-multiline插件(解决logstash匹配多行日志)

    ELK-logstash在搬运日志的时候会出现多行日志,普通的搬运会造成保存到ES中日志一条一条的保存,很丑,而且不方便读取,logstash-filter-multiline可以解决该问题. 接下来 ...

  4. logstash匹配多行日志

    在工作中,遇到一个问题就是日志的处理,首选的方案就是ELFK(filebeat+logstash+es+kibana) 因为之前使用过logstash采集日志的时候,非常的消耗系统的资源,所以这里我选 ...

  5. Logstash之multiline 插件

    input { stdin { codec => multiline { pattern => "^\[" negate => true what => & ...

  6. logstash 中multiline插件的用法

    input { stdin { codec =>multiline { charset=>... #可选 字符编码 max_bytes=>... #可选 bytes类型 设置最大的字 ...

  7. Python正则处理多行日志一例

    正则表达式基础知识请参阅<正则表达式基础知识>,本文使用正则表达式来匹配多行日志并从中解析出相应的信息. 假设现在有这样的SQL日志: SELECT * FROM open_app WHE ...

  8. Python正则处理多行日志一例(可配置化)

    正则表达式基础知识请参阅<正则表达式基础知识>,本文使用正则表达式来匹配多行日志并从中解析出相应的信息. 假设现在有这样的SQL日志: SELECT * FROM open_app WHE ...

  9. 写给大忙人的ELK最新版6.2.4学习笔记-Logstash和Filebeat解析(java异常堆栈下多行日志配置支持)

    接前一篇CentOS 7下最新版(6.2.4)ELK+Filebeat+Log4j日志集成环境搭建完整指南,继续对ELK. logstash官方最新文档https://www.elastic.co/g ...

随机推荐

  1. thusc滚粗记

    day0 下午到了北京,雾霾还是那么大.. 到宾馆报个到,和哥哥吃了一波饭,去不起西郊...只能去五道口了... 晚上和wyz队长见面,wyz队长好帅啊...没带手机拍照真是个错误TAT day1 今 ...

  2. 实操UNITY3D接入91SDK安卓版

    原地址:http://bbs.18183.com/thread-149758-1-1.html 本文内容为创建UNITY3D接入91SDK的DEMO的具体操作过程.本人水平有限,UNITY3D与And ...

  3. Ubuntu 虚拟机空间不足增加空间笔记

    一开始给虚拟机加空间以为只是需要在虚拟机设置里面添加就行,后面发现还得重新在虚拟机内部增加. 链接: http://pan.baidu.com/s/1c2uVcCo 密码: xid8 这只是第一步,共 ...

  4. Delphi 中的 procedure of object

    转载:http://www.cnblogs.com/ywangzi/archive/2012/08/28/2659811.html 总结:TMyEvent = procedure of object; ...

  5. linux qq下载

    下载:下载地址:http://www.ubuntukylin.com/application/show.php?lang=cn&id=279 下载后解压得到wine-qqintl文件夹,里面有 ...

  6. iframe自适应宽度

    <iframe id="course_content" style="width:100%;margin:5px 0 0;" scrolling=&quo ...

  7. java文件压缩和解压

    功能实现. package com.test; import java.io.File; import java.io.BufferedOutputStream; import java.io.Buf ...

  8. NET程序内存分析工具CLRProfiler的使用(性能测试)

    http://blog.csdn.net/wy3552128/article/details/8158938 大家都知道.net有一套自己的内存(垃圾)回收机制,除非有一些数据(方法)长期占有内存不随 ...

  9. C#之显示效果

    窗体最大化(包含任务栏): this.TopMost = true; , ); this.Size = Screen.PrimaryScreen.Bounds.Size; 窗体最大化(不包含任务栏): ...

  10. MySQL(MariaDB)的 SSL 加密复制

    背景: 在默认的主从复制过程或远程连接到MySQL/MariaDB所有的链接通信中的数据都是明文的,在局域网内连接倒问题不大:要是在外网里访问数据或则复制,则安全隐患会被放大很多.由于项目要求需要直接 ...