Logstash——multiline 插件,匹配多行日志
本文内容
- 测试数据
- 字段属性
- 按多行解析运行时日志
- 把多行日志解析到字段
- 参考资料
在处理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如 log4j。运行时日志跟访问日志最大的不同是,运行时日志是多行,也就是说,连续的多行才能表达一个意思。
本文主要说明,如何用 multiline 出来运行日志。
如果能按多行处理,那么把他们拆分到字段就很容易了。
迁移到:http://www.bdata-cap.com/newsinfo/1712113.html
测试数据
[16-04-12 03:40:01 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.
[16-04-12 03:40:02 DEBUG] impl.JdbcEntityInserter:- from product_category product_category
where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null
order by product_category.ORDERS asc
[16-04-12 03:40:03 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.
[16-04-12 03:40:04 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.
[16-04-12 03:40:05 DEBUG] impl.JdbcEntityInserter:- from product_category product_category
where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null
order by product_category.ORDERS desc
[16-04-12 03:40:06 DEBUG] impl.JdbcEntityInserter:- from product_category product_category
where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null
order by product_category.ORDERS asc
[16-04-12 03:40:07 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.
测试是在7秒内发生的(当然是假数据)。可以看到,第二、五、六秒的日志是多行的,有条SQL语句。其他是单行的。
字段属性
对 multiline 插件来说,有三个设置比较重要:negate、pattern 和 what。
negate
类型是 boolean
默认为
false
否定正则表达式(如果没有匹配的话)。
pattern
必须设置
类型为 string
没有默认值
要匹配的正则表达式。
what
必须设置
可以为
previous 或
next
没有默认值
如果正则表达式匹配了,那么该事件是属于下一个或是前一个事件?
按多行解析运行时日志
示例1:若配置文件如下所示,
input {
file{
path=>"/usr/local/elk/logstash/logs/c.out"
type=>"runtimelog"
codec=> multiline {
pattern => "^\["
negate => true
what => "previous"
}
start_position=>"beginning"
sincedb_path=>"/usr/local/elk/logstash/sincedb-access"
ignore_older=>0
}
}
output{
stdout{
codec=>rubydebug
}
}
说明:匹配以“[”开头的行,如果不是,那肯定是属于前一行的。
解析结果如下所示,能解析出6个JSON:
{
"@timestamp" => "2016-06-01T04:37:43.147Z",
"message" => "[16-04-12 03:40:01 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T04:37:43.152Z",
"message" => "[16-04-12 03:40:02 DEBUG] impl.JdbcEntityInserter:- from product_category product_category\nwhere product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS asc",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T04:37:43.152Z",
"message" => "[16-04-12 03:40:03 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T04:37:43.155Z",
"message" => "[16-04-12 03:40:04 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T04:37:43.157Z",
"message" => "[16-04-12 03:40:05 DEBUG] impl.JdbcEntityInserter:- from product_category product_category\nwhere product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS desc",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T04:37:43.159Z",
"message" => "[16-04-12 03:40:06 DEBUG] impl.JdbcEntityInserter:- from product_category product_category\nwhere product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS asc",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
解析时,最后一行日志,不会解析。只有当再追加一条日志时,才会解析最后一条日志。
示例2:若将配置文件修改为,
input {
file{
path=>"/usr/local/elk/logstash/logs/c.out"
type=>"runtimelog"
codec=>multiline {
pattern => "^\["
negate => true
what => "next"
}
start_position=>"beginning"
sincedb_path=>"/usr/local/elk/logstash/sincedb-access"
ignore_older=>0
}
}
output{
stdout{
codec=>rubydebug
}
}
解析结果为,能解析出7个JSON:
{
"@timestamp" => "2016-06-01T04:40:43.232Z",
"message" => "[16-04-12 03:40:01 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T04:40:43.237Z",
"message" => "[16-04-12 03:40:02 DEBUG] impl.JdbcEntityInserter:- from product_category product_category",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T04:40:43.238Z",
"message" => "where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS asc\n[16-04-12 03:40:03 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T04:40:43.239Z",
"message" => "[16-04-12 03:40:04 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T04:40:43.244Z",
"message" => "[16-04-12 03:40:05 DEBUG] impl.JdbcEntityInserter:- from product_category product_category",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T04:40:43.245Z",
"message" => "where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS desc\n[16-04-12 03:40:06 DEBUG] impl.JdbcEntityInserter:- from product_category product_category",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T04:40:43.249Z",
"message" => "where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS asc\n[16-04-12 03:40:07 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
示例3:若将配置文件修改为,
codec=>multiline {
pattern => "^\["
negate => false
what => "previous"
}
则解析结果为:
{
"@timestamp" => "2016-06-01T05:38:50.853Z",
"message" => "[16-04-12 03:40:01 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.\n[16-04-12 03:40:02 DEBUG] impl.JdbcEntityInserter:- from product_category product_category",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T05:38:50.856Z",
"message" => "where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T05:38:50.858Z",
"message" => "order by product_category.ORDERS asc\n[16-04-12 03:40:03 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.\n[16-04-12 03:40:04 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.\n[16-04-12 03:40:05 DEBUG] impl.JdbcEntityInserter:- from product_category product_category",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T05:38:50.860Z",
"message" => "where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T05:38:50.861Z",
"message" => "order by product_category.ORDERS desc\n[16-04-12 03:40:06 DEBUG] impl.JdbcEntityInserter:- from product_category product_category",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
{
"@timestamp" => "2016-06-01T05:38:50.863Z",
"message" => "where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog"
}
把多行日志解析到字段
配置文件如下所示:
input {
file{
path=>"/usr/local/elk/logstash/logs/c.out"
type=>"runtimelog"
codec=>multiline {
pattern => "^\["
negate => true
what => "previous"
}
start_position=>"beginning"
sincedb_path=>"/usr/local/elk/logstash/sincedb-access"
ignore_older=>0
}
}
filter {
grok {
match=>["message","\[%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level}\] %{GREEDYDATA:msg}"]
}
}
output{
stdout{
codec=>rubydebug
}
}
解析后结果:
{
"@timestamp" => "2016-06-01T06:33:26.426Z",
"message" => "[16-04-12 03:40:01 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog",
"timestamp" => "16-04-12 03:40:01",
"level" => "DEBUG",
"msg" => "model.MappingNode:- ['/store/shopclass'] matched over."
}
{
"@timestamp" => "2016-06-01T06:33:26.485Z",
"message" => "[16-04-12 03:40:02 DEBUG] impl.JdbcEntityInserter:- from product_category product_category\nwhere product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS asc",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog",
"timestamp" => "16-04-12 03:40:02",
"level" => "DEBUG",
"msg" => "impl.JdbcEntityInserter:- from product_category product_category\nwhere product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS asc"
}
{
"@timestamp" => "2016-06-01T06:33:26.491Z",
"message" => "[16-04-12 03:40:03 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog",
"timestamp" => "16-04-12 03:40:03",
"level" => "DEBUG",
"msg" => "model.MappingNode:- ['/store/shopclass'] matched over."
}
{
"@timestamp" => "2016-06-01T06:33:26.492Z",
"message" => "[16-04-12 03:40:04 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.",
"@version" => "1",
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog",
"timestamp" => "16-04-12 03:40:04",
"level" => "DEBUG",
"msg" => "model.MappingNode:- ['/store/shopclass'] matched over."
}
{
"@timestamp" => "2016-06-01T06:33:26.494Z",
"message" => "[16-04-12 03:40:05 DEBUG] impl.JdbcEntityInserter:- from product_category product_category\nwhere product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS desc",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog",
"timestamp" => "16-04-12 03:40:05",
"level" => "DEBUG",
"msg" => "impl.JdbcEntityInserter:- from product_category product_category\nwhere product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS desc"
}
{
"@timestamp" => "2016-06-01T06:33:26.495Z",
"message" => "[16-04-12 03:40:06 DEBUG] impl.JdbcEntityInserter:- from product_category product_category\nwhere product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS asc",
"@version" => "1",
"tags" => [
[0] "multiline"
],
"path" => "/usr/local/elk/logstash/logs/c.out",
"host" => "vcyber",
"type" => "runtimelog",
"timestamp" => "16-04-12 03:40:06",
"level" => "DEBUG",
"msg" => "impl.JdbcEntityInserter:- from product_category product_category\nwhere product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null\norder by product_category.ORDERS asc"
}
参考资料
Logstash——multiline 插件,匹配多行日志的更多相关文章
- logstash之multiline插件,匹配多行日志
在外理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如log4j.运行时日志跟访问日志最大的不同是,运行时日志是多行,也就是说,连续的多行才能表达一个意思. 在filter中,加 ...
- Logstash——multiline 插件,匹配多行日志
本文内容 测试数据 字段属性 按多行解析运行时日志 把多行日志解析到字段 参考资料 在处理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如 log4j.运行时日志跟访问日志最大的 ...
- Logstash-安装logstash-filter-multiline插件(解决logstash匹配多行日志)
ELK-logstash在搬运日志的时候会出现多行日志,普通的搬运会造成保存到ES中日志一条一条的保存,很丑,而且不方便读取,logstash-filter-multiline可以解决该问题. 接下来 ...
- logstash匹配多行日志
在工作中,遇到一个问题就是日志的处理,首选的方案就是ELFK(filebeat+logstash+es+kibana) 因为之前使用过logstash采集日志的时候,非常的消耗系统的资源,所以这里我选 ...
- Python正则处理多行日志一例
正则表达式基础知识请参阅<正则表达式基础知识>,本文使用正则表达式来匹配多行日志并从中解析出相应的信息. 假设现在有这样的SQL日志: SELECT * FROM open_app WHE ...
- Python正则处理多行日志一例(可配置化)
正则表达式基础知识请参阅<正则表达式基础知识>,本文使用正则表达式来匹配多行日志并从中解析出相应的信息. 假设现在有这样的SQL日志: SELECT * FROM open_app WHE ...
- 写给大忙人的ELK最新版6.2.4学习笔记-Logstash和Filebeat解析(java异常堆栈下多行日志配置支持)
接前一篇CentOS 7下最新版(6.2.4)ELK+Filebeat+Log4j日志集成环境搭建完整指南,继续对ELK. logstash官方最新文档https://www.elastic.co/g ...
- ELK学习笔记之Logstash和Filebeat解析对java异常堆栈下多行日志配置支持
0x00 概述 logstash官方最新文档.假设有几十台服务器,每台服务器要监控系统日志syslog.tomcat日志.nginx日志.mysql日志等等,监控OOM.内存低下进程被kill.ngi ...
- logstash 安装插件multiline
一.安装multiline 在使用elk 传输记录 java 日志时,如下 一个java的报错 在elk中会按每一行 产生多条记录,不方便查阅 这里修改配置文件 使用 multiline 插件 ...
随机推荐
- html及css常用的单词
string? 字符串 boolean? 布尔数学体系,1,0 node? 节点,结 log? ? 日志,记录 console? 控制台 alert? ? 警报 document? 文档 write? ...
- Java 第14章 字符串
字符串 基本数据类型和引用数据类型作为方法参数 ,在传递时有什么不同之处. 答:基本数据类型按值传递,相当于复制了一份过去. 引用数据类型是指向引用 内存地址,两个地方 根据地址使用同一份数据,如被更 ...
- Jquery DOM元素的方法
jQuery DOM 元素方法 函数 描述 .get() 获得由选择器指定的 DOM 元素. .index() 返回指定元素相对于其他指定元素的 index 位置. .size() 返回被 jQuer ...
- 用SQLSERVER里的bcp命令或者bulkinsert命令也可以把dat文件导入数据表
用SQLSERVER里的bcp命令或者bulkinsert命令也可以把dat文件导入数据表 下面的内容的实验环境我是在SQLSERVER2005上面做的 之前在园子里看到两篇文章<C# 读取纯真 ...
- leveldb.net对象读写封装
leveldb是一个非常高效的可嵌入式K-V数据库,在.NET下有着基于win实现的包装leveldb.net;不过leveldb.net只提供了基于byte[]和string的处理,这显然会对使用的 ...
- C#自学系列 - 开篇
2014年即将过去,这一年我参加了不少面试,被问到了很多问题.回来总结下发现自己确实在基础方面有着很多的不足,还有很多东西是我不知道的.遂在下半年购入书籍若干,并系统的加以学习.我目前在看的书是Jon ...
- Unity3D引用dll打包发布的问题及解决
今年我们开始使用Unity3D开发MMORPG,脚本语言使用C#,这样我们就可以使用以往积累的许多类库.但是,在U3D中使用.NET dll的过程并不是那么顺利,比如我们今天遇到的这种问题. 一.问题 ...
- word2vec模型原理与实现
word2vec是Google在2013年开源的一款将词表征为实数值向量的高效工具. gensim包提供了word2vec的python接口. word2vec采用了CBOW(Continuous B ...
- 转:判断DATASET是否为空
http://blog.sina.com.cn/jiangshuqin2010 1,if(ds == null) 这是判断内存中的数据集是否为空,说明DATASET为空,行和列都不存在!! 2,if( ...
- [安卓] 3、EditText使用小程序
这里比较简单,看下面代码就能知道了:在按钮的点击事件时用String str = et.getText().toString();获取文本内容. public class MainActivity e ...