转自:https://iwringer.wordpress.com/2013/08/07/understanding-complex-event-processing-cep-operators-with-wso2-cep-siddhi/

CEP model has many sensors. A sensor can be a real sensor (e.g. temperature sensor), some agent, or a system that support instrumentation. The sensor sends events to CEP and each event has several name-value properties.

We call events coming from the same sensor as a “stream” and give it a name. When an interesting event occurs, the sensor sends that event to the stream.

To use a stream, you need to first define them.

 define stream PizzaOrders (id string, price float, ts long, custid string)

CEP listens to one or more streams, and we can write queries telling the CEP to look for certain conditions. For writing queries, you can use following constructs.

  1. Filters
  2. Windows
  3. Joins
  4. Patterns and Sequences
  5. Event tables
  6. Partitions

Let us see what we can do with each construct.

Filter
The filter checks a condition about the property in an event. It can be a =, >, < etc., and you can create complex queries by combing multiple conditions via and, or, not etc.

Following query detect pizza orders that are small and placed too far from the store.

 select from PizzaOrders[price 1km]
insert into NBNOrders id, price, distance

Windows

An event stream can have an infinite number of events. Windows are a way to select a subset of events for further processing. You can select events in many ways: events came in a time period, last N events etc.

An output from a window is set of events. You can use it for further processing (e.g. joining event streams) or calculate aggregate function like sum and average.

We can either get output to be triggered when all events are collected or whenever a new event is added. We call the first type batch windows and second sliding windows.

For example, a window can collect all pizza orders placed in the last hour and emit the average value of the order once every hour.

 from PizzaOrders#window.time( 1h ) into HourlyOrderStats avg(price) as avgPrice  

Joins
Join operator join two event streams. The idea is to match event coming from two streams and create a new event stream.

For example, you can use join operator to join PizzaDelivery stream and PizzaOrder stream and calculate the time took to deliver each order.

 from PizzaOrder#window.time(1h) as o join PizzaDelivery as d
on o.id == d.id
insert into DeliveryTime o.id as id, d.ts-0.ts as ts

At least one side of the join must have a window. For example, in above example, we can have a one hour window for PizzaOrder (because delivery always happens after the order) where join will store the events coming in PizzaOrder for one hour and match them against delivery events. If you have two windows, the join will store events at each stream and match them against events coming to the other stream.

Patterns and Sequences 

Patterns and sequences let us match conditions that happen over time.

For example, we can use patterns to identify returning customers using the following query. Here -> denotes followed by relationship.

 from every a1 = PizzaOder
-> a2 = PizzaOder[custid=a1.custid]
insert into ReturningCustomers
a1.custid as custid a2.ts as ts

Patterns match even when there are other events in between two matching conditions. Sequences are similar but provided event sequence must exactly match the events that happened. For example, following is the same query implemented using sequences. Note here the second line is to ignore any not matching events.

 from every a1 = PizzaOder,
PizzaOder[custid!=a1.custid]*,
a2 = PizzaOder[custid=a1.custid]
insert into ReturningCustomers
a1.custid as custid a2.ts as ts

Here instead of -> relationship we use a regular expression like notation to define a sequence of conditions.

Partitions (available in upcoming 3.0 release)

Siddhi evaluates a query matching all the events in event streams used by that query. Partitions let us partition events into several groups based on some condition before evaluating queries.

For example, let say we need to find the time spent until pizza left the shop and until it is delivered. We can first partition pizza orders by orderID and then evaluate the query. It simplifies the query by a great extent.

define partition oderParition by PizzaOder.id, PizzaDone.oid, PizzaDelivered.oid
select from PizzaOder as o ->PizzaDone as p -> PizzaDelivered as d
insert into OrderTimes (p.ts-o.ts) as time2Preprae, (d.ts-p.ts) as time2Delivery
partition by oderParition

We do this for several reasons.

  1. Evaluating events separately within several partitions might be faster than matching them all together. In the latter case, we match events only within the partition.
  2. Sometimes it makes queries easier to design. For example, in the above query, partitioning let us write a query without worrying about other orders that are overlapped with the same order.
  3. Partitions let CEP runtime to distribute evaluation to multiple machines, and this can helps when scaling queries.
Event Tables (available in upcoming 3.0 release)
 
Event tables let you remember some events and use them later. You can define a table just like a stream.
 define table LatePizzaOrdersTable (ordered string, ts long, price float);
 

Then you can add events to it, delete events from it, and join those events in the table against incoming events.

For example, let’s say we need to store all late deliveries and if late delivery happened to the same customer twice we want to give them free pizza.

 from LatePizzaDeliveries insert into LatePizzaOrdersTable;

Then we can join events from event table with incoming events as follows.

from LatePizzaDeliveries as l join LatePizzaOrdersTable as t
on l.custid=t.custid AND l.ts!=t.ts
insert into FreePizzaOrders

You can also do the same using an event stream. However, event tables can be written to the disk and very useful for the long-running use cases. For example, if we do the above using an event stream stored values will be lost when we restart the server. However, values in event tables will be preserved in a disk.

Update 2017 September: You can try out above queries with  WSO2 Stream Processor,which is freely available under Apache Licence 2.

Update 2018 January: You can find a detailed discussion about operators from Stream Processing 101: From SQL to Streaming SQL in 10 Minutes

 
 
 
 

Understanding Complex Event Processing (CEP)/ Streaming SQL Operators with WSO2 CEP (Siddhi)的更多相关文章

  1. How to scale Complex Event Processing (CEP)/ Streaming SQL Systems?

    转自:https://iwringer.wordpress.com/2012/05/18/how-to-scale-complex-event-processing-cep-systems/ What ...

  2. An Overview of Complex Event Processing

    An Overview of Complex Event Processing 复杂事件处理技术概览(一) 翻译前言:我在理解复杂事件处理(CEP)方面一直有这样的困惑--为什么这种计算模式是有效的, ...

  3. FlinkCEP - Complex event processing for Flink

    https://ci.apache.org/projects/flink/flink-docs-release-1.3/dev/libs/cep.html 首先目的是匹配pattern sequenc ...

  4. Stream Processing 101: From SQL to Streaming SQL in 10 Minutes

    转自:https://wso2.com/library/articles/2018/02/stream-processing-101-from-sql-to-streaming-sql-in-ten- ...

  5. An Overview of Complex Event Processing2

    An Overview of Complex Event Processing 翻译前言:感觉作者有点夸夸其谈兼絮絮叨叨,但文章还是很有用的.原文<An Overview of Complex ...

  6. Introducing KSQL: Streaming SQL for Apache Kafka

    Update: KSQL is now available as a component of the Confluent Platform. I’m really excited to announ ...

  7. Flafka: Apache Flume Meets Apache Kafka for Event Processing

    The new integration between Flume and Kafka offers sub-second-latency event processing without the n ...

  8. KSQL: Streaming SQL for Apache Kafka

    Few weeks back, while I was enjoying my holidays in the south of Italy, I started receiving notifica ...

  9. OpenGL的GLUT事件处理(Event Processing)窗口管理(Window Management)函数[转]

    GLUT事件处理(Event Processing)窗口管理(Window Management)函数 void glutMainLoop(void) 让glut程序进入事件循环.在一个glut程序中 ...

随机推荐

  1. Cracking The Coding Interview 3.2

    //How would you design a stack which, in addition to push and pop, also has a function min which ret ...

  2. Cracking The Coding Interview 1.8

    //Assume you have a method isSubstring which checks if one word is a substring of another. //Given t ...

  3. MicroOrm.Dapper.Repositories 的使用

    https://github.com/geffzhang/MicroOrm.Dapper.Repositories 1.特性标记都是要引用: System.ComponentModel.DataAnn ...

  4. [SCOI2005]栅栏

    这个题...只能说比较水... 排序后,算一个前缀和,二分dfs查找答案...加上两个剪枝就过了...QVQ 我只能刷这种水题...我太菜了...QVQ #include<iostream> ...

  5. webpack进阶构建项目(一):1.理解webpack加载器

    1.理解webpack加载器 webpack的设计理念,所有资源都是“模块”,webpack内部实现了一套资源加载机制,这与Requirejs.Sea.js.Browserify等实现有所不同. We ...

  6. ylz简单增删改查实现

    首先用generator实现三个文档 分别是实体类(domain文件夹下) xml配置和dao层文件. resource文件夹下 注意位置事先写死了,要根据要求文档来定义位置. package com ...

  7. <Flume><Source Code><Flume源码阅读笔记>

    Overview source采集的日志首先会传入ChannelProcessor, 在其内首先会通过Interceptors进行过滤加工,然后通过ChannelSelector选择channel. ...

  8. 2019-04-04-day026-模块和包的导入

    课前 估分 重新做题 思考为什么 积累问题 提前了解你的情况 40分以下 选课系统 按照反射那个版本 把反射的逻辑看明白 接着把逻辑填完整 用上pickle logging写日志 进阶 : 用软件开发 ...

  9. flex 布局 出滚动条的操作

    flex布局也是可以初横向滚动条的哦, 设置 flex-wrap:nowrap ,然后横向的固定宽度超过100% 则出滚动条

  10. 【Python】socket编程-3

    . SocketServer最简单的使用方法: () 创建一个Handler类,继承自BaseRequestHandler,重写其handle(),在该方法中完成对请求的处理. () 实例化一个Ser ...