Hadoop实战-Flume之Source regex_filter(十三)
a1.sources = r1
a1.sinks = k1
a1.channels = c1 # Describe/configure the source
a1.sources.r1.type = netcat
a1.sources.r1.bind = localhost
a1.sources.r1.port = 44444
a1.sources.r1.interceptors =i5
#a1.sources.r1.interceptors.i4.type = timestamp
#a1.sources.r1.interceptors.i2.type = host
#a1.sources.r1.interceptors.i3.type = static
#a1.sources.r1.interceptors.i3.key = datacenter
#a1.sources.r1.interceptors.i3.value = NEW_YORK
#a1.sources.r1.interceptors.i1.type=regex_extractor
#a1.sources.r1.interceptors.i1.regex = (\\d):(\\d):(\\d)
#a1.sources.r1.interceptors.i1.serializers = s1 s2 s3
#a1.sources.r1.interceptors.i1.serializers.s1.name = one
#a1.sources.r1.interceptors.i1.serializers.s2.name = two
#a1.sources.r1.interceptors.i1.serializers.s3.name = three
a1.sources.r1.interceptors.i5.type=regex_filter
a1.sources.r1.interceptors.i5.regex=(\\w):(\\w):(\\w) # Describe the sink
a1.sinks.k1.type = logger # Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100 # Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
Hadoop实战-Flume之Source regex_filter(十三)的更多相关文章
- Hadoop实战-Flume之Source multiplexing(十五)
a1.sources = r1 a1.sinks = k1 k2 a1.channels = c1 c2 # Describe/configure the source a1.sources.r1.t ...
- Hadoop实战-Flume之Source replicating(十四)
a1.sources = r1 a1.sinks = k1 k2 a1.channels = c1 c2 # Describe/configure the source a1.sources.r1.t ...
- Hadoop实战-Flume之Source regex_extractor(十二)
a1.sources = r1 a1.sinks = k1 a1.channels = c1 # Describe/configure the source a1.sources.r1.type = ...
- Hadoop实战-Flume之Source interceptor(十一)(2017-05-16 22:40)
a1.sources = r1 a1.sinks = k1 a1.channels = c1 # Describe/configure the source a1.sources.r1.type = ...
- Hadoop实战-Flume之自定义Source(十八)
import java.nio.charset.Charset; import java.util.HashMap; import java.util.Random; import org.apach ...
- Hadoop实战-Flume之Sink Load-balancing(十七)
a1.sources = r1 a1.sinks = k1 k2 a1.channels = c1 # Describe/configure the source a1.sources.r1.type ...
- Hadoop实战-Flume之Sink Failover(十六)
a1.sources = r1 a1.sinks = k1 k2 a1.channels = c1 # Describe/configure the source a1.sources.r1.type ...
- Hadoop实战-Flume之Hdfs Sink(十)
a1.sources = r1 a1.sinks = k1 a1.channels = c1 # Describe/configure the source a1.sources.r1.type = ...
- Hadoop实战-Flume之Hello world(九)
环境介绍: 主服务器ip:192.168.80.128 1.准备apache-flume-1.7.0-bin.tar文件 2.上传到master(192.168.80.128)服务器上 3.解压apa ...
随机推荐
- codevs——1048 石子归并 (区间DP)
时间限制: 1 s 空间限制: 128000 KB 题目等级 : 黄金 Gold 题解 题目描述 Description 有n堆石子排成一列,每堆石子有一个重量w[i], 每次合并可以合并 ...
- Hdoj 2509 Be the Winner
Diciption Let's consider m apples divided into n groups. Each group contains no more than 100 apples ...
- [field:description /]标签如何限制字数?|DedeCms
[field:description /]标签如何限制字数? [field:description function='cn_substr(@me,80)'/]dede 里的所有标记都支持这样使用函数 ...
- window脚本命令学习(转)
批处理文件是无格式的文本文件,它包含一条或多条命令.它的文件扩展名为 .bat 或 .cmd.在命令提示下键入批处理文件的名称,或者双击该批处理文件,系统就会调用Cmd.exe按照该文件中各个命令出现 ...
- DB2和MySQL常用SQL整理
1.Truncate删除表中所有数据 truncate table USER immediate; 说明:Truncate是一个能够快速清空资料表内所有资料的SQL语法.并且能针对具有自动递增值的字段 ...
- ntp时间服务同步
第一种方式:同步到网络时间服务器 # ntpdate time.windows.com将硬件时间设置为当前系统时间. #hwclock –w 加入crontab: 30 8 * * * root /u ...
- 2016.7.12 去除mybatis-generator生成的class里的注释
用mybatis-generator自动生成代码会出现很多没必要的注释. 在配置文件里加上: 是否去除所有自动生成的文件的时间戳: 是否去除所有自动生成文件的注释: <commentGe ...
- 转:Kafka、RabbitMQ、RocketMQ消息中间件的对比 —— 消息发送性能 (阿里中间件团队博客)
from: http://jm.taobao.org/2016/04/01/kafka-vs-rabbitmq-vs-rocketmq-message-send-performance/ 引言 分布式 ...
- CentOS-7 在windows server 2012下的虚拟机安装教程
CentOS-7 在windows server 2012下的虚拟机安装教程 一.下载 CentOS-7-x86_64-DVD-1611.iso https://mirrors.aliyun.com/ ...
- Linux装mysqli.so
php 5.2.3+mysqli 安装与常见错误 总结 php 5.2.3+mysqli 安装与常见错误 总结 记得原来在编译php的已经已经加上参数--with-mysql=/usr/local ...