以下jie皆来自官网:

1:首先版本是flume 1.8

查看版本:  bin/flume-ng version

2:配置与启动

https://flume.apache.org/FlumeUserGuide.html#configuration

Defining the flow

# list the sources, sinks and channels for the agent
<Agent>.sources = <Source>
<Agent>.sinks = <Sink>
<Agent>.channels = <Channel1> <Channel2> # set channel for source
<Agent>.sources.<Source>.channels = <Channel1> <Channel2> ... # set channel for sink
<Agent>.sinks.<Sink>.channel = <Channel1> eg2:
# list the sources, sinks and channels for the agent
agent_foo.sources = avro-appserver-src-1
agent_foo.sinks = hdfs-sink-1
agent_foo.channels = mem-channel-1 # set channel for source
agent_foo.sources.avro-appserver-src-1.channels = mem-channel-1 # set channel for sink
agent_foo.sinks.hdfs-sink-1.channel = mem-channel-1

Using environment variables in configuration files

Flume has the ability to substitute environment variables in the configuration. For example:

a1.sources = r1
a1.sources.r1.type = netcat
a1.sources.r1.bind = 0.0.0.0
a1.sources.r1.port = ${NC_PORT}
a1.sources.r1.channels = c1

NB: it currently works for values only, not for keys. (Ie. only on the “right side” of the = mark of the config lines.)

This can be enabled via Java system properties on agent invocation by setting propertiesImplementation = org.apache.flume.node.EnvVarResolverProperties.

For example::
$ NC_PORT=44444 bin/flume-ng agent –conf conf –conf-file example.conf –name a1 -Dflume.root.logger=INFO,console -DpropertiesImplementation=org.apache.flume.node.EnvVarResolverProperties

Note the above is just an example, environment variables can be configured in other ways, including being set in conf/flume-env.sh.

Configuring individual components

# properties for sources
<Agent>.sources.<Source>.<someProperty> = <someValue> # properties for channels
<Agent>.channel.<Channel>.<someProperty> = <someValue> # properties for sinks
<Agent>.sources.<Sink>.<someProperty> = <someValue>

Adding multiple flows in an agent


# list the sources, sinks and channels for the agent
<Agent>.sources = <Source1> <Source2>
<Agent>.sinks = <Sink1> <Sink2>
<Agent>.channels = <Channel1> <Channel2>


# list the sources, sinks and channels in the agent
agent_foo.sources = avro-AppSrv-source1 exec-tail-source2
agent_foo.sinks = hdfs-Cluster1-sink1 avro-forward-sink2
agent_foo.channels = mem-channel-1 file-channel-2 # flow #1 configuration
agent_foo.sources.avro-AppSrv-source1.channels = mem-channel-1
agent_foo.sinks.hdfs-Cluster1-sink1.channel = mem-channel-1 # flow #2 configuration
agent_foo.sources.exec-tail-source2.channels = file-channel-2
agent_foo.sinks.avro-forward-sink2.channel = file-channel-2

Configuring a multi agent flow


Weblog agent config:


# list sources, sinks and channels in the agent
agent_foo.sources = avro-AppSrv-source
agent_foo.sinks = avro-forward-sink
agent_foo.channels = file-channel # define the flow
agent_foo.sources.avro-AppSrv-source.channels = file-channel
agent_foo.sinks.avro-forward-sink.channel = file-channel # avro sink properties
agent_foo.sinks.avro-forward-sink.type = avro
agent_foo.sinks.avro-forward-sink.hostname = 10.1.1.100
agent_foo.sinks.avro-forward-sink.port = 10000 # configure other pieces
#...

HDFS agent config:


# list sources, sinks and channels in the agent
agent_foo.sources = avro-collection-source
agent_foo.sinks = hdfs-sink
agent_foo.channels = mem-channel # define the flow
agent_foo.sources.avro-collection-source.channels = mem-channel
agent_foo.sinks.hdfs-sink.channel = mem-channel # avro source properties
agent_foo.sources.avro-collection-source.type = avro
agent_foo.sources.avro-collection-source.bind = 10.1.1.100
agent_foo.sources.avro-collection-source.port = 10000 # configure other pieces
#...

Fan out flow


# List the sources, sinks and channels for the agent
<Agent>.sources = <Source1>
<Agent>.sinks = <Sink1> <Sink2>
<Agent>.channels = <Channel1> <Channel2> # set list of channels for source (separated by space)
<Agent>.sources.<Source1>.channels = <Channel1> <Channel2> # set channel for sinks
<Agent>.sinks.<Sink1>.channel = <Channel1>
<Agent>.sinks.<Sink2>.channel = <Channel2> <Agent>.sources.<Source1>.selector.type = replicating
# Mapping for multiplexing selector
<Agent>.sources.<Source1>.selector.type = multiplexing
<Agent>.sources.<Source1>.selector.header = <someHeader>
<Agent>.sources.<Source1>.selector.mapping.<Value1> = <Channel1>
<Agent>.sources.<Source1>.selector.mapping.<Value2> = <Channel1> <Channel2>
<Agent>.sources.<Source1>.selector.mapping.<Value3> = <Channel2>
#... <Agent>.sources.<Source1>.selector.default = <Channel2>
# list the sources, sinks and channels in the agent
agent_foo.sources = avro-AppSrv-source1
agent_foo.sinks = hdfs-Cluster1-sink1 avro-forward-sink2
agent_foo.channels = mem-channel-1 file-channel-2 # set channels for source
agent_foo.sources.avro-AppSrv-source1.channels = mem-channel-1 file-channel-2 # set channel for sinks
agent_foo.sinks.hdfs-Cluster1-sink1.channel = mem-channel-1
agent_foo.sinks.avro-forward-sink2.channel = file-channel-2 # channel selector configuration
agent_foo.sources.avro-AppSrv-source1.selector.type = multiplexing
agent_foo.sources.avro-AppSrv-source1.selector.header = State
agent_foo.sources.avro-AppSrv-source1.selector.mapping.CA = mem-channel-1
agent_foo.sources.avro-AppSrv-source1.selector.mapping.AZ = file-channel-2
agent_foo.sources.avro-AppSrv-source1.selector.mapping.NY = mem-channel-1 file-channel-2
agent_foo.sources.avro-AppSrv-source1.selector.default = mem-channel-1
# channel selector configuration
agent_foo.sources.avro-AppSrv-source1.selector.type = multiplexing
agent_foo.sources.avro-AppSrv-source1.selector.header = State
agent_foo.sources.avro-AppSrv-source1.selector.mapping.CA = mem-channel-1
agent_foo.sources.avro-AppSrv-source1.selector.mapping.AZ = file-channel-2
agent_foo.sources.avro-AppSrv-source1.selector.mapping.NY = mem-channel-1 file-channel-2
agent_foo.sources.avro-AppSrv-source1.selector.optional.CA = mem-channel-1 file-channel-2
agent_foo.sources.avro-AppSrv-source1.selector.mapping.AZ = file-channel-2
agent_foo.sources.avro-AppSrv-source1.selector.default = mem-channel-1
 

Network streams


Flume supports the following mechanisms to read data from popular log stream types, such as:


  1. Avro
  2. Thrift
  3. Syslog
  4. Netcat
  5. exec:
  6.   a1.sources.r1.type = exec
    a1.sources.r1.command = tail -f /opt/xfs/logs/tomcat/xfs-cs/logs/xfs_cs_41
  7. seq
  8. spool :暂时不清楚
  9. http:
  10. a1.sources.r1.type =org.apache.flume.source.http.HTTPSource

    a1.sources.r1.port= 5142

  11. tcp || syslog
  12. a1.sources.r1.type =syslogtcp

    a1.sources.r1.port= 5140    ####built log ==>  echo "hello idoall.org syslog" | nc localhost 5140


 
bin/flume-ng agent --conf conf --conf-file example.conf --name a1 -Dflume.root.logger=INFO,console  #####--name 就是配置中的agent

flume 使用手册的更多相关文章

  1. 【Flume学习之二】Flume 使用场景

    环境 apache-flume-1.6.0 一.多agent连接 1.node101配置 option2 # Name the components on this agent a1.sources ...

  2. 【Flume学习之一】Flume简介

    环境 apache-flume-1.6.0 Flume是分布式日志收集系统.可以将应用产生的数据存储到任何集中存储器中,比如HDFS,HBase:同类工具:Facebook Scribe,Apache ...

  3. flume+kafka+smart数据接入实施手册

    1.  概述 本手册主要介绍了,一个将传统数据接入到Hadoop集群的数据接入方案和实施方法.供数据接入和集群运维人员参考. 1.1.   整体方案 Flume作为日志收集工具,监控一个文件目录或者一 ...

  4. 分布式日志收集系统Apache Flume的设计详细介绍

    问题导读: 1.Flume传输的数据的基本单位是是什么? 2.Event是什么,流向是怎么样的? 3.Source:完成对日志数据的收集,分成什么打入Channel中? 4.Channel的作用是什么 ...

  5. DevOps之服务手册

    唠叨话 关于德语噢屁事的知识点,仅提供精华汇总,具体知识点细节,参考教程网址,如需帮助,请留言. <DevOps服务手册(Manual)> <IT资源目标化>1.设施和设备(I ...

  6. 日志采集框架Flume以及Flume的安装部署(一个分布式、可靠、和高可用的海量日志采集、聚合和传输的系统)

    Flume支持众多的source和sink类型,详细手册可参考官方文档,更多source和sink组件 http://flume.apache.org/FlumeUserGuide.html Flum ...

  7. Flume+Sqoop+Azkaban笔记

    大纲(辅助系统) 离线辅助系统 数据接入 Flume介绍 Flume组件 Flume实战案例 任务调度 调度器基础 市面上调度工具 Oozie的使用 Oozie的流程定义详解 数据导出 sqoop基础 ...

  8. Flume 概述+环境配置+监听Hive日志信息并写入到hdfs

    Flume介绍Flume是Apache基金会组织的一个提供的高可用的,高可靠的,分布式的海量日志采集.聚合和传输的系统,Flume支持在日志系统中定制各类数据发送方,用于收集数据:同时,Flume提供 ...

  9. 大数据技术之_09_Flume学习_Flume概述+Flume快速入门+Flume企业开发案例+Flume监控之Ganglia+Flume高级之自定义MySQLSource+Flume企业真实面试题(重点)

    第1章 Flume概述1.1 Flume定义1.2 Flume组成架构1.2.1 Agent1.2.2 Source1.2.3 Channel1.2.4 Sink1.2.5 Event1.3 Flum ...

随机推荐

  1. layerUi与AJAX的一种思路

    javascript:function rep(id) { layer.confirm("确定要拒绝此认证吗?", { btn: ["确定", "取消 ...

  2. fabric Node SDK进行连接

    yum install gcc-c++ npm install --unsafe-perm --registry=https://registry.npm.taobao.org chmod -R

  3. python 插入html到数据库 re.escape() PyMysql

    python 把html 网页源码插入到mysql 成功,部分汉字乱码是 mysql编码问题

  4. Windows下python库的常用安装方法

    目录:       1.pip安装(需要pip)       2.通过下载whl文件安装(需要pip)       3.在pythn官网下载安装包安装(不需要pip)   方法一:pip安装. 这是最 ...

  5. hive执行报错:Both left and right aliases encountered in JOIN 's1'

    原因:两个表join的时候,不支持两个表的字段 非相等 操作. 可以把不相等条件拿到 where语句中. 例如: right JOIN test.dim_month_date p2 on p1.mon ...

  6. PHP 文件操作类(转载)

    <?php class File { /** * 创建多级目录 * @param string $dir * @param int $mode * @return boolean */ publ ...

  7. ThinkPHP同时操作多个数据库

    除了在预先定义数据库连接和实例化的时候指定数据库连接外,我们还可以在模型操作过程中动态的切换数据库,支持切换到相同和不同的数据库类型.用法很简单, 只需要调用Model类的db方法,用法: $this ...

  8. java的原子类 AtomicInteger 实现原理是什么?

    采用硬件提供原子操作指令实现的,即CAS.每次调用都会先判断预期的值是否符合,才进行写操作,保证数据安全. CAS机制 CAS是英文单词Compare And Swap的缩写,翻译过来就是比较并替换. ...

  9. day38-常见第三方模块

    1.requests模块 2.psutil模块 3.xlrd模块 4.xlwt模块 5.Paramiko模块

  10. (转载)Android下Affinities和Task

    源文链接:http://appmem.com/archives/405 1.Activity和Task task就好像是能包含很多activity的栈. 默认情况下,一个activity启动另外一个a ...