[转帖]Welcome to the di-kafkameter wiki!
https://github.com/rollno748/di-kafkameter/wiki#producer-elements
Introduction
DI-Kafkameter is a JMeter plugin that allows you to test and measure the performance of Apache Kafka.
Components
The DI-Kafkameter comprises of 2 components, which are
- Producer Component
- Kafka Producer Config
- Kafka Producer Sampler
- Consumer Component
- Kafka Consumer Config
- Kafka Consumer Sampler
Producer Component (Publish a Message to a Topic)
To publish/send a message to a Kafka topic you need to add producer components to the testplan.
- The Kafka Producer config is responsible to hold the connection information, which includes security and other properties required to talk to the broker.
- The Kakfa Producer Sampler helps to send messages to the topic with the connection established using Config element.

Right click on Test Plan -> Add -> Config Element -> Kafka Producer Config
Provide a Variable name to export the connection object (Which will be used in Sampler element)
Provide the Kafka connection configs (list of Brokers with comma separated)
Provide a Client ID (Make it unique, to define where you sending the message from)
Select the right security to connect to brokers (This will be completely based on how Kafka security is defined)
For JAAS Security, You need to add the below key and value to the Additional Properties
Config key: sasl.jaas.config
Config value: org.apache.kafka.common.security.scram.ScramLoginModule required username="<USERNAME>" password="<PASSWORD>";

Right click on Test Plan -> Add -> Sampler -> Kafka Producer Sampler
Use the same Variable name which was defined in the config element
Define the topic name where you want to send the message (Case sensitive)
Kafka Message - The Original message which needs to be pushed to the topic
Partition String (Optional) - This option helps you to post messages to particular partition by providing the partition number
Message Headers (Optional) - This helps in adding headers to the messages which are being pushed (Supports more than one header)
Consumer Component (Read Message from a topic)
To Consume/Read a message from a Kafka topic you need to add Consumer components to the testplan.
- The Kafka Consumer config is responsible to hold the connection information, which includes security and other properties required to talk to the broker.
- The Kafka Consumer Sampler helps to read messages from the topic with the connection established using Config element.

Right click on Test Plan -> Add -> Config Element -> Kafka Consumer Config
Provide a Variable name to export the connection object (Which will be used in Sampler element)
Provide the Kafka connection configs (list of Brokers with comma separated)
Provide a Group ID (Make it unique, to define the group your consumer belongs to)
Define the topic name where you want to send the message (Case sensitive)
No Of Messages to Poll - This allows you to define the number of messages to read within a request (Defaults to 1)
Select the right security to connect to brokers (This will be completely based on how Kafka security is defined)
Auto Commit - This will set the offset as read, once the message is consumed
Select the right security to connect to brokers (This will be completely based on how Kafka security is defined)
For JAAS Security, You need to add the below key and value to the Additional Properties
Config key: sasl.jaas.config
Config value: org.apache.kafka.common.security.scram.ScramLoginModule required username="<USERNAME>" password="<PASSWORD>";

Right click on Test Plan -> Add -> Sampler -> Kafka Consumer Sampler
Use the same Variable name which was defined in the config element
Poll timeout - This helps to set the polling timeout for consumer to read from topic (Defaults to 100 ms)
Commit Type - Defines the Commit type (Sync/Async)
Producer Properties
Supported Producer properties which can be added to Additional Properties field.
| Property | Available Options | Default |
|---|---|---|
| acks | [0, 1, -1] | 1 |
| batch.size | positive integer | 16384 |
| bootstrap.servers | comma-separated host:port pairs | localhost:9092 |
| buffer.memory | positive long | 33554432 |
| client.id | string | "" |
| compression.type | [none, gzip, snappy, lz4, zstd] | none |
| connections.max.idle.ms | positive long | 540000 |
| delivery.timeout.ms | positive long | 120000 |
| enable.idempotence | [true, false] | false |
| interceptor.classes | fully-qualified class names | [] |
| key.serializer | fully-qualified class name | org.apache.kafka.common.serialization.StringSerializer |
| linger.ms | non-negative integer | 0 |
| max.block.ms | non-negative long | 60000 |
| max.in.flight.requests.per.connection | positive integer | 5 |
| max.request.size | positive integer | 1048576 |
| metadata.fetch.timeout.ms | positive long | 60000 |
| metadata.max.age.ms | positive long | 300000 |
| partitioner.class | fully-qualified class name | org.apache.kafka.clients.producer.internals.DefaultPartitioner |
| receive.buffer.bytes | positive integer | 32768 |
| reconnect.backoff.ms | non-negative long | 50 |
| request.timeout.ms | positive integer | 30000 |
| retries | non-negative integer | 0 |
| sasl.jaas.config | string | null |
| sasl.kerberos.kinit.cmd | string | /usr/bin/kinit |
| sasl.kerberos.min.time.before.relogin | positive long | 60000 |
| sasl.kerberos.service.name | string | null |
| sasl.mechanism | [GSSAPI, PLAIN, SCRAM-SHA-256, SCRAM-SHA-512] | GSSAPI |
| security.protocol | [PLAINTEXT, SSL, SASL_PLAINTEXT, SASL_SSL] | PLAINTEXT |
| sender.flush.timeout.ms | non-negative long | 0 |
| send.buffer.bytes | positive integer | 131072 |
| value.serializer | fully-qualified class name | org.apache.kafka.common.serialization.StringSerializer |
Consumer Properties
Supported Consumer properties which can be added to Additional Properties field.
| Property | Available Options | Default |
|---|---|---|
| auto.commit.interval.ms | positive integer | 5000 |
| auto.offset.reset | [earliest, latest, none] | latest |
| bootstrap.servers | comma-separated host:port pairs | localhost:9092 |
| check.crcs | [true, false] | true |
| client.id | string | "" |
| connections.max.idle.ms | positive long | 540000 |
| enable.auto.commit | [true, false] | true |
| exclude.internal.topics | [true, false] | true |
| fetch.max.bytes | positive long | 52428800 |
| fetch.max.wait.ms | non-negative integer | 500 |
| fetch.min.bytes | non-negative integer | 1 |
| group.id | string | "" |
| heartbeat.interval.ms | positive integer | 3000 |
| interceptor.classes | fully-qualified class names | [] |
| isolation.level | [read_uncommitted, read_committed] | read_uncommitted |
| key.deserializer | fully-qualified class name | org.apache.kafka.common.serialization.StringDeserializer |
| max.partition.fetch.bytes | positive integer | 1048576 |
| max.poll.interval.ms | positive long | 300000 |
| max.poll.records | positive integer | 500 |
| metadata.max.age.ms | positive long | 300000 |
| metadata.fetch.timeout.ms | positive long | 60000 |
| receive.buffer.bytes | positive integer | 32768 |
| reconnect.backoff.ms | non-negative long | 50 |
| request.timeout.ms | positive integer | 30000 |
| retry.backoff.ms | non-negative long | 100 |
| sasl.jaas.config | string | null |
| sasl.kerberos.kinit.cmd | string | /usr/bin/kinit |
| sasl.kerberos.min.time.before.relogin | positive long | 60000 |
| sasl.kerberos.service.name | string | null |
| sasl.mechanism | [GSSAPI, PLAIN, SCRAM-SHA-256, SCRAM-SHA-512] | GSSAPI |
| security.protocol | [PLAINTEXT, SSL, SASL_PLAINTEXT, SASL_SSL] | PLAINTEXT |
| send.buffer.bytes | positive integer | 131072 |
| session.timeout.ms | positive integer | 10000 |
| value.deserializer | fully-qualified class name | org.apache.kafka.common.serialization.StringDeserializer |
[转帖]Welcome to the di-kafkameter wiki!的更多相关文章
- 用Zim替代org-mode?
三年前我玩过Zim,当时还写了一篇<Zim - 普通人的Org-mode>,当时还说我还是会坚持使用emacs org-mode.但最近我又在考虑是不是回头用Zim来写博客文章.整理知识库 ...
- jmeter取样器之KafkaProducerSampler(往kafka插入数据)
项目背景 性能测试场景中有一个业务场景的数据抽取策略是直接使用kafka队列,该场景需要准备的测试数据是kafka队列里的数据,故需要实现插入数据到kafka队列,且需要实现控制每分钟插入多少条数据. ...
- 【好文转帖】控制反转(IOC)和依赖注入(DI)的区别
IOC inversion of control 控制反转 DI Dependency Injection 依赖注入 要理解这两个概念,首先要搞清楚以下几个问题: 参与者都有谁? 依赖:谁 ...
- [转帖]什么是IOC(控制反转)、DI(依赖注入)
什么是IOC(控制反转).DI(依赖注入) 2018-08-22 21:29:13 Ming339456 阅读数 20642 原文地址(摘要了部分内容):https://blog.csdn.net ...
- Ninject之旅之一:理解DI
摘要: DI(IoC)是当前软件架构设计中比较时髦的技术.DI(IoC)可以使代码耦合性更低,更容易维护,更容易测试.现在有很多开源的依赖反转的框架,Ninject是其中一个轻量级开源的.net DI ...
- [Android分享] 【转帖】Android ListView的A-Z字母排序和过滤搜索功能
感谢eoe社区的分享 最近看关于Android实现ListView的功能问题,一直都是小伙伴们关心探讨的Android开发问题之一,今天看到有关ListView实现A-Z字母排序和过滤搜索功能 ...
- MVC 5 + EF6 完整教程15 -- 使用DI进行解耦
如果大家研究一些开源项目,会发现无处不在的DI(Dependency Injection依赖注入). 本篇文章将会详细讲述如何在MVC中使用Ninject实现DI 文章提纲 场景描述 & 问题 ...
- [转帖][分享] 关于系统DIY--by 原罪
http://wuyou.net/forum.php?mod=viewthread&tid=399277&extra=page%3D1 前几天我发了一个帖子<Windows组件w ...
- AngularJs学习笔记--Dependency Injection(DI,依赖注入)
原版地址:http://code.angularjs.org/1.0.2/docs/guide/di 一.Dependency Injection(依赖注入) 依赖注入(DI)是一个软件设计模式,处理 ...
- 原创:Javascript DI!Angular依赖注入的实现原理
DI是Angular的特色功能,而在Angular 2.0的计划中,DI将成为一个独立的模块,参见 https://github.com/angular/di.js 这意味着它也有机会被用于nodej ...
随机推荐
- BFS(二)转动转盘锁
对应 LeetCode 752.转动转盘锁 ### 问题定义 你有一个带有四个圆形拨轮的转盘锁.每个拨轮都有10个数字: '0', '1', '2', '3', '4', '5', '6', '7', ...
- CSS3学习笔记-字体属性
在CSS3中,可以使用字体属性来控制网页中文本的样式和排版.以下是常用的字体属性: font-family 该属性用于指定网页中的文本所使用的字体.我们可以通过使用通用的字体名称,或者直接使用字体名称 ...
- maven系列:聚合与继承
目录 一.聚合 创建Maven模块,设置打包类型为pom 设置当前聚合工程所包含的子模块名称 二. 继承 问题导入 创建Maven模块,设置打包类型为pom 在父工程的pom文件中配置依赖关系(子工程 ...
- 容器中域名解析流程以及不同dnsPolicy对域名解析影响
本文分享自华为云社区<容器中域名解析流程以及不同dnsPolicy对域名解析影响>,作者:可以交个朋友 . 一.coreDNS背景 部署在kubernetes集群中的容器业务通过coreD ...
- 看故事学Redis:再不懂,我怀疑你是假个开发
摘要:还不懂Redis?看完这个故事就明白了! 本文转载自博客园社区<还不懂Redis?看完这个故事就明白了!>,作者:轩辕之风 我是Redis 你好,我是Redis,一个叫Antirez ...
- 云小课|ModelArts Pro 视觉套件:零代码构建视觉AI应用
阅识风云是华为云信息大咖,擅长将复杂信息多元化呈现,其出品的一张图(云图说).深入浅出的博文(云小课)或短视频(云视厅)总有一款能让您快速上手华为云.更多精彩内容请单击此处. 摘要:ModelArts ...
- 1024程序员节献礼,火山引擎ByteHouse带来三重产品福利
更多技术交流.求职机会,欢迎关注字节跳动数据平台微信公众号,回复[1]进入官方交流. 随着信息技术飞速发展,互联网.Web3.物联网.人工智能相继出现. 在这近三十年的高速发展中,"程序 ...
- Solon2 开发之IoC,三、注入或手动获取 Bean
1.如何注入Bean? 先了解一下Bean生命周期的简化版: 运行构建函数 尝试字段注入(有时同步注入,没时订阅注入.不会有相互依赖而卡住的问题) @Init 函数(是在容器初始化完成后才执行) .. ...
- Axure 选中同意复选框后,改变登录按钮的颜色
登录时,当选中 同意用户协议后 复选框,登录按钮变颜色 登录按钮 设置登录按钮的选中颜色 同意协议 当同意复选框被选中后,设置 登录 的选中状态为 真,这时候触发登录按钮改变颜色, 取消勾选后,登录按 ...
- 微服务网关 —— SpringCloud Netflix Zuul
概述 Spring Cloud Zuul 是 Spring Cloud Netflix 子项目的核心组件之一,可以作为微服务架构中的 API 网关使用,有以下用途: 鉴权:对于访问每个服务的请求进行鉴 ...