https://github.com/rollno748/di-kafkameter/wiki#producer-elements

Introduction

DI-Kafkameter is a JMeter plugin that allows you to test and measure the performance of Apache Kafka.

Components

The DI-Kafkameter comprises of 2 components, which are

  • Producer Component
    1. Kafka Producer Config
    2. Kafka Producer Sampler
  • Consumer Component
    1. Kafka Consumer Config
    2. Kafka Consumer Sampler

Producer Component (Publish a Message to a Topic)

To publish/send a message to a Kafka topic you need to add producer components to the testplan.

  • The Kafka Producer config is responsible to hold the connection information, which includes security and other properties required to talk to the broker.
  • The Kakfa Producer Sampler helps to send messages to the topic with the connection established using Config element.

Right click on Test Plan -> Add -> Config Element -> Kafka Producer Config

Provide a Variable name to export the connection object (Which will be used in Sampler element)

Provide the Kafka connection configs (list of Brokers with comma separated)

Provide a Client ID (Make it unique, to define where you sending the message from)

Select the right security to connect to brokers (This will be completely based on how Kafka security is defined)

For JAAS Security, You need to add the below key and value to the Additional Properties
Config key: sasl.jaas.config
Config value: org.apache.kafka.common.security.scram.ScramLoginModule required username="<USERNAME>" password="<PASSWORD>";

Right click on Test Plan -> Add -> Sampler -> Kafka Producer Sampler

Use the same Variable name which was defined in the config element

Define the topic name where you want to send the message (Case sensitive)

Kafka Message - The Original message which needs to be pushed to the topic

Partition String (Optional) - This option helps you to post messages to particular partition by providing the partition number

Message Headers (Optional) - This helps in adding headers to the messages which are being pushed (Supports more than one header)

Consumer Component (Read Message from a topic)

To Consume/Read a message from a Kafka topic you need to add Consumer components to the testplan.

  • The Kafka Consumer config is responsible to hold the connection information, which includes security and other properties required to talk to the broker.
  • The Kafka Consumer Sampler helps to read messages from the topic with the connection established using Config element.

Right click on Test Plan -> Add -> Config Element -> Kafka Consumer Config

Provide a Variable name to export the connection object (Which will be used in Sampler element)

Provide the Kafka connection configs (list of Brokers with comma separated)

Provide a Group ID (Make it unique, to define the group your consumer belongs to)

Define the topic name where you want to send the message (Case sensitive)

No Of Messages to Poll - This allows you to define the number of messages to read within a request (Defaults to 1)

Select the right security to connect to brokers (This will be completely based on how Kafka security is defined)

Auto Commit - This will set the offset as read, once the message is consumed

Select the right security to connect to brokers (This will be completely based on how Kafka security is defined)

For JAAS Security, You need to add the below key and value to the Additional Properties
Config key: sasl.jaas.config
Config value: org.apache.kafka.common.security.scram.ScramLoginModule required username="<USERNAME>" password="<PASSWORD>";

Right click on Test Plan -> Add -> Sampler -> Kafka Consumer Sampler

Use the same Variable name which was defined in the config element

Poll timeout - This helps to set the polling timeout for consumer to read from topic (Defaults to 100 ms)

Commit Type - Defines the Commit type (Sync/Async)

Producer Properties

Supported Producer properties which can be added to Additional Properties field.

Property Available Options Default
acks [0, 1, -1] 1
batch.size positive integer 16384
bootstrap.servers comma-separated host:port pairs localhost:9092
buffer.memory positive long 33554432
client.id string ""
compression.type [none, gzip, snappy, lz4, zstd] none
connections.max.idle.ms positive long 540000
delivery.timeout.ms positive long 120000
enable.idempotence [true, false] false
interceptor.classes fully-qualified class names []
key.serializer fully-qualified class name org.apache.kafka.common.serialization.StringSerializer
linger.ms non-negative integer 0
max.block.ms non-negative long 60000
max.in.flight.requests.per.connection positive integer 5
max.request.size positive integer 1048576
metadata.fetch.timeout.ms positive long 60000
metadata.max.age.ms positive long 300000
partitioner.class fully-qualified class name org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes positive integer 32768
reconnect.backoff.ms non-negative long 50
request.timeout.ms positive integer 30000
retries non-negative integer 0
sasl.jaas.config string null
sasl.kerberos.kinit.cmd string /usr/bin/kinit
sasl.kerberos.min.time.before.relogin positive long 60000
sasl.kerberos.service.name string null
sasl.mechanism [GSSAPI, PLAIN, SCRAM-SHA-256, SCRAM-SHA-512] GSSAPI
security.protocol [PLAINTEXT, SSL, SASL_PLAINTEXT, SASL_SSL] PLAINTEXT
sender.flush.timeout.ms non-negative long 0
send.buffer.bytes positive integer 131072
value.serializer fully-qualified class name org.apache.kafka.common.serialization.StringSerializer

Consumer Properties

Supported Consumer properties which can be added to Additional Properties field.

Property Available Options Default
auto.commit.interval.ms positive integer 5000
auto.offset.reset [earliest, latest, none] latest
bootstrap.servers comma-separated host:port pairs localhost:9092
check.crcs [true, false] true
client.id string ""
connections.max.idle.ms positive long 540000
enable.auto.commit [true, false] true
exclude.internal.topics [true, false] true
fetch.max.bytes positive long 52428800
fetch.max.wait.ms non-negative integer 500
fetch.min.bytes non-negative integer 1
group.id string ""
heartbeat.interval.ms positive integer 3000
interceptor.classes fully-qualified class names []
isolation.level [read_uncommitted, read_committed] read_uncommitted
key.deserializer fully-qualified class name org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes positive integer 1048576
max.poll.interval.ms positive long 300000
max.poll.records positive integer 500
metadata.max.age.ms positive long 300000
metadata.fetch.timeout.ms positive long 60000
receive.buffer.bytes positive integer 32768
reconnect.backoff.ms non-negative long 50
request.timeout.ms positive integer 30000
retry.backoff.ms non-negative long 100
sasl.jaas.config string null
sasl.kerberos.kinit.cmd string /usr/bin/kinit
sasl.kerberos.min.time.before.relogin positive long 60000
sasl.kerberos.service.name string null
sasl.mechanism [GSSAPI, PLAIN, SCRAM-SHA-256, SCRAM-SHA-512] GSSAPI
security.protocol [PLAINTEXT, SSL, SASL_PLAINTEXT, SASL_SSL] PLAINTEXT
send.buffer.bytes positive integer 131072
session.timeout.ms positive integer 10000
value.deserializer fully-qualified class name org.apache.kafka.common.serialization.StringDeserializer

[转帖]Welcome to the di-kafkameter wiki!的更多相关文章

  1. 用Zim替代org-mode?

    三年前我玩过Zim,当时还写了一篇<Zim - 普通人的Org-mode>,当时还说我还是会坚持使用emacs org-mode.但最近我又在考虑是不是回头用Zim来写博客文章.整理知识库 ...

  2. jmeter取样器之KafkaProducerSampler(往kafka插入数据)

    项目背景 性能测试场景中有一个业务场景的数据抽取策略是直接使用kafka队列,该场景需要准备的测试数据是kafka队列里的数据,故需要实现插入数据到kafka队列,且需要实现控制每分钟插入多少条数据. ...

  3. 【好文转帖】控制反转(IOC)和依赖注入(DI)的区别

    IOC   inversion of control  控制反转 DI   Dependency Injection  依赖注入 要理解这两个概念,首先要搞清楚以下几个问题: 参与者都有谁? 依赖:谁 ...

  4. [转帖]什么是IOC(控制反转)、DI(依赖注入)

    什么是IOC(控制反转).DI(依赖注入) 2018-08-22 21:29:13 Ming339456 阅读数 20642   原文地址(摘要了部分内容):https://blog.csdn.net ...

  5. Ninject之旅之一:理解DI

    摘要: DI(IoC)是当前软件架构设计中比较时髦的技术.DI(IoC)可以使代码耦合性更低,更容易维护,更容易测试.现在有很多开源的依赖反转的框架,Ninject是其中一个轻量级开源的.net DI ...

  6. [Android分享] 【转帖】Android ListView的A-Z字母排序和过滤搜索功能

      感谢eoe社区的分享   最近看关于Android实现ListView的功能问题,一直都是小伙伴们关心探讨的Android开发问题之一,今天看到有关ListView实现A-Z字母排序和过滤搜索功能 ...

  7. MVC 5 + EF6 完整教程15 -- 使用DI进行解耦

    如果大家研究一些开源项目,会发现无处不在的DI(Dependency Injection依赖注入). 本篇文章将会详细讲述如何在MVC中使用Ninject实现DI 文章提纲 场景描述 & 问题 ...

  8. [转帖][分享] 关于系统DIY--by 原罪

    http://wuyou.net/forum.php?mod=viewthread&tid=399277&extra=page%3D1 前几天我发了一个帖子<Windows组件w ...

  9. AngularJs学习笔记--Dependency Injection(DI,依赖注入)

    原版地址:http://code.angularjs.org/1.0.2/docs/guide/di 一.Dependency Injection(依赖注入) 依赖注入(DI)是一个软件设计模式,处理 ...

  10. 原创:Javascript DI!Angular依赖注入的实现原理

    DI是Angular的特色功能,而在Angular 2.0的计划中,DI将成为一个独立的模块,参见 https://github.com/angular/di.js 这意味着它也有机会被用于nodej ...

随机推荐

  1. vue3 + vite + ts 配置 @ 别名

    第一步 npm install @types/node -D 第二步 这是原 vite.config.ts文件 import { defineConfig } from 'vite' import v ...

  2. PythonAnywhere 部署Flask项目

    一. 注册账号 官网:https://www.pythonanywhere.com/ 二. 将GitHub上的项目发送至PythonAnywhere 三.配置环境及运行 git clone https ...

  3. Java中ArrayList的遍历与删除元素方式总结

    在Java编程中,我们经常需要对数据结构进行遍历操作,并根据业务需求删除部分元素.而数组列表(ArrayList)是集合类中的一种,它可以动态地添加和删除元素,非常适合在程序中使用.本篇博客将总结Ar ...

  4. WinDbg实践--入门篇

      WinDbg从字面意思就是Windows+Debug的组合,即Windows平台上的调试工具,可以调试用户模式.内核模式.dump文件等,总之知道它的调试功能非常强大就行了.WinDbg调试命令分 ...

  5. 带你认识多模数据库GeminiDB架构与应用实践

    本文分享自华为云社区<多模归一,一生万物--华为云多模数据库GeminiDB架构与应用实践>,作者: GaussDB 数据库 . 在这个信息爆炸的时代,数据的管理和应用变得越来越重要.互联 ...

  6. 实战案例丨GaussDB for DWS如何识别坏味道的SQL

    摘要:SQL中的坏味道,你知道吗? SQL语言是关系型数据库(RDB)的标准语言,其作用是将使用者的意图翻译成数据库能够理解的语言来执行.人类之间进行交流时,同样的意思用不同的措辞会产生不同的效果. ...

  7. Python 的 sum():Pythonic 的求和方法

    摘要:Python 的内置函数sum()是一种对数值列表求和的有效且Pythonic 的方法.将多个数字相加是许多计算中常见的中间步骤,因此sum()对于 Python 程序员来说是一个非常方便的工具 ...

  8. vue2升级vue3: Event Bus 替代方案

    在看 https://v3-migration.vuejs.org/breaking-changes/events-api.html 在vue2里面 In 2.x, a Vue instance co ...

  9. vue2升级vue3: 全局变量挂载与类型声明

    全局变量挂载 vue2 Vue.prototype.$lm = {} vue3 const app = Vue.createApp({}) app.config.globalProperties.$l ...

  10. 在低代码开发平台 ILLA Cloud 中使用 Hugging Face 上的模型

    ILLA Cloud 是一个面向开发者的开源低代码开发平台,平台专注于帮助开发者快速建立企业内部应用,为开发者节约数据调用与页面设计的时间.平台具有面向开发者.数据整合.协同开发.灵活部署等功能与特点 ...