confluent kafka connect remote debugging
1. Deep inside of kafka-connect start up
To begin with, let's take a look at how kafka connect start.
1.1 start command
# background running mode
cd /home/lenmom/workspace/software/confluent-community-5.1.-2.11/ &&./bin/connect-distributed -daemon ./etc/schema-registry/connect-avro-distributed.properties # or console running mode
cd /home/lenmom/workspace/software/confluent-community-5.1.-2.11/ &&./bin/connect-distributed ./etc/schema-registry/connect-avro-distributed.properties
we saw the start command is connect-distributed, then take a look at content of this file
#!/bin/sh
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License. if [ $# -lt ];
then
echo "USAGE: $0 [-daemon] connect-distributed.properties"
exit
fi base_dir=$(dirname $) ###
### Classpath additions for Confluent Platform releases (LSB-style layout)
###
#cd -P deals with symlink from /bin to /usr/bin
java_base_dir=$( cd -P "$base_dir/../share/java" && pwd ) # confluent-common: required by kafka-serde-tools
# kafka-serde-tools (e.g. Avro serializer): bundled with confluent-schema-registry package
for library in "kafka" "confluent-common" "kafka-serde-tools" "monitoring-interceptors"; do
dir="$java_base_dir/$library"
if [ -d "$dir" ]; then
classpath_prefix="$CLASSPATH:"
if [ "x$CLASSPATH" = "x" ]; then
classpath_prefix=""
fi
CLASSPATH="$classpath_prefix$dir/*"
fi
done if [ "x$KAFKA_LOG4J_OPTS" = "x" ]; then
LOG4J_CONFIG_NORMAL_INSTALL="/etc/kafka/connect-log4j.properties"
LOG4J_CONFIG_ZIP_INSTALL="$base_dir/../etc/kafka/connect-log4j.properties"
if [ -e "$LOG4J_CONFIG_NORMAL_INSTALL" ]; then # Normal install layout
KAFKA_LOG4J_OPTS="-Dlog4j.configuration=file:${LOG4J_CONFIG_NORMAL_INSTALL}"
elif [ -e "${LOG4J_CONFIG_ZIP_INSTALL}" ]; then # Simple zip file layout
KAFKA_LOG4J_OPTS="-Dlog4j.configuration=file:${LOG4J_CONFIG_ZIP_INSTALL}"
else # Fallback to normal default
KAFKA_LOG4J_OPTS="-Dlog4j.configuration=file:$base_dir/../config/connect-log4j.properties"
fi
fi
export KAFKA_LOG4J_OPTS if [ "x$KAFKA_HEAP_OPTS" = "x" ]; then
export KAFKA_HEAP_OPTS="-Xms256M -Xmx2G"
fi EXTRA_ARGS=${EXTRA_ARGS-'-name connectDistributed'} COMMAND=$
case $COMMAND in
-daemon)
EXTRA_ARGS="-daemon "$EXTRA_ARGS
shift
;;
*)
;;
esac export CLASSPATH
exec $(dirname $)/kafka-run-class $EXTRA_ARGS org.apache.kafka.connect.cli.ConnectDistributed "$@"
we found that to start the kafka connect process, it called another file kafka-run-class,so let's goto kafka-run-class.
1.2 kafka-run-class
.
.
.
.
# Launch mode
if [ "x$DAEMON_MODE" = "xtrue" ]; then
nohup $JAVA $KAFKA_HEAP_OPTS $KAFKA_JVM_PERFORMANCE_OPTS $KAFKA_GC_LOG_OPTS $KAFKA_JMX_OPTS $KAFKA_LOG4J_OPTS -cp $CLASSPATH $KAFKA_OPTS "$@" > "$CONSOLE_OUTPUT_FILE" >& < /dev/null &
else
exec $JAVA $KAFKA_HEAP_OPTS $KAFKA_JVM_PERFORMANCE_OPTS $KAFKA_GC_LOG_OPTS $KAFKA_JMX_OPTS $KAFKA_LOG4J_OPTS -cp $CLASSPATH $KAFKA_OPTS "$@"
fi
at the end of this file, it launched the connect process by invoking java command, and this is the location where we can add logic to remote debugging.
2. copy kafka-run-class and rename the copy to kafka-connect-debugging
cp bin/kafka-run-class bin/kafka-connect-debugging
modify the invoke command in kafka-connect-debugging to add java remote debugging support.
vim bin/kafka-connect-debugging
the invoke command as follows:
.
.
.
export JPDA_OPTS="-agentlib:jdwp=transport=dt_socket,address=8888,server=y,suspend=y"
#export JPDA_OPTS="" # Launch mode
if [ "x$DAEMON_MODE" = "xtrue" ]; then
nohup $JAVA $JPDA_OPTS $KAFKA_HEAP_OPTS $KAFKA_JVM_PERFORMANCE_OPTS $KAFKA_GC_LOG_OPTS $KAFKA_JMX_OPTS $KAFKA_LOG4J_OPTS -cp $CLASSPATH $KAFKA_OPTS "$@" > "$CONSOLE_OUTPUT_FILE" >& < /dev/null &
else
exec $JAVA $JPDA_OPTS $KAFKA_HEAP_OPTS $KAFKA_JVM_PERFORMANCE_OPTS $KAFKA_GC_LOG_OPTS $KAFKA_JMX_OPTS $KAFKA_LOG4J_OPTS -cp $CLASSPATH $KAFKA_OPTS "$@"
fi
The added command means to start the kafka-connect as server and listen at port number 8888, and paused for the debugging client to connect.
if we don't want to run in debug mode, just uncomment the line
#export JPDA_OPTS=""
which means remote the # symbol in this line.
3. edit connect-distributed file
cd /home/lenmom/workspace/software/confluent-community-5.1.-2.11/
vim ./bin/connect-distributed
replace last line from
exec $(dirname $)/kafka-run-class $EXTRA_ARGS org.apache.kafka.connect.cli.ConnectDistributed "$@"
to
exec $(dirname $)/kafka-connect-debugging $EXTRA_ARGS org.apache.kafka.connect.cli.ConnectDistributed "$@"
4. debugging
4.1 start kafka-connect
lenmom@M1701:~/workspace/software/confluent-community-5.1.-2.11$ bin/connect-distributed ./etc/schema-registry/connect-avro-distributed.properties
Listening for transport dt_socket at address:
we see the process is paused and listening on port 8888, until the debugging client attached on.
4.2 attach the kafka-connect using idea
after setup the debugg setting, just client debugging, is ok now. show a screenshot of my scenario.
Have fun!
confluent kafka connect remote debugging的更多相关文章
- Oracle GoldenGate to Confluent with Kafka Connect
Confluent is a company founded by the team that built Apache Kafka. It builds a platform around Kafk ...
- Debugging Kafka connect
1. setup debug configuration mainClass: org.apache.kafka.connect.cli.ConnectDistributed VMOption: -D ...
- Streaming data from Oracle using Oracle GoldenGate and Kafka Connect
This is a guest blog from Robin Moffatt. Robin Moffatt is Head of R&D (Europe) at Rittman Mead, ...
- Kafka connect in practice(3): distributed mode mysql binlog ->kafka->hive
In the previous post Kafka connect in practice(1): standalone, I have introduced about the basics of ...
- Build an ETL Pipeline With Kafka Connect via JDBC Connectors
This article is an in-depth tutorial for using Kafka to move data from PostgreSQL to Hadoop HDFS via ...
- Kafka connect快速构建数据ETL通道
摘要: 作者:Syn良子 出处:http://www.cnblogs.com/cssdongl 转载请注明出处 业余时间调研了一下Kafka connect的配置和使用,记录一些自己的理解和心得,欢迎 ...
- 以Kafka Connect作为实时数据集成平台的基础架构有什么优势?
Kafka Connect是一种用于在Kafka和其他系统之间可扩展的.可靠的流式传输数据的工具,可以更快捷和简单地将大量数据集合移入和移出Kafka的连接器.Kafka Connect为DataPi ...
- DataPipeline联合Confluent Kafka Meetup上海站
Confluent作为国际数据“流”处理技术领先者,提供实时数据处理解决方案,在市场上拥有大量企业客户,帮助企业轻松访问各类数据.DataPipeline作为国内首家原生支持Kafka解决方案的“iP ...
- 打造实时数据集成平台——DataPipeline基于Kafka Connect的应用实践
导读:传统ETL方案让企业难以承受数据集成之重,基于Kafka Connect构建的新型实时数据集成平台被寄予厚望. 在4月21日的Kafka Beijing Meetup第四场活动上,DataPip ...
随机推荐
- 【动态规划】ZZNU-OJ- 2054 : 油田
2054 : 油田 (一个神奇的功能:点击上方文字进入相应页面) 时间限制:1 Sec 内存限制:32 MiB提交:49 答案正确:6 提交 状态 讨论区 题目描述 在太平洋的一片海域,发现了大量的油 ...
- 全球化 System.Globalization.CultureInfo与RegionInfo类
一.CultureInfo类:文化信息 分类: 1.中立文化(Neutral culture): zh-CHS:中文,无区域信息,无格式化信息 2.特定区域文化(Specific culture) z ...
- for,foreach,$.each()跳出循环的比较
说起跳出循环,第一时间想起的是 break \ continue,这是经典的for循环. 1.for 循环 先上例子,思考输出结果,体会 break 与 continue 的不同. 1 var arr ...
- Oracle数据库 — DDL:数据定义语言
1.数据定义语言:用于定义数据库的结构,比如创建.修改或删除数据库对象: 包括: CREATE TABLE:创建数据库表:创建表的命名规则: 2.以字母开头:在 1–30 个字符之间:只能包含 A–Z ...
- mysql中删除重复数据
//首先我们需要知道我们重复的都有哪些数据, //第一步:进行对数据表进行分组,group by. //第二步:进行后通过having进行限制筛选,条数大于等于2的 //第三步:进行多表删除. //案 ...
- JavaScript this 的指向问题
原文作者:SegmentFault ——写bug 原文链接:https://segmentfault.com/a/1190000015438195 this的指向已经是一个老生常谈的问题,每逢面试都要 ...
- 使用jQuery快速高效制作网页交互特效第一章JavaScript基础
JavaScript 一.JavaScript概念: JavaScript面向对象事件驱动具有安全性的脚本语言,面向对象 JavaScript特点: 1.解释性语言,边运行边解释 2.和HTML页面实 ...
- 记录从裸机到TensorFlow GPU版运行 的配置过程
实验室原来有一台装Ubuntu Server系统的服务器,安装有tensorflow,在使用过程中经常出现断网.死机.自动关机等毛病,忍无可忍,决定重装系统 配置如下:Dell工作站,Xeon-E5 ...
- bzoj 1415: [Noi2005]聪聪和可可 期望dp+记忆化搜索
期望dp水题~ 你发现每一次肯定是贪心走 2 步,(只走一步的话就可能出现环) 然后令 $f[i][j]$ 表示聪在 $i$,可在 $j$,且聪先手两个人碰上面的期望最小次数. 用记忆化搜索转移就行了 ...
- [APIO2012]派遣 左偏树
P1552 [APIO2012]派遣 题面 考虑枚举每个节点作为管理者,计算所获得的满意程度以更新答案.对于每个节点的计算,贪心,维护一个大根堆,每次弹出薪水最大的人.这里注意,一旦一个人被弹出,那么 ...