开始使用Apache弗林克和Mapr Streams
Introduction
MapR Ecosystem Package 2.0 (MEP) is coming with some new features related to MapR Streams:
Kafka REST Proxy for MapR Streams provides a RESTful interface to MapR Streams and Kafka clusters to consume and product messages and to perform administrative operations. Kafka Connect for MapR Streams is a utility for streaming data between MapR Streams and Apache Kafka and other storage systems.
MapR Ecosystem Packs (MEPs) are a way to deliver ecosystem upgrades decoupled from core upgrades - allowing you to upgrade your tooling independently of your Converged Data Platform. You can lean more about MEP 2.0 in this article .
In this blog we describe how to use the REST Proxy to publish and consume messages to/from MapR Streams. The REST Proxy is a great addition to the MapR Converged Data Platform allowing any programming language to use MapR Streams.
The Kafka REST Proxy provided with the MapR Streams tools, can be used with MapR Streams (default), but also used in a hybrid mode with Apache Kafka. In this article we will focus on MapR Streams.
Prerequisites MapR Converged Data Platform 5.2 with MEP 2.0 with MapR Streams Tools curl, wget or any HTTP/REST Client tool Create the MapR Streams and Topic
A stream is a collection of topics that you can manage as a group by:
Setting security policies that apply to all topics in that stream Setting a default number of partitions for each new topic that is created in the stream Set a time-to-live for messages in every topic in the stream
You can find more information about MapR Streams concepts in the documentation .
On your Mapr Cluster or Sandbox, run the following commands:
$ maprcli stream create -path /apps/iot-stream -produceperm p -consumeperm p -topicperm p $ maprcli stream topic create -path /apps/iot-stream -topic sensor-json -partitions 3 $ maprcli stream topic create -path /apps/iot-stream -topic sensor-binary -partitions 3
Start Kafka Console Producers and Consumers
Open two terminal windows and run the consumer Kafka utilities using the following commands:
Consumer Topic sensor-json
$ /opt/mapr/kafka/kafka-0.9.0/bin/kafka-console-consumer.sh --new-consumer --bootstrap-server this.will.be.ignored:9092 --topic /apps/iot-stream:sensor-json
Topic sensor-binary
$ /opt/mapr/kafka/kafka-0.9.0/bin/kafka-console-consumer.sh --new-consumer --bootstrap-server this.will.be.ignored:9092 --topic /apps/iot-stream:sensor-binary
This two terminal windows will allow you to see the messages posted on the different topics
Using Kafka REST Proxy Inspect Topic Metadata The endpoint /topics/[topic_name] allows you to get some informations about the topic. In MapR Streams, topics are part of a stream identified by a path; to use the topic using the REST API you have to use the full path, and encode it in the URL; for example: /apps/iot-stream:sensor-json will be encoded with %2Fapps%2Fiot-stream%3Asensor-json
Run the following command, to get information about the sensor-json topic
$ curl -X GET http://localhost:8082/topics/%2Fapps%2Fiot-stream%3Asensor-json
Note: For simplicity reason I am running the command from the node where the Kafka REST proxy is running, so it is possible to use localhost .
You can print JSON in a pretty way, by adding a python command such as :
$ curl -X GET http://localhost:8082/topics/%2Fapps%2Fiot-stream%3Asensor-json | python -m json.tool
Default Stream
As mentioned above, the Stream path is part of the topic name you have to use in the command; however it is possible to configure the MapR Kafka REST Proxy to use a default stream. For this you should add the following property in the /opt/mapr/kafka-rest/kafka-rest-2.0.1/config/kafka-rest.properties file:
streams.default.stream=/apps/iot-stream
When you change the Kafka REST proxy configuration, you must restart the service using maprcli or MCS.
The main reason to use the streams.default.stream properties is to simplify the URLs used by the application for example * with streams.default.stream you can use curl -X GET http://localhost:8082/topics/ * without this configuration, or if you want to use a specific stream you must specify it in the URL http://localhost:8082/topics/%2Fapps%2Fiot-stream%3Asensor-json
In this article, all the URLs contains the encoded stream name, like that you can start using the Kafka REST proxy without changind the configuration and also use it with different streams.
Publishing Messages
The Kafka REST Proxy for MapR Streams allows application to publish messages to MapR Streams. Messages could be send as JSON or Binary content (base64 encoding).
To send a JSON Message: the query should be a HTTP POST the Content-Type should be : application/vnd.kafka.json.v1+json the Body: { "records": [ { "value": { "temp" : 10 , "speed" : 40 , "direction" : "NW" } } ] }
The complete request is:
curl -X POST -H "Content-Type: application/vnd.kafka.json.v1+json" \ --data '{"records":[{"value": {"temp" : 10 , "speed" : 40 , "direction" : "NW"} }]}' \ http://localhost:8082/topics/%2Fapps%2Fiot-stream%3Asensor-json
You should see the message printed in the terminal window where the /apps/iot-stream:sensor-json consumer is running.
To send a binary Message: the query should be a HTTP POST the Content-Type should be : application/vnd.kafka.binary.v1+json the Body: { "records": [ { "value":"SGVsbG8gV29ybGQ=" } ] }
开始使用Apache弗林克和Mapr Streams的更多相关文章
- Apache Flink中的广播状态实用指南
感谢英文原文作者:https://data-artisans.com/blog/a-practical-guide-to-broadcast-state-in-apache-flink 不过,原文最近 ...
- Real Time Credit Card Fraud Detection with Apache Spark and Event Streaming
https://mapr.com/blog/real-time-credit-card-fraud-detection-apache-spark-and-event-streaming/ Editor ...
- 《从0到1学习Flink》—— Apache Flink 介绍
前言 Flink 是一种流式计算框架,为什么我会接触到 Flink 呢?因为我目前在负责的是监控平台的告警部分,负责采集到的监控数据会直接往 kafka 里塞,然后告警这边需要从 kafka topi ...
- 2.3 Streams API 官网剖析(博主推荐)
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ 2.3 Streams API 2.3 Streams API 在0..0增加了一个 ...
- Apache Flink 整体介绍
前言 Flink 是一种流式计算框架,为什么我会接触到 Flink 呢?因为我目前在负责的是监控平台的告警部分,负责采集到的监控数据会直接往 kafka 里塞,然后告警这边需要从 kafka topi ...
- Flink(三)Flink开发IDEA环境搭建与测试
一.IDEA开发环境 1.pom文件设置 <properties> <maven.compiler.source>1.8</maven.compiler.source&g ...
- 一文让你彻底了解大数据实时计算引擎 Flink
前言 在上一篇文章 你公司到底需不需要引入实时计算引擎? 中我讲解了日常中常见的实时需求,然后分析了这些需求的实现方式,接着对比了实时计算和离线计算.随着这些年大数据的飞速发展,也出现了不少计算的框架 ...
- Kafka入门教程(二)
转自:https://blog.csdn.net/yuan_xw/article/details/79188061 Kafka集群环境安装 相关下载 JDK要求1.8版本以上. JDK安装教程:htt ...
- 转载文章——Hadoop学习
转载地址:http://www.iteye.com/blogs/subjects/zy19982004?page=2 一.Hadoop社区版和发行版 社区版:我们把Apache社区一直开发的Hadoo ...
随机推荐
- 由VMnet引起的browser-sync故障解决方案
(2019年2月19日注:这篇文章原先发在自己github那边的博客,时间是2016年7月11日) 今天晚上,前端组的小伙伴问我说能不能帮忙看看他的电脑为什么在安装了browser-sync插件以后, ...
- Mybatis和spingboot整合
0. 导包 <!-- 统一管理springboot相关的包 --> <parent> <groupId>org.springframework.boot</g ...
- UNION All中ORDER By的使用
一个sql中,union了几个子查询.单独执行每个子查询都没问题,但union后执行,报ORA-00904: "xxx": invalid identifier关于union的使用 ...
- Django之模板语言(一)
1.Django的模板语言(简而言之,字符串替换) 1.目前为止已经学过的模板语言: 1.{{ name }} ------>变量 2. for 循环: {% for i in book_li ...
- CSS奇数、偶数、指定数样式
原文: https://blog.csdn.net/wangjia200913/article/details/49615325 语法 :nth-child(an+b) 第一种:简单数字序号写法 ...
- vs 快捷键 (空格显示 绿点, Tab 显示箭头)
VS 快捷键 (空格显示 绿点, Tab 显示箭头) VS 有用的快捷键 : Ctrl + r, ctrl + w, 切换空格示.
- ajaxStart 和 ajaxSend 不执行
我们一般会在loading 效果的时候会用上这两个全局事件 ajaxStart 和 ajaxSend 但是要注意的是 在同时有多个ajax 执行的时候ajaxStart 只会执行一次 所以一般情况下 ...
- mysql系统变量与状态变量
一.系统变量分为全局系统变量和会话系统变量:有些变量既是全局系统变量,有些变量只有全局的,有些变量只有会话的. .变量的查询: show global variables like 'log' \G; ...
- JZOJ5965【NOIP2018提高组D2T2】填数游戏
题目 作为NOIP2018的题目,我觉得不需要把题目贴出来了. 大意就是,在一个n∗mn*mn∗m的010101矩阵中,从左上角到右下角的路径中,对于任意的两条,上面的那条小于下面的那条.问满足这样的 ...
- (转)AngularJS判断checkbox/复选框是否选中并实时显示
最近做了一个选择标签的功能,把一些标签展示给用户,用户选择自己喜欢的标签,就类似我们在购物网站看到的那种过滤标签似的: 简单的效果如图所示: 首先看一下html代码: <!DOCTYPE htm ...