Introduction

MapR Ecosystem Package 2.0 (MEP) is coming with some new features related to MapR Streams:

Kafka REST Proxy for MapR Streams provides a RESTful interface to MapR Streams and Kafka clusters to consume and product messages and to perform administrative operations. Kafka Connect for MapR Streams is a utility for streaming data between MapR Streams and Apache Kafka and other storage systems.

MapR Ecosystem Packs (MEPs) are a way to deliver ecosystem upgrades decoupled from core upgrades - allowing you to upgrade your tooling independently of your Converged Data Platform. You can lean more about MEP 2.0 in this article .

In this blog we describe how to use the REST Proxy to publish and consume messages to/from MapR Streams. The REST Proxy is a great addition to the MapR Converged Data Platform allowing any programming language to use MapR Streams.

The Kafka REST Proxy provided with the MapR Streams tools, can be used with MapR Streams (default), but also used in a hybrid mode with Apache Kafka. In this article we will focus on MapR Streams.

Prerequisites MapR Converged Data Platform 5.2 with MEP 2.0 with MapR Streams Tools curl, wget or any HTTP/REST Client tool Create the MapR Streams and Topic

A stream is a collection of topics that you can manage as a group by:

Setting security policies that apply to all topics in that stream Setting a default number of partitions for each new topic that is created in the stream Set a time-to-live for messages in every topic in the stream

You can find more information about MapR Streams concepts in the documentation .

On your Mapr Cluster or Sandbox, run the following commands:

$ maprcli stream create -path /apps/iot-stream -produceperm p -consumeperm p -topicperm p $ maprcli stream topic create -path /apps/iot-stream -topic sensor-json -partitions 3 $ maprcli stream topic create -path /apps/iot-stream -topic sensor-binary -partitions 3

Start Kafka Console Producers and Consumers

Open two terminal windows and run the consumer Kafka utilities using the following commands:

Consumer Topic sensor-json

$ /opt/mapr/kafka/kafka-0.9.0/bin/kafka-console-consumer.sh --new-consumer --bootstrap-server this.will.be.ignored:9092 --topic /apps/iot-stream:sensor-json

Topic sensor-binary

$ /opt/mapr/kafka/kafka-0.9.0/bin/kafka-console-consumer.sh --new-consumer --bootstrap-server this.will.be.ignored:9092 --topic /apps/iot-stream:sensor-binary

This two terminal windows will allow you to see the messages posted on the different topics

Using Kafka REST Proxy Inspect Topic Metadata The endpoint /topics/[topic_name] allows you to get some informations about the topic. In MapR Streams, topics are part of a stream identified by a path; to use the topic using the REST API you have to use the full path, and encode it in the URL; for example: /apps/iot-stream:sensor-json will be encoded with %2Fapps%2Fiot-stream%3Asensor-json

Run the following command, to get information about the sensor-json topic

$ curl -X GET http://localhost:8082/topics/%2Fapps%2Fiot-stream%3Asensor-json

Note: For simplicity reason I am running the command from the node where the Kafka REST proxy is running, so it is possible to use localhost .

You can print JSON in a pretty way, by adding a python command such as :

$ curl -X GET http://localhost:8082/topics/%2Fapps%2Fiot-stream%3Asensor-json | python -m json.tool

Default Stream

As mentioned above, the Stream path is part of the topic name you have to use in the command; however it is possible to configure the MapR Kafka REST Proxy to use a default stream. For this you should add the following property in the /opt/mapr/kafka-rest/kafka-rest-2.0.1/config/kafka-rest.properties file:

streams.default.stream=/apps/iot-stream

When you change the Kafka REST proxy configuration, you must restart the service using maprcli or MCS.

The main reason to use the streams.default.stream properties is to simplify the URLs used by the application for example * with streams.default.stream you can use curl -X GET http://localhost:8082/topics/ * without this configuration, or if you want to use a specific stream you must specify it in the URL http://localhost:8082/topics/%2Fapps%2Fiot-stream%3Asensor-json

In this article, all the URLs contains the encoded stream name, like that you can start using the Kafka REST proxy without changind the configuration and also use it with different streams.

Publishing Messages

The Kafka REST Proxy for MapR Streams allows application to publish messages to MapR Streams. Messages could be send as JSON or Binary content (base64 encoding).

To send a JSON Message: the query should be a HTTP POST the Content-Type should be : application/vnd.kafka.json.v1+json the Body: { "records": [ { "value": { "temp" : 10 , "speed" : 40 , "direction" : "NW" } } ] }

The complete request is:

curl -X POST -H "Content-Type: application/vnd.kafka.json.v1+json" \ --data '{"records":[{"value": {"temp" : 10 , "speed" : 40 , "direction" : "NW"} }]}' \ http://localhost:8082/topics/%2Fapps%2Fiot-stream%3Asensor-json

You should see the message printed in the terminal window where the /apps/iot-stream:sensor-json consumer is running.

To send a binary Message: the query should be a HTTP POST the Content-Type should be : application/vnd.kafka.binary.v1+json the Body: { "records": [ { "value":"SGVsbG8gV29ybGQ=" } ] }

开始使用Apache弗林克和Mapr Streams的更多相关文章

  1. Apache Flink中的广播状态实用指南

    感谢英文原文作者:https://data-artisans.com/blog/a-practical-guide-to-broadcast-state-in-apache-flink 不过,原文最近 ...

  2. Real Time Credit Card Fraud Detection with Apache Spark and Event Streaming

    https://mapr.com/blog/real-time-credit-card-fraud-detection-apache-spark-and-event-streaming/ Editor ...

  3. 《从0到1学习Flink》—— Apache Flink 介绍

    前言 Flink 是一种流式计算框架,为什么我会接触到 Flink 呢?因为我目前在负责的是监控平台的告警部分,负责采集到的监控数据会直接往 kafka 里塞,然后告警这边需要从 kafka topi ...

  4. 2.3 Streams API 官网剖析(博主推荐)

    不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ 2.3 Streams API 2.3 Streams API 在0..0增加了一个 ...

  5. Apache Flink 整体介绍

    前言 Flink 是一种流式计算框架,为什么我会接触到 Flink 呢?因为我目前在负责的是监控平台的告警部分,负责采集到的监控数据会直接往 kafka 里塞,然后告警这边需要从 kafka topi ...

  6. Flink(三)Flink开发IDEA环境搭建与测试

    一.IDEA开发环境 1.pom文件设置 <properties> <maven.compiler.source>1.8</maven.compiler.source&g ...

  7. 一文让你彻底了解大数据实时计算引擎 Flink

    前言 在上一篇文章 你公司到底需不需要引入实时计算引擎? 中我讲解了日常中常见的实时需求,然后分析了这些需求的实现方式,接着对比了实时计算和离线计算.随着这些年大数据的飞速发展,也出现了不少计算的框架 ...

  8. Kafka入门教程(二)

    转自:https://blog.csdn.net/yuan_xw/article/details/79188061 Kafka集群环境安装 相关下载 JDK要求1.8版本以上. JDK安装教程:htt ...

  9. 转载文章——Hadoop学习

    转载地址:http://www.iteye.com/blogs/subjects/zy19982004?page=2 一.Hadoop社区版和发行版 社区版:我们把Apache社区一直开发的Hadoo ...

随机推荐

  1. 苹果CMS

    本篇将主要讲解使用过程中普遍遇到的“问题”,这些问题并非是BUG,通常是需要我们自己去注意的一些点.(会结合用户反馈持续补充)http://www.maccms.com/doc/v10/faq.htm ...

  2. JS全局函数里面的一些区别

  3. [原创]Java调用PageOffice在线打开数据库中保存的Word文件

    PageOffice产品和数据库是两个独立的概念,严格来说两者之间没有任何本质关系.PageOffice不依赖数据库而存在,但是数据库和PageOffice可以结合使用来完成某些复杂的业务逻辑.例如: ...

  4. 《DSP using MATLAB》Problem 8.8

    代码: %% ------------------------------------------------------------------------ %% Output Info about ...

  5. php从5.6升级到php7后,扩展出现segment fault的问题解决

    php7的文档中有这样的描述: Both mistakes might cause memory corruptions and segfaults:1)char *str;long str_len; ...

  6. redis Hash 命令

    HDEL key field2 [field2] 删除一个或多个哈希表字段 HEXISTS key field 查看哈希表 key 中,指定的字段是否存在. HGET key field 获取存储在哈 ...

  7. Python学习之--数据基础

    对于Python来说,一切皆对象.包括数字.字符串.列表等,对象是由类来创建的,那对象的一个优点就是可以使用其创建类中所定义的各种方法. 查看对象/方法 1)可以在命令行中直接查看,如下: >& ...

  8. Find- Linux必学的60个命令

    1.作用 find命令的作用是在目录中搜索文件,它的使用权限是所有用户. 2.格式 find [path][options][expression] path指定目录路径,系统从这里开始沿着目录树向下 ...

  9. [转]WPF 构建无外观(Lookless)控件

    构建一个用户可以使用Template属性设置外观的WPF控件需要以下几步 1.继承自System.Windows.Controls.Control 2.设置DefaultStyleKeyPropert ...

  10. [转]C#设计模式(4)-Simple Factory Pattern

    工厂模式专门负责将大量有共同接口的类实例化.工厂模式可以动态决定将哪一个类实例化,不必事先知道每次要实例化哪一个类.工厂模式有以下几种形态: 简单工厂(Simple Factory)模式 工厂方法(F ...