flink源码编译(windows环境)
前言
最新开始捣鼓flink,fucking the code之前,编译是第一步。
编译环境
win7 java maven

编译步骤
https://ci.apache.org/projects/flink/flink-docs-release-1.6/start/building.html 官方文档搞起,如下:
Building Flink from Source
This page covers how to build Flink 1.6.1 from sources.
Build Flink
In order to build Flink you need the source code. Either download the source of a release or clone the git repository.
In addition you need Maven 3 and a JDK (Java Development Kit). Flink requires at least Java 8 to build.
NOTE: Maven 3.3.x can build Flink, but will not properly shade away certain dependencies. Maven 3.2.5 creates the libraries properly. To build unit tests use Java 8u51 or above to prevent failures in unit tests that use the PowerMock runner.
To clone from git, enter:
git clone https://github.com/apache/flink
The simplest way of building Flink is by running:
mvn clean install -DskipTests
This instructs Maven (mvn) to first remove all existing builds (clean) and then create a new Flink binary (install).
To speed up the build you can skip tests, QA plugins, and JavaDocs:
mvn clean install -DskipTests -Dfast
The default build adds a Flink-specific JAR for Hadoop 2, to allow using Flink with HDFS and YARN.
Dependency Shading
Flink shades away some of the libraries it uses, in order to avoid version clashes with user programs that use different versions of these libraries. Among the shaded libraries are Google Guava, Asm, Apache Curator, Apache HTTP Components, Netty, and others.
The dependency shading mechanism was recently changed in Maven and requires users to build Flink slightly differently, depending on their Maven version:
Maven 3.0.x, 3.1.x, and 3.2.x It is sufficient to call mvn clean install -DskipTests in the root directory of Flink code base.
Maven 3.3.x The build has to be done in two steps: First in the base directory, then in the distribution project:
mvn clean install -DskipTests
cd flink-dist
mvn clean install
Note: To check your Maven version, run mvn --version.
Hadoop Versions
Info Most users do not need to do this manually. The download page contains binary packages for common Hadoop versions.
Flink has dependencies to HDFS and YARN which are both dependencies from Apache Hadoop. There exist many different versions of Hadoop (from both the upstream project and the different Hadoop distributions). If you are using a wrong combination of versions, exceptions can occur.
Hadoop is only supported from version 2.4.0 upwards. You can also specify a specific Hadoop version to build against:
mvn clean install -DskipTests -Dhadoop.version=2.6.1
Vendor-specific Versions 指定hadoop发行商
To build Flink against a vendor specific Hadoop version, issue the following command:
mvn clean install -DskipTests -Pvendor-repos -Dhadoop.version=2.6.1-cdh5.0.0
The -Pvendor-repos activates a Maven build profile that includes the repositories of popular Hadoop vendors such as Cloudera, Hortonworks, or MapR.
官网给出的是指定cdh发行商的版本,这里我给出一个hdp发行商的版本
mvn clean install -DskipTests -Pvendor-repos -Dhadoop.version=2.7.3.2.6.1.114-2
详细的版本信息可以从http://repo.hortonworks.com/content/repositories/releases/org/apache/hadoop/hadoop-common/查看
Scala Versions
Info Users that purely use the Java APIs and libraries can ignore this section.
Flink has APIs, libraries, and runtime modules written in Scala. Users of the Scala API and libraries may have to match the Scala version of Flink with the Scala version of their projects (because Scala is not strictly backwards compatible).
Flink 1.4 currently builds only with Scala version 2.11.
We are working on supporting Scala 2.12, but certain breaking changes in Scala 2.12 make this a more involved effort. Please check out this JIRA issue for updates.
Encrypted File Systems
If your home directory is encrypted you might encounter a java.io.IOException: File name too long exception. Some encrypted file systems, like encfs used by Ubuntu, do not allow long filenames, which is the cause of this error.
The workaround is to add:
<args>
<arg>-Xmax-classfile-name</arg>
<arg>128</arg>
</args>
in the compiler configuration of the pom.xml file of the module causing the error. For example, if the error appears in the flink-yarn module, the above code should be added under the <configuration> tag of scala-maven-plugin. See this issue for more information.
编译结果


flink\flink-dist\target\flink-1.6.0-bin\flink-1.6.0下会有编译结果

flink源码编译(windows环境)的更多相关文章
- HBase-2.2.3源码编译-Windows版
源码环境一览 windows: 7 64Bit Java: 1.8.0_131 Maven:3.3.9 Git:2.24.0.windows.1 HBase:2.2.3 Hadoop:2.8.5 下载 ...
- 终于完成了 源码 编译lnmp环境
经过了大概一个星期的努力,终于按照海生的编译流程将lnmp环境源码安装出来了 nginx 和php 主要参考 http://hessian.cn/p/1273.html mysql 主要参考 http ...
- 查看android源码,windows环境下载源码
查看源码 参考: http://blog.csdn.net/janronehoo/article/details/8560304 步骤: 添加chrome插件 Android SDK Search 进 ...
- python在windows(双版本)及linux(源码编译)环境下安装
python下载 下载地址:https://www.python.org/downloads/ 可以下载需要的版本,这里选择2.7.12和3.6.2 下面第一个是linux版本,第二个是windows ...
- git在windows及linux(源码编译)环境下安装
git在windows下安装 下载地址:https://git-scm.com/ 默认安装即可 验证 git --version git在linux下安装 下载地址:https://mirrors.e ...
- TensorFlow Python2.7环境下的源码编译(一)环境准备
参考: https://blog.csdn.net/yhily2008/article/details/79967118 https://tensorflow.google.cn/install/in ...
- TensorFlow Python3.7环境下的源码编译(一)环境准备
参考: https://blog.csdn.net/yhily2008/article/details/79967118 https://tensorflow.google.cn/install/in ...
- vcmi(魔法门英雄无敌3 - 开源复刻版) 源码编译
vcmi源码编译 windows+cmake+mingw ##1 准备 HoMM3 gog.com CMake 官网 vcmi 源码 下载 QT5 with mingw 官网 Boost 源码1.55 ...
- 麒麟系统开发笔记(三):从Qt源码编译安装之编译安装Qt5.12
前言 上一篇,是使用Qt提供的安装包安装的,有些场景需要使用到从源码编译的Qt,所以本篇如何在银河麒麟系统V4上编译Qt5.12源码. 银河麒麟V4版本 系统版本: Qt源码下载 ...
随机推荐
- Collections.synchronizedMap()、ConcurrentHashMap、Hashtable之间的区别
为什么要比较Hashtable.SynchronizedMap().ConcurrentHashMap之间的关系?因为常用的HashMap是非线程安全的,不能满足在多线程高并发场景下的需求. 那么为什 ...
- 初识函数库libpcap
由于工作上的需要,最近简单学习了抓包函数库libpcap,顺便记下笔记,方便以后查看 一.libpcap简介 libpcap(Packet Capture Library),即数据包捕获函数库, ...
- Mysql中外键的 Cascade ,NO ACTION ,Restrict ,SET NULL
外键约束对子表的含义: 如果在父表中找不到候选键,则不允许在子表上进行insert/update 外键约束对父表的含义: 在父表上进行update/delete以更新或删除在子表中有一条或多条对应匹配 ...
- 关于Linux虚拟化技术KVM的科普 科普三(From OenHan)
http://oenhan.com/archives,包括<KVM源代码分析1:基本工作原理>.<KVM源代码分析2:虚拟机的创建与运行>.<KVM源代码分析3:CPU虚 ...
- 你不知道的JavaScript--Item1 严格模式
本文转自[阮一峰博客]:http://www.ruanyifeng.com/blog/2013/01/javascript_strict_mode.html 一.概述 除了正常运行模式,ECMAscr ...
- attr和prop的区别以及在企业开发中应该如何抉择
attr和prop有很多相同的地方,比如都可以操作标签的属性节点,而且获取的时候都只可以获取到相同节点的第一个,例如这样: $('span').attr('class');和$('span').pro ...
- 【ShoppingWebCrawler】-基于Webkit内核的爬虫蜘蛛引擎概述
写在开头 在各个电商平台发展日渐成熟的今天.很多时候,我们需要一些平台上的基础数据.比如:商品分类,分类下的商品详细,甚至业务订单数据.电商平台大多数提供了相应的业务接口.允许ISV接入,用来扩展自身 ...
- JAVA经典算法40题(原题+分析)之原题
JAVA经典算法40题(上) [程序1] 题目:古典问题:有一对兔子,从出生后第3个月起每个月都生一对兔子,小兔子长到第四个月后每个月又生一对兔子,假如兔子都不死,问每个月的兔子总数为多少? [程 ...
- [SDOI2011]染色 BZOJ2243 树链剖分+线段树
分析: 区间合并,lcol是左端点的颜色编号,rcol是右端点的颜色编号,那么我们向上合并的时候,如果左儿子的rcol等于右儿子的lcol那么区间的sum--. 另外,如果重链顶的颜色等于重链顶的父节 ...
- Python Django 2.1登录功能_1
#在上篇的基础上进行#在.../sign/templates/index.html文件,开发登录表单 <html> <head> <title>Django Pag ...