Hadoop.2.x_源码编译
一、基本环境搭建
1. 准备
hadoop-2.5.0-src.tar.gz
apache-maven-3.0.5-bin.tar.gz
jdk-7u67-linux-x64.tar.gz
protobuf-2.5.0.tar.gz
可联外部网络
2. 安装 jdk-7u67-linux-x64.tar.gz 与 apache-maven-3.0.5-bin.tar.gz
[liuwl@centos66-bigdata-hadoop ~]$ vi /etc/profile
#JAVA_HOME
export JAVA_HOME=/opt/modules/jdk1.7.0_67
export PATH=$PATH:$JAVA_HOME/bin
#MAVEN_HOME
export MAVEN_HOME=/opt/modules/apache-maven-3.0.5
export PATH=$PATH:$MAVEN_HOME/bin
[liuwl@centos66-bigdata-hadoop ~] source /etc/profile
[liuwl@centos66-bigdata-hadoop ~] java -version
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)
[liuwl@centos66-bigdata-hadoop ~]$ echo $MAVEN_HOME
/opt/modules/apache-maven-3.0.5
[root@centos66-bigdata-hadoop ~]# mvn -v
Apache Maven 3.0.5 (r01de14724cdef164cd33c7c8c2fe155faf9602da; 2013-02-19 05:51:28-0800)
Maven home: /opt/modules/apache-maven-3.0.5
Java version: 1.7.0_67, vendor: Oracle Corporation
Java home: /opt/modules/jdk1.7.0_67/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-504.el6.x86_64", arch: "amd64", family: "unix"
PS:准备文件中最好准备好maven的仓库文件,否则将下载很久
[root@centos66-bigdata-hadoop hadoop-2.5.0-src]# ls /root/.m2/repository/
ant biz commons-chain commons-el commons-validator junit sslext antlr bouncycastle commons-cli commons-httpclient dom4j log4j tomcat aopalliance bsh commons-codec commons-io doxia logkit xerces asm cglib commons-collections commons-lang io net xml-apis avalon-framework classworlds commons-configuration commons-logging javax org xmlenc backport-util-concurrent com commons-daemon commons-net jdiff oro xpp3 bcel commons-beanutils commons-digester commons-pool jline regexp
3. yum 安装 cmake,zlib-devel,openssl-devel,gcc gcc-c++,ncurses-devel
[root@centos66-bigdata-hadoop ~]# yum -y install cmake
[root@centos66-bigdata-hadoop ~]# yum -y install zlib-devel
[root@centos66-bigdata-hadoop ~]# yum -y install openssl-devel
[root@centos66-bigdata-hadoop ~]# yum -y install gcc gcc-c++
[root@centos66-bigdata-hadoop hadoop-2.5.0-src]# yum -y install ncurses-devel
4. 安装 protobuf-2.5.0.tar.gz(解压后进入protobuf主目录)
[root@centos66-bigdata-hadoop protobuf-2.5.0]# mkdir -p /opt/modules/protobuf
[root@centos66-bigdata-hadoop protobuf-2.5.0]# ./configure --prefix=/opt/modules/protobuf
...
[root@centos66-bigdata-hadoop protobuf-2.5.0]# make
...
[root@centos66-bigdata-hadoop protobuf-2.5.0]# make install
...
[root@centos66-bigdata-hadoop protobuf-2.5.0]# vi /etc/profile
...
#PROTOBUF_HOME
export PROTOBUF_HOME=/opt/modules/protobuf
export PATH=$PATH:$PROTOBUF_HOME/bin
[root@centos66-bigdata-hadoop protobuf-2.5.0]# source /etc/profile
[root@centos66-bigdata-hadoop protobuf-2.5.0]# protoc --version
libprotoc 2.5.0
5. 解压Hadoop源文件压缩包,并进入主目录进行编译
[root@centos66-bigdata-hadoop protobuf-2.5.0]# cd ../../files/
[root@centos66-bigdata-hadoop files]# tar -zxf hadoop-2.5.0-src.tar.gz -C ../src/
[root@centos66-bigdata-hadoop files]# cd ../src/hadoop-2.5.0-src/
[root@centos66-bigdata-hadoop hadoop-2.5.0-src]# mvn package -DskipTests -Pdist,native
...
[INFO] Executed tasks
[INFO]
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
[INFO] Building jar: /home/liuwl/opt/src/hadoop-2.5.0-src/hadoop-dist/target/hadoop-dist-2.5.0-javadoc.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [8:22.179s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [5:14.366s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [1:50.627s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.795s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1:11.384s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [1:55.962s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [10:21.736s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [4:01.790s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [35.829s]
[INFO] Apache Hadoop Common .............................. SUCCESS [12:51.374s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [29.567s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.220s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:04:44.352s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [1:40.397s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [1:24.100s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [12.020s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.239s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.298s]
[INFO] hadoop-yarn-api ................................... SUCCESS [2:07.150s]
[INFO] hadoop-yarn-common ................................ SUCCESS [3:13.690s]
[INFO] hadoop-yarn-server ................................ SUCCESS [1.009s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [54.750s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [2:53.418s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [23.570s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [16.137s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [1:17.456s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [9.170s]
[INFO] hadoop-yarn-client ................................ SUCCESS [17.790s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.132s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [6.689s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.015s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.102s]
[INFO] hadoop-yarn-project ............................... SUCCESS [13.562s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.526s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [1:27.794s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [1:32.320s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [19.368s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [26.041s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [31.938s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [38.261s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [5.923s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [12.856s]
[INFO] hadoop-mapreduce .................................. SUCCESS [15.510s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [20.631s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [51.096s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [13.185s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [22.877s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [25.861s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [9.764s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [7.152s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [23.914s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [21.289s]
[INFO] Apache Hadoop Client .............................. SUCCESS [18.486s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.966s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [37.039s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [9.809s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.192s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [34.114s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:21:11.103s
[INFO] Finished at: Wed Sep 14 11:49:38 PDT 2016
[INFO] Final Memory: 86M/239M
[INFO] ------------------------------------------------------------------------
Hadoop.2.x_源码编译的更多相关文章
- Hadoop,HBase,Zookeeper源码编译并导入eclipse
基本理念:尽可能的参考官方英文文档 Hadoop: http://wiki.apache.org/hadoop/FrontPage HBase: http://hbase.apache.org/b ...
- Hadoop源码编译过程
一. 为什么要编译Hadoop源码 Hadoop是使用Java语言开发的,但是有一些需求和操作并不适合使用java,所以就引入了本地库(Native Libraries)的概念,通 ...
- 基于cdh5.10.x hadoop版本的apache源码编译安装spark
参考文档:http://spark.apache.org/docs/1.6.0/building-spark.html spark安装需要选择源码编译方式进行安装部署,cdh5.10.0提供默认的二进 ...
- hadoop 源码编译
hadoop 源码编译 1.准备jar 1) hadoop-2.7.2-src.tar.gz 2) jdk-8u144-linux-x64.tar.gz 3) apach-ant-1.9.9-bin. ...
- Spark源码编译
原创文章,转载请注明: 转载自http://www.cnblogs.com/tovin/p/3822995.html spark源码编译步骤如下: cd /home/hdpusr/workspace ...
- hadoop-1.2.0源码编译
以下为在CentOS-6.4下hadoop-1.2.0源码编译步骤. 1. 安装并且配置ant 下载ant,将ant目录下的bin文件夹加入到PATH变量中. 2. 安装git,安装autoconf, ...
- hadoop-2.0.0-mr1-cdh4.2.0源码编译总结
准备编译hadoop-2.0.0-mr1-cdh4.2.0的同学们要谨慎了.首先看一下这篇文章: Hadoop作业提交多种方案 http://www.blogjava.net/dragonHadoop ...
- wso2esb源码编译总结
最近花了两周的空闲时间帮朋友把wso2esb的4.0.3.4.6.0.4.7.0三个版本从源码编译出来了.以下是大概的一些体会. wso2esb是基于carbon的.carbon是个基于eclipse ...
- hadoop-2.6.0源码编译问题汇总
在上一篇文章中,介绍了hadoop-2.6.0源码编译的一般流程,因个人计算机环境的不同, 编译过程中难免会出现一些错误,下面是我编译过程中遇到的错误. 列举出来并附上我解决此错误的方法,希望对大家有 ...
随机推荐
- Linggle: 英语写作学习搜索引擎
Linggle 搜索引擎是一个可用于英语写作的语法.句子工具,可帮助学习者分析更准确的英文写作建议,能够根据词性来推测短句和句子,可精准的分享出完整英文句子如何撰写. Linggle 是台湾学术团队研 ...
- Android三种基本的加载网络图片方式(转)
Android三种基本的加载网络图片方式,包括普通加载网络方式.用ImageLoader加载图片.用Volley加载图片. 1. [代码]普通加载网络方式 ? 1 2 3 4 5 6 7 8 9 10 ...
- C语言中运算符的口决
- SSH框架应用解析
一.什么是SSH SSH 不仅仅只是一个框架,而是由多个框架集成而来,是 struts+spring+hibernate的一个集成框架,是目前较流行的一种Web应用程序开源框架,结构清晰.可复用性好. ...
- UML 之 四种关系
学习过UML的人都知道,UML之中有九种图和四种关系,今天,我们先来介绍一下这四种关系: 对于我们这些初学者来说,UML之中无非是 关联.依赖.泛化和实现,但是其中,关联和依赖又如何区分?泛化又如何 ...
- SQL..如何用命令删除数据库中所有的表?
要删除所有的用户表: declare @sql varchar(8000) SELECT @sql='drop table ' + name FROM sysobjects WHERE (type = ...
- Servlet3.0新特性
1 Servlet3.0新特性概述 使用要求:MyEclipse10.0或以上版本,发布到Tomcat7.0或以上版本,创建JavaEE6.0应用! Servlete3.0的主要新特性如下三部分: 使 ...
- express-5 质量保证(2)
跨页测试 跨页测试更有挑战性,因为需要你控制和观测浏览器. 现在设置一个跨页测试情境的例子.比如,你的网站上有一个包含联系表单的Request Group Rate页面.营销部门想知道客户是从哪个页面 ...
- bootstrap的图标无法正常显示解决方法
bootstrap的图标无法在火狐浏览器上正常显示,出现的是乱码,如下图所示: 解决方案: 直接把bootstrap整个文件夹放到项目中,引用的时候../static/bootstrap-3.3.5- ...
- OpenCV 第二课 认识图像的存储结构
OpenCV 第二课 认识图像的存储结构 Mat Mat 类包含两部分,矩阵头和矩阵体.矩阵头包含矩阵的大小,存储方式和矩阵体存储空间的指针.因此,Mat中矩阵头的大小是固定的,矩阵体大小是不定的. ...