由于在Hadoop-2.x中,Apache官网上提供的都是32位版本,如果是生产环境中则需要自行编译64位,编译Hadoop-2.x版本方法如下:

安装编译源码所依赖的底层库
  yum install glibc-headers
  yum install gcc
  yum install gcc-c++
  yum install make
  yum install cmake
  yum install openssl-devel
  yum install ncurses-devel

安装protobuf-2.5.0(Hadoop中节点间的RPC通讯协议的实现是基于Google的Protocol buffer)
  tar -zxvf /home/tools/protobuf-2.5.0.tar.gz -C /home/tools/
  cd protobuf-2.5.0 && ./configure && make && make check && make install

安装apache-maven-3.0.5(这里我安装的JDK1.7,编译Hadoop源码时需要使用maven)
  tar -zxvf /home/tools/apache-maven-3.0.5-bin.tar.gz -C /home/tools/
  vi /etc/profile
  export JAVA_HOME=/usr/local/java
  export M2_HOME=/usr/local/apache-maven-3.0.5
  export PATH=.:$M2_HOME/bin:$JAVA_HOME/bin:$PATH

解压Hadoop-2.4.1-src.tar.gz源码包进行编译
  tar -zxvf /home/tools/hadoop-2.4.1-src.tar.gz -C /home/tools/
  cd /home/tools/hadoop-2.4.1-src
  mvn package -DskipsTests -Pdist,native
  出现如下日志表示编译成功
        main:
             [exec]
             [exec] Current directory /home/tools/hadoop-2.4.1-src/hadoop-dist/target
             [exec]
             [exec] $ rm -rf hadoop-2.4.1
             [exec] $ mkdir hadoop-2.4.1
             [exec] $ cd hadoop-2.4.1
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/lib /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-nfs/target/hadoop-nfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/include /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/lib /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-tools/hadoop-tools-dist/target/hadoop-tools-dist-2.4.1/include /home/tools/hadoop-2.4.1-src/hadoop-tools/hadoop-tools-dist/target/hadoop-tools-dist-2.4.1/lib /home/tools/hadoop-2.4.1-src/hadoop-tools/hadoop-tools-dist/target/hadoop-tools-dist-2.4.1/share .
             [exec]
             [exec] Hadoop dist layout available at: /home/tools/hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1
             [exec]
        [INFO] Executed tasks
        [INFO]
        [INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-dist ---
        [WARNING] JAR will be empty - no content was marked for inclusion!
        [INFO] Building jar: /home/tools/hadoop-2.4.1-src/hadoop-dist/target/hadoop-dist-2.4.1.jar
        [INFO]
        [INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-dist ---
        [INFO] No sources in project. Archive not created.
        [INFO]
        [INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-dist ---
        [INFO] No sources in project. Archive not created.
        [INFO]
        [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hadoop-dist ---
        [INFO]
        [INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-dist ---
        [INFO] Executing tasks

main:
        [INFO] Executed tasks
        [INFO]
        [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
        [INFO] Building jar: /home/tools/hadoop-2.4.1-src/hadoop-dist/target/hadoop-dist-2.4.1-javadoc.jar
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO]
        [INFO] Apache Hadoop Main ................................ SUCCESS [1.176s]
        [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.937s]
        [INFO] Apache Hadoop Annotations ......................... SUCCESS [3.426s]
        [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.302s]
        [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.582s]
        [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [3.132s]
        [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [2.916s]
        [INFO] Apache Hadoop Auth ................................ SUCCESS [3.873s]
        [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [2.328s]
        [INFO] Apache Hadoop Common .............................. SUCCESS [1:36.564s]
        [INFO] Apache Hadoop NFS ................................. SUCCESS [5.527s]
        [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.038s]
        [INFO] Apache Hadoop HDFS ................................ SUCCESS [2:44.338s]
        [INFO] Apache Hadoop HttpFS .............................. SUCCESS [21.785s]
        [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [25.123s]
        [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [3.578s]
        [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.046s]
        [INFO] hadoop-yarn ....................................... SUCCESS [0.039s]
        [INFO] hadoop-yarn-api ................................... SUCCESS [1:19.929s]
        [INFO] hadoop-yarn-common ................................ SUCCESS [1:30.724s]
        [INFO] hadoop-yarn-server ................................ SUCCESS [0.032s]
        [INFO] hadoop-yarn-server-common ......................... SUCCESS [8.375s]
        [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [52.226s]
        [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [2.878s]
        [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [12.762s]
        [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [12.406s]
        [INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.483s]
        [INFO] hadoop-yarn-client ................................ SUCCESS [5.208s]
        [INFO] hadoop-yarn-applications .......................... SUCCESS [0.029s]
        [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [2.614s]
        [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [2.137s]
        [INFO] hadoop-yarn-site .................................. SUCCESS [0.037s]
        [INFO] hadoop-yarn-project ............................... SUCCESS [3.164s]
        [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.059s]
        [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [18.276s]
        [INFO] hadoop-mapreduce-client-common .................... SUCCESS [18.034s]
        [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [2.728s]
        [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [8.973s]
        [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [7.420s]
        [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [12.076s]
        [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [1.988s]
        [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [5.648s]
        [INFO] hadoop-mapreduce .................................. SUCCESS [2.431s]
        [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [9.437s]
        [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [20.544s]
        [INFO] Apache Hadoop Archives ............................ SUCCESS [2.163s]
        [INFO] Apache Hadoop Rumen ............................... SUCCESS [5.710s]
        [INFO] Apache Hadoop Gridmix ............................. SUCCESS [4.467s]
        [INFO] Apache Hadoop Data Join ........................... SUCCESS [2.770s]
        [INFO] Apache Hadoop Extras .............................. SUCCESS [3.014s]
        [INFO] Apache Hadoop Pipes ............................... SUCCESS [10.174s]
        [INFO] Apache Hadoop OpenStack support ................... SUCCESS [4.523s]
        [INFO] Apache Hadoop Client .............................. SUCCESS [3.611s]
        [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.136s]
        [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [9.834s]
        [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [3.285s]
        [INFO] Apache Hadoop Tools ............................... SUCCESS [0.025s]
        [INFO] Apache Hadoop Distribution ........................ SUCCESS [12.173s]
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD SUCCESS
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 13:01.056s
        [INFO] Finished at: Tues Jul 02 10:28:07 CST 2014
        [INFO] Final Memory: 165M/512M
        [INFO] ------------------------------------------------------------------------

cd /home/tools/hadoop-2.4.1-src/hadoop-dist/target
     ls
     antrun  dist-layout-stitching.sh  hadoop-2.4.1  hadoop-dist-2.4.1.jar  hadoop-dist-2.4.1-javadoc.jar  javadoc-bundle-options  maven-archiver  test-dir
     其中hadoop-2.4.1是64位hadoop二进制包,也就是我们所需要的。

Hadoop-2.x的源码编译的更多相关文章

  1. 基于cdh5.10.x hadoop版本的apache源码编译安装spark

    参考文档:http://spark.apache.org/docs/1.6.0/building-spark.html spark安装需要选择源码编译方式进行安装部署,cdh5.10.0提供默认的二进 ...

  2. Hadoop(一)Hadoop的简介与源码编译

    一 Hadoop简介 1.1Hadoop产生的背景 1. HADOOP最早起源于Nutch.Nutch的设计目标是构建一个大型的全网搜索引擎,包括网页抓取.索引.查询等功能,但随着抓取网页数量的增加, ...

  3. hadoop 2.5.2源码编译

    编译过程漫长无比,错误百出,需要耐心耐心!! 1.准备的环境及软件 操作系统:Centos6.4 64位 jdk:jdk-7u80-linux-x64.rpm,不要使用1.8 maven:apache ...

  4. hadoop 2.7.3 源码编译教程

    1.工具准备,最靠谱的是hadoop说明文档里要求具备的那些工具. 到hadoop官网,点击source下载hadoop-2.7.3-src.tar.gz. 解压之 tar -zxvf hadoop- ...

  5. Hadoop源码编译过程

    一.           为什么要编译Hadoop源码 Hadoop是使用Java语言开发的,但是有一些需求和操作并不适合使用java,所以就引入了本地库(Native Libraries)的概念,通 ...

  6. Hadoop,HBase,Zookeeper源码编译并导入eclipse

    基本理念:尽可能的参考官方英文文档 Hadoop:  http://wiki.apache.org/hadoop/FrontPage HBase:  http://hbase.apache.org/b ...

  7. hadoop 源码编译

    hadoop 源码编译 1.准备jar 1) hadoop-2.7.2-src.tar.gz 2) jdk-8u144-linux-x64.tar.gz 3) apach-ant-1.9.9-bin. ...

  8. Spark源码编译

    原创文章,转载请注明: 转载自http://www.cnblogs.com/tovin/p/3822995.html spark源码编译步骤如下: cd /home/hdpusr/workspace ...

  9. hadoop-1.2.0源码编译

    以下为在CentOS-6.4下hadoop-1.2.0源码编译步骤. 1. 安装并且配置ant 下载ant,将ant目录下的bin文件夹加入到PATH变量中. 2. 安装git,安装autoconf, ...

  10. hadoop-2.0.0-mr1-cdh4.2.0源码编译总结

    准备编译hadoop-2.0.0-mr1-cdh4.2.0的同学们要谨慎了.首先看一下这篇文章: Hadoop作业提交多种方案 http://www.blogjava.net/dragonHadoop ...

随机推荐

  1. 总结了关于PHP xss 和 SQL 注入的问题(转)

    漏洞无非这么几类,XSS.sql注入.命令执行.上传漏洞.本地包含.远程包含.权限绕过.信息泄露.cookie伪造.CSRF(跨站请求)等.这些漏洞不仅仅是针对PHP语言的,本文只是简单介绍PHP如何 ...

  2. Android组件间的数据传输

    组件我们有了,那么我们缺少一个组件之间传递信息的渠道.利用Intent做载体,这是一个王道的做法.还有呢,可以利用文件系统来做数据共享.也可以使用Application设置全局数据,利用组件来进行控制 ...

  3. yii console.php 报错 Property "CConsoleApplication.theme" is not defined.

    默认配置的话,是不会出现这个错误的,应该是有人为修改了 yiic.php 这个文件,本来是 $config 载入的应该是 console.php ,人为修改后载入了 main.php 这个配置文件了 ...

  4. 获取android手机联系人信息

    package com.yarin.android.Examples_04_04; import android.app.Activity; import android.database.Curso ...

  5. jquery中的this 到底是什么意思? $(this)

    如果你学过面向对象语言的话,例如JAVA,你应该明白这个this在JAVA里的意思,简单的说,谁在调用它,它就代表文谁. 那么,用到这个jquery里,也算是蛮简单的.举两个例子,一个是单个对象,一个 ...

  6. MSSQL 字符串替换语句

    MSSQL替换语句:update 表名 set 字段名=replace(cast(字段名 as varchar(8000)),'abc.com','123.com')例如:update PE_Arti ...

  7. jQuery旋转插件

    jQuery旋转插件,支持Internet Explorer 6.0 + .Firefox 2.0.Safari 3.Opera 9.Google Chrome,高级浏览器下使用Transform,低 ...

  8. Oracle数据库之FORALL与BULK COLLECT语句

    Oracle数据库之FORALL与BULK COLLECT语句 我们再来看一下PL/SQL块的执行过程:当PL/SQL运行时引擎处理一块代码时,它使用PL/SQL引擎来执行过程化的代码,而将SQL语句 ...

  9. Linux(Centos)之安装tomcat并且部署Java Web项目(转)

    1.准备工作 a.下载tomcat linux的包,地址:http://tomcat.apache.org/download-80.cgi,我们下载的版本是8.0,下载方式如图:          b ...

  10. 自己寫的 Loading JS插件

    本文為原創文章,轉載請注明出處,謝謝./** * @author samkin.yang * @version 1.0 */var $_yxj = new SamkinLoading(); (func ...