Hadoop官网提供的编译好的hadoop-2.3.0.tar.gz二进制包是在32位系统上编译的,在64系统上运行会有一些错误,比如:

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

此时需要自行编译hadoop 2.30 源码。本人编译的hadoop 2.30的二进制包经实验证明可以正确安装并正确运行了Word Count程序。此处可以下载我编译的二进制包:

http://pan.baidu.com/s/1eQrgsWa

以下为编译步骤

1. 更新软件包列表

apt-get update

2. 安装编译所需要的软件: 为什么需要这些依赖包? 我也不知道==

apt-get install -y openjdk-7-jdk libprotobuf-dev protobuf-compiler maven cmake build-essential pkg-config libssl-dev zlib1g-dev llvm-gcc automake autoconf make

3. 下载hadoop 2.30的源文件包

wget http://archive.apache.org/dist/hadoop/core/hadoop-2.3.0/hadoop-2.3.0-src.tar.gz

4. 解压hadoop 2.30 的源文件包

tar -xzvf hadoop-2.3.0-src.tar.gz

5. 进入hadoop 2.30 文件夹

cd hadoop-2.3.0-src

6. 编译hadoop 2.30 源文件

mvn package -Pdist,native -DskipTests –Dtar

正确执行的结果如下:

[INFO] ------------------------------------------------------------------------

[INFO] Reactor Summary:

[INFO]

[INFO] Apache Hadoop Main ................................ SUCCESS [1:11.968s]

[INFO] Apache Hadoop Project POM ......................... SUCCESS [30.393s]

[INFO] Apache Hadoop Annotations ......................... SUCCESS [18.398s]

[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.246s]

[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [20.372s]

[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [23.721s]

[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [1:41.836s]

[INFO] Apache Hadoop Auth ................................ SUCCESS [22.303s]

[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [7.052s]

[INFO] Apache Hadoop Common .............................. SUCCESS [2:29.466s]

[INFO] Apache Hadoop NFS ................................. SUCCESS [11.604s]

[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.073s]

[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:30.230s]

[INFO] Apache Hadoop HttpFS .............................. SUCCESS [17.976s]

[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [19.927s]

[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [3.304s]

[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.032s]

[INFO] hadoop-yarn ....................................... SUCCESS [0.033s]

[INFO] hadoop-yarn-api ................................... SUCCESS [36.284s]

[INFO] hadoop-yarn-common ................................ SUCCESS [33.912s]

[INFO] hadoop-yarn-server ................................ SUCCESS [0.213s]

[INFO] hadoop-yarn-server-common ......................... SUCCESS [8.193s]

[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [41.181s]

[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [2.768s]

[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [13.923s]

[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.904s]

[INFO] hadoop-yarn-client ................................ SUCCESS [4.363s]

[INFO] hadoop-yarn-applications .......................... SUCCESS [0.120s]

[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [2.262s]

[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [1.615s]

[INFO] hadoop-yarn-site .................................. SUCCESS [0.086s]

[INFO] hadoop-yarn-project ............................... SUCCESS [2.703s]

[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.132s]

[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [18.951s]

[INFO] hadoop-mapreduce-client-common .................... SUCCESS [14.320s]

[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [3.330s]

[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [9.664s]

[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [7.678s]

[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [9.263s]

[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [1.549s]

[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [5.748s]

[INFO] hadoop-mapreduce .................................. SUCCESS [2.880s]

[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [7.080s]

[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [14.648s]

[INFO] Apache Hadoop Archives ............................ SUCCESS [2.602s]

[INFO] Apache Hadoop Rumen ............................... SUCCESS [5.706s]

[INFO] Apache Hadoop Gridmix ............................. SUCCESS [3.649s]

[INFO] Apache Hadoop Data Join ........................... SUCCESS [2.483s]

[INFO] Apache Hadoop Extras .............................. SUCCESS [2.678s]

[INFO] Apache Hadoop Pipes ............................... SUCCESS [6.359s]

[INFO] Apache Hadoop OpenStack support ................... SUCCESS [5.088s]

[INFO] Apache Hadoop Client .............................. SUCCESS [4.534s]

[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.433s]

[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [7.757s]

[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [4.099s]

[INFO] Apache Hadoop Tools ............................... SUCCESS [0.428s]

[INFO] Apache Hadoop Distribution ........................ SUCCESS [18.045s]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 14:59.240s

[INFO] Finished at: Thu Jan 15 18:51:59 JST 2015

[INFO] Final Memory: 168M/435M

[INFO] ------------------------------------------------------------------------

编译好的二进制文件包位于

hadoop-2.3.0-src/hadoop-dist/target/hadoop-2.3.0.tar.gz

PS: 使用自行编译的hadoop 2.30二进制包安装hadoop 2.30时需要注意删除 .bashrc文件与hadoop-env.sh文件中下面两行(默认不会有这两行,但是尝试解决报错时可能改写了)

export HADOOP_COMMON_LIB_NATIVE_DIR="~/hadoop/lib/"

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=~/hadoop/lib/"

linux编译64bitHadoop (eg: ubuntu14.04 and hadoop 2.3.0)的更多相关文章

  1. Torch,Tensorflow使用: Ubuntu14.04(x64)+ CUDA8.0 安装 Torch和Tensorflow

    系统配置: Ubuntu14.04(x64) CUDA8.0 cudnn-8.0-linux-x64-v5.1.tgz(Tensorflow依赖) Anaconda 1. Torch安装 Torch是 ...

  2. 二、Ubuntu14.04下安装Hadoop2.4.0 (伪分布模式)

    在Ubuntu14.04下安装Hadoop2.4.0 (单机模式)基础上配置 一.配置core-site.xml /usr/local/hadoop/etc/hadoop/core-site.xml ...

  3. Ubuntu 12.04下Hadoop 2.2.0 集群搭建(原创)

    现在大家可以跟我一起来实现Ubuntu 12.04下Hadoop 2.2.0 集群搭建,在这里我使用了两台服务器,一台作为master即namenode主机,另一台作为slave即datanode主机 ...

  4. 搭建ubuntu14.04的hadoop集群【docker容器充当服务器】

    首先弄出来装有hadoop.java.ssh.vim的镜像起名badboyf/hadoop.做镜像有两种方法,一种是用Dockerfile来生成一个镜像,一种是基于ubuntu14.04的基础镜像生成 ...

  5. Ubuntu14.04 安装配置Hadoop2.6.0

    目前关于Hadoop的安装配置教程书上.官方教程.博客都有很多,但由于对Linux环境的不熟悉以及各种教程或多或少有这样那样的坑,很容易导致折腾许久都安装不成功(本人就是受害人之一).经过几天不断尝试 ...

  6. Ubuntu14.04安装和配置Tomcat8.0.12(转)

    Ubuntu14.04长的好看,所以一时间很感兴趣,研究各种软件的安装和开发环境的配置.今天先把安装的tomcat 8.0.12的教程分享给大家.如果你需要,请收藏!!!   工具/原料 系统环境:U ...

  7. Ubuntu14.04安装和配置Tomcat8.0.12

    Ubuntu14.04长的好看,所以一时间很感兴趣,研究各种软件的安装和开发环境的配置.今天先把安装的tomcat 8.0.12的教程分享给大家.如果你需要,请收藏!!!   官方网站下载最新的tom ...

  8. Linux系统安装及初始化(ubuntu14.04)

    Windows 7下硬盘安装Ubuntu 14.04图文教程 Ubuntu 官方已经发布了正式版的 Ubuntu 14.04 LTS,并宣称这是为云计算准备的版本.该版本在云平台和伸缩环境的可靠性.性 ...

  9. ubuntu14.04下配置使用openCV3.0

    [操  作  系  统] Ubuntu 14.04 LTS [OpenCV版本]  3.0.0-beta [Eclipse 版 本] 3.8.1 需要知识: Linux系统shell命令基础 编译原理 ...

随机推荐

  1. 第一册:lesson eighty nine.

    原文: For sale. A:Good afternoon. I believe that the house is for sale. B:That's right. A:May I have a ...

  2. 解决mysql服务无法启动的问题

    今天,mysql突然无法启动了. 解决办法记录一下: 1.删除data文件 我的:C:\Program Files\MySQL\MySQL Server 5.7\data 注意:这个文件可能在你一直试 ...

  3. Cheating sheet for vim

  4. 整理:手机端弹出提示框,使用的bootstrap中的模态框(modal,弹出层),比kendo弹出效果好

    效果图: 我的代码示例: <!--提示模态框--> <div class="modal fade" id="myModal" tabindex ...

  5. H5_canvas与svg

    Canvas 什么是canvas: HTML5 的 canvas 元素是使用 JavaScript 在网页上绘制图像,canvas 元素本身是没有绘图能力的,所有的绘制工作必须在 JavaScript ...

  6. blfs(systemd版本)学习笔记-编译安装gnome桌面组件及应用

    我的邮箱地址:zytrenren@163.com欢迎大家交流学习纠错! blfs中的gnome项目地址:http://www.linuxfromscratch.org/blfs/view/stable ...

  7. HTML DOM classList 属性

    页面DOM里的每个节点上都有一个classList对象,程序员可以使用里面的方法新增.删除.修改节点上的CSS类.使用classList,程序员还可以用它来判断某个节点是否被赋予了某个CSS类. 添加 ...

  8. iphone手机投屏在哪里 手机无线投屏电脑

    Iphone是我们经常使用的一款手机,有时候经常需要将一些文件图片信息等投屏到电脑,那么iphone手机投屏在哪里?可以无线投屏到电脑吗?其实很简单,下面就分享下苹果手机投屏的具体方法给大家,希望对大 ...

  9. 用ABP只要加人即可马上加快项目进展(二) - 分工篇

    2018年和1998年其中两大区别就是: 前端蓬勃发展, 前后端分离是一个十分大的趋势. 专门的测试人员角色被取消, 多出了一个很重要的角色, 产品经理   ABP只要加入即可马上加快项目进展, 选择 ...

  10. python 爬虫爬取内容时, \xa0 、 \u3000 的含义

    最近用 scrapy 爬某网站,发现拿到的内容里面含有 \xa0 . \u3000 这样的字符,起初还以为是编码不对,搜了一下才知道是见识太少 233 . \xa0 是不间断空白符   我们通常所用的 ...