关于64位Linux编译hadoop2
Apache官方提供hadoop2的安装包是在32位机器下编译的,生产环境一般使用64的Linux,那么需要在64位机器下重新编译
可以查看hadoop-2.2.0-src下的BUILDING.txt
Build instructions for Hadoop
----------------------------------------------------------------------------------
Requirements:
* Unix System
* JDK 1.6+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
----------------------------------------------------------------------------------
...
步骤:
1.安装JDK 1.6+ (验证:java -version)
2.安装Maven 3.0 or later (验证:mvn -version)
3.ProtocolBuffer 2.5.0 (验证:protoc --version)下载地址:https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
#为了编译安装protobuf,Linux需要上网,使用YUM在线安装依赖。如不能上网,就比较麻烦,需要一一下载每个依赖包再安装。
sudo yum install gcc
sudo yum install gcc-c++
sudo yum install make
#解压protobuf
sudo tar -zxvf protobuf-2.5.0.tar.gz
#进入到protobuf-2.5.0
cd protobuf-2.5.0
#编译安装
sudo ./configure
sudo make
sudo make install
4.安装CMake 2.6 or newer
sudo yum install cmake
sudo yum install openssl-devel
sudo yum install ncurses-devel
5.编译hadoop-2.2.0
#解压hadoop-2.2.0-src.tar.gz
tar -zxvf hadoop-2.2.0-src.tar.gz
#进入到hadoop-2.2.0-src
cd hadoop-2.2.0-src
#hadoop-2.2.0-src/hadoop-common-project/hadoop-auth/pom.xml有个bug,这里需要修改一下
vim hadoop-common-project/hadoop-auth/pom.xml
#在<dependencies>标签内添加如下内容
<dependency>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-util</artifactId>
<scope>test</scope>
</dependency>
#编译
mvn package -DskipTests -Pdist,native
#编译好的hadoop-2.2.0再hadoop-2.2.0-src/hadoop-dist/target目录下
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [ 1.228 s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [ 0.894 s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [ 1.809 s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [ 0.222 s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [ 1.198 s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [ 2.205 s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [ 2.169 s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [ 1.583 s]
[INFO] Apache Hadoop Common .............................. SUCCESS [01:02 min]
[INFO] Apache Hadoop NFS ................................. SUCCESS [ 5.132 s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [ 0.038 s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [01:02 min]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [ 9.002 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [ 4.995 s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [ 2.647 s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [ 0.058 s]
[INFO] hadoop-yarn ....................................... SUCCESS [ 0.138 s]
[INFO] hadoop-yarn-api ................................... SUCCESS [ 31.854 s]
[INFO] hadoop-yarn-common ................................ SUCCESS [ 20.121 s]
[INFO] hadoop-yarn-server ................................ SUCCESS [ 0.105 s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [ 5.776 s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [ 10.490 s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [ 3.321 s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [ 8.311 s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [ 0.510 s]
[INFO] hadoop-yarn-client ................................ SUCCESS [ 3.929 s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [ 0.060 s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [ 1.720 s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [ 0.062 s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [ 16.204 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [ 1.779 s]
[INFO] hadoop-yarn-site .................................. SUCCESS [ 0.111 s]
[INFO] hadoop-yarn-project ............................... SUCCESS [ 2.287 s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [ 11.855 s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [ 2.560 s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [ 6.985 s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [ 3.319 s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [ 4.021 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [ 1.508 s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [ 4.176 s]
[INFO] hadoop-mapreduce .................................. SUCCESS [ 2.367 s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [ 2.902 s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [ 5.365 s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [ 1.673 s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [ 4.095 s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [ 2.962 s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [ 2.089 s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [ 2.190 s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [ 5.887 s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [ 1.149 s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [ 0.028 s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [ 6.514 s]
[INFO] Apache Hadoop Client .............................. SUCCESS [ 2.199 s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [ 0.121 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS (####看到BUILD SUCCESS说明编译成功####)
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 05:39 min
[INFO] Finished at: 2014-05-23T13:37:06+08:00
[INFO] Final Memory: 135M/300M
[INFO] ------------------------------------------------------------------------
关于64位Linux编译hadoop2的更多相关文章
- 64位Linux编译hadoop-2.5.1
Apache Hadoop生态系统安装包下载地址:http://archive.apache.org/dist/ 软件安装目录:~/app jdk: jdk-7u45-linux-x64.rpm ha ...
- 64位Linux编译C代码,crt1.o文件格式不对的问题
今天在某台64位LInux下编译一个简单的hello world的C程序,报错: /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../crt1.o: cou ...
- CentOS 64位上编译 Hadoop2.6.0
由于hadoop-2.6.0.tar.gz安装包是在32位机器上编译的,64位的机器加载本地库.so文件时会出错,比如: java.lang.UnsatisfiedLinkError: org.apa ...
- 64位linux编译32位程序
昨天接到的任务,编译64位和32位两个版本的.so动态库给其他部门,我的ubuntu虚拟机是64位的,编译32位时遇到了问题: /usr/bin/ld: cannot find -lstdc++ 最后 ...
- linux下hadoop2.6.1源码64位的编译
linux下hadoop2.6.1源码64位的编译 一. 前言 Apache官网上提供的hadoop本地库是32位的,如果我们的Linux服务器是64位的话,就会现问题.我们在64位服务器执行Hado ...
- 在64位linux下编译32位程序
在64位linux下编译32位程序 http://blog.csdn.net/xsckernel/article/details/38045783
- MiniCRT 64位 linux 系统移植记录:64位gcc的几点注意
32位未修改源码与修改版的代码下载: git clone git@github.com:youzhonghui/MiniCRT.git MiniCRT 64位 linux 系统移植记录 MiniCRT ...
- /usr/local/lib/libz.a: could not read symbols: Bad value(64 位 Linux)
/usr/local/lib/libz.a: could not read symbols: Bad value(64 位 Linux) /usr/bin/ld: /usr/local/lib/lib ...
- 64位linux报错Could not initialize class java.awt.image.BufferedImage
最近碰到一个问题: 64位linux报错Could not initialize class java.awt.image.BufferedImage 在WIN平台下运行正常BufferedImage ...
随机推荐
- sql:[dbo].[smt_MES_RptProductDaily] 生产日报表
USE [ChangHongMES_904]GO/****** Object: StoredProcedure [dbo].[smt_MES_RptProductDaily] Script Date: ...
- C#自定义控件的开发:Pin和Connector
C#自定义控件的开发:Pin和Connector 2009-08-03 14:46 wonsoft hi.baidu 我要评论(0) 字号:T | T 本文介绍了如何使用智能设备扩展C#自定义控件. ...
- cocos2d-x CCArray
转自:http://blog.csdn.net/onerain88/article/details/8164210 1. CCArray只是提供了一个面向对象的封装类 其继承于CCObject类(CC ...
- Install eclipse groovy plugin
http://dist.springsource.org/release/GRECLIPSE/e4.4/
- HTML输出 二 控制行背景颜色
$Infors = Get-Content ports01.txt$Temp_PortStatustxt = "C:\Windows\Temp\PortStatustxt.txt" ...
- wsus客户端/服务器检查更新
wuauclt /detectnow 客户端检查更新 Wuauclt.exe是Windows自动升级管理程序.该进程会不断在线检测更新 wsusutil.exe wsus服务器命令行工具
- Codeforces Round #313 (Div. 2) B. Gerald is into Art 水题
B. Gerald is into Art Time Limit: 20 Sec Memory Limit: 256 MB 题目连接 http://codeforces.com/contest/560 ...
- Codeforces Gym H. Hell on the Markets 贪心
Problem H. Hell on the MarketsTime Limit: 20 Sec Memory Limit: 256 MB 题目连接 http://acm.hust.edu.cn/vj ...
- C#.net 之货币转换
利用string.format 和cultureInfo 来进行转换 /// <summary> /// 输入Float格式数字,将其转换为货币表达方式 /// </summary& ...
- Java多线程模式(二)
Guarded Suspension Pattern 该模式描述的是当一个线程在执行某个操作时,但由于其他资源还没有准备好,需要等待,那么就等待资源准备好才开始自己的操作.我们直接看代码例子 ...