64位Linux编译hadoop-2.5.1
Apache Hadoop生态系统安装包下载地址:http://archive.apache.org/dist/
软件安装目录:~/app
jdk: jdk-7u45-linux-x64.rpm
hadoop: hadoop-2.5.-src.tar.gz
maven: apache-maven-3.0.-bin.zip
protobuf: protobuf-2.5..tar.gz
1、下载hadoop
wget http://archive.apache.org/dist/hadoop/core/stable/hadoop-2.5.1-src.tar.gz
tar -zxvf hadoop-2.5.-src.tar.gz
在解压后的hadoop根目录下有个BUILDING.txt文件,可以看到编译hadoop的环境要求
Requirements:
* Unix System
* JDK 1.6+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code)
* Zlib devel (if compiling native code)
* openssl devel ( if compiling native hadoop-pipes )
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
2、安装jdk
sudo yum install jdk-7u45-linux-x64.rpm
查看jdk安装位置:
which java
/usr/java/jdk1..0_45/bin/java
添加jdk到环境变量(~/.bash_profile):
export JAVA_HOME=/usr/java/jdk1..0_45
export PATH=.:$JAVA_HOME/bin:$PATH
验证:
java -version
java version "1.7.0_45"
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) -Bit Server VM (build 24.45-b08, mixed mode)
3、安装maven
wget http://apache.fayea.com/apache-mirror/maven/maven-3/3.0.5/binaries/apache-maven-3.0.5-bin.zip
unzip apache-maven-3.0.-bin.zip
添加maven到环境变量(~/.bash_profile):
export MAVEN_HOME=/home/hadoop/app/apache-maven-3.0.
export PATH=.:$MAVEN_HOME/bin:$PATH
验证:
mvn -version
Apache Maven 3.0. (r01de14724cdef164cd33c7c8c2fe155faf9602da; -- ::-)
Maven home: /home/hadoop/app/apache-maven-3.0.
Java version: 1.7.0_45, vendor: Oracle Corporation
Java home: /usr/java/jdk1..0_45/jre
Default locale: en_US, platform encoding: UTF-
OS name: "linux", version: "2.6.32-358.el6.x86_64", arch: "amd64", family: "unix"
4、安装protobuf
protobuf的官方地址貌似上不了,自行下载protobuf安装包;为了编译安装protobuf,需要先gcc/gcc-c++/make
sudo yum install gcc
sudo yum install gcc-c++
sudo yum install make
tar -zvxf protobuf-2.5..tar.gz
cd protobuf-2.5.
./configure --prefix=/usr/local/protoc/
sudo make
sudo make install
添加protobuf到环境变量(~/.bash_profile):
export PATH=.:/usr/local/protoc/bin:$PATH
验证:
protoc --version
libprotoc 2.5.
5、安装其他依赖
sudo yum install cmake
sudo yum install openssl-devel
sudo yum install ncurses-devel
6、编译hadoop源代码
cd ~/app/hadoop-2.5.-src
mvn package -DskipTests -Pdist,native
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [.980s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [.575s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [.324s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [.318s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [.550s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [.548s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [.410s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [.503s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [.915s]
[INFO] Apache Hadoop Common .............................. SUCCESS [:.913s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [.324s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [.064s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [:.023s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [.389s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [.235s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [.493s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [.041s]
[INFO] hadoop-yarn ....................................... SUCCESS [.031s]
[INFO] hadoop-yarn-api ................................... SUCCESS [:.828s]
[INFO] hadoop-yarn-common ................................ SUCCESS [.542s]
[INFO] hadoop-yarn-server ................................ SUCCESS [.047s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [.953s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [.537s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [.270s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [.840s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [.877s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [.421s]
[INFO] hadoop-yarn-client ................................ SUCCESS [.406s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [.025s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [.208s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [.885s]
[INFO] hadoop-yarn-site .................................. SUCCESS [.058s]
[INFO] hadoop-yarn-project ............................... SUCCESS [.870s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [.065s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [.292s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [.197s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [.229s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [.322s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [.640s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [.154s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [.939s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [.088s]
[INFO] hadoop-mapreduce .................................. SUCCESS [.979s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [.615s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [.668s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [.014s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [.567s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [.398s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [.151s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [.251s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [.901s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [.722s]
[INFO] Apache Hadoop Client .............................. SUCCESS [.021s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [.095s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [.776s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [.768s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [.035s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [.571s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: :.071s
[INFO] Finished at: Sat Nov :: PST
[INFO] Final Memory: 91M/324M
[INFO] ------------------------------------------------------------------------
编译后的代码在hadoop-2.5.1-src/hadoop-dist/target/hadoop-2.5.1下,以后要搭建hadoop环境直接使用hadoop-2.5.1文件夹部署即可。
64位Linux编译hadoop-2.5.1的更多相关文章
- 64位Linux编译C代码,crt1.o文件格式不对的问题
今天在某台64位LInux下编译一个简单的hello world的C程序,报错: /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../crt1.o: cou ...
- 关于64位Linux编译hadoop2
Apache官方提供hadoop2的安装包是在32位机器下编译的,生产环境一般使用64的Linux,那么需要在64位机器下重新编译可以查看hadoop-2.2.0-src下的BUILDING.txtB ...
- CentOS 64位上编译 Hadoop 2.6.0
Hadoop不提供64位编译好的版本号,仅仅能用源代码自行编译64位版本号. 学习一项技术从安装開始.学习hadoop要从编译開始. 1.操作系统编译环境 yum install cmake lzo- ...
- 64位linux编译32位程序
昨天接到的任务,编译64位和32位两个版本的.so动态库给其他部门,我的ubuntu虚拟机是64位的,编译32位时遇到了问题: /usr/bin/ld: cannot find -lstdc++ 最后 ...
- linux下hadoop2.6.1源码64位的编译
linux下hadoop2.6.1源码64位的编译 一. 前言 Apache官网上提供的hadoop本地库是32位的,如果我们的Linux服务器是64位的话,就会现问题.我们在64位服务器执行Hado ...
- 在64位linux下编译32位程序
在64位linux下编译32位程序 http://blog.csdn.net/xsckernel/article/details/38045783
- MiniCRT 64位 linux 系统移植记录:64位gcc的几点注意
32位未修改源码与修改版的代码下载: git clone git@github.com:youzhonghui/MiniCRT.git MiniCRT 64位 linux 系统移植记录 MiniCRT ...
- /usr/local/lib/libz.a: could not read symbols: Bad value(64 位 Linux)
/usr/local/lib/libz.a: could not read symbols: Bad value(64 位 Linux) /usr/bin/ld: /usr/local/lib/lib ...
- 64位linux报错Could not initialize class java.awt.image.BufferedImage
最近碰到一个问题: 64位linux报错Could not initialize class java.awt.image.BufferedImage 在WIN平台下运行正常BufferedImage ...
随机推荐
- SSL使用windows证书库中证书实现双向认证
前一段时间对OpenSSL库中的SSL通讯稍微琢磨了一下,在百度文库中找了个示例程序,然后在机器上跑,哇塞,运行成功!那时那个惊喜啊,SSL蛮简单的嘛.前几天,老板要我整一个SSL通讯,要使用wind ...
- jQuery实现文字放大效果
实现效果:当鼠标移动到超链接的那一瞬间就出现提示. <!DOCTYPE html> <html> <head> <meta charset="UTF ...
- php base64编码和urlencode
base64编码 加密 base64_encode($str); 解密 base64_decode(base64_encode($str)); urlencode和base64混合使用 functio ...
- 【mybatis】之批量添加
mybatis批量添加xml <insert id="batchCreate"> INSERT INTO `roomer` (`order`,name,idCard,m ...
- UDP 单播、广播和多播
阅读目录(Content) 一.UDP广播 二.UDP多播 1.多播(组播)的概念 2.广域网的多播 三.UDP广播与单播 广播与单播的比较 使用UDP协议进行信息的传输之前不需要建议连接.换句话说就 ...
- HDMI EDID解读
现在的显示设备比如显示器,电视等都HDMI接口,那通常每个HDMI接口都保留有一份EDID数据,这个数据可以存在程序里面由系统启动过程中来初始化,更常见的做法是每个HDMI口会有一个EEPROM来保存 ...
- Unix commands in Mac OS X
参考:http://www.renfei.org/blog/mac-os-x-terminal-101.html One command line includes 4 parts: Command ...
- MyEclipse背景色不伤眼+字体大小调节+代码格式化不换行
- org.hibernate.id.IdentifierGenerationException: Unknown integral data type for ids : java.lang.String
org.hibernate.id.IdentifierGenerationException: Unknown integral data type for ids : java.lang.Strin ...
- hdu 1532 Dinic模板(小白书)
hdu1532 输入n,m. n条边,m个点,之后给出a到b的容量,求1到m的最大流. 注意:Dinic只能调用一次,因为原理是改变cap的值,如果调用多次一样的,那么第一次会对,其余的都会是0,因为 ...