64位Linux编译hadoop-2.5.1
Apache Hadoop生态系统安装包下载地址:http://archive.apache.org/dist/
软件安装目录:~/app
jdk: jdk-7u45-linux-x64.rpm
hadoop: hadoop-2.5.-src.tar.gz
maven: apache-maven-3.0.-bin.zip
protobuf: protobuf-2.5..tar.gz
1、下载hadoop
wget http://archive.apache.org/dist/hadoop/core/stable/hadoop-2.5.1-src.tar.gz
tar -zxvf hadoop-2.5.-src.tar.gz
在解压后的hadoop根目录下有个BUILDING.txt文件,可以看到编译hadoop的环境要求
Requirements:
* Unix System
* JDK 1.6+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code)
* Zlib devel (if compiling native code)
* openssl devel ( if compiling native hadoop-pipes )
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
2、安装jdk
sudo yum install jdk-7u45-linux-x64.rpm
查看jdk安装位置:
which java
/usr/java/jdk1..0_45/bin/java
添加jdk到环境变量(~/.bash_profile):
export JAVA_HOME=/usr/java/jdk1..0_45
export PATH=.:$JAVA_HOME/bin:$PATH
验证:
java -version
java version "1.7.0_45"
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) -Bit Server VM (build 24.45-b08, mixed mode)
3、安装maven
wget http://apache.fayea.com/apache-mirror/maven/maven-3/3.0.5/binaries/apache-maven-3.0.5-bin.zip
unzip apache-maven-3.0.-bin.zip
添加maven到环境变量(~/.bash_profile):
export MAVEN_HOME=/home/hadoop/app/apache-maven-3.0.
export PATH=.:$MAVEN_HOME/bin:$PATH
验证:
mvn -version
Apache Maven 3.0. (r01de14724cdef164cd33c7c8c2fe155faf9602da; -- ::-)
Maven home: /home/hadoop/app/apache-maven-3.0.
Java version: 1.7.0_45, vendor: Oracle Corporation
Java home: /usr/java/jdk1..0_45/jre
Default locale: en_US, platform encoding: UTF-
OS name: "linux", version: "2.6.32-358.el6.x86_64", arch: "amd64", family: "unix"
4、安装protobuf
protobuf的官方地址貌似上不了,自行下载protobuf安装包;为了编译安装protobuf,需要先gcc/gcc-c++/make
sudo yum install gcc
sudo yum install gcc-c++
sudo yum install make
tar -zvxf protobuf-2.5..tar.gz
cd protobuf-2.5.
./configure --prefix=/usr/local/protoc/
sudo make
sudo make install
添加protobuf到环境变量(~/.bash_profile):
export PATH=.:/usr/local/protoc/bin:$PATH
验证:
protoc --version
libprotoc 2.5.
5、安装其他依赖
sudo yum install cmake
sudo yum install openssl-devel
sudo yum install ncurses-devel
6、编译hadoop源代码
cd ~/app/hadoop-2.5.-src
mvn package -DskipTests -Pdist,native
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [.980s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [.575s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [.324s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [.318s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [.550s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [.548s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [.410s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [.503s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [.915s]
[INFO] Apache Hadoop Common .............................. SUCCESS [:.913s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [.324s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [.064s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [:.023s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [.389s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [.235s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [.493s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [.041s]
[INFO] hadoop-yarn ....................................... SUCCESS [.031s]
[INFO] hadoop-yarn-api ................................... SUCCESS [:.828s]
[INFO] hadoop-yarn-common ................................ SUCCESS [.542s]
[INFO] hadoop-yarn-server ................................ SUCCESS [.047s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [.953s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [.537s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [.270s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [.840s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [.877s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [.421s]
[INFO] hadoop-yarn-client ................................ SUCCESS [.406s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [.025s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [.208s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [.885s]
[INFO] hadoop-yarn-site .................................. SUCCESS [.058s]
[INFO] hadoop-yarn-project ............................... SUCCESS [.870s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [.065s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [.292s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [.197s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [.229s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [.322s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [.640s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [.154s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [.939s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [.088s]
[INFO] hadoop-mapreduce .................................. SUCCESS [.979s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [.615s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [.668s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [.014s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [.567s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [.398s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [.151s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [.251s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [.901s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [.722s]
[INFO] Apache Hadoop Client .............................. SUCCESS [.021s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [.095s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [.776s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [.768s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [.035s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [.571s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: :.071s
[INFO] Finished at: Sat Nov :: PST
[INFO] Final Memory: 91M/324M
[INFO] ------------------------------------------------------------------------
编译后的代码在hadoop-2.5.1-src/hadoop-dist/target/hadoop-2.5.1下,以后要搭建hadoop环境直接使用hadoop-2.5.1文件夹部署即可。
64位Linux编译hadoop-2.5.1的更多相关文章
- 64位Linux编译C代码,crt1.o文件格式不对的问题
今天在某台64位LInux下编译一个简单的hello world的C程序,报错: /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../crt1.o: cou ...
- 关于64位Linux编译hadoop2
Apache官方提供hadoop2的安装包是在32位机器下编译的,生产环境一般使用64的Linux,那么需要在64位机器下重新编译可以查看hadoop-2.2.0-src下的BUILDING.txtB ...
- CentOS 64位上编译 Hadoop 2.6.0
Hadoop不提供64位编译好的版本号,仅仅能用源代码自行编译64位版本号. 学习一项技术从安装開始.学习hadoop要从编译開始. 1.操作系统编译环境 yum install cmake lzo- ...
- 64位linux编译32位程序
昨天接到的任务,编译64位和32位两个版本的.so动态库给其他部门,我的ubuntu虚拟机是64位的,编译32位时遇到了问题: /usr/bin/ld: cannot find -lstdc++ 最后 ...
- linux下hadoop2.6.1源码64位的编译
linux下hadoop2.6.1源码64位的编译 一. 前言 Apache官网上提供的hadoop本地库是32位的,如果我们的Linux服务器是64位的话,就会现问题.我们在64位服务器执行Hado ...
- 在64位linux下编译32位程序
在64位linux下编译32位程序 http://blog.csdn.net/xsckernel/article/details/38045783
- MiniCRT 64位 linux 系统移植记录:64位gcc的几点注意
32位未修改源码与修改版的代码下载: git clone git@github.com:youzhonghui/MiniCRT.git MiniCRT 64位 linux 系统移植记录 MiniCRT ...
- /usr/local/lib/libz.a: could not read symbols: Bad value(64 位 Linux)
/usr/local/lib/libz.a: could not read symbols: Bad value(64 位 Linux) /usr/bin/ld: /usr/local/lib/lib ...
- 64位linux报错Could not initialize class java.awt.image.BufferedImage
最近碰到一个问题: 64位linux报错Could not initialize class java.awt.image.BufferedImage 在WIN平台下运行正常BufferedImage ...
随机推荐
- JST(JavaScript Trimpath)前端模板引擎简介
JST(JavaScript Trimpath)前端模板引擎简介及应用 今天在做某系统日志列表的时候用到了这个玩意儿.刚开始只是根据别人的例子照葫芦画瓢完成了日志列表及对应详情,晚上有空了才仔细去网上 ...
- HackerRank "Array and simple queries" !
The most interesting, flexible and juicy binary tree problem I have ever seen. I learnt it from here ...
- 【JDBC】向数据表插入数据时,自动获取生成的主键
数据表设计时,一般都会有一个主键(Key)(自己指定),有时也可以使用联合主键: 有许多数据库提供了隐藏列为表中的每行记录分配一个唯一键值(如:rowid): 当我们没有指定哪一列作为主键key时,数 ...
- Redis 宣言(Redis Manifesto)
Redis 的作者 antirez(Salvatore Sanfilippo)曾经发表了一篇名为 Redis 宣言(Redis Manifesto)的文章,文中列举了 Redis 的七个原则,以向大家 ...
- Redis桌面管理工具 RedisDesktopManager
下载链接地址:[官网地址:https://redisdesktop.com] redis-desktop-manager-0.8.8.384.exe Source code (zip) Source ...
- MySQL加载并执行SQL脚本文件
第一种方法: 命令行下(未连接数据库) ,输入 mysql -h localhost -u root -p123456 < C:\db.sql 第二种方法: 命令行下(已连接数据库,此时的提示符 ...
- label标签的用法
label 标签for属性 <h1>显式指定通过for(for的值就是对应radio的id的值)</h1> <form> <label for="m ...
- 磁盘IOPS的计算
计算磁盘IOPS的三个因素: 1.RAID类型的读写比 不同RAID类型的IOPS计算公式: RAID类型 公式 RAID5.RAID3 Drive IOPS=Read IOPS + 4*Wr ...
- linux中cat、more、less命令区别详解
众所周知linux中命令cat.more.less均可用来查看文件内容,主要区别有:cat是一次性显示整个文件的内容,还可以将多个文件连接起来显示,它常与重定向符号配合使用,适用于文件内容少的情况:m ...
- 黄聪:主机宝安装wordpress注意事项
1.web环境安装PHP使用5.4.21-nts-03版本 2.web环境安装Mysql使用5.5.45版本 3.创建好站点后,给站点的public_html目录添加IIS_xxx用户最高权限,添加N ...