step 2.下载源码包:http://mirrors.hust.edu.cn/apache/hadoop/common/stable2/
wget http://mirrors.hust.edu.cn/apach ... op-2.4.1-src.tar.gz
step 3.准备maven作为编译hadoop的工具
a.下载编译好的maven包:
wget http://mirror.bit.edu.cn/apache/ ... en-3.1.1-bin.tar.gz
tar -zxvf apache-maven-3.1.1-bin.tar.gz -C /opt/
b.配置maven的环境变量,在/etc/profile文件结尾中添加如下代码
export MAVEN_HOME=/opt/apache-maven-3.1.1
export PATH=$PATH:${MAVEN_HOME}/bin
c.执行如下命令使配置文件生效
source /etc/profile
d.测试maven
mvn -version
e.由于maven国外服务器可能连不上,先给maven配置一下国内镜像
在maven目录下,conf/settings.xml,在<mirrors></mirros>里添加如下内容(注意不要添加到注释里面了)
<mirror>
<id>nexus-osc</id>
<mirrorOf>*</mirrorOf>
<name>Nexusosc</name>
<url>http://maven.oschina.net/content/groups/public/</url>
</mirror>
在maven目录下,conf/settings.xml,在<profiles></profiles>添加如下内容(注意不要添加到注释里面了)
<profile>
<id>jdk-1.7</id>
<activation>
<jdk>1.7</jdk>
</activation>
<repositories>
<repository>
<id>nexus</id>
<name>local private nexus</name>
<url>http://maven.oschina.net/content/groups/public/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>nexus</id>
<name>local private nexus</name>
<url>http://maven.oschina.net/content/groups/public/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</pluginRepository>
</pluginRepositories>
</profile>
step 4.hadoop2.4.1编译需要protoc2.5.0的支持,所以还要安装下载protoc2.5.0
官方网址:https://code.google.com/p/protobuf/downloads/list
百度网盘网址:http://pan.baidu.com/s/1pJlZubT
a.对protoc进行编译安装前先要装几个依赖包:gcc,gcc-c++,make 如果已经安装的可以忽略
yum install gcc
yum intall gcc-c++
yum install make
b.配置protoc
tar -xvf protobuf-2.5.0.tar.bz2
cd protobuf-2.5.0
./configure --prefix=/opt/protoc/
make && make install
yum install gcc
红色字体是自定义的protoc安装目录,完成上面的命令后,再配置protoc环境变量,同样在/etc/profile尾部加入:
export PATH=/opt/protoc/bin:$PATH
step 5.,还不要着急开始编译安装,不然又是各种错误,需要安装cmake,openssl-devel,ncurses-devel依赖
yum install cmake
yum install openssl-devel
yum install ncurses-devel
step 6. 解压并编译hadoop源码(一定要进入到源码解压的位置再执行mvn编译)
tar -zxvf hadoop-2.4.1-src.tar.gz -C /opt/
cd /opt/hadoop-2.4.1-src
mvn package -Pdist,native -DskipTests -Dtar
编译耗时比较长,耐心等待,成功后结果类似于:
pache Hadoop Distributed Copy .................... SUCCESS [33.648s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [7.303s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [21.288s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [14.611s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [8.334s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [10.737s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [18.321s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [17.136s]
[INFO] Apache Hadoop Client .............................. SUCCESS [14.563s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.254s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [17.245s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [14.478s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.084s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [41.979s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 34:24.464s
[INFO] Finished at: Thu Aug 07 14:25:51 CST 2014
[INFO] Final Memory: 159M/814M
[INFO] ------------------------------------------------------------------------
You have new mail in /var/spool/mail/root
编译后的路径在:hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1
进入hadoop-2.4.1目录,测试安装是否成功
[iyunv@cupcs-redhat6 bin]# cd /opt/hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1/bin
[iyunv@cupcs-redhat6 bin]# ./hadoop version
Hadoop 2.4.1
Subversion Unknown -r Unknown
Compiled by root on 2014-08-07T05:52Z
Compiled with protoc 2.5.0
From source with checksum bb7ac0a3c73dc131f4844b873c74b630
This command was run using
/opt/hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1/share/hadoop/common/hadoop-common-2.4.1.jar
[iyunv@cupcs-redhat6 bin]# cd ..
[iyunv@cupcs-redhat6 hadoop-2.4.1]# file lib/native/*
lib/native/libhadoop.a: current ar archive
lib/native/libhadooppipes.a: current ar archive
lib/native/libhadoop.so: symbolic link to `libhadoop.so.1.0.0'
lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1
(SYSV), dynamically linked, not stripped
lib/native/libhadooputils.a: current ar archive
lib/native/libhdfs.a: current ar archive
lib/native/libhdfs.so: symbolic link to `libhdfs.so.0.0.0'
lib/native/libhdfs.so.0.0.0: ELF 64-bit LSB shared object, x86-64, version 1
(SYSV), dynamically linked, not stripped
[iyunv@cupcs-redhat6 hadoop-2.4.1]#
运维网声明
1、欢迎大家加入本站运维交流群:群②:261659950 群⑤:202807635 群⑦870801961 群⑧679858003
2、本站所有主题由该帖子作者发表,该帖子作者与运维网 享有帖子相关版权
3、所有作品的著作权均归原作者享有,请您和我们一样尊重他人的著作权等合法权益。如果您对作品感到满意,请购买正版
4、禁止制作、复制、发布和传播具有反动、淫秽、色情、暴力、凶杀等内容的信息,一经发现立即删除。若您因此触犯法律,一切后果自负,我们对此不承担任何责任
5、所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其内容的准确性、可靠性、正当性、安全性、合法性等负责,亦不承担任何法律责任
6、所有作品仅供您个人学习、研究或欣赏,不得用于商业或者其他用途,否则,一切后果均由您自己承担,我们对此不承担任何法律责任
7、如涉及侵犯版权等问题,请您及时通知我们,我们将立即采取措施予以解决
8、联系人Email:admin@iyunv.com 网址:www.yunweiku.com