重庆分公司,新征程启航
为企业提供网站建设、域名注册、服务器等服务
1、安装JDK
公司主营业务:网站建设、网站设计、移动网站开发等业务。帮助企业客户真正实现互联网宣传,提高企业的竞争能力。创新互联公司是一支青春激扬、勤奋敬业、活力青春激扬、勤奋敬业、活力澎湃、和谐高效的团队。公司秉承以“开放、自由、严谨、自律”为核心的企业文化,感谢他们对我们的高要求,感谢他们从不同领域给我们带来的挑战,让我们激情的团队有机会用头脑与智慧不断的给客户带来惊喜。创新互联公司推出郊区免费做网站回馈大家。
下载JDK 7u55版本,安装
JDK和JRE都需要,JDK里面有tools.jar,这个jar包是一定需要的
安装在/java上
2、下载Hadoop 2.2.0源代码
wget http://apache.dataguru.cn/hadoop/common/stable/hadoop-2.2.0-src.tar.gz
解压缩
tar
zxvf hadoop-2.2.0-src.tar.gz -C /tmp
3、安装依赖包
yum -y install lzo-devel zlib-devel gcc autoconf automake libtool gcc-c++ openssl-devel openssl-devel
4、安装相关编译用软件
Protobuf编译和安装
tar -zxvf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure
--prefix=/usr/local/protobuf
make
make install
Ant安装
tar -zxvf apache-ant-1.9.2-bin.tar.gz
mv apache-ant-1.9.2
/usr/local/ant
Maven安装
tar -zxvf apache-maven-3.0.5-bin.tar.gz
mv apache-maven-3.0.5
/usr/local/maven
Findbugs安装
tar -zxfv findbugs-2.0.2.tar.gz
mv findbugs-2.0.2 /usr/local/findbugs
cmake编译安装
tar -zvxf cmake-2.8.8.tar.gz
cd cmake-2.8.8
./bootstrap
make
make
install
5、配置路径
vim /etc/profile
#java
export JAVA_HOME=/java
export JRE_HOME=$JAVA_HOME/jre
export
CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JRE_HOME/lib
export
PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin
#maven
export MAVEN_HOME=/usr/local/maven
export MAVEN_OPTS="-Xms256m
-Xmx512m"
export CLASSPATH=.:$CLASSPATH:$MAVEN_HOME/lib
export
PATH=$PATH:$MAVEN_HOME/bin
#protobuf
export PROTOBUF_HOME=/usr/local/protobuf
export
CLASSPATH=.:$CLASSPATH:$PROTOBUF_HOME/lib
export
PATH=$PATH:$PROTOBUF_HOME/bin
#ant
export ANT_HOME=/usr/local/ant
export
CLASSPATH=.:$CLASSPATH:$ANT_HOME/lib
export PATH=$PATH:$ANT_HOME/bin
#findbugs
export FINDBUGS_HOME=/usr/local/findbugs
export
CLASSPATH=.:$CLASSPATH:$FINDBUGS_HOME/lib
export
PATH=$PATH:$FINDBUGS_HOME/bin
source /etc/profile
即刻生效
9、修改依赖Bug
vim /hadoop-2.2.0/hadoop-common-project/hadoop-auth/pom.xml
在dependency部分加入:
org.mortbay.jetty
jetty
test
org.mortbay.jetty
jetty-util
test
10、编译
cd hadoop-2.2.0-src
mvn clean package -Pdist,native -DskipTests -Dtar
[INFO]
------------------------------------------------------------------------
[INFO]
Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main
................................ SUCCESS [10.796s]
[INFO] Apache Hadoop
Project POM ......................... SUCCESS [8.171s]
[INFO] Apache Hadoop
Annotations ......................... SUCCESS [18.306s]
[INFO] Apache Hadoop
Assemblies .......................... SUCCESS [1.704s]
[INFO] Apache Hadoop
Project Dist POM .................... SUCCESS [8.222s]
[INFO] Apache Hadoop
Maven Plugins ....................... SUCCESS [17.120s]
[INFO] Apache Hadoop
Auth ................................ SUCCESS [15.952s]
[INFO] Apache Hadoop
Auth Examples ....................... SUCCESS [12.085s]
[INFO] Apache Hadoop
Common .............................. SUCCESS [4:57.617s]
[INFO] Apache
Hadoop NFS ................................. SUCCESS [25.393s]
[INFO] Apache
Hadoop Common Project ...................... SUCCESS [0.231s]
[INFO] Apache
Hadoop HDFS ................................ SUCCESS [5:51.635s]
[INFO]
Apache Hadoop HttpFS .............................. SUCCESS
[1:27.220s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............
SUCCESS [59.011s]
[INFO] Apache Hadoop HDFS-NFS ............................
SUCCESS [11.979s]
[INFO] Apache Hadoop HDFS Project ........................
SUCCESS [0.195s]
[INFO] hadoop-yarn .......................................
SUCCESS [1:41.292s]
[INFO] hadoop-yarn-api
................................... SUCCESS [1:53.028s]
[INFO]
hadoop-yarn-common ................................ SUCCESS
[1:47.889s]
[INFO] hadoop-yarn-server ................................
SUCCESS [0.712s]
[INFO] hadoop-yarn-server-common .........................
SUCCESS [38.517s]
[INFO] hadoop-yarn-server-nodemanager ....................
SUCCESS [53.352s]
[INFO] hadoop-yarn-server-web-proxy ......................
SUCCESS [13.733s]
[INFO] hadoop-yarn-server-resourcemanager ................
SUCCESS [49.935s]
[INFO] hadoop-yarn-server-tests ..........................
SUCCESS [3.230s]
[INFO] hadoop-yarn-client ................................
SUCCESS [23.036s]
[INFO] hadoop-yarn-applications ..........................
SUCCESS [0.690s]
[INFO] hadoop-yarn-applications-distributedshell .........
SUCCESS [7.623s]
[INFO] hadoop-mapreduce-client ...........................
SUCCESS [0.581s]
[INFO] hadoop-mapreduce-client-core ......................
SUCCESS [1:26.644s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher
.... SUCCESS [8.783s]
[INFO] hadoop-yarn-site
.................................. SUCCESS [1.217s]
[INFO]
hadoop-yarn-project ............................... SUCCESS [30.587s]
[INFO]
hadoop-mapreduce-client-common .................... SUCCESS
[1:19.185s]
[INFO] hadoop-mapreduce-client-shuffle ...................
SUCCESS [17.693s]
[INFO] hadoop-mapreduce-client-app .......................
SUCCESS [41.929s]
[INFO] hadoop-mapreduce-client-hs ........................
SUCCESS [18.209s]
[INFO] hadoop-mapreduce-client-jobclient .................
SUCCESS [24.663s]
[INFO] hadoop-mapreduce-client-hs-plugins ................
SUCCESS [7.631s]
[INFO] Apache Hadoop MapReduce Examples ..................
SUCCESS [22.663s]
[INFO] hadoop-mapreduce ..................................
SUCCESS [10.093s]
[INFO] Apache Hadoop MapReduce Streaming .................
SUCCESS [19.489s]
[INFO] Apache Hadoop Distributed Copy ....................
SUCCESS [51.046s]
[INFO] Apache Hadoop Archives ............................
SUCCESS [7.621s]
[INFO] Apache Hadoop Rumen ...............................
SUCCESS [20.543s]
[INFO] Apache Hadoop Gridmix .............................
SUCCESS [15.156s]
[INFO] Apache Hadoop Data Join ...........................
SUCCESS [9.968s]
[INFO] Apache Hadoop Extras ..............................
SUCCESS [9.504s]
[INFO] Apache Hadoop Pipes ...............................
SUCCESS [15.708s]
[INFO] Apache Hadoop Tools Dist ..........................
SUCCESS [5.261s]
[INFO] Apache Hadoop Tools ...............................
SUCCESS [0.268s]
[INFO] Apache Hadoop Distribution ........................
SUCCESS [1:15.418s]
[INFO] Apache Hadoop Client
.............................. SUCCESS [29.025s]
[INFO] Apache Hadoop
Mini-Cluster ........................ SUCCESS [0.735s]
[INFO]
------------------------------------------------------------------------
[INFO]
BUILD SUCCESS
[INFO]
------------------------------------------------------------------------
[INFO]
Total time: 34:15.365s
[INFO] Finished at: Fri May 16 16:15:37 CST
2014
[INFO] Final Memory: 101M/385M
[INFO]
------------------------------------------------------------------------
编译完毕会在
hadoop-2.2.0-src/hadoop-dist/target/
产生一个包
hadoop-2.2.0.tar.gz
这个包就是最终可部署的hadoop包
整个编程过程可能会出现中间失败的情况,有很多原因,有可能是因为连接主机下载依赖不成功。
可以尝试:mvn clean package -Pdist,native -DskipTests -Dtar
多来几次,就可以了。