日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

本地编译Hadoop2.8.0源码总结和问题解决(转自:http://blog.csdn.net/young_kim1/article/details/50324345)

發布時間:2024/9/27 编程问答 35 豆豆
生活随笔 收集整理的這篇文章主要介紹了 本地编译Hadoop2.8.0源码总结和问题解决(转自:http://blog.csdn.net/young_kim1/article/details/50324345) 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

編譯準備

1、下載所需的軟件

先去官網下載hadoop2.8.0源碼并解壓,打開解壓目錄下的BUILDING.txt,編譯過程和需要的軟件其實就是根據這個文檔里的描述來的。

(可以通過命令下載:wget http://mirrors.hust.edu.cn/apache/hadoop/common/hadoop-2.8.0/hadoop-2.8.0-src.tar.gz)

Requirements:* Unix System * JDK 1.7+ * Maven 3.0 or later * Findbugs 1.3.9 (if running findbugs) * ProtocolBuffer 2.5.0 * CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac * Zlib devel (if compiling native code) * openssl devel ( if compiling native hadoop-pipes and to get the best HDFS encryption performance ) * Jansson C XML parsing library ( if compiling libwebhdfs ) * Linux FUSE (Filesystem in Userspace) version 2.6 or above ( if compiling fuse_dfs ) * Internet connection for first build (to fetch all Maven and Hadoop dependencies)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13

這是編譯所需要的軟件,下載戳這里。

包括:

  • JDK1.7+
  • maven 3.0 or later
  • findbugs 1.3.9
  • protocolBuffer 2.5.0
  • cmake 2.6
  • zlib-devel
  • openssl-devel
  • 根據網友的資料,也需要安裝autoconf automake gcc等。

    2、安裝軟件

    1> 安裝JDK1.7并配置環境變量,這里就不贅述了,具體看前面的文檔。

    2> 安裝各種庫

    yum -y install svn ncurses-devel gcc* yum -y install lzo-devel zlib-devel autoconf automake libtool cmake openssl-devel
    • 1
    • 2
    • 3
    • 1
    • 2
    • 3

    3> 安裝maven

    解壓安裝并配置環境變量,具體目錄結構看前面的博客。

    tar zxvf apache-maven-3.3.9-bin.tar.gz vi /etc/profile
    • 1
    • 2
    • 3
    • 4
    • 5
    • 1
    • 2
    • 3
    • 4
    • 5

    在profile文件末尾追加

    export MAVEN_HOME=/home/toto/software/apache-maven-3.3.9 export MAVEN_OPTS="-Xms256m -Xmx512m" (可以不設置) export PATH=$PATH:$MAVEN_HOME/bin進入/home/toto/software/apache-maven-3.3.9/conf,修改settings.xml中臨時產生的本地maven庫的位置 <localRepository>/home/toto/software/repo</localRepository>
    保存并使環境變量生效, source /etc/profile,輸入 mvn -version,有下面輸出結果則安裝并配置正確。

    [root@hadoop ~]# mvn -version Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00) Maven home: /home/toto/software/apache-maven-3.3.9 Java version: 1.8.0_73, vendor: Oracle Corporation Java home: /usr/local/java/jdk1.8.0_73/jre Default locale: zh_CN, platform encoding: UTF-8 OS name: "linux", version: "2.6.32-573.el6.x86_64", arch: "amd64", family: "unix" [root@hadoop ~]#
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9

    4> 安裝protocolBuffer

    解壓安裝并配置環境變量。

    tar zxvf protobuf-2.5.0.tar.gz cd /home/toto/software/protobuf-2.5.0 ./configure make make install
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11

    輸入protoc --version,有下面輸出結果則安裝并配置正確。

    [hadoop@Master ~]$ protoc --version libprotoc 2.5.0 [hadoop@Master ~]$
    • 1
    • 2
    • 3
    • 4
    • 1
    • 2
    • 3
    • 4

    5> 安裝findbugs

    解壓安裝并配置環境變量。

    unzip findbugs-1.3.9.zip vi /etc/profile
    • 1
    • 2
    • 3
    • 4
    • 5
    • 1
    • 2
    • 3
    • 4
    • 5

    在profile文件末尾追加

    export FINDBUGS_HOME=/home/toto/software/findbugs-1.3.9 export PATH=$PATH:$FINDBUGS_HOME/bin
    • 1
    • 2
    • 1
    • 2

    保存并使環境變量生效,source /etc/profile,輸入findbugs -version,有下面輸出結果則安裝并配置正確。

    [hadoop@Master ~]$ findbugs -version 1.3.9 [hadoop@Master ~]$
    • 1
    • 2
    • 3
    • 1
    • 2
    • 3

    開始編譯

    首先保證主機能上網(虛擬機怎么上網的點這里),在編譯過程中網絡保持暢通;進入到hadoop2.8.0源碼的解壓目錄下,輸入下面命令:

    cd /home/toto/software/hadoop-2.8.0-srcmvn package -Pdist,native -DskipTests -Dtar
    • 1
    • 1

    或者這個

    cd /home/toto/software/hadoop-2.8.0-srcmvn package -Pdist,native,docs,src -DskipTests -Dtar
    • 1
    • 1

    前面只編譯本地代碼,后者編譯本地代碼和文檔,因此前者速度較快。

    接下來就是漫長的等待,等出現這個就說明編譯成功。

    [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [ 3.533 s] [INFO] Apache Hadoop Project POM .......................... SUCCESS [ 2.023 s] [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 3.679 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.275 s] [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 2.875 s] [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 4.856 s] [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 4.340 s] [INFO] Apache Hadoop Auth ................................. SUCCESS [ 4.534 s] [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 5.398 s] [INFO] Apache Hadoop Common ............................... SUCCESS [03:02 min] [INFO] Apache Hadoop NFS .................................. SUCCESS [ 11.653 s] [INFO] Apache Hadoop KMS .................................. SUCCESS [ 24.501 s] [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.112 s] [INFO] Apache Hadoop HDFS ................................. SUCCESS [07:28 min] [INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 41.608 s] [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 10.673 s] [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 7.225 s] [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.057 s] [INFO] hadoop-yarn ........................................ SUCCESS [ 0.110 s] [INFO] hadoop-yarn-api .................................... SUCCESS [03:36 min] [INFO] hadoop-yarn-common ................................. SUCCESS [ 45.418 s] [INFO] hadoop-yarn-server ................................. SUCCESS [ 0.164 s] [INFO] hadoop-yarn-server-common .......................... SUCCESS [ 12.942 s] [INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 19.200 s] [INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 3.315 s] [INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 7.855 s] [INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 24.347 s] [INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 6.439 s] [INFO] hadoop-yarn-client ................................. SUCCESS [ 6.393 s] [INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [ 3.445 s] [INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.075 s] [INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 2.304 s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 2.026 s] [INFO] hadoop-yarn-site ................................... SUCCESS [ 0.155 s] [INFO] hadoop-yarn-registry ............................... SUCCESS [ 7.255 s] [INFO] hadoop-yarn-project ................................ SUCCESS [ 11.871 s] [INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.254 s] [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 26.029 s] [INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 25.002 s] [INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 3.792 s] [INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 7.797 s] [INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 5.143 s] [INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 6.771 s] [INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 1.837 s] [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 4.513 s] [INFO] hadoop-mapreduce ................................... SUCCESS [ 6.842 s] [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 4.355 s] [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 14.910 s] [INFO] Apache Hadoop Archives ............................. SUCCESS [ 2.844 s] [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 6.931 s] [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 3.937 s] [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 2.499 s] [INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 2.268 s] [INFO] Apache Hadoop Extras ............................... SUCCESS [ 2.739 s] [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 5.793 s] [INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 4.444 s] [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 4.258 s] [INFO] Apache Hadoop Azure support ........................ SUCCESS [ 47.689 s] [INFO] Apache Hadoop Client ............................... SUCCESS [ 19.524 s] [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.305 s] [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 5.581 s] [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 25.708 s] [INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.281 s] [INFO] Apache Hadoop Distribution ......................... SUCCESS [01:53 min] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 24:49 min [INFO] Finished at: 2015-12-11T20:29:45+08:00 [INFO] Final Memory: 110M/493M [INFO] ------------------------------------------------------------------------
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48
    • 49
    • 50
    • 51
    • 52
    • 53
    • 54
    • 55
    • 56
    • 57
    • 58
    • 59
    • 60
    • 61
    • 62
    • 63
    • 64
    • 65
    • 66
    • 67
    • 68
    • 69
    • 70
    • 71
    • 72
    • 73
    • 74
    • 75
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48
    • 49
    • 50
    • 51
    • 52
    • 53
    • 54
    • 55
    • 56
    • 57
    • 58
    • 59
    • 60
    • 61
    • 62
    • 63
    • 64
    • 65
    • 66
    • 67
    • 68
    • 69
    • 70
    • 71
    • 72
    • 73
    • 74
    • 75

    編譯好的文件在../hadoop-dist/target/hadoop-2.8.0.tar.gz下。

    最后編譯后出的效果:

    cd /home/toto/software/hadoop-2.8.0-src/hadoop-dist/target


    編譯中遇到的問題

    錯誤1

    Connection to?http://repo.maven.apache.org?refused

    表示連接maven遠程倉庫拒絕,此時再運行一下編譯命令,就會接著下載jar包。

    錯誤2

    [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hadoop-nfs: Compilation failure: Compilation failure: [ERROR] /home/hadoop/toolkits/hadoop-2.7.1-src/hadoop-common-project/hadoop-nfs/src/main/java/org/apache/hadoop/oncrpc/XDR.java:[23,30] package org.jboss.netty.buffer does not exist
    • 1
    • 2
    • 1
    • 2

    這個錯誤估計很少遇到,這是因為我嫌repo.maven.apache.org這個網站比較慢,更改的第三方鏡像,導致maven中的settings文件配置錯誤,后來采用默認的就行,雖然慢點。

    錯誤3

    [ERROR] around Ant part ...<exec dir="/opt/soft/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target" executable="sh" failonerror="true">... @ 10:123 in /opt/soft/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/antrun/build-main.xml [ERROR] -> [Help 1]
    • 1
    • 2
    • 1
    • 2

    這是由于tomcat的apache-tomcat-6.0.41.tar.gz包太大,沒有下載完整,可以到.../hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/downloads/apache-tomcat-6.0.41.tar.gz這個目錄下,刪除重新下載。

    提醒:?
    1、有時候編譯過程中會出現下載某個包的時間太久,這是由于連接網站的過程中會出現假死,此時按ctrl+c,重新運行編譯命令。?
    2、如果出現缺少了某個文件的情況,則要先清理maven(使用命令?mvn clean) 再重新編譯。

    ====================================

    寫在最后:

    經過了較長時間的折磨,終于成功,可能還是自己比較菜吧,有編譯不成功的歡迎留言探討。

    總結

    以上是生活随笔為你收集整理的本地编译Hadoop2.8.0源码总结和问题解决(转自:http://blog.csdn.net/young_kim1/article/details/50324345)的全部內容,希望文章能夠幫你解決所遇到的問題。

    如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。