日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當(dāng)前位置: 首頁(yè) > 编程资源 > 编程问答 >内容正文

编程问答

hadoop 源码编译---Win7(64位)

發(fā)布時(shí)間:2024/9/5 编程问答 29 豆豆
生活随笔 收集整理的這篇文章主要介紹了 hadoop 源码编译---Win7(64位) 小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

說明:

  在Windows 平臺(tái)上編譯hadoop 源代碼,hadoop源碼包的安裝說明文件【BUILDING.txt】描述如下

haoop官方文檔上需要準(zhǔn)備的文件:

* Windows System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer
* Windows SDK 7.1 or Visual Studio 2010 Professional【沒用到,我用cmd代替】
* Windows SDK 8.1 (if building CPU rate control for the container executor)【沒用到,windows sdk7 和windows sdk8 二選一】
* zlib headers (if building native code bindings for zlib)【沒用到】
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)【我用的是nexus-2.14.1-01-bundle.zip】
* Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip.【沒用到】

These
tools must be present on your PATH.

1.我的準(zhǔn)備文件:

  (1).apache-maven-3.0.4

  (2).protoc-2.5.0-win32.zip

  (3).findbugs-1.3.9.zip

  (4).cmake-3.8.0-rc2-win64-x64.zip

2.系統(tǒng)變量Path設(shè)置

  F:\Maven\apache-maven-3.0.4-bin\apache-maven-3.0.4\bin;

E:\Linux\google-protobuf-2.5.0\protoc-2.5.0-win32;
E:\Linux\findbugs-1.3.9\bin;
E:\Linux\cmake-3.8.0-rc2-win64-x64\bin;

3. 首先進(jìn)入hadoop源碼文件夾--->hadoop-maven-plugins

  【我的:E:\Linux\hadoop-2.7.3-src\hadoop-maven-plugins】 在地址欄輸入cmd 后回車,進(jìn)入當(dāng)前目錄

?

4. 運(yùn)行mvn clean install ?下載編譯包

  

5.退會(huì)到hadoop源碼根目錄執(zhí)行:

  mvn eclipse:eclipse -DskipTests

成功后的界面輸入出如下:

  

main:[mkdir] Created dir: E:\Linux\hadoop-2.7.3-src\hadoop-tools\hadoop-tools-dist\target\test-dir[mkdir] Created dir: E:\Linux\hadoop-2.7.3-src\hadoop-tools\hadoop-tools-dist\target\test\data [INFO] Executed tasks [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hadoop-tools-dist --- [INFO] [INFO] <<< maven-eclipse-plugin:2.10:eclipse (default-cli) < generate-resources @ hadoop-tools-dist <<< [INFO] [INFO] --- maven-eclipse-plugin:2.10:eclipse (default-cli) @ hadoop-tools-dist --- [INFO] Using Eclipse Workspace: null [INFO] Adding default classpath container: org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7 [INFO] Wrote settings to E:\Linux\hadoop-2.7.3-src\hadoop-tools\hadoop-tools-dist\.settings\org.eclipse.jdt.core.prefs [INFO] File E:\Linux\hadoop-2.7.3-src\hadoop-tools\hadoop-tools-dist\.project already exists.Additional settings will be preserved, run mvn eclipse:clean if you want old settings to be removed. [INFO] Wrote Eclipse project for "hadoop-tools-dist" to E:\Linux\hadoop-2.7.3-src\hadoop-tools\hadoop-tools-dist. [INFO] [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop Tools 2.7.3 [INFO] ------------------------------------------------------------------------ [INFO] [INFO] >>> maven-eclipse-plugin:2.10:eclipse (default-cli) > generate-resources @ hadoop-tools >>> [INFO] [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-tools --- [INFO] Executing tasksmain:[mkdir] Created dir: E:\Linux\hadoop-2.7.3-src\hadoop-tools\target\test-dir [INFO] Executed tasks [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hadoop-tools --- [INFO] [INFO] <<< maven-eclipse-plugin:2.10:eclipse (default-cli) < generate-resources @ hadoop-tools <<< [INFO] [INFO] --- maven-eclipse-plugin:2.10:eclipse (default-cli) @ hadoop-tools --- [INFO] Not running eclipse plugin goal for pom project [INFO] Using Eclipse Workspace: null [INFO] Adding default classpath container: org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7 [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop Distribution 2.7.3 [INFO] ------------------------------------------------------------------------ [INFO] [INFO] >>> maven-eclipse-plugin:2.10:eclipse (default-cli) > generate-resources @ hadoop-dist >>> [INFO] [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-dist --- [INFO] Executing tasksmain:[mkdir] Created dir: E:\Linux\hadoop-2.7.3-src\hadoop-dist\target\test-dir [INFO] Executed tasks [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hadoop-dist --- [INFO] [INFO] <<< maven-eclipse-plugin:2.10:eclipse (default-cli) < generate-resources @ hadoop-dist <<< [INFO] [INFO] --- maven-eclipse-plugin:2.10:eclipse (default-cli) @ hadoop-dist --- [INFO] Using Eclipse Workspace: null [INFO] Adding default classpath container: org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7 [INFO] Wrote settings to E:\Linux\hadoop-2.7.3-src\hadoop-dist\.settings\org.eclipse.jdt.core.prefs [INFO] File E:\Linux\hadoop-2.7.3-src\hadoop-dist\.project already exists.Additional settings will be preserved, run mvn eclipse:clean if you want old settings to be removed. [INFO] Wrote Eclipse project for "hadoop-dist" to E:\Linux\hadoop-2.7.3-src\hadoop-dist. [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [ 0.424 s] [INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 0.215 s] [INFO] Apache Hadoop Project POM .......................... SUCCESS [ 0.520 s] [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 0.108 s] [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 0.084 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.095 s] [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 1.108 s] [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 3.725 s] [INFO] Apache Hadoop Auth ................................. SUCCESS [ 3.602 s] [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 0.471 s] [INFO] Apache Hadoop Common ............................... SUCCESS [ 10.259 s] [INFO] Apache Hadoop NFS .................................. SUCCESS [ 13.015 s] [INFO] Apache Hadoop KMS .................................. SUCCESS [ 10.464 s] [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.070 s] [INFO] Apache Hadoop HDFS ................................. SUCCESS [ 5.865 s] [INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 32.714 s] [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 5.733 s] [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 1.789 s] [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.088 s] [INFO] hadoop-yarn ........................................ SUCCESS [ 0.066 s] [INFO] hadoop-yarn-api .................................... SUCCESS [ 0.710 s] [INFO] hadoop-yarn-common ................................. SUCCESS [ 8.698 s] [INFO] hadoop-yarn-server ................................. SUCCESS [ 0.065 s] [INFO] hadoop-yarn-server-common .......................... SUCCESS [ 5.843 s] [INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 2.519 s] [INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 0.636 s] [INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 0.958 s] [INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 3.172 s] [INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 13.857 s] [INFO] hadoop-yarn-client ................................. SUCCESS [ 2.377 s] [INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [ 2.371 s] [INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.070 s] [INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 0.777 s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 0.478 s] [INFO] hadoop-yarn-site ................................... SUCCESS [ 0.069 s] [INFO] hadoop-yarn-registry ............................... SUCCESS [ 1.384 s] [INFO] hadoop-yarn-project ................................ SUCCESS [ 0.477 s] [INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.227 s] [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 5.169 s] [INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 7.680 s] [INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 3.952 s] [INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 1.673 s] [INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 5.814 s] [INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 1.755 s] [INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 0.515 s] [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 4.946 s] [INFO] hadoop-mapreduce ................................... SUCCESS [ 0.253 s] [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 1.095 s] [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [05:56 min] [INFO] Apache Hadoop Archives ............................. SUCCESS [ 3.804 s] [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 0.891 s] [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 7.510 s] [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 0.351 s] [INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 0.483 s] [INFO] Apache Hadoop Extras ............................... SUCCESS [ 0.567 s] [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 0.067 s] [INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 1.997 s] [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [01:30 min] [INFO] Apache Hadoop Azure support ........................ SUCCESS [ 5.079 s] [INFO] Apache Hadoop Client ............................... SUCCESS [ 3.223 s] [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 1.524 s] [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 7.012 s] [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 7.261 s] [INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.067 s] [INFO] Apache Hadoop Distribution ......................... SUCCESS [ 0.272 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 10:56 min [INFO] Finished at: 2017-03-20T08:39:17+08:00 [INFO] Final Memory: 77M/247M [INFO] ------------------------------------------------------------------------

編譯輸出的目錄:

  

?

  

轉(zhuǎn)載于:https://www.cnblogs.com/zhangxiaolin/p/6585982.html

總結(jié)

以上是生活随笔為你收集整理的hadoop 源码编译---Win7(64位)的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。