日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

hadoop--HDFS的Shell相关操作

發布時間:2025/3/17 编程问答 26 豆豆
生活随笔 收集整理的這篇文章主要介紹了 hadoop--HDFS的Shell相关操作 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

目錄

  • 基本語法
  • 命令
  • 常用命令
    • 1. 前提
    • 2. 上傳
    • 3. 下載
    • 4. HDFS的直接操作

基本語法

hadoop fs [選項] / hdfs dfs [選項]

命令

[xiaobai@hadoop102 ~]$ hadoop fs Usage: hadoop fs [generic options][-appendToFile <localsrc> ... <dst>][-cat [-ignoreCrc] <src> ...][-checksum <src> ...][-chgrp [-R] GROUP PATH...][-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...][-chown [-R] [OWNER][:[GROUP]] PATH...][-copyFromLocal [-f] [-p] [-l] [-d] [-t <thread count>] <localsrc> ... <dst>][-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>][-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] [-e] <path> ...][-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>][-createSnapshot <snapshotDir> [<snapshotName>]][-deleteSnapshot <snapshotDir> <snapshotName>][-df [-h] [<path> ...]][-du [-s] [-h] [-v] [-x] <path> ...][-expunge [-immediate]][-find <path> ... <expression> ...][-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>][-getfacl [-R] <path>][-getfattr [-R] {-n name | -d} [-e en] <path>][-getmerge [-nl] [-skip-empty-file] <src> <localdst>][-head <file>][-help [cmd ...]][-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [-e] [<path> ...]][-mkdir [-p] <path> ...][-moveFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>][-moveToLocal <src> <localdst>][-mv <src> ... <dst>][-put [-f] [-p] [-l] [-d] [-t <thread count>] <localsrc> ... <dst>][-renameSnapshot <snapshotDir> <oldName> <newName>][-rm [-f] [-r|-R] [-skipTrash] [-safely] <src> ...][-rmdir [--ignore-fail-on-non-empty] <dir> ...][-setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]][-setfattr {-n name [-v value] | -x name} <path>][-setrep [-R] [-w] <rep> <path> ...][-stat [format] <path> ...][-tail [-f] [-s <sleep interval>] <file>][-test -[defswrz] <path>][-text [-ignoreCrc] <src> ...][-touch [-a] [-m] [-t TIMESTAMP ] [-c] <path> ...][-touchz <path> ...][-truncate [-w] <length> <path> ...][-usage [cmd ...]] Generic options supported are: -conf <configuration file> specify an application configuration file -D <property=value> define a value for a given property -fs <file:///|hdfs://namenode:port> specify default filesystem URL to use, overrides 'fs.defaultFS' property from configurations. -jt <local|resourcemanager:port> specify a ResourceManager -files <file1,...> specify a comma-separated list of files to be copied to the map reduce cluster -libjars <jar1,...> specify a comma-separated list of jar files to be included in the classpath -archives <archive1,...> specify a comma-separated list of archives to be unarchived on the compute machinesThe general command line syntax is: command [genericOptions] [commandOptions]

常用命令

1. 前提

1). 啟動Hadoop集群:
hadoop102啟動hdfs:

[xiaobai@hadoop102 hadoop-3.2.2]$ sbin/start-dfs.sh [xiaobai@hadoop103 hadoop-3.2.2]$ sbin/start-yarn.sh

2). -help:輸出命令參數:

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -help rm

3). 創建/weather文件夾:

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -mkdir /weather

2. 上傳

1). -moveFromLocal:從本地剪貼到HDFS(本地無,HDFS有);

[xiaobai@hadoop102 hadoop-3.2.2]$ vim monday.txt 輸入: monday is a rainy day [xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -moveFromLocal ./monday.txt /weather

2). -copyFromLocal:從本地文件系統中拷貝文件到HDFS路徑(本地有,HDSF有);

[xiaobai@hadoop102 hadoop-3.2.2]$ vim Tuesday.txt 輸入: Tuesday is a sunny day [xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -copyFromLocal Tuesday.txt /weather

3). -put:與copyFromLocal同,生產環境中用put(本地有,HDSF有);

[xiaobai@hadoop102 hadoop-3.2.2]$ vim Wednesday.txt 輸入: Wednesday balabala [xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -put ./Wednesday.txt /weather

4). appendToFile:追加一個文件到另一個已經存在的文件末尾(HDFS 只能追加不能修改);

[xiaobai@hadoop102 hadoop-3.2.2]$ vim myfeeling 輸入: I was not in a good mood that day. [xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -appendToFile ./myfeeling /weather/Wednesday.txt

3. 下載

1). -copyToLocal:從HDFS拷貝到本地(本地有,HDSF有);

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -copyToLocal /weather/monday.txt ./

2). -get:于copyToLocal同,生產環境中用get(本地有,HDSF有);

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -get /weather/monday.txt ./monday2.txt

4. HDFS的直接操作

1). -ls: 顯示目錄信息;

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -ls /weather

2). -cat: 顯示文件內容;

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -cat /weather/Wednesday.txt

3). -chgrp / chmod / -chown: 與Linux文件系統中的用法相同,修改文件所屬的權限;

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -chmod 666 /weather/Tuesday.txt [xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -chown xiaobai:xiaobai /weather/monday.txt

4). -mkdir: 創建路徑;

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -mkdir/sunny

5). -cp: 在HDFS中的路徑拷貝;

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -cp /weather/Wednesday.txt /sunny [xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -rm /sunny/monday.txt

6). -mv: 在HDFS目錄中移動文件;

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -mv /weather/monday.txt /sunny

7). -tail: 顯示一個文件的末尾1kb的數據(末尾文件為最新文件/重要文件);

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -tail /sunny/Wednesday.txt

8). -rm: 刪除文件/文件夾;

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -rm /sunny/monday.txt

9). -rm -r: 遞歸刪除目錄及目錄里的內容;

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -rm -r /sunny

10). -du: 統計文件夾的大小信息;

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -du -s -h /weather 99 297 /weather [xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -du -h /weather 23 69 /weather/Tuesday.txt 54 162 /weather/Wednesday.txt 22 66 /weather/monday.txt

tips:99表示文件大小,297表示99*3個副本;/weather 表示查看的目錄;

11). -setup: 設置HDFS中文件的副本數量;

[xiaobai@hadoop102 hadoop-3.2.2]$ hadoop fs -setrep 6 /weather/monday.txt Replication 6 set: /weather/monday.txt


tips:此處設置的副本數只是記錄在NameNode的元數據中,不一定會有這么多副本,只有DataNode的數量可以得到保證時才會達到設置的副本數,否則,副本數只等到下次DataNode足夠時才會繼續拷貝副本。因為目前只有3ux設備,最多3x副本,只有節點數增加到6x時,副本數才能達到6.

總結

以上是生活随笔為你收集整理的hadoop--HDFS的Shell相关操作的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。