日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

Flume向HDFS写数据实例

發布時間:2025/6/15 编程问答 42 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Flume向HDFS写数据实例 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

2019獨角獸企業重金招聘Python工程師標準>>>

Goal:

Use Flume to?pool a folder on local file system and write it to HDFS.

Version Information:

hadoop-0.22.0

apache-flume-1.3.1

Flume Configuration:

Edit file? flume-env.sh?under ?/$FLUME_HOME$/conf:

export JAVA_HOME=your jave home export FLUME_CLASSPATH=your flume home export HADOOP_CLASSPATH=your hadoop home

Edit file?flume-conf.properties?under?/$FLUME_HOME$/conf:

# Configure the agent agent.sources = spooldirSource agent.channels = memoryChannel agent.sinks = hdfsSink# Configure the source agent.sources.spooldirSource.type = spooldir agent.sources.spooldirSource.spoolDir = /tmp/flume/ agent.sources.spooldirSource.channels = memoryChannel# Configure the sink agent.sinks.hdfsSink.type = hdfs agent.sinks.hdfsSink.hdfs.path = hdfs://masternode:9000/flume/events agent.sinks.hdfsSink.hdfs.filePrefix = events- agent.sinks.hdfsSink.channel = memoryChannel# Configure the channel agent.channels.memoryChannel.type = memory agent.channels.memoryChannel.capacity = 100

Copy Hadoop Jars to Flume lib directory:

Copy?hadoop-hdfs-0.22.0.jar and?hadoop-common-0.22.0.jar to?/$FLUME_HOME$/lib.

Start Flume Agent:

./bin/flume-ng agent -n agent -c conf -f conf/flume-conf.properties

Write File:

echo "Hello World">>/tmp/flume/test

View Logs:

Under?/$FLUME_HOME$/logs



轉載于:https://my.oschina.net/xiangchen/blog/101111

《新程序員》:云原生和全面數字化實踐50位技術專家共同創作,文字、視頻、音頻交互閱讀

總結

以上是生活随笔為你收集整理的Flume向HDFS写数据实例的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。