生活随笔
收集整理的這篇文章主要介紹了
在CentOS6.9中 搭建 Flume
小編覺(jué)得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.
2019獨(dú)角獸企業(yè)重金招聘Python工程師標(biāo)準(zhǔn)>>>
之前的基本環(huán)境準(zhǔn)備我就不再說(shuō)了, 請(qǐng)參照我之前的搭建hadoop的文章
?
在CentOS 6.9中搭建Hadoop
配置 flume 環(huán)境變量
export FLUME_HOME=/opt/apache-flume-1.7.0-bin
export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$FLUME_HOME/bin:$HOME/bin
?然后 記得 source ~/.bash_profile
根據(jù)需求,配置不同的 source/channel/sink,添加配置文件到 conf/中
logAgent.sources = logSource
logAgent.channels = fileChannel
logAgent.sinks = hdfsSinklogAgent.sources.logSource.type = exec
logAgent.sources.logSource.command = tail -F /aura/data/flume-search/logs
logAgent.sources.logSource.channels = fileChannellogAgent.sinks.hdfsSink.type = hdfs
logAgent.sinks.hdfsSink.hdfs.path = hdfs://bigdata:9000/flume/record/%Y-%m-%d/%H%M
logAgent.sinks.hdfsSink.hdfs.rollCount= 10000
logAgent.sinks.hdfsSink.hdfs.rollSize= 0
logAgent.sinks.hdfsSink.hdfs.batchSize= 1000
logAgent.sinks.hdfsSink.hdfs.filePrefix= transaction_log
logAgent.sinks.hdfsSink.hdfs.rollInterval= 600
logAgent.sinks.hdfsSink.hdfs.roundUnit = minute
logAgent.sinks.hdfsSink.hdfs.fileType = DataStream
logAgent.sinks.hdfsSink.hdfs.useLocalTimeStamp = true
logAgent.sinks.hdfsSink.channel = fileChannellogAgent.channels.fileChannel.type = memory
logAgent.channels.logSource.capacity=1000
logAgent.channels.logSource.transactionCapacity=100
logAgent.sources = logSource
logAgent.channels = fileChannel
logAgent.sinks = hdfsSinklogAgent.sources.logSource.type = avro
logAgent.sources.logSource.bind = 127.0.0.1
logAgent.sources.logSource.port = 44444
logAgent.sources.logSource.channels = fileChannellogAgent.sinks.hdfsSink.type = hdfs
logAgent.sinks.hdfsSink.hdfs.path = hdfs://bigdata:9000/flume/record/%Y-%m-%d/%H%M
logAgent.sinks.hdfsSink.hdfs.rollCount= 10000
logAgent.sinks.hdfsSink.hdfs.rollSize= 0
logAgent.sinks.hdfsSink.hdfs.batchSize= 1000
logAgent.sinks.hdfsSink.hdfs.filePrefix= transaction_log
logAgent.sinks.hdfsSink.hdfs.rollInterval= 600
logAgent.sinks.hdfsSink.hdfs.roundUnit = minute
logAgent.sinks.hdfsSink.hdfs.fileType = DataStream
logAgent.sinks.hdfsSink.hdfs.useLocalTimeStamp = true
logAgent.sinks.hdfsSink.channel = fileChannellogAgent.channels.fileChannel.type = memory
logAgent.channels.logSource.capacity=1000
logAgent.channels.logSource.transactionCapacity=100
logAgent.sources = logSource
logAgent.channels = fileChannel
logAgent.sinks = hdfsSinklogAgent.sources.logSource.type = spooldir
logAgent.sources.logSource.spoolDir =/aura/data/flume-search
logAgent.sources.logSource.channels = fileChannellogAgent.sinks.hdfsSink.type = hdfs
logAgent.sinks.hdfsSink.hdfs.path = hdfs://bigdata:9000/flume/record/%Y-%m-%d/%H%M
logAgent.sinks.hdfsSink.hdfs.rollCount= 10000
logAgent.sinks.hdfsSink.hdfs.rollSize= 0
logAgent.sinks.hdfsSink.hdfs.batchSize= 1000
logAgent.sinks.hdfsSink.hdfs.filePrefix= transaction_log
logAgent.sinks.hdfsSink.hdfs.rollInterval= 600
logAgent.sinks.hdfsSink.hdfs.roundUnit = minute
logAgent.sinks.hdfsSink.hdfs.fileType = DataStream
logAgent.sinks.hdfsSink.hdfs.useLocalTimeStamp = true
logAgent.sinks.hdfsSink.channel = fileChannellogAgent.channels.fileChannel.type = memory
logAgent.channels.logSource.capacity=1000
logAgent.channels.logSource.transactionCapacity=100 bin/flume-ng agent -n logAgent -c conf -f conf/flume_exec_hdfs.conf -Dflume.root.logger=INFO,console
?
?
轉(zhuǎn)載于:https://my.oschina.net/zdtdtel/blog/1577446
總結(jié)
以上是生活随笔為你收集整理的在CentOS6.9中 搭建 Flume的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
如果覺(jué)得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。