日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當(dāng)前位置: 首頁 > 运维知识 > 数据库 >内容正文

数据库

flume连接kafka_日志收集系统架构设计:(flume+zookeeper+kafka+php+mysql )

發(fā)布時(shí)間:2025/3/12 数据库 24 豆豆
生活随笔 收集整理的這篇文章主要介紹了 flume连接kafka_日志收集系统架构设计:(flume+zookeeper+kafka+php+mysql ) 小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

正文內(nèi)容

  • 一、安裝jdk
  • 二、安裝flume
  • 三、安裝kafka
  • 1、zookeeper
  • 2、kafka
  • 四、啟動測試步驟
  • 五、目錄說明
  • 六、日志收集系統(tǒng)設(shè)計(jì)圖
  • 七、大數(shù)據(jù)參考資料推薦

一、安裝jdk -(版本:1.8.0_191)

1.下載:

https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html

解壓:tar -zxvf /jdk-8u191-linux-x64.tar.gz -C /home/ppgt/local/

2.修改 /etc/profile 增加:

export JAVA_HOME=/home/ppgt/local/jdk1.8.0_191

export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

export PATH=$PATH:$JAVA_HOME/bin

3.檢測:

java -version

二、安裝flume -(版本:1.8.0)

1.下載:

wget http://mirrors.hust.edu.cn/apache/flume/1.8.0/apache-flume-1.8.0-bin.tar.gz

2.解壓安裝:

tar -zxvf apache-flume-1.8.0-bin.tar.gz -C /home/ppgt/local/

3.驗(yàn)證安裝:

bin/flume-ng version

4.修改配置 conf/:

cp flume-env.sh.template flume-env.sh

vi flume-env.sh

//配置java路徑:export JAVA_HOME=/usr/local/jdk/jdk1.8.0_191-amd64

5.添加連接kafka的配置文件:conf/flumetokafka.conf

#配置flume鏈接kafka

# 定義這個(gè)agent中各組件的名字

flume_kafka.sources = exec-sources

flume_kafka.sinks = kafka-sink

flume_kafka.channels = memory-channel

# Describe/configure the source

flume_kafka.sources.exec-sources.type = exec

flume_kafka.sources.exec-sources.command = tail -F /home/ppgt/tmpfile/testlogs/data.log

# Describe the sink

flume_kafka.sinks.kafka-sink.type = org.apache.flume.sink.kafka.KafkaSink

flume_kafka.sinks.kafka-sink.topic = topiclogs01

flume_kafka.sinks.kafka-sink.brokerList = localhost:9092

flume_kafka.sinks.kafka-sink.requiredAcks = 1

flume_kafka.sinks.kafka-sink.batchSize = 20

# Use a channel which buffers events in memory

flume_kafka.channels.memory-channel.type = memory

flume_kafka.channels.memory-channel.capacity = 1000

flume_kafka.channels.memory-channel.transactionCapacity = 100

# Bind the source and sink to the channel

flume_kafka.sources.exec-sources.channels = memory-channel

flume_kafka.sinks.kafka-sink.channel = memory-channel

三、安裝kafka

1.安裝依賴zookeeper -(版本:3.4.12)

1)下載:

wget http://mirror.bit.edu.cn/apache/zookeeper/stable/zookeeper-3.4.12.tar.gz

2)解壓:

tar -zxvf zookeeper-3.4.12.tar.gz -C /home/ppgt/local/

3)修改數(shù)據(jù)存儲目錄:

conf/zoo.cfg

cp zoo_sample.cfg zoo.cfg

修改值:dataDir=/home/ppgt/kafka_zk_tmp/tmp/zookeeper

2.安裝kafka -(版本:0.9.0.0)

1)下載:

wget https://archive.apache.org/dist/kafka/0.9.0.0/kafka_2.11-0.9.0.0.tgz

2)解壓:

tar -zxvf kafka_2.11-0.9.0.0.tgz -C /home/ppgt/local/

3)修改配置文件:config/server.properties

a) zookeeper.connect=localhost:2181 #zk服務(wù)地址

b) host.name=localhost #主機(jī)名

c) log.dirs=/home/ppgt/kafka_zk_tmp/tmp/kafka-logs#kafka數(shù)據(jù)的存放地址

d) num.partitions=1 #分區(qū)數(shù)量

e) listeners=PLAINTEXT://:9092 #kafka監(jiān)聽端口

f) broker_id=0 #唯一標(biāo)識id

四、啟動測試步驟

1.啟動zookeeper

bin/zkServer.sh start

2.啟動flume

bin/flume-ng agent --conf conf --conf-file conf/flumetokafka.conf --name flume_kafka -Dflume.root.logger=INFO,console

3.啟動kafka

//啟動kafka服務(wù)

bin/kafka-server-start.sh /home/ppgt/local/kafka_2.11-0.9.0.0/config/server.properties

//創(chuàng)建一個(gè)topic

bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic topiclogs01

//創(chuàng)建一個(gè)kafka消費(fèi)者

php /home/ppgt/www_test_ppgt_admin/syslogs_featrue_v1.0/script/cron/sysLogsConsumerRun.php

bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic topiclogs01 --from-beginning

//或者執(zhí)行php消費(fèi)者

//生產(chǎn)測試日志

echo '內(nèi)容' exce >> /home/ppgt/tmpfile/testlogs/data.log

五、目錄說明

1./home/ppgt/local/

conf/zoo.cfg #配置文件

conf/flumetokafka.conf #連接kafka的配置文件

config/server.properties #配置文件

conf/flume-env.sh#配置文件

jdk1.8.0_191 #jdk安裝目錄

apache-flume-1.8.0-bin/#flume安裝目錄

kafka_2.11-0.9.0.0/ #kafka安裝目錄

zookeeper-3.4.12/ #zookeeper安裝目錄

2./home/ppgt/kafka_zk_tmp/tmp/

zookeeper/ #zookeeper數(shù)據(jù)存儲目錄

kafka-logs/ #kafka數(shù)據(jù)的存放地址

六、日志收集系統(tǒng)設(shè)計(jì)圖

七、大數(shù)據(jù)資料文章推薦

  • flume:
  • https://blog.csdn.net/mengfanzhundsc/article/details/81300310?from=singlemessage&isappinstalled=0
  • https://blog.csdn.net/caodaoxi/article/details/25903867
  • https://www.cnblogs.com/tonglin0325/p/8963395.html
  • http://www.wfuyu.com/technology/25331.html
  • https://blog.csdn.net/Team77/article/details/44154529a
  • https://blog.csdn.net/wuxintdrh/article/details/79478710
  • https://blog.csdn.net/l1028386804/article/details/79366155
  • https://blog.csdn.net/jy02268879/article/details/81024758
  • http://itindex.net/detail/57323-flume-%E7%9B%91%E6%8E%A7-%E6%97%A5%E6%9C%9F
  • https://github.com/ypenglyn/locktail/blob/master/locktail_rotate.sh
  • https://blog.csdn.net/maoyuanming0806/article/details/79391010
  • https://blog.csdn.net/u010316188/article/details/79905372
  • https://blog.csdn.net/u011254180/article/details/80000763
  • linux下flume安裝
  • flume使用taildir收集文件和文件夾
  • taildirSource多文件監(jiān)控實(shí)時(shí)采集
  • 滾動日志監(jiān)控解決方案:
  • Flume使用:監(jiān)控文件實(shí)時(shí)采集新增數(shù)據(jù)輸出到控制臺
  • Flume之——Flume讀取日志數(shù)據(jù)寫入Kafka
  • flume Source志SpoolDir
  • flume監(jiān)控
  • flume監(jiān)控參數(shù)說明
  • flume監(jiān)控指標(biāo)詳解
  • 大數(shù)據(jù)系列之Flume--幾種不同的Sources
  • kafuka:
  • https://www.jianshu.com/p/a036405f989c
  • https://blog.csdn.net/nankiao/article/details/78553635
  • https://blog.csdn.net/weixin_38750084/article/details/82944759
  • http://www.cnblogs.com/jun1019/p/6656223.html
  • https://www.cnblogs.com/jun1019/p/6256514.html
  • http://www.tianshouzhi.com/api/tutorials/kafka/117
  • https://www.cnblogs.com/hei12138/p/7805475.html
  • kafka介紹
  • kafka的使用和安裝
  • kafka存儲機(jī)制
  • kafka配置說明
  • kafka與zookeeper間的關(guān)聯(lián)
  • Zookeeper 在 Kafka 中的作用

總結(jié)

以上是生活随笔為你收集整理的flume连接kafka_日志收集系统架构设计:(flume+zookeeper+kafka+php+mysql )的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯,歡迎將生活随笔推薦給好友。