日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

FileBeat + Pipeline 解析日志 保存至ElasticSearch(实战)

發布時間:2024/8/23 编程问答 31 豆豆
生活随笔 收集整理的這篇文章主要介紹了 FileBeat + Pipeline 解析日志 保存至ElasticSearch(实战) 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

文章目錄

  • FileBeat + Pipeline 解析日志 保存至ElasticSearch(實戰)
    • 下載地址
    • 目的
    • 日志數據
    • 模擬Pipeline
    • 創建pipeline
      • 查看Pipeline是否創建成功
    • 創建FileBeat配置文件 filebeat.yml
    • 創建自定義字段 FileBeat fields.yml
    • 執行 FileBeat
    • filebeat 啟動命令說明
    • 測試
  • Pipeline 配置詳解
    • 1. 根據日志數據指定索引 _id
  • FileBeat 配置詳解
    • 1.設置Filebeat保存到ElasticSearch索引副本、分片數量
  • 異常處理
    • 提示 ERROR instance/beat.go:802 Exiting: error initializing processors:

FileBeat + Pipeline 解析日志 保存至ElasticSearch(實戰)

下載地址

https://www.elastic.co/cn/downloads/past-releases#filebeat

目的

使用FileBeat收集日志,Pipeline解析日志,最終寫入ES

日志數據

2021-07-01 20:07:25 [XNIO-1 task-2] INFO fileBeatLogData - 查詢用戶|4|com.internet.operator.controller..list()|GET|http://127.0.0.1:8080/list|127.0.0.1|jast110|9a2e232170744efda8c526d67f4f5405|userAcco909571P&installedLocation=&pageNum=10&pageSize=10&superQuery=1|{"code":200,"msg":"查詢成功","rows":[],"took":2,"total":1}|||0|||1625141245843||||||2021-07-01 20:07:25|142|91110108769392234H|測試111|X

模擬Pipeline

注意:如果同時通過set和script設置字段,會以script為準。

POST /_ingest/pipeline/_simulate {"pipeline": {"processors" : [{"dissect": {"field": "message","pattern": "%{@logTimestamp} [%{logTthread}] %{loglevel} fileBeatLogData - %{logdata}"}},{"split": {"field": "logdata","separator": "\\|","target_field": "logdata"}},{"set": {"field": "actionOrFunction","value": "{{logdata.0}}"}},{"set": {"field": "businessType","value": "{{logdata.1}}"}},{"set": {"field": "callMethod","value": "{{logdata.2}}"}},{"set": {"field": "requestMethod","value": "{{logdata.3}}"}},{"set": {"field": "callLink","value": "{{logdata.4}}"}},{"set": {"field": "loginUserIp","value": "{{logdata.5}}"}},{"set": {"field": "userName","value": "{{logdata.6}}"}},{"set": {"field": "userId","value": "{{logdata.7}}"}},{"set": {"field": "paramOrInputData","value": "{{logdata.8}}"}},{"set": {"field": "resultOrOutputData","value": "{{logdata.9}}"}},{"set": {"field": "exceptionInfo","value": "{{logdata.10}}"}},{"set": {"field": "systemEnv","value": "{{logdata.11}}"}},{"set": {"field": "status","value": "{{logdata.12}}"}},{"set": {"field": "fullLinkId","value": "{{logdata.13}}"}},{"set": {"field": "subFullLinkId","value": "{{logdata.14}}"}},{"set": {"field": "currentTimeMillisecond","value": "{{logdata.15}}"}},{"convert": {"field": "currentTimeMillisecond","type": "long"}},{"set": {"field": "detail","value": "{{logdata.16}}"}},{"set": {"field": "other","value": "{{logdata.17}}"}},{"set": {"field": "errorData","value": "{{logdata.18}}"}},{"set": {"field": "errorDataSource","value": "{{logdata.19}}"}},{"set": {"field": "errorDataDetail","value": "{{logdata.20}}"}},{"set": {"field": "logTime","value": "{{logdata.21}}"}},{"set": {"field": "processTime","value": "{{logdata.22}}"}},{"convert": {"field": "processTime","type": "long"}},{"set": {"field": "orgCode","value": "{{logdata.23}}"}},{"set": {"field": "orgName","value": "{{logdata.24}}"}},{"set": {"field": "exceptionDetailInfo","value": "{{logdata.25}}"}},{"set": {"field": "message","value": ""}},{"set": {"field": "logdata","value": ""}},{"script": {"lang": "painless","source": """ ctx.insertTime = new Date(System.currentTimeMillis()+1000l*60*60*8); """}}]},"docs": [{"_source": {"message": "2021-07-01 20:07:25 [XNIO-1 task-2] INFO fileBeatLogData - 查詢運營商寬帶用戶|4|com.bjga.internet.operator.controller.OperatorBroadbandController.list()|GET|http://127.0.0.1:8080/operator2/broadband/list|127.0.0.1|jast110|9a2e232170744efda8c526d67f4f5405|userAccount=%E5%8C%97%E4%BA%AC1%E5%B8%8256&installedPhone=639857&accountHolderName=%E4%B8%9C%E7%A5%A5%E6%9E%97&operatorCreditCode=91110108101909571P&installedLocation=&pageNum=10&pageSize=10&superQuery=1|{\"code\":200,\"msg\":\"查詢成功\",\"rows\":[],\"took\":2,\"total\":1}|||0|||1625141245843||||||2021-07-01 20:07:25|142|91110108769392234H|測試111|X"}}] }

創建pipeline

PUT _ingest/pipeline/logdatapipeline {"description" : "outer pipeline","processors" : [{"dissect": {"field": "message","pattern": "%{@logTimestamp} [%{logTthread}] %{loglevel} fileBeatLogData - %{logdata}"}},{"split": {"field": "logdata","separator": "\\|","target_field": "logdata"}},{"set": {"field": "actionOrFunction","value": "{{logdata.0}}"}},{"set": {"field": "businessType","value": "{{logdata.1}}"}},{"set": {"field": "callMethod","value": "{{logdata.2}}"}},{"set": {"field": "requestMethod","value": "{{logdata.3}}"}},{"set": {"field": "callLink","value": "{{logdata.4}}"}},{"set": {"field": "loginUserIp","value": "{{logdata.5}}"}},{"set": {"field": "userName","value": "{{logdata.6}}"}},{"set": {"field": "userId","value": "{{logdata.7}}"}},{"set": {"field": "paramOrInputData","value": "{{logdata.8}}"}},{"set": {"field": "resultOrOutputData","value": "{{logdata.9}}"}},{"set": {"field": "exceptionInfo","value": "{{logdata.10}}"}},{"set": {"field": "systemEnv","value": "{{logdata.11}}"}},{"set": {"field": "status","value": "{{logdata.12}}"}},{"set": {"field": "fullLinkId","value": "{{logdata.13}}"}},{"set": {"field": "subFullLinkId","value": "{{logdata.14}}"}},{"set": {"field": "currentTimeMillisecond","value": "{{logdata.15}}"}},{"convert": {"field": "currentTimeMillisecond","type": "long"}},{"set": {"field": "detail","value": "{{logdata.16}}"}},{"set": {"field": "other","value": "{{logdata.17}}"}},{"set": {"field": "errorData","value": "{{logdata.18}}"}},{"set": {"field": "errorDataSource","value": "{{logdata.19}}"}},{"set": {"field": "errorDataDetail","value": "{{logdata.20}}"}},{"set": {"field": "logTime","value": "{{logdata.21}}"}},{"set": {"field": "processTime","value": "{{logdata.22}}"}},{"convert": {"field": "processTime","type": "long"}},{"set": {"field": "orgCode","value": "{{logdata.23}}"}},{"set": {"field": "orgName","value": "{{logdata.24}}"}},{"set": {"field": "exceptionDetailInfo","value": "{{logdata.25}}"}},{"set": {"field": "message","value": ""}},{"set": {"field": "logdata","value": ""}},{"script": {"lang": "painless","source": """ ctx.insertTime = new Date(System.currentTimeMillis()+1000l*60*60*8); """}}] }

查看Pipeline是否創建成功

GET _ingest/pipeline/logDataPipeline?pretty

創建FileBeat配置文件 filebeat.yml

讀取 /var/log2/*.log 文件寫入ES

filebeat.inputs: - type: logenabled: true #讀取的文件paths:- /var/log2/*.log # 標記,在后面用于判斷寫入的索引fields:type: logDataPipelinesource: common - type: logenabled: truepaths:- /var/log/1.log- /var/log/2.logfields:source: exception - type: logenabled: truepaths:- /var/log/3.logfilebeat.config.modules:path: ${path.config}/modules.d/*.ymlreload.enabled: false# ======================= Elasticsearch template setting =======================setup.template.settings:# 索引默認分片數index.number_of_shards: 1# 索引默認副本數index.number_of_replicas: 1#index.codec: best_compression#_source.enabled: false# # 生成index模板的名稱 #允許自動生成index模板 setup.template.enabled: true # # 如果存在模塊則覆蓋 setup.template.overwrite: true # # # 生成index模板時字段配置文件 setup.template.fields: fields.yml setup.template.name: "logdata" # # # 生成index模板匹配的index格式 setup.template.pattern: "logdata-*" setup.ilm.enabled: auto # 這里一定要注意 會在alias后面自動添加-* setup.ilm.rollover_alias: "park-ssm" setup.ilm.pattern: "{now/d}" # # # 生成kibana中的index pattern,便于檢索日志 # #setup.dashboards.index: myfilebeat-7.0.0-* # #filebeat默認值為auto,創建的elasticsearch索引生命周期為50GB+30天。如果不改,可以不用設置 setup.ilm.enabled: false# =================================== Kibana =================================== setup.kibana:# ---------------------------- Elasticsearch Output ---------------------------- output.elasticsearch:# Array of hosts to connect to.hosts: ["10.8.10.12:9200"]index: "logdata-%{+yyyy.MM.dd}"indices:- index: "logdata-%{[fields.source]}-%{+yyyy.MM.dd}"when.equals: fields: source: "common"- index: "logdata-%{[fields.source]}-%{+yyyy.MM.dd}"when.equals:fields:source: "exception"pipelines:- pipeline: logDataPipelinewhen.equals:fields.type: logDataPipeline# ================================= Processors ================================= processors:- add_host_metadata:when.not.contains.tags: forwarded- add_cloud_metadata: ~- add_docker_metadata: ~- add_kubernetes_metadata: ~

創建自定義字段 FileBeat fields.yml

# 我們自定義的 - key: rbttitle: rbtdescription: rbt log data fields fields:- name: logdatatype: keyword- name: actionOrFunctiontype: keyword- name: businessTypetype: keyword- name: callMethodtype: keyword- name: requestMethodtype: keyword- name: callLinktype: keyword- name: loginUserIptype: keyword- name: userNametype: keyword- name: userIdtype: keyword- name: paramOrInputDatatype: keyword- name: resultOrOutputDatatype: keyword- name: exceptionInfotype: keyword- name: systemEnvtype: keyword- name: statustype: long- name: fullLinkIdtype: keyword- name: subFullLinkIdtype: keyword- name: currentTimeMillisecondtype: long- name: detailtype: keyword- name: othertype: keyword- name: errorDatatype: keyword- name: errorDataSourcetype: keyword- name: errorDataDetailtype: keyword- name: logTimetype: keyword- name: processTimetype: long- name: orgCodetype: keyword- name: orgNametype: keyword- name: exceptionDetailInfotype: keyword- name: insertTimetype: date# FileBeat自帶的 - key: ecstitle: ECSdescription: ECS Fields.fields:- name: '@timestamp'level: corerequired: truetype: datedescription: 'Date/time when the event originated.This is the date/time extracted from the event, typically representing whenthe event was generated by the source.If the event source has no original timestamp, this value is typically populatedby the first time the event was received by the pipeline.Required field for all events.'example: '2016-05-23T08:05:34.853Z'

執行 FileBeat

[root@test13 filebeat-7.9.3-linux-x86_64]# ls data fields.yml.bak filebeat.reference.yml filebeat.yml.bak LICENSE.txt modules.d README.md fields.yml filebeat filebeat.yml kibana module NOTICE.txt s.log [root@test13 filebeat-7.9.3-linux-x86_64]# ./filebeat -e

filebeat 啟動命令說明

-c 指定配置文件 -d "*" 報錯時候,查看具體的錯誤原因。

測試

新增數據到 vim /var/log2/test.log

2021-07-01 20:07:25 [XNIO-1 task-2] INFO fileBeatLogData - 查詢用戶|4|com.internet.operator.controller..list()|GET|http://127.0.0.1:8080/list|127.0.0.1|jast110|9a2e232170744efda8c526d67f4f5405|userAcco909571P&installedLocation=&pageNum=10&pageSize=10&superQuery=1|{"code":200,"msg":"查詢成功","rows":[],"took":2,"total":1}|||0|||1625141245843||||||2021-07-01 20:07:25|142|91110108769392234H|測試111|X

查詢結果發現日志已經進入到ES

個人公眾號(大數據學習交流): hadoopwiki

Pipeline 配置詳解

1. 根據日志數據指定索引 _id

每個文檔都會有一些元數據字段信息(metadata filed),比如_id,_index,_type 等,我們在 processors 中也可以直接訪問這些信息的,比如下面的例子:

{"set": {"field": "_id","value": "{{logdata.6}}"} }

FileBeat 配置詳解

注意:首次創建的時候FileBeat會在ElasticSearch設置我們再FileBeat配置的_template索引模板,后續重啟服務即便配置改了都不會更新該模板,比如下面的分片副本數量,首次啟動后,該配置會寫入索引模板中,后續修改不起作用。需要重新配置修改,需要刪除filebeat目錄下的data目錄。

1.設置Filebeat保存到ElasticSearch索引副本、分片數量

修改 filebeat.yml 文件中下面參數

setup.template.settings:# 索引默認分片數index.number_of_shards: 1# 索引默認副本數index.number_of_replicas: 1

異常處理

提示 ERROR instance/beat.go:802 Exiting: error initializing processors:

異常內容如下

2022-01-20T14:39:22.441+0800 ERROR instance/beat.go:802 Exiting: error initializing processors: Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running? Exiting: error initializing processors: Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?

解決方法
注釋掉filebeat.yml文件中的add_docker_metadata和add_kubernetes_metadata

# ================================= Processors ================================= processors:- add_host_metadata:when.not.contains.tags: forwarded- add_cloud_metadata: ~ # - add_docker_metadata: ~ # - add_kubernetes_metadata: ~

總結

以上是生活随笔為你收集整理的FileBeat + Pipeline 解析日志 保存至ElasticSearch(实战)的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。