日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 运维知识 > 数据库 >内容正文

数据库

flume获取mysql日志到hdfs_Hadoop实战:Flume输入日志到HDFS报错解决

發布時間:2024/9/27 数据库 19 豆豆
生活随笔 收集整理的這篇文章主要介紹了 flume获取mysql日志到hdfs_Hadoop实战:Flume输入日志到HDFS报错解决 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

使用Flume把日志存儲到HDFS,在啟動時報錯如下:

2017-06-16 08:58:32,634 (conf-file-poller-0) [ERROR - org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:146)] Failed to start agent because dependencies were not found in classpath. Error follows.

java.lang.NoClassDefFoundError: org/apache/hadoop/io/SequenceFile$CompressionType

at org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:235)

at org.apache.flume.conf.Configurables.configure(Configurables.java:41)

at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:411)

at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:102)

at org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:141)

at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)

at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)

at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:748)

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.SequenceFile$CompressionType

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 12 more

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

2017-06-1608:58:32,634(conf-file-poller-0)[ERROR-org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:146)]Failedtostartagentbecausedependencieswerenotfoundinclasspath.Errorfollows.

java.lang.NoClassDefFoundError:org/apache/hadoop/io/SequenceFile$CompressionType

atorg.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:235)

atorg.apache.flume.conf.Configurables.configure(Configurables.java:41)

atorg.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:411)

atorg.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:102)

atorg.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:141)

atjava.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

atjava.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)

atjava.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)

atjava.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)

atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

atjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

atjava.lang.Thread.run(Thread.java:748)

Causedby:java.lang.ClassNotFoundException:org.apache.hadoop.io.SequenceFile$CompressionType

atjava.net.URLClassLoader.findClass(URLClassLoader.java:381)

atjava.lang.ClassLoader.loadClass(ClassLoader.java:424)

atsun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

atjava.lang.ClassLoader.loadClass(ClassLoader.java:357)

...12more

其實這個報錯信息真的不太友好,看不錯哪里有問題。第一眼看上去一直以為是JAVA的classpath有問題,后來在網上看到一篇文章:安裝Flume遇到的問題及解決,說是缺少Hadoop相關的jar包導致,想想也對。解決這個問題也很簡單,就是在Flume主機上解壓好Hadoop的二進制安裝包,然后輸出Hadoop環境變量即可,Flume會根據環境變量自動找到相關的依賴jar包。

由于我這里的Flume是獨立安裝,沒有部署在Hadoop機器上,所以需要在Flume主機上安裝一個Hadoop。

Hadoop二進制包下載自行去國內源或者官網搞定。

$ tar xvf hadoop-2.8.0.tar.gz -C /usr/local/

$ ln -sv /usr/local/hadoop-2.8.0/ /usr/local/hadoop

$ useradd hadoop

$ passwd hadoop

1

2

3

4

$tarxvfhadoop-2.8.0.tar.gz-C/usr/local/

$ln-sv/usr/local/hadoop-2.8.0//usr/local/hadoop

$useraddhadoop

$passwdhadoop

然后最主要的就是輸出Hadoop環境變量,編輯環境配置文件/etc/profile.d/hadoop.sh,定義類似如下環境變量,設定Hadoop的運行環境。

#!/bin/bash

#

export HADOOP_PREFIX="/usr/local/hadoop"

export PATH=$PATH:$HADOOP_PREFIX/bin:$HADOOP_PREFIX/sbin

export HADOOP_COMMON_HOME=${HADOOP_PREFIX}

export HADOOP_HDFS_HOME=${HADOOP_PREFIX}

export HADOOP_MAPRED_HOME=${HADOOP_PREFIX}

export HADOOP_YARN_HOME=${HADOOP_PREFIX}

1

2

3

4

5

6

7

8

#!/bin/bash

#

exportHADOOP_PREFIX="/usr/local/hadoop"

exportPATH=$PATH:$HADOOP_PREFIX/bin:$HADOOP_PREFIX/sbin

exportHADOOP_COMMON_HOME=${HADOOP_PREFIX}

exportHADOOP_HDFS_HOME=${HADOOP_PREFIX}

exportHADOOP_MAPRED_HOME=${HADOOP_PREFIX}

exportHADOOP_YARN_HOME=${HADOOP_PREFIX}

$ source /etc/profile

1

$source/etc/profile

再次運行Flume-ng就應該可以了。

另外,當Flume-ng正常運行后,寫入HDFS時報錯:java.lang.NoClassDefFoundError: org/apache/hadoop/io/SequenceFile$CompressionType

Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE, inode=”/”:hadoop:supergroup:drwxr-xr-x.

這個提示很明顯,就是沒有寫入權限(因為你當前運行flume-ng的用戶不是Hadoop用戶),解決方案也很簡單,就是切換到Hadoop用戶執行flume-ng命令即可。

$ chown hadoop.hadoop -R /usr/local/flume/

$ su - hadoop

$ flume-ng .....

1

2

3

$chownhadoop.hadoop-R/usr/local/flume/

$su-hadoop

$flume-ng.....

或者開啟HDFS允許所有用戶進行文件寫入,默認可能你沒有開啟。需要在hdfs-site.xml配置文件中添加一項屬性定義:

dfs.permissions

false

1

2

3

4

dfs.permissions

false

完結。。。

如果您覺得本站對你有幫助,那么可以支付寶掃碼捐助以幫助本站更好地發展,在此謝過。

總結

以上是生活随笔為你收集整理的flume获取mysql日志到hdfs_Hadoop实战:Flume输入日志到HDFS报错解决的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。