日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 运维知识 > 数据库 >内容正文

数据库

把hive数据导出至mysql

發布時間:2023/12/31 数据库 20 豆豆
生活随笔 收集整理的這篇文章主要介紹了 把hive数据导出至mysql 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

##################環境###########################################

軟件版本備注
Ubuntu19.10?
sqoop1.4.7?
mysql8.0.20-0ubuntu0.19.10.1 (Ubuntu)?
hadoop3.1.2?
hive3.0.0之所以和hive有關系是因為需要在.bashrc中設置HCAT_HOME

##########################數據集##################################

數據來自于[4]

mysql數據同步到hive內表之后,

清空mysql中的數據,保留表結構

##########################下面是第1種思路###################################

mysql操作如下:

create database?sqoop_hbase;

use sqoop_hbase;
CREATE TABLE book(id INT(4) PRIMARY KEY NOT NULL AUTO_INCREMENT,NAME VARCHAR(255) NOT NULL,price VARCHAR(255) NOT NULL);

執行命令:

sqoop export --connect jdbc:mysql://Desktop:3306/sqoop_hbase --username appleyuchi --password appleyuchi --table book --export-dir /user/hive/warehouse/book

最終出現了附錄中的報錯二,暫時無法解決.

##########################下面是第2種思路###################################

①從HIVE內表的存儲路徑中獲取數據文件存儲到本地

hdfs dfs -get /user/hive/warehouse/book/* ./

HIVE內表在HDFS中的存儲路徑可以參考[3]

這些數據文件拷入mysql

拷入前需要設置mysql[2],然后

mysql> LOAD DATA LOCAL INFILE '/home/appleyuchi/桌面/數據導入/part-m-00000' INTO TABLE sqoop_hbase.book;

mysql> LOAD DATA LOCAL INFILE '/home/appleyuchi/桌面/數據導入/part-m-00001' INTO TABLE sqoop_hbase.book;

mysql> LOAD DATA LOCAL INFILE '/home/appleyuchi/桌面/數據導入/part-m-00002' INTO TABLE sqoop_hbase.book;

#--------------------------------------------------------附錄-報錯一------------------------------------------------------------------------------------------

得到log如下:

SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/appleyuchi/bigdata/hadoop-3.1.2/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/appleyuchi/bigdata/apache-hive-3.0.0-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/appleyuchi/bigdata/hbase-2.2.4/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2020-05-29 22:03:31,127 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary. Note: /tmp/sqoop-appleyuchi/compile/95fdfeaba705f570f20353a9b48f576b/book.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details.

卡住,無任何報錯信息,失敗!

報錯信息在:

http://desktop:8088/cluster/scheduler

對,你沒看錯,是yarn管理界面里面才有sqoop操作hive的報錯信息。

?Application is added to the scheduler and is not yet activated. Queue's AM resource limit exceeded. Details : AM Partition = <DEFAULT_PARTITION>; AM Resource Request = <memory:3000, vCores:1>;

Queue Resource Limit for AM = <memory:7500, vCores:1>; User AM Resource Limit of the queue = <memory:7500, vCores:1>; Queue AM Resource Usage = <memory:6000, vCores:2>;

修改$HADOOP_HOME/etc/hadoop/capacity-scheduler.xml中的yarn.scheduler.capacity.maximum-am-resource-percent,我是調整為0.8

注意,不要僅僅修改namenode一個節點中的capacity-scheduler.xml文件,也需要修改datanode中的capacity-scheduler.xml文件

然后重啟HDFS集群,不需要重啟HIVE,HIVE是依托于HDFS的,所以只要關心namenode中的hive即可。

#--------------------------------------------------------附錄-報錯二------------------------------------------------------------------------------------------

2020-05-29 23:46:22,486 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead 2020-05-29 23:46:22,496 ERROR mapreduce.ExportJobBase: Export job failed! 2020-05-29 23:46:22,496 ERROR tool.ExportTool: Error during export: Export job failed!at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:445)at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)at org.apache.sqoop.Sqoop.run(Sqoop.java:147)at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

#---------------------------------------------------------------------------------------------------------------------------------------------------------

Reference:

[1]Hive數據導入Mysql

[2]ERROR 3948 (42000): Loading local data is disabled問題解決

[3]數據倉庫在HDFS上存儲數據的路徑(持續更新中)

[4]sqoop把mysql數據導入hive

總結

以上是生活随笔為你收集整理的把hive数据导出至mysql的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。