日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

Hbase Import导入数据异常处理-RetriesExhaustedWithDetailsException

發布時間:2024/8/23 编程问答 28 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Hbase Import导入数据异常处理-RetriesExhaustedWithDetailsException 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

CDH顯示

問題導致原因:

hbase org.apache.hadoop.hbase.mapreduce.Import -Dmapred.job.queue.name=etl crawl:wechat_biz /hbase/test4

執行import時,短時間內寫入數據量過大導致寫入異常。

18/09/11 09:44:27 INFO mapreduce.Job: Task Id : attempt_1536465059397_0003_m_000125_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 404 actions: org.apache.hadoop.hbase.RegionTooBusyException: Over memstore limit=256.0M, regionName=2431744e123e49dee5f099876ebb8bff, server=testHostName,16020,1536467992250at org.apache.hadoop.hbase.regionserver.HRegion.checkResources(HRegion.java:4194)at org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3815)at org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3755)at org.apache.hadoop.hbase.regionserver.RSRpcServices.doBatchOp(RSRpcServices.java:1027)at org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicBatchOp(RSRpcServices.java:959)at org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:922)at org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2666)at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42014)at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:409)at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) : 404 times, servers with issues: fwqzx011.zh,16020,1536467992250at org.apache.hadoop.hbase.client.BatchErrors.makeException(BatchErrors.java:54)at org.apache.hadoop.hbase.client.AsyncRequestFutureImpl.getErrors(AsyncRequestFutureImpl.java:1225)at org.apache.hadoop.hbase.client.BufferedMutatorImpl.doFlush(BufferedMutatorImpl.java:309)at org.apache.hadoop.hbase.client.BufferedMutatorImpl.mutate(BufferedMutatorImpl.java:203)at org.apache.hadoop.hbase.client.BufferedMutatorImpl.mutate(BufferedMutatorImpl.java:179)at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:143)at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:93)at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:670)at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:584)at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:539)at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:522)at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:505)at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1685)at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)

?

原因Region的memstore占用內存大小超過正常的n倍,這時候會拋異常,寫入請求會被拒絕,客戶端開始重試請求。當達到128M的時候會觸發flush memstore,當達到128M * n還沒法觸發flush時候會拋異常來拒絕寫入。兩個相關參數的默認值如下:

hbase.hregion.memstore.flush.size=128M hbase.hregion.memstore.block.multiplier=4

調整相關參數,保證還數據導入時不會異常。

hbase.hregion.memstore.flush.size=512M hbase.hregion.memstore.block.multiplier=8

hbase.hregion.memstore.block.multiplier參數參考:

https://blog.csdn.net/zhangshenghang/article/details/82745205

總結

以上是生活随笔為你收集整理的Hbase Import导入数据异常处理-RetriesExhaustedWithDetailsException的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。