Hbase Import导入数据异常处理-RetriesExhaustedWithDetailsException
生活随笔
收集整理的這篇文章主要介紹了
Hbase Import导入数据异常处理-RetriesExhaustedWithDetailsException
小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.
CDH顯示
問題導(dǎo)致原因:
hbase org.apache.hadoop.hbase.mapreduce.Import -Dmapred.job.queue.name=etl crawl:wechat_biz /hbase/test4執(zhí)行import時(shí),短時(shí)間內(nèi)寫入數(shù)據(jù)量過大導(dǎo)致寫入異常。
18/09/11 09:44:27 INFO mapreduce.Job: Task Id : attempt_1536465059397_0003_m_000125_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 404 actions: org.apache.hadoop.hbase.RegionTooBusyException: Over memstore limit=256.0M, regionName=2431744e123e49dee5f099876ebb8bff, server=testHostName,16020,1536467992250at org.apache.hadoop.hbase.regionserver.HRegion.checkResources(HRegion.java:4194)at org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3815)at org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3755)at org.apache.hadoop.hbase.regionserver.RSRpcServices.doBatchOp(RSRpcServices.java:1027)at org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicBatchOp(RSRpcServices.java:959)at org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:922)at org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2666)at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42014)at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:409)at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304) : 404 times, servers with issues: fwqzx011.zh,16020,1536467992250at org.apache.hadoop.hbase.client.BatchErrors.makeException(BatchErrors.java:54)at org.apache.hadoop.hbase.client.AsyncRequestFutureImpl.getErrors(AsyncRequestFutureImpl.java:1225)at org.apache.hadoop.hbase.client.BufferedMutatorImpl.doFlush(BufferedMutatorImpl.java:309)at org.apache.hadoop.hbase.client.BufferedMutatorImpl.mutate(BufferedMutatorImpl.java:203)at org.apache.hadoop.hbase.client.BufferedMutatorImpl.mutate(BufferedMutatorImpl.java:179)at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:143)at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:93)at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:670)at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:584)at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:539)at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:522)at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:505)at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1685)at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)?
原因Region的memstore占用內(nèi)存大小超過正常的n倍,這時(shí)候會(huì)拋異常,寫入請(qǐng)求會(huì)被拒絕,客戶端開始重試請(qǐng)求。當(dāng)達(dá)到128M的時(shí)候會(huì)觸發(fā)flush memstore,當(dāng)達(dá)到128M * n還沒法觸發(fā)flush時(shí)候會(huì)拋異常來拒絕寫入。兩個(gè)相關(guān)參數(shù)的默認(rèn)值如下:
hbase.hregion.memstore.flush.size=128M hbase.hregion.memstore.block.multiplier=4調(diào)整相關(guān)參數(shù),保證還數(shù)據(jù)導(dǎo)入時(shí)不會(huì)異常。
hbase.hregion.memstore.flush.size=512M hbase.hregion.memstore.block.multiplier=8hbase.hregion.memstore.block.multiplier參數(shù)參考:
https://blog.csdn.net/zhangshenghang/article/details/82745205
總結(jié)
以上是生活随笔為你收集整理的Hbase Import导入数据异常处理-RetriesExhaustedWithDetailsException的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 时序数据库技术体系 – InfluxDB
- 下一篇: phoenix-sqlline 连接失败