今天手工迁移了一批HBase的数据,然后又碰到了这个熟悉的面孔:
[code]
On-disk size without header provided is 327866, but block header contains 65584. Block offset: -1, data starts with: DATABLK* ....
[/code]

怀疑是数据不一致问题,对数据进行了一次刷新,果然好了。
所以目前的情况,只能理解为:数据不一致引起的问题。

具体操作如下:
[code]
flush 'confirmedfile';
major_compact 'confirmedfile';
[/code]
即:
1、将confirmedfile表所有memstore刷新到hdfs,会产生大量storefile。
2、合并storefile。

附上本次问题的完整异常信息
[code]
2016-09-18 15:25:24.211 ERROR 7255 --- [http-nio-9001-exec-157] c.b.xxx.api.command.ApiFacadeCommand : onExecute(ApiCommandContext) sig:61ca0c3830d6024992c74ada0516ccbb

java.lang.RuntimeException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Sun Sep 18 15:25:24 CST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=60931: row '^@`�B^B�^D^@' on table 'confirmedfile' at region=confirmedfile,,1469976800253.cfaa0bc267a81ee3dd6e6ce9e0f9ea13., ho
stname=xxxx294,16020,1470015347493, seqNum=803411

at org.apache.hadoop.hbase.client.AbstractClientScanner$1.hasNext(AbstractClientScanner.java:97) ~[hbase-client-1.1.2.jar!/:1.1.2]
...
at sun.reflect.GeneratedMethodAccessor355.invoke(Unknown Source) ~[na:na]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_79]
at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_79]
...
at sun.reflect.GeneratedMethodAccessor149.invoke(Unknown Source) ~[na:na]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_79]
at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_79]
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:222) [spring-web-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:137) [spring-web-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:110) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:814) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:737) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:959) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:893) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:969) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:871) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:648) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:845) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) [tomcat-embed-websocket-8.0.30.jar!/:8.0.30]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:121) [spring-web-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.valves.RemoteIpValve.invoke(RemoteIpValve.java:676) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:521) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1096) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:674) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1500) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1456) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_79]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_79]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
at java.lang.Thread.run(Thread.java:745) [na:1.7.0_79]
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Sun Sep 18 15:25:24 CST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=60931: row '^@`�B^B�^D^@' on table 'confirmedfile' at region=confirmedfile,,1469976800253.cfaa0bc267a81ee3dd6e6ce9e0f9ea13., ho
stname=xxxx294,16020,1470015347493, seqNum=803411

at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:271) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:195) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:59) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:403) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:364) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.AbstractClientScanner$1.hasNext(AbstractClientScanner.java:94) ~[hbase-client-1.1.2.jar!/:1.1.2]
... 53 common frames omitted
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=60931: row '^@`�B^B�^D^@' on table 'confirmedfile' at region=confirmedfile,,1469976800253.cfaa0bc267a81ee3dd6e6ce9e0f9ea13., hostname=xxxx294,16020,1470015347493, seqNum=803411
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64) ~[hbase-client-1.1.2.jar!/:1.1.2]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_79]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_79]
... 1 common frames omitted
Caused by: java.io.IOException: java.io.IOException: Could not seekToPreviousRow StoreFileScanner[HFileScanner for reader reader=hdfs://ns06cluster/apps/hbase/data/data/default/confirmedfile/cfaa0bc267a81ee3dd6e6ce9e0f9ea13/bs/0ea88642120c416da4b284e170767273, compression=none, cacheConf=blockCache=LruBlockCache{blockCount=148008, currentSize=9773602864, freeSize=512848848, maxSize=10286451712, heapSize=9773602864, minSize=9772129280, minFactor=0.95, multiSize=4886064640, multiFactor=0.5, singleSize=2443032320, singleFactor=0.25}, cacheDataOnRead=true, cacheDataOnWrite=false, cacheIndexesOnWrite=false, cacheBloomsOnWrite=false, cacheEvictOnClose=false, cacheDataCompressed=false, prefetchOnOpen=false, firstKey=\x00_\xC6\xFF[8\x04\x00/bs:/1473646930474/DeleteFamily, lastKey=\x00`\xB9B\x01|\x04\x00/bs:/1474176630295/DeleteFamily, avgKeyLen=22, avgValueLen=0, entries=133949, length=4895314, cur=\x00`y:\x8D(\x04\x00/bs:/1474175903843/DeleteFamily/vlen=0/seqid=3165495] to key \x00`y:\x8B\xF8\x04\x00/bs:/1474175903796/DeleteFamily/vlen=0/seqid=3165493
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekToPreviousRow(StoreFileScanner.java:477)
at org.apache.hadoop.hbase.regionserver.ReversedKeyValueHeap.next(ReversedKeyValueHeap.java:136)
at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:629)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:147)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:5587)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:5738)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5525)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2396)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32205)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: On-disk size without header provided is 327866, but block header contains 65584. Block offset: -1, data starts with: DATABLK*\x00\x01\x000\x00\x01\x00\x1C\x00\x00\x00\x00\x007\x0E\x91\x01\x00\[email protected]\x00\x00\x01\x00
at org.apache.hadoop.hbase.io.hfile.HFileBlock.validateOnDiskSizeWithoutHeader(HFileBlock.java:500)
at org.apache.hadoop.hbase.io.hfile.HFileBlock.access$700(HFileBlock.java:85)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1625)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1470)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:438)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekBefore(HFileReaderV2.java:674)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekBefore(HFileReaderV2.java:647)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekToPreviousRow(StoreFileScanner.java:441)
... 13 more

at sun.reflect.GeneratedConstructorAccessor217.newInstance(Unknown Source) ~[na:na]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.7.0_79]
at java.lang.reflect.Constructor.newInstance(Constructor.java:526) ~[na:1.7.0_79]
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) ~[hadoop-common-2.7.1.jar!/:na]
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95) ~[hadoop-common-2.7.1.jar!/:na]
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:325) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:255) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:346) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:320) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126) ~[hbase-client-1.1.2.jar!/:1.1.2]
... 4 common frames omitted
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: java.io.IOException: Could not seekToPreviousRow StoreFileScanner[HFileScanner for reader reader=hdfs://ns06cluster/apps/hbase/data/data/default/confirmedfile/cfaa0bc267a81ee3dd6e6ce9e0f9ea13/bs/0ea88642120c416da4b284e170767273, compression=none, cacheConf=blockCache=LruBlockCache{blockCount=148008, currentSize=9773602864, freeSize=512848848, maxSize=10286451712, heapSize=9773602864, minSize=9772129280, minFactor=0.95, multiSize=4886064640, multiFactor=0.5, singleSize=2443032320, singleFactor=0.25}, cacheDataOnRead=true, cacheDataOnWrite=false, cacheIndexesOnWrite=false, cacheBloomsOnWrite=false, cacheEvictOnClose=false, cacheDataCompressed=false, prefetchOnOpen=false, firstKey=\x00_\xC6\xFF[8\x04\x00/bs:/1473646930474/DeleteFamily, lastKey=\x00`\xB9B\x01|\x04\x00/bs:/1474176630295/DeleteFamily, avgKeyLen=22, avgValueLen=0, entries=133949, length=4895314, cur=\x00`y:\x8D(\x04\x00/bs:/1474175903843/DeleteFamily/vlen=0/seqid=3165495] to key \x00`y:\x8B\xF8\x04\x00/bs:/1474175903796/DeleteFamily/vlen=0/seqid=3165493
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekToPreviousRow(StoreFileScanner.java:477)
at org.apache.hadoop.hbase.regionserver.ReversedKeyValueHeap.next(ReversedKeyValueHeap.java:136)
at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:629)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:147)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:5587)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:5738)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5525)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2396)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32205)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: On-disk size without header provided is 327866, but block header contains 65584. Block offset: -1, data starts with: DATABLK*\x00\x01\x000\x00\x01\x00\x1C\x00\x00\x00\x00\x007\x0E\x91\x01\x00\[email protected]\x00\x00\x01\x00
at org.apache.hadoop.hbase.io.hfile.HFileBlock.validateOnDiskSizeWithoutHeader(HFileBlock.java:500)
at org.apache.hadoop.hbase.io.hfile.HFileBlock.access$700(HFileBlock.java:85)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1625)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1470)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:438)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekBefore(HFileReaderV2.java:674)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekBefore(HFileReaderV2.java:647)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekToPreviousRow(StoreFileScanner.java:441)
... 13 more

at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1248) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287) ~[hbase-client-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651) ~[hbase-protocol-1.1.2.jar!/:1.1.2]
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:213) ~[hbase-client-1.1.2.jar!/:1.1.2]
... 9 common frames omitted
[/code]

标签: flush, hbase, major_compact, On-disk size without header provided is

添加新评论