存档

作者存档

关于nginx和linux内核的keepalive配置

2016年10月14日 没有评论

今天iOS的童鞋反馈连接出错问题:
Error Domain=NSURLErrorDomain Code=-1005 “The network connection was lost.”

看到有建议说调服务的keepalive参数,让范童鞋配合搞了一下,果然有收获:
目前的配置是:
nginx.conf

keepalive_timeout  65;

内核配置

# cat /proc/sys/net/ipv4/tcp_keepalive_time
30

两个值不一致,目测应该有问题。
更多内容…

分类: 工作 标签: , ,

JDK1.8与JDK1.7的差异:实例类型自动感知

2016年9月22日 没有评论

今天碰到一个问题,同样的jar包,在不同的环境,有的可用,有的不可用。跟了一下,才发现jdk1.8的这个差异点。
简化的示例代码

package test;

import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;

public class JDK8 {
	private static final Map<String, String> map = new ConcurrentHashMap<String, String>();
	public static void main(String[] args) {
		map.putIfAbsent("name", "value");
	}
}

更多内容…

分类: 工作 标签: , ,

HBase数据不一致问题再现及解决

2016年9月18日 没有评论

今天手工迁移了一批HBase的数据,然后又碰到了这个熟悉的面孔:

On-disk size without header provided is 327866, but block header contains 65584. Block offset: -1, data starts with: DATABLK* ....

怀疑是数据不一致问题,对数据进行了一次刷新,果然好了。
所以目前的情况,只能理解为:数据不一致引起的问题。

具体操作如下:

flush 'confirmedfile';
major_compact 'confirmedfile';

即:
1、将confirmedfile表所有memstore刷新到hdfs,会产生大量storefile。
2、合并storefile。
更多内容…

使用nvm管理node环境

2016年8月24日 没有评论

一直是手工到node.js官网下载tar.xz包安装,忽然有一天发现一个好工具:nvm
具体用法:

找个位置安装nvm,我放到~/.git下

cd
mkdir .git
cd .git
$ git clone https://github.com/creationix/nvm.git

登录后自动运行nvm.sh,需要修改~/.bashrc文件

vim ~/.bashrc

增加如下一行:

source ~/.git/nvm/nvm.sh

更多内容…

分类: 工作 标签: , ,

关于头像的一个常见问题

2016年8月23日 没有评论

QA发现一个bug,线某用户头像忽然变了,变成另一个同事的头像。
经过艰苦的排查,发现是一个很低级的错误:测试环境的用户id跟线上的有冲突。

问题描述:
用户头像存储,用的是同一个cdn的同一个bucket,命名规则也一样:xxx_id,结果,测试环境的头像url跟线上同id冲突,且测试的id和线上的id对应的不是同一个人。

问题处理:
方案一:为测试环境保留足够多的id。确保跟线上隔离。
方案二:cdn存储空间跟线上隔离,确保测试id和线上id的对应url不同。

问题很低级,也是比较常见。备忘一下。

HBase异常一例:问题初步解决

2016年8月17日 没有评论

看起来是存储有问题,先上异常栈:

2016-08-16 05:44:30.265 ERROR 1419 --- [http-nio-9001-exec-283] -
java.lang.RuntimeException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Tue Aug 16 05:44:30 CST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68076: row 'h�^A"^@^@^@^@^@e�ё ^D^@' on table 'fileindex' at region=fileindex,,1461811143505.c0d3d5ad7798ea1a5c6081fb35872735.,
 hostname=xxxx292,16020,1470015445443, seqNum=2444796
        at org.apache.hadoop.hbase.client.AbstractClientScanner$1.hasNext(AbstractClientScanner.java:97) ~[hbase-client-1.1.1.jar!/:1.1.1]
		....
        at sun.reflect.GeneratedMethodAccessor139.invoke(Unknown Source) ~[na:na]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_79]
        at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_79]
        at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:222) [spring-web-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:137) [spring-web-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:110) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:814) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:737) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:959) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:893) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:969) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:871) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:648) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:845) [spring-webmvc-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) [tomcat-embed-websocket-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:121) [spring-web-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.2.4.RELEASE.jar!/:4.2.4.RELEASE]
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.valves.RemoteIpValve.invoke(RemoteIpValve.java:676) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:521) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1096) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:674) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1500) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1456) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_79]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_79]
        at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-8.0.30.jar!/:8.0.30]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_79]
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Tue Aug 16 05:44:30 CST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68076: row 'h�^A"^@^@^@^@^@e�ё ^D^@' on table 'fileindex' at region=fileindex,,1461811143505.c0d3d5ad7798ea1a5c6081fb35872735., hostname=xxxx292,16020,1470015445443, seqNum=2444796
        at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:271) ~[hbase-client-1.1.1.jar!/:1.1.1]
        at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:223) ~[hbase-client-1.1.1.jar!/:1.1.1]
        at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:61) ~[hbase-client-1.1.1.jar!/:1.1.1]
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) ~[hbase-client-1.1.1.jar!/:1.1.1]
        at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320) ~[hbase-client-1.1.1.jar!/:1.1.1]
        at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:403) ~[hbase-client-1.1.1.jar!/:1.1.1]
        at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:364) ~[hbase-client-1.1.1.jar!/:1.1.1]
        at org.apache.hadoop.hbase.client.AbstractClientScanner$1.hasNext(AbstractClientScanner.java:94) ~[hbase-client-1.1.1.jar!/:1.1.1]
        ... 54 common frames omitted
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68076: row 'h�^A"^@^@^@^@^@e�ё ^D^@' on table 'fileindex' at region=fileindex,,1461811143505.c0d3d5ad7798ea1a5c6081fb35872735., hostname=xxxx292,16020,1470015445443, seqNum=2444796
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159) ~[hbase-client-1.1.1.jar!/:1.1.1]
        at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64) ~[hbase-client-1.1.1.jar!/:1.1.1]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_79]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_79]
        ... 1 common frames omitted
Caused by: java.io.IOException: java.io.IOException: Could not seekToPreviousRow StoreFileScanner[HFileScanner for reader reader=hdfs://ns06cluster/apps/hbase/data/data/default/fileindex/c0d3d5ad7798ea1a5c6081fb35872735/bs/1245e48bd690407893c35f0e0578f909, compression=none, cacheConf=blockCache=LruBlockCache{blockCount=147151, currentSize=9772114048, freeSize=514337664, maxSize=10286451712, heapSize=9772114048, minSize=9772129280, minFactor=0.95, multiSize=4886064640, multiFactor=0.5, singleSize=2443032320, singleFactor=0.25}, cacheDataOnRead=true, cacheDataOnWrite=false, cacheIndexesOnWrite=false, cacheBloomsOnWrite=false, cacheEvictOnClose=false, cacheDataCompressed=false, prefetchOnOpen=false, firstKey=\x00\x00\x00\x00\x00\x00\x00\x00\x00`\x06\xBA\xE4\x90\x04\x00/bs:fileKey/1464861184798/Put, lastKey=\xFF\xFD@"\x00\x00\x00\x00\x00d \x8B\xC4|\x04\x00/bs:fileKey/1469264431944/Put, avgKeyLen=38, avgValueLen=8, entries=690525, length=41267369, cur=h\x8D\x01"\x00\x00\x00\x00\x00e\xB5\xD1\x91 \x04\x01/bs:fileKey/1470964268832/Put/vlen=8/seqid=3148032] to key h\x8D\x01"\x00\x00\x00\x00\x00d\xDA\xCF\xCE<\x04\x13/bs:fileKey/1470045687342/Put/vlen=8/seqid=2554915
        at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekToPreviousRow(StoreFileScanner.java:477)
        at org.apache.hadoop.hbase.regionserver.ReversedKeyValueHeap.next(ReversedKeyValueHeap.java:136)
        at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:596)
        at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:147)
        at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:5587)
        at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:5738)
        at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5525)
        at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2396)
        at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32205)
        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
        at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
        at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: On-disk size without header provided is 196710, but block header contains 65569. Block offset: -1, data starts with: DATABLK*\x00\x01\x00!\x00\x01\x00\x0D\x00\x00\x00\x00\x00\xBF0\xCD\x01\x00\x00@\x00\x00\x01\x00
        at org.apache.hadoop.hbase.io.hfile.HFileBlock.validateOnDiskSizeWithoutHeader(HFileBlock.java:500)
        at org.apache.hadoop.hbase.io.hfile.HFileBlock.access$700(HFileBlock.java:85)
        at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1625)
        at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1470)
        at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:438)
        at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekBefore(HFileReaderV2.java:674)
        at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekBefore(HFileReaderV2.java:647)
        at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekToPreviousRow(StoreFileScanner.java:441)
        ... 13 more

更多内容…

分类: 工作 标签: ,