系统连接数 ulimit -n 65535 查看命令 ulimit -a socket最大连接数 sudo echo 10000 > /proc/sys/net/core/somaxconn 加快tcp回收系统默认是0。 sudo echo 1 > /proc/sys/net/ipv4/tcp_tw_recycle 洪水抵御保护系统默认是1 echo 0 > /proc/sys/net/ipv4/tcp_syncookies 这个修改强烈建议在做完压力测试后改回来 Nginx Nginx使用默认配置的话,只能接受1024个并发请求。我们可以通过修改配置来增加并发。找到nginx.conf,位置一般在/etc/nginx/nginx.conf。在Nginx全局的配置中worker_processes 1的下面加上worker_rlimit_nofile 10240,允许Nginx的子进程可以打开10240个文件。event中,worker_connections从1024改为10000。 ulimit -n 65535 sudo echo 10000 > /proc/sys/net/core/somaxconn sudo echo 1 > /proc/sys/net/ipv4/tcp_tw_recycle …
sbt相关配置
安装 以下方式任选其一: 官网下载zip/tat包,解压 brew install sbt 环境变量 如果使用独立安装包的方式安装需要配置环境变量,brew方式已自动配置 仓库修改 创建~/.sbt/repositories内容如下: [repositories] local osc: http://maven.aliyun.com/nexus/content/groups/public/ typesafe: http://repo.typesafe.com/typesafe/ivy-releases/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext], bootOnly sonatype-oss-releases maven-central sonatype-oss-snapshots IDEA配置修改 sbt-launch.jar指向安装地址 添加环境变量 -Dsbt.override.build.repos=true …
idea中配置spark开发环境via scala
具体步骤 为idea添加scala插件 下载spark,查看jars中scala版本,如:scala-compiler-2.11.8.jar、scala-library-2.11.8.jar 下载上面对应的scala版本 idea中添加scalaSDK,导入spark依赖 demo code import org.apache.spark.{SparkConf, SparkContext} /** * Created by wei on 17/3/19. */ object Demo extends App { val conf = new SparkConf().setAppName("hello").setMaster("local") val sc = new SparkContext(conf) var text = sc.parallelize(Seq("a", "b", "c")) val c = text.count() print(c) } 遇到的问题 项目配置 Error:scalac: Error: requirement failed: package compress java.lang.IllegalArgumentException: requirement failed: package compress at scala.reflect.internal.Types$ModuleTypeRef.<init>(Types.scala:1879) 需要修改项目依赖及SDK设置 缺少依赖 Error:scalac: error while loading RDD, Missing dependency 'bad symbolic reference. A signature in RDD.class refers to term annotation in package org.apache.spark which is not available. 查看spark-core_2.11-2.1.0.jar确实有RDD这个类,尝试把jars目录下的所有jar导入后运行正常 上面的两个问题困扰了我很久,直到[Error in running Spark-Scala example in intellij ,里面提到了scala版本,查看spark的依赖里面已经包含了scala,但如果要在idea中启动又必须要设置一个SDK,又下载了spark版本一直的scala后才启动正常。 …
1天-基础软件安装、配置
准备安装文件 CDH-5.3.10-1.cdh5.3.10.p0.19-el6.parcel cloudera-manager-el6-cm5.1.3_x86_64.tar.gz jdk-7-linux-x64.rpm 安装JDK 查看现有版本java rpm -qa|grep java yum -y remove 卸载 rpm -ivh jdk-7-linux-x64.rpm echo “JAVA_HOME=/usr/java/latest/” » /etc/environment 关闭防火墙、SELinux 关闭防火墙 service iptables stop (临时关闭) chkconfig iptables off (重启后生效) 关闭SELinux setenforce 0 (临时生效) 修改 /etc/selinux/config 下的 SELINUX=disabled (重启后永久生效) host cdh1/2/3 vi /etc/hosts 192.168.0.114 cdh1 192.168.0.115 cdh2 192.168.0.116 cdh3 无密码登录 cdh1 ssh-keygen -t rsa ssh-copy-id root@cdh2 ssh-copy-id root@cdh3 cdh2 ssh-keygen -t rsa ssh-copy-id root@cdh1 ssh-copy-id root@cdh3 cdh3 ssh-keygen -t rsa ssh-copy-id root@cdh2 ssh-copy-id root@cdh1 ntp 所有节点安装ntp服务 yum install ntp chkconfig ntpd on cdh1 ntpdate -u 202.120.2.101 vi /etc/ntp.conf vi /etc/ntp.conf driftfile /var/lib/ntp/drift restrict 127.0.0.1 restrict -6 ::1 restrict default nomodify notrap...…
0天-虚拟机安装
系统安装文件准备 CentOS-6.8-x86_64-minimal.iso VirtualBox-5.1.18-114002-OSX 虚拟机安装 设置好虚拟机1cpu 2G内容 3G硬盘 载入安装盘,网络设置选择桥接 最小化安装后默认没有启用网卡 vi /etc/sysconfig/network vi /etc/hosts 修改配置/etc/sysconfig/network-scripts/ifcfg-eth0 DEVICE=eth0 HWADDR=08:00:27:D3:57:8E TYPE=Ethernet UUID=72f1df9c-a65f-4c51-8d11-00becf5a9030 ONBOOT=yes NM_CONTROLLED=yes BOOTPROTO=static IPADDR=192.168.0.114 NETMASK=255.255.255.0 GATEWAY=192.168.0.1 改为静态地址后需要添加DNS,在DNS列表找到一个比较快的写入配置/etc/resolv.conf 重启网络service network restart 后续操作使用ssh完成 …
web项目集成swagger
添加依赖 <dependency> <groupId>io.springfox</groupId> <artifactId>springfox-swagger2</artifactId> <version>2.2.2</version> </dependency> <dependency> <groupId>io.springfox</groupId> <artifactId>springfox-swagger-ui</artifactId> <version>2.2.2</version> </dependency> 添加spring配置 <bean class="com.xx.uther.utils.SwaggerConfig"/> <mvc:resources location="classpath:/META-INF/resources/" mapping="swagger-ui.html"/> <mvc:resources location="classpath:/META-INF/resources/webjars/" mapping="/webjars/**"/> 构造的类用于配置swagger,内容如下 @EnableWebMvc @EnableSwagger2 @Configuration public class SwaggerConfig extends WebMvcConfigurationSupport { @Bean public Docket createRestApi() { return new Docket(DocumentationType.SWAGGER_2) .apiInfo(apiInfo()) .select() .apis(RequestHandlerSelectors.basePackage("com.xx.uther.web.controller.api")) .paths(PathSelectors.any()) .build(); } private ApiInfo apiInfo() { return new ApiInfoBuilder() .title("RCM APIs") .contact("RCM项目组") .version("V2.0.0") .build(); } } 改造Controller层 通常的用法如下 @RestController @RequestMapping(value = "/api", method = RequestMethod.GET) @Api(description = "xxxx服务") public class ServiceImplController { @Autowired RcmAdapter rcmAdapterImp; @Autowired Rcm4webAdapter rcm4webAdapter; @ApiOperation(value = "网认证方式", notes = "互联网认证方式") @RequestMapping(value = "/getAuthenticationFromRong360ByH5", method = RequestMethod.GET) public Object getAuthenticationFromRong360ByH5(@RequestParam String typeEnum, @RequestParam String customerid)...…