【Java】JDBC Part5.1 Hikari连接池补充
Hikari Connection Pool Hikari 连接池
HikariCP 官方文档 https://github.com/brettwooldridge/HikariCP

Maven依赖

一般都用8版本
Maven仓库所在地址
https://mvnrepository.com/artifact/com.zaxxer/HikariCP/3.4.5
<dependency>
<groupId>com.zaxxer</groupId>
<artifactId>HikariCP</artifactId>
<version>3.4.2</version>
</dependency>
官方硬编码获取连接
public class HikariTest {
@Test
public void hikariTest() throws Exception {
HikariConfig config = new HikariConfig();
config.setJdbcUrl("jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai");
config.setUsername("root");
config.setPassword("123456");
config.addDataSourceProperty("cachePrepStmts", "true");
config.addDataSourceProperty("prepStmtCacheSize", "300");
config.addDataSourceProperty("prepStmtCacheSqlLimit", "2048");
HikariDataSource dataSource = new HikariDataSource(config);
Connection connection = dataSource.getConnection();
System.out.println(connection);
connection.close();
}
}
因为没有配置SLF4J日志工厂,这里报加载失败信息
上面的硬编码配置还支持了设置连接池的SQL预编译相关
- 开启与编译对象缓存
- 设置缓存个数300
- 设置缓存上限2048个

Hikari也是支持对配置文件读取方式的
但是啊但是啊,官方解释的驱动名很费解,我试了半天都不行
然后在这个博客看到了:http://zetcode.com/articles/hikaricp/
就是说你连MySQL不需要配置驱动的,直接写Url就好了
hikari.properties
jdbcUrl = jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai
dataSource.user = root
dataSource.password = 123456 # 开启SQL预编译对象缓存
dataSource.cachePrepStmts = true
# SQL预编译对象缓存个数 256
dataSource.prepStmtCacheSize = 256
# SQL预编译对象缓存个数上限 512
dataSource.prepStmtCacheSqlLimit = 512
连接测试
@Test
public void hikariTest2() throws Exception {
final String configureFile = "src/main/resources/hikari.properties";
HikariConfig configure = new HikariConfig(configureFile);
HikariDataSource dataSource = new HikariDataSource(configure);
Connection connection = dataSource.getConnection();
System.out.println(connection);
connection.close();
}

封装工具类
public class JdbcHikariUtil {
private static final DataSource dataSource = new HikariDataSource(new HikariConfig("src/main/resources/hikari.properties"));
public static Connection getConnection(){
try {
return dataSource.getConnection();
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
}
}
做一个全连接池和原生JDBC的封装工具类
首先是Maven依赖
<!-- https://mvnrepository.com/artifact/mysql/mysql-connector-java -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>8.0.19</version>
</dependency> <!-- https://mvnrepository.com/artifact/junit/junit -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13</version>
<scope>test</scope>
</dependency> <!-- https://mvnrepository.com/artifact/com.mchange/c3p0 -->
<dependency>
<groupId>com.mchange</groupId>
<artifactId>c3p0</artifactId>
<version>0.9.5.5</version>
</dependency> <!-- https://mvnrepository.com/artifact/commons-dbutils/commons-dbutils -->
<dependency>
<groupId>commons-dbutils</groupId>
<artifactId>commons-dbutils</artifactId>
<version>1.7</version>
</dependency> <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-dbcp2 -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-dbcp2</artifactId>
<version>2.7.0</version>
</dependency> <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-pool2 -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-pool2</artifactId>
<version>2.8.0</version>
</dependency> <!-- https://mvnrepository.com/artifact/com.alibaba/druid -->
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>druid</artifactId>
<version>1.1.22</version>
</dependency> <dependency>
<groupId>com.zaxxer</groupId>
<artifactId>HikariCP</artifactId>
<version>3.4.2</version>
</dependency>
然后是各个连接池的配置
原生JDBC jdbc.properties
driverClass = com.mysql.cj.jdbc.Driver
url = jdbc:mysql://localhost:3306/jdbc_db?serverTimezone=Asia/Shanghai
user = root
password = 123456
C3P0 c3p0-config.xml
<?xml version="1.0" encoding="UTF-8" ?>
<c3p0-config>
<!-- 自定义的配置命名-->
<named-config name="c3p0_xml_config"> <!-- 四个基本信息 -->
<property name="driverClass">com.mysql.cj.jdbc.Driver</property>
<!-- 默认本地可以省略 jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai -->
<property name="jdbcUrl">jdbc:mysql://localhost:3306/jdbc_db?serverTimezone=Asia/Shanghai</property>
<property name="user">root</property>
<property name="password">123456</property> <!-- 连接池管理信息 --> <!-- 连接对象数不够时,每次申请 迭增的连接数 -->
<property name="acquireIncrement">5</property>
<!-- 初始池大小存放的连接对象数 -->
<property name="initialPoolSize">10</property>
<!-- 最小连接对象数 -->
<property name="minPoolSize">10</property>
<!-- 最大连接对象数,不可超出的范围 -->
<property name="maxPoolSize">100</property>
<!-- 最多维护的SQL编译对象个数-->
<property name="maxStatements">50</property>
<!-- 每个连接对象最多可使用的SQL编译对象的个数 -->
<property name="maxStatementsPerConnection">2</property>
</named-config>
</c3p0-config>
DBCP dbcp.properties
driverClassName = com.mysql.cj.jdbc.Driver
url = jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai
username = root
password = 123456
Druid druid.properties ,可以共用c3p0,基本一样
driverClassName = com.mysql.cj.jdbc.Driver
url = jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai
username = root
password = 123456
Hikari hikari.properties
jdbcUrl = jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai
dataSource.user = root
dataSource.password = 123456 # 开启SQL预编译对象缓存
dataSource.cachePrepStmts = true
# SQL预编译对象缓存个数 256
dataSource.prepStmtCacheSize = 256
# SQL预编译对象缓存个数上限 512
dataSource.prepStmtCacheSqlLimit = 512
完整封装工具类
package cn.dai.util; import com.alibaba.druid.pool.DruidDataSourceFactory;
import com.mchange.v2.c3p0.ComboPooledDataSource;
import com.zaxxer.hikari.HikariConfig;
import com.zaxxer.hikari.HikariDataSource;
import org.apache.commons.dbcp2.BasicDataSourceFactory; import javax.sql.DataSource;
import java.io.InputStream;
import java.sql.*;
import java.util.Properties; /**
* @author ArkD42
* @file Jdbc
* @create 2020 - 04 - 24 - 22:04
*/
public class CompleteJdbcUtils {
private CompleteJdbcUtils(){} //private static String driverClass;
private static String url;
private static String user;
private static String password; private static DataSource dataSourceFromDBCP;
private static DataSource dataSourceFromDruid;
private static final DataSource dataSourceFromC3P0 = new ComboPooledDataSource("c3p0_xml_config");
private static final DataSource dataSourceFromHikari = new HikariDataSource(new HikariConfig("src/main/resources/hikari.properties")); static {
InputStream originalJdbcStream = CompleteJdbcUtils.class.getClassLoader().getResourceAsStream("jdbc.properties");
Properties originalJdbcProperties = new Properties(); InputStream dbcpStream = CompleteJdbcUtils.class.getClassLoader().getResourceAsStream("dbcp.properties");
Properties dbcpProperties = new Properties(); InputStream druidStream = CompleteJdbcUtils.class.getClassLoader().getResourceAsStream("druid.properties");
Properties druidProperties = new Properties();
try {
originalJdbcProperties.load(originalJdbcStream);
//driverClass = originalJdbcProperties.getProperty("driverClass");
url = originalJdbcProperties.getProperty("url");
user = originalJdbcProperties.getProperty("user");
password = originalJdbcProperties.getProperty("password");
//Class.forName(driverClass);
//--------------------------------------------------------------------------\\
dbcpProperties.load( dbcpStream );
dataSourceFromDBCP = BasicDataSourceFactory.createDataSource(dbcpProperties);
druidProperties.load( druidStream );
dataSourceFromDruid = DruidDataSourceFactory.createDataSource(druidProperties);
} catch ( Exception e) {
e.printStackTrace();
}
} // 原生JDBC
public static Connection getConnectionByOriginalJdbc(){
try {
return DriverManager.getConnection(url,user,password);
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // C3P0
public static Connection getConnectionByC3P0(){
try {
return dataSourceFromC3P0.getConnection();
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // DBCP
public static Connection getConnectionByDBCP(){
try {
return dataSourceFromDBCP.getConnection();
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // Druid
public static Connection getConnectionByDruid(){
try {
return dataSourceFromDruid.getConnection();
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // Hikari
public static Connection getConnectionByHikari(){
try {
return dataSourceFromHikari.getConnection();
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // 资源释放
public static void releaseResource(Connection connection, PreparedStatement preparedStatement, ResultSet resultSet){
try{
if (resultSet != null) resultSet.close();
if (preparedStatement != null) preparedStatement.close();
if (connection != null) connection.close();
} catch (SQLException sqlException){
sqlException.printStackTrace();
}
}
}
测试
@Test
public void te5() throws SQLException {
Connection connectionByOriginalJdbc = CompleteJdbcUtils.getConnectionByOriginalJdbc();
Connection connectionByC3P0 = CompleteJdbcUtils.getConnectionByC3P0();
Connection connectionByDBCP = CompleteJdbcUtils.getConnectionByDBCP();
Connection connectionByDruid = CompleteJdbcUtils.getConnectionByDruid();
Connection connectionByHikari = CompleteJdbcUtils.getConnectionByHikari(); Connection[]connections = new Connection[]{
connectionByOriginalJdbc,
connectionByC3P0,
connectionByDBCP,
connectionByDruid,
connectionByHikari
}; for (Connection connection: connections) {
System.out.println(connection);
connection.close();
}
}

对应完整工具类封装的一些常用方法
- 增删改 Update
- 查询单个结果 queryOne
- 查询多个结果 queryToList
- 对注入条件的封装
// 获取SQL预编译对象
public static PreparedStatement getPreparedStatement(Connection connection,String sql){
try {
return connection.prepareStatement(sql);
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // 参数注入
public static void argumentsInject(PreparedStatement preparedStatement,Object[] args){
for (int i = 0; i < args.length; i++) {
try {
preparedStatement.setObject(i+1,args[i]);
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
}
} // 转换日期格式
public static java.sql.Date parseToSqlDate(String patternTime){
SimpleDateFormat simpleDateFormat = new SimpleDateFormat("yyyy-MM-dd");
java.util.Date date = null;//"1987-09-01"
try {
date = simpleDateFormat.parse(patternTime);
} catch (ParseException e) {
e.printStackTrace();
}
return new java.sql.Date(date.getTime());
} // 更新操作
public static int update(Connection connection,String sql,Object[] args) {
PreparedStatement preparedStatement = getPreparedStatement(connection, sql);
if (args != null) argumentsInject(preparedStatement,args);
try { return preparedStatement.executeUpdate(); }
catch (SQLException sqlException) { sqlException.printStackTrace(); }
return 0;
} // 自己封装的查询
public static <T> List<T> queryToList(Connection connection,Class<T> tClass,String sql,Object[] args){
PreparedStatement preparedStatement = getPreparedStatement(connection, sql);
if (args != null) argumentsInject(preparedStatement,args);
ResultSet resultSet = null;
try{
resultSet = preparedStatement.executeQuery();
ResultSetMetaData metaData = resultSet.getMetaData();
int columnCount = metaData.getColumnCount();
List<T> tList = new ArrayList<T>();
while(resultSet.next()){
T t = tClass.newInstance();
for (int i = 0; i < columnCount; i++) {
Object columnValue = resultSet.getObject(i + 1);
String columnLabel = metaData.getColumnLabel(i + 1);
Field field = tClass.getDeclaredField(columnLabel);
field.setAccessible( true );
field.set(t,columnValue);
}
tList.add(t);
}
return tList;
} catch (Exception e){ e.printStackTrace(); }
finally { releaseResource(connection,preparedStatement,resultSet); }
return null;
} // MapList集合封装
public static List<Map<String, Object>> queryToList(Connection connection,String sql, Object[] args){
PreparedStatement preparedStatement = getPreparedStatement(connection, sql);
if (args != null) argumentsInject(preparedStatement,args);
ResultSet resultSet = null;
try{
resultSet = preparedStatement.executeQuery();
ResultSetMetaData metaData = resultSet.getMetaData();
int columnCount = metaData.getColumnCount();
List<Map<String,Object>> mapList = new ArrayList<Map<String, Object>>();
while(resultSet.next()){
Map<String,Object> row = new HashMap<String, Object>();
for (int i = 0; i < columnCount; i++) {
String columnLabel = metaData.getColumnLabel(i + 1);
Object columnValue = resultSet.getObject(i + 1);
row.put(columnLabel,columnValue);
}
mapList.add(row);
}
return mapList;
}catch (Exception e){ e.printStackTrace(); }
finally { releaseResource(connection,preparedStatement,resultSet);}
return null;
} // 反射实现单个查询
public static <T> T queryOne(Connection connection,Class<T> tClass,String sql,Object[] args){
PreparedStatement preparedStatement = getPreparedStatement(connection,sql);
argumentsInject(preparedStatement,args);
ResultSet resultSet = null;
try{
resultSet = preparedStatement.executeQuery(); ResultSetMetaData metaData = resultSet.getMetaData();
int columnCount = metaData.getColumnCount();
if (resultSet.next()){
T t = tClass.newInstance();
for (int i = 0; i < columnCount; i++) {
Object columnValue = resultSet.getObject(i + 1);
String columnLabel = metaData.getColumnLabel(i + 1);
Field field = tClass.getDeclaredField(columnLabel);
field.setAccessible( true );
field.set(t,columnValue);
}
return t;
}
}catch (Exception e){ e.printStackTrace();}
finally { releaseResource(connection,preparedStatement,resultSet);}
return null;
} // Map单个查询
public static Map<String,Object> queryOne(Connection connection,String sql,Object[] args){
PreparedStatement preparedStatement = getPreparedStatement(connection,sql);
argumentsInject(preparedStatement,args);
ResultSet resultSet = null;
try{
resultSet = preparedStatement.executeQuery();
ResultSetMetaData metaData = resultSet.getMetaData();
int columnCount = metaData.getColumnCount();
if (resultSet.next()){
Map<String,Object> map = new HashMap<String, Object>();
for (int i = 0; i < columnCount; i++) {
Object columnValue = resultSet.getObject(i + 1);
String columnLabel = metaData.getColumnLabel(i + 1);
map.put(columnLabel,columnValue);
}
return map;
}
}catch (Exception e){ e.printStackTrace();}
finally { releaseResource(connection,preparedStatement,resultSet);}
return null;
} // 求特殊值的通用方法 聚合函数
public <E> E getValue(Connection connection,String sql,Object[] args){
PreparedStatement preparedStatement = getPreparedStatement(connection, sql);
if (args != null) argumentsInject(preparedStatement,args);
ResultSet resultSet = null;
try{
resultSet = preparedStatement.executeQuery();
if (resultSet.next())return (E)resultSet.getObject(1);;
} catch (Exception e){ e.printStackTrace(); }
finally{ CompleteJdbcUtils.releaseResource(connection,preparedStatement,resultSet);}
return null;
}
【Java】JDBC Part5.1 Hikari连接池补充的更多相关文章
- DB数据源之SpringBoot+MyBatis踏坑过程(五)手动使用Hikari连接池
DB数据源之SpringBoot+MyBatis踏坑过程(五)手动使用Hikari连接池 liuyuhang原创,未经允许禁止转载 系列目录连接 DB数据源之SpringBoot+Mybatis踏坑 ...
- hikari连接池属性详解
hikari连接池属性详解 一.主要配置 1.dataSourceClassName 这是DataSourceJDBC驱动程序提供的类的名称.请查阅您的特定JDBC驱动程序的文档以获取此类名称,或参阅 ...
- spring boot:使用mybatis访问多个mysql数据源/查看Hikari连接池的统计信息(spring boot 2.3.1)
一,为什么要访问多个mysql数据源? 实际的生产环境中,我们的数据并不会总放在一个数据库, 例如:业务数据库:存放了用户/商品/订单 统计数据库:按年.月.日的针对用户.商品.订单的统计表 因为统计 ...
- java基础(30):DBUtils、连接池
1. DBUtils 如果只使用JDBC进行开发,我们会发现冗余代码过多,为了简化JDBC开发,本案例我们讲采用apache commons组件一个成员:DBUtils. DBUtils就是JDBC的 ...
- 微服务架构 ------ 插曲 hikari连接池的配置
开胃菜:据说hikari连接池很快,快到让另一个连接池的作者抛弃对自己连接池的维护,并且强烈推荐使用hikari 连接池目前我们项目使用的有两个 一个是Druid , 一个是 Hikari, 其中Dr ...
- JAVA基础之DBUtils与连接池
利用DBUtils进一步简化JDBC数据库的增删改查的代码,同时利用从连接池中接取连接,进而进行简化和减少资源的消耗! 一.DBUtils: 1.DBUtils就是JDBC的简化开发工具包.需要项目导 ...
- 走进JavaWeb技术世界3:JDBC的进化与连接池技术
走进JavaWeb技术世界3:JDBC的进化与连接池技术 转载公众号[码农翻身] 网络访问 随着 Oracle, Sybase, SQL Server ,DB2, Mysql 等人陆陆续续住进数据库 ...
- java jdbc使用SSH隧道连接mysql数据库demo
java jdbc使用SSH隧道连接mysql数据库demo 本文链接:https://blog.csdn.net/earbao/article/details/50216999 packag ...
- Java的JDBC原生态学习以及连接池的用法
JDBC是什么 JDBC(Java Data Base Connectivity)是Java访问数据库的桥梁,但它只是接口规范,具体实现是各数据库厂商提供的驱动程序(Driver). 应用程序.JDB ...
- java基础之JDBC八:Druid连接池的使用
基本使用代码: /** * Druid连接池及简单工具类的使用 */ public class Test{ public static void main(String[] args) { Conne ...
随机推荐
- k8s——deployment
创建deployment [root@master deploy]# kubectl create deploy nginx-deploy --image=nginx:1.7.9 deployment ...
- Java中Calendar类与SimpleDateFormat类的介绍
目录 Calendar类(关于日期的一些方法) get(Calendar.XXX); get(Calendar.Year) get(Calendar.MONTH) get(Calendar.DAY_O ...
- c#使用webView2 访问本地静态html资源跨域Cors问题 (附带代理服务helper帮助类)
背景 在浏览器中访问本地静态资源html网页时,可能会遇到跨域问题如图. 是因为浏览器默认启用了同源策略,即只允许加载与当前网页具有相同源(协议.域名和端口)的内容. WebView2默认情况下启用了 ...
- 在AngularJS中,控制器没有生命周期方法
在AngularJS中,控制器没有生命周期方法,但是$scope对象有一些事件,可以模拟生命周期方法的行为.例如,$scope.$on('$destroy', function() {...})可以在 ...
- webpack js兼容处理
webpack在不需要引入任何loader可以对于js进行打包处理,但是它不会对于js兼容性进行任务的处理,而我们编写的项目是需要在不同的浏览器中运行的,此时就需要对于js的兼容性在打包过程中进行对应 ...
- 支付宝APP支付 订单已付款成功,请勿重复提交 和 微信H5支付 INVALID_REQUEST 201 商户订单号重复
支付宝APP支付 返回请求给前端SDK 提示报错"订单已付款成功,请勿重复提交" 产生原因:存在商家订单号已经支付成功,重复再次请求的情况.每一笔的支付项目商家订单号是唯一的,如果 ...
- python 日志写入文件,参数说明及动态判断文件是创建还是追加
import logging import os ''' 格式符 含义 %(levername)s 日志级别名称 %(pathname)s 当前执行程序的路径(即脚本所在的位置) %(filename ...
- 详解Kubernetes Pod优雅退出
1.概述 Pod优雅关闭是指在Kubernetes中,当Pod因为某种原因(如版本更新.资源不足.故障等)需要被终止时,Kubernetes不会立即强制关闭Pod,而是首先尝试以一种"优雅& ...
- 解决Vue中使用history路由模式出现404的问题
背景 vue中默认的路由模式是hash,会出现烦人的符号#,如http://127.0.0.1/#/. 改为history模式可以解决这个问题,但是有一个坑是:强刷新.回退等操作会出现404. Vue ...
- C#中关于 object,dynamic 一点使用心得
首先说一下使用场景 WebAPI接口入参使用 object和 dynamic 后续解析和处理 1.object和dynamic 区别 在.NET中,object和dynamic也有一些区别: obj ...