Spring-boot+Spring-batch+hibernate+Quartz简单批量读文件写数据用例
本文程序集成了Spring-boot、Spring-batch、Spring-data-jpa、hibernate、Quartz、H2等。完整代码在Github上共享,地址https://github.com/birdstudiocn/spring-sample
这程序功能是简单批量读取文件记录,然后将记录数据保存在数据库。是Quartz定时任务每20秒执行一次。功能简单只作框架搭建使用。
首先是主类QuartzApplication.java
package cn.birdstudio; import org.quartz.CronScheduleBuilder;
import org.quartz.JobBuilder;
import org.quartz.JobDetail;
import org.quartz.Trigger;
import org.quartz.TriggerBuilder;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean; /**
* @author Sam Zhang
*/
@SpringBootApplication
public class QuartzApplication { public static void main(String[] args) {
SpringApplication.run(QuartzApplication.class, args);
} @Bean
public JobDetail simpleJobDetail() {
return JobBuilder.newJob(QuartzJob.class).withIdentity("simpleJob").storeDurably().build();
} @Bean
public Trigger simpleJobTrigger() {
CronScheduleBuilder scheduleBuilder = CronScheduleBuilder.cronSchedule("0/20 * * * * ?"); return TriggerBuilder.newTrigger().forJob(simpleJobDetail()).withIdentity("simpleTrigger")
.withSchedule(scheduleBuilder).build();
} }
Quartz工作类QuartzJob.java
package cn.birdstudio; import javax.annotation.Resource; import org.quartz.JobExecutionContext;
import org.quartz.JobExecutionException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.JobParametersBuilder;
import org.springframework.batch.core.JobParametersInvalidException;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.repository.JobExecutionAlreadyRunningException;
import org.springframework.batch.core.repository.JobInstanceAlreadyCompleteException;
import org.springframework.batch.core.repository.JobRestartException;
import org.springframework.scheduling.quartz.QuartzJobBean; import cn.birdstudio.service.UserService; /**
* @author Sam Zhang
*/
public class QuartzJob extends QuartzJobBean {
private static final Logger logger = LoggerFactory.getLogger(QuartzJob.class); @Resource
JobLauncher jobLauncher;
@Resource(name = "importJob")
Job job; @Resource
private UserService qdParaWayinfoService; /**
* 将文件数据批量入库
*/
@Override
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
logger.info("start COMS daemon");
try {
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addLong("run.id", System.currentTimeMillis());
JobParameters jobParameters = jobParametersBuilder.toJobParameters();
jobLauncher.run(job, jobParameters);
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobInstanceAlreadyCompleteException
| JobParametersInvalidException e) {
}
}
}
批量处理类BatchConfiguration.java,读入一个文件,然后将文件的字段内容数据入库,主要包含三个部分:读数据、处理数据、写数据。
package cn.birdstudio.batch; import javax.persistence.EntityManagerFactory; import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobExecutionListener;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.batch.item.ExecutionContext;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.database.JpaItemWriter;
import org.springframework.batch.item.file.FlatFileItemReader;
import org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.transform.DelimitedLineTokenizer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.ClassPathResource; import cn.birdstudio.domain.User; /**
* 数据入库,主要包含三个部分:读数据、处理数据、写数据
*
* @author Sam Zhang
*/
@Configuration
@EnableBatchProcessing
@EnableAutoConfiguration
public class BatchConfiguration {
@Autowired
private JobBuilderFactory jobs; @Autowired
private StepBuilderFactory steps; private static final Logger logger = LoggerFactory.getLogger(BatchConfiguration.class); /**
* 1.读数据
*
* @return
*/
@Bean(name = "reader1")
@StepScope
public ItemReader<User> reader() {
logger.info("read txt");
ClassPathResource pathResource = new ClassPathResource("user.txt");
FlatFileItemReader<User> reader = new FlatFileItemReader<>();
reader.setResource(pathResource);
reader.setLineMapper(new DefaultLineMapper<User>() {
{
setLineTokenizer(new DelimitedLineTokenizer("|") {
{
setNames(new String[] { "name", "gender" });
}
});
setFieldSetMapper(new BeanWrapperFieldSetMapper<User>() {
{
setTargetType(User.class);
}
});
}
});
reader.open(new ExecutionContext());
return reader;
} /**
* 2.处理数据
*
* @return
*/
@Bean(name = "processor1")
@StepScope
public SampleItemProcessor processor() {
return new SampleItemProcessor();
} /**
* 3.写数据
*
* @param entityManagerFactory
* @return
*/
@Bean(name = "writer1")
@StepScope
public ItemWriter<User> writer(EntityManagerFactory entityManagerFactory) {
logger.info("write data in database");
JpaItemWriter<User> writer = new JpaItemWriter<>();
writer.setEntityManagerFactory(entityManagerFactory);
return writer;
} @Bean
public Job importJob(@Qualifier("step1") Step s1, JobExecutionListener listener) {
return jobs.get("importJob").incrementer(new RunIdIncrementer()).listener(listener).flow(s1).end().build();
} @Bean
public Step step1(@Qualifier("reader1") ItemReader<User> reader, @Qualifier("writer1") ItemWriter<User> writer,
@Qualifier("processor1") ItemProcessor<User, User> processor, JobExecutionListener listener) {
return steps.get("step1").<User, User>chunk(10).reader(reader).processor(processor).writer(writer).build(); } }
Spring-boot+Spring-batch+hibernate+Quartz简单批量读文件写数据用例的更多相关文章
- spring boot + spring batch 读数据库文件写入文本文件&读文本文件写入数据库
好久没有写博客,换了一家新公司,原来的公司用的是spring,现在这家公司用的是spring boot.然后,项目组布置了一个任务,关于两个数据库之间的表同步,我首先想到的就是spring batch ...
- Java面试题(Spring Boot/Spring Cloud篇)
Spring Boot/Spring Cloud 104.什么是 spring boot? SpringBoot是一个框架,一种全新的编程规范,他的产生简化了框架的使用,所谓简化是指简化了Spring ...
- spring Boot+spring Cloud实现微服务详细教程第二篇
上一篇文章已经说明了一下,关于spring boot创建maven项目的简单步骤,相信很多熟悉Maven+Eclipse作为开发常用工具的朋友们都一目了然,这篇文章主要讲解一下,构建spring bo ...
- Spring boot +Spring Security + Thymeleaf 认证失败返回错误信息
[Please make sure to select the branch corresponding to the version of Thymeleaf you are using] Stat ...
- 255.Spring Boot+Spring Security:使用md5加密
说明 (1)JDK版本:1.8 (2)Spring Boot 2.0.6 (3)Spring Security 5.0.9 (4)Spring Data JPA 2.0.11.RELEASE (5)h ...
- 256.Spring Boot+Spring Security: MD5是加密算法吗?
说明 (1)JDK版本:1.8 (2)Spring Boot 2.0.6 (3)Spring Security 5.0.9 (4)Spring Data JPA 2.0.11.RELEASE (5)h ...
- [Spring Boot] Spring Boot启动过程源码分析
关于Spring Boot,已经有很多介绍其如何使用的文章了,本文从源代码(基于Spring-boot 1.5.6)的角度来看看Spring Boot的启动过程到底是怎么样的,为何以往纷繁复杂的配置到 ...
- Spring Boot/Spring Cloud、ESB、Dubbo
如何使用Spring Boot/Spring Cloud 实现微服务应用spring Cloud是一个基于Spring Boot实现的云应用开发工具,它为基于JVM的云应用开发中的配置管理.服务发现. ...
- 基于Spring Boot+Spring Security+JWT+Vue前后端分离的开源项目
一.前言 最近整合Spring Boot+Spring Security+JWT+Vue 完成了一套前后端分离的基础项目,这里把它开源出来分享给有需要的小伙伴们 功能很简单,单点登录,前后端动态权限配 ...
随机推荐
- HDU 3635:Dragon Balls(并查集)
Dragon Balls Time Limit: 2000/1000 MS (Java/Others) Memory Limit: 32768/32768 K (Java/Others) Tot ...
- Blender节点笔记
Blender节点笔记实现复杂材质,纹理的更直观的方式就是使用节点功能. 每个节点左边作为输入,右边作为输出.节点之间通过传递值影响后者.传递的值为(Scalars,Vectors)标量与矢量.二维矢 ...
- Redis 开发与运维
Getting Start 高性能 性能优势的体现 C语言实现的内存管理 epoll的I/O多路复用技术+IO连接/关闭/读写通过事件实现异步的非阻塞IO TCP协议 单线程架构,不会因为高并发对服务 ...
- 安装部署redis3.2 phpRedisAdmin 攻略
1.下载redis3.2稳定版本: 下载地址: https://redis.io/download 2.安装: 解压文件后,进行文件夹: 执行以下命令: make cd src make insta ...
- Java中break和continue跳出指定循环
https://www.cnblogs.com/miys/p/b7f6a463bc58785d74a8a7fccd1f1243.html 在Java中,break和continue可以跳出指定循环,在 ...
- Redis怎么保持缓存与数据库一致性?
将不一致分为三种情况: 1. 数据库有数据,缓存没有数据: 2. 数据库有数据,缓存也有数据,数据不相等: 3. 数据库没有数据,缓存有数据. 在讨论这三种情况之前,先说明一下我使用缓存的策略,也是大 ...
- oracle完全之dbf文件出现问题, ORA-01219
alter database datafile '/data/app/oradata/ora237/users01.dbf' offline drop; 强制删除该故障文件
- MySQL Transaction--事务相关查询
MySQL支持的四种事务隔离级别 READ-UNCOMMITTED READ-COMMITTED REPEATABLE-READ SERIALIZABLE 查看全局事务隔离级别和会话事务隔离级别 SH ...
- activiti 委派和转办的区别
委派 委派:是将任务节点分给其他人处理,等其他人处理好之后,委派任务会自动回到委派人的任务中 将hr的任务进行委派: taskService.delegateTask(taskId, userId); ...
- MySQL通过Navicat实现远程连接
直接使用Navicat通过IP连接会报各种错误,例如:Error 1130: Host '192.168.1.80' is not allowed to connect to this MySQL s ...