Skip to content
Advertisement

Spring batch job runs automatically

I’m using a spring batch to read a CSV file and write it to the DB, using the controller trigger. On starting the application, before I hit from the browser url, I see the print statements from my reader, on the startup. Although it doesn’t print it for my processor or writer, which are in separate classes which I have autowired. Is it because the reader is a bean?

I see the print statements from my FlatFileItemReader in the log on the application startup. But the print statements for my processor and writer only show up in the console when I hit the controller url. I’ve tried adding spring.batch.job.enabled=false in the application.properties file, but it doesnt stop the execution of the reader bean. How can I prevent auto execution of the reader bean in the SpringBatchConfig class:

SpringBatchConfig class:

@Configuration
@EnableBatchProcessing
public class SpringBatchConfig {
    
    @Autowired
    private JobBuilderFactory jobBuilderFactory;
    
    @Autowired
    private StepBuilderFactory stepBuilderFactory;
    
    @Autowired
    private DataSource dataSource;
    
    @Autowired
    private DBWriter writer1;
    
    @Autowired
    private Processor processor1;
    
    //Step 1 - CSV to DB
    @Bean
    public FlatFileItemReader<User> itemReader() {

        FlatFileItemReader<User> flatFileItemReader = new FlatFileItemReader<>();
        flatFileItemReader.setResource(new FileSystemResource("src/main/resources/users.csv"));
        flatFileItemReader.setName("CSV-Reader");
        flatFileItemReader.setLinesToSkip(1);
        flatFileItemReader.setLineMapper(lineMapper());
        System.out.println("inside file reader 1 !!!!!");
        return flatFileItemReader;
    }

    @Bean
    public LineMapper<User> lineMapper() {

        DefaultLineMapper<User> defaultLineMapper = new DefaultLineMapper<>();
        DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();

        lineTokenizer.setDelimiter(",");
        lineTokenizer.setStrict(false);
        lineTokenizer.setNames(new String[]{"id", "name", "dept", "salary"});

        BeanWrapperFieldSetMapper<User> fieldSetMapper = new BeanWrapperFieldSetMapper<>();
        fieldSetMapper.setTargetType(User.class);

        defaultLineMapper.setLineTokenizer(lineTokenizer);
        defaultLineMapper.setFieldSetMapper(fieldSetMapper);

        return defaultLineMapper;
    }

    @Bean
    public Step step1() throws Exception{   // Step 1 - Read CSV and Write to DB
        return stepBuilderFactory.get("step1")
                .<User,User>chunk(100)
                .reader(itemReader())
                .processor(processor1)
                .writer(writer1)
                .build();
    }

   @Bean
    public Job job() throws Exception{
        return this.jobBuilderFactory.get("BATCH JOB")
                .incrementer(new RunIdIncrementer())
                .start(step1())
                .build();
    }

DBWriter class:

@Component
public class DBWriter implements ItemWriter<User> {

    @Autowired
    private UserRepository userRepository;

    @Override
    public void write(List<? extends User> users) throws Exception {
    System.out.println("Inside DB Writer");
        System.out.println("Data Saved for Users: " + users);
            userRepository.save(users);
    }
}

Processor class:

@Component
public class Processor implements ItemProcessor<User, User> {

    private static final Map<String, String> DEPT_NAMES =
            new HashMap<>();

    public Processor() {
        DEPT_NAMES.put("001", "Technology");
        DEPT_NAMES.put("002", "Operations");
        DEPT_NAMES.put("003", "Accounts");
    }

    @Override
    public User process(User user) throws Exception {
        String deptCode = user.getDept();
        String dept = DEPT_NAMES.get(deptCode);
        user.setDept(dept);
        user.setTime(new Date());
        System.out.println(String.format("Converted from [%s] to [%s]", deptCode, dept));
        return user;
    }
}

Controller Class:

@RestController
@RequestMapping("/load")
public class LoadController {

    @Autowired
    JobLauncher jobLauncher;

    @Autowired
    Job job;

    @GetMapping("/users")
    public BatchStatus load() throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException {


        Map<String, JobParameter> maps = new HashMap<>();
        maps.put("time", new JobParameter(System.currentTimeMillis()));
        JobParameters parameters = new JobParameters(maps);
        JobExecution jobExecution = jobLauncher.run(job, parameters);

        System.out.println("JobExecution: " + jobExecution.getStatus());

        System.out.println("Batch is Running...");
        while (jobExecution.isRunning()) {
            System.out.println("...");
        }

        return jobExecution.getStatus();
    }
}

Advertisement

Answer

The spring.batch.job.enabled=false property is used to prevent running jobs at application startup.

The method that creates the reader will be still be called at configuration time, so it’s normal that you see the print statement. But that does not mean the reader was called inside a running job.

Advertisement