From the course: Building High-Throughput Data Microservices

Unlock the full course today

Join today to access over 24,900 courses taught by industry experts.

Spring Batch with Apache Kafka throughput tuning

Spring Batch with Apache Kafka throughput tuning

From the course: Building High-Throughput Data Microservices

Spring Batch with Apache Kafka throughput tuning

- [Instructor] In this video, I'll show you another Spring Batch application that reads the payment transactions. But this time, we'll insert the records into Apache Kafka. I'll open up the application code in IntelliJ. It reads the same CSV file containing the two million payment records that we're going to publish to a Kafka topic. Just as before, the batch config object in this project contains most of the code. We're using the same ItemReader to read the CSV files from a particular location to convert into the payment object. This time, our ItemWriter will use the KafkaItemWriter, which is another open source writer. It receives the list of read records from Spring Batch and it sends all of those messages to Kafka. Now, you should know that Kafka events have a key and a value structure. So in this case, I not only need to pass the value as the payment details, but I also need to provide a key. And the key is going to…

Contents