Spring Cloud Data Flow allows you to create, configure, and launch a simple single-step Spring Batch job application without writing any code. Here you will learn how to configure a single-step batch job.
The single-step batch job is composed of one item reader and one writer:
- An item reader provides data from different types of input.
- An item writer is similar in functionality to an item reader, but with inverse operations, in that it writes out, rather than reading.
The single-step batch job provides four different types of readers:
- Flat File
- JDBC
- AMQP
- Kafka
Similarly, it offers four different types of writers:
- Flat File
- JDBC
- AMQP
- Kafka
Register single-step batch job application
Begin by registering the single-step batch job application. In the Spring Cloud Data Flow UI, click Applications on the left side menu. This will open the Applications page.
To register an application, click Add Application(s). When the Add Application(s) page appears, select Register one or more applications.
Fill in the form as shown below, and click Import Application(s):
Create task definition
To create a task in the Spring Cloud Data Flow UI:
- Select Tasks from the left navigation bar.
- Select Create task(s). This opens a graphical editor that you can use to compose tasks. The initial canvas contains
START
andEND
nodes. The left of the canvas lists the available task applications, includingsinglestepbatchjob
, which was registered in the previous section. - Drag the
singlestepbatchjob
task application to the canvas. - Connect the task to the START and END nodes to complete the task definition.
- Click Create Task. You will be prompted to name the task definition, which is the logical name for the runtime configuration that you want to deploy. In this case, use the same name as the task application:
singlestepbatchjob
. - Click Create Task. The main Tasks view appears.
Launch single-step batch job application
You can launch the single-step batch job from the Task UI.
To launch the task:
- Click the option control on the row of the
singlestepbatchjob
definition, and select Launch. This opens a form where you can add command line arguments and deployment properties. -
Click Edit under the Application Properties section of the Launch page.
-
Click App Properties, and then select
spring.batch.job
from the drop-down menu. Enter:- chunk-size: The number of records to process before committing a transaction
- step-name: The name of the step associated with the job
- job-name: The name of the job to be processed
-
Fill in the reader properties.
- Click Reader properties.
- Select the reader type (File, AMQP, JDBC, or Kafka).
- Select the properties drop-down menu displayed and populate the properties for how to read from the input resource.
-
Fill in the writer properties.
- Select Writer properties.
- Select the writer type (File, AMQP, JDBC, or Kafka).
- Click the properties drop-down menu and populate the properties for how to write to the output resource.
-
Click Launch task. This runs the task on the Data Flow server’s task platform and records a new task execution. You can track the execution progress using the Task Progress Indicator.
-
When the task is complete, check the status of the job by selecting Jobs executions on the left side of the page.
- Select the execution ID of the task that you just launched to review the status of the job execution.
Content feedback and comments