Updated: Jun 10, 2019
DataRow.io includes Kafka connector. You can transform, ingest and distribute Kafka messages to multiple destinations in different formats within single data pipeline. This post focuses on writing Avro formatted Kafka messages to Cassandra. Below steps walk you through building such, simple two step pipeline.
1. First, open DataRow Platform and create a new job. Then modify job settings
2. Locate Kafka Reader activity as shown below
3. Drag the Kafka Reader activity into designer and click on the settings icon
4. Enter Kafka Servers/Brokers details including Kafka topics
5. In the toolbar, locate Cassandra writer activity as shown below
6. Drag the Cassandra writer activity into designer as sub-activity of Kafka Reader activity and click on the settings icon
7. Enter Cassandra host settings including source view, LiveOrders this case.
8. Run the job
Note that, you can run the same pipeline for messages in JSON and Text formats.
DataRow.io | Big Data Integration | Try it here.