Senior Developer Advocate (Presenter)
In this exercise, we’ll gain experience with Kafka Connect and create a source connector in Confluent Cloud, which will produce data to Apache Kafka. Then, we’ll consume this data from the command line.
From the Confluent Cloud Console, navigate to Data integration and then Connectors.
In the “Connectors” search bar, enter “Datagen” to narrow down the available connectors. Select Datagen Source.
Provide a topic for the connector to produce data into. You can do this directly as part of the connector creation process. Select Add a new topic. Call it inventory
, and select Create with defaults.
Select the new inventory
topic and Continue.
Create a new API key for the connector to use for communicating with the Kafka cluster. Select Global Access and Generate API key & Download, then Continue.
The Datagen source connector can auto-generate a number of predefined datasets. Select the “Inventory” template and serialize the messages as JSON.
Select Continue twice to first review the cost of the connector and then to give the connector a name and confirm the configuration choices.
Click Launch to start the connector. It may take a few minutes to provision the connector.
inventory
topic.
confluent kafka topic consume --from-beginning inventory
In this exercise, we learned how to quickly get started with Kafka Connect and created a simple data-generating source connector. This is just the tip of the iceberg for Kafka Connect, so I encourage you to check out the Kafka Connect 101 course to really get a feel for what Connect has to offer-
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.