Get Started Free
course: Kafka Connect 101

Hands On: Getting Started with Kafka Connect

4 min
Danica Fine

Danica Fine

Senior Developer Advocate (Presenter)

Hands On: Getting Started with Kafka Connect

In this exercise, you will create a new topic to hold order event data, and set up a Kafka Connect data generator to populate the topic with sample data.

Confluent Cloud offers dozens of pre-built, fully managed connectors. Use the Amazon CloudWatch Logs or Oracle Database source connectors (among many others!) to stream data into Apache Kafka® or choose from a number of sink connectors to help you move your data into a variety of systems, including BigQuery and Amazon S3. Leveraging these managed connectors is the easiest way to use Kafka Connect to build fully managed data pipelines.

Confluent Cloud

For this course, you will use Confluent Cloud to provide a managed Kafka service, connectors, and stream processing.

  1. Go to Confluent Cloud and create a Confluent Cloud account if you don’t already have one. Otherwise, be sure to log in.
  2. Create a new cluster in Confluent Cloud. For the purposes of this exercise, we’ll be using all of the default configurations for our cluster and choose the Standard cluster. Name the cluster kc-101.
  3. When you’re finished with this exercise, don’t forget to delete your connector and topic in order to avoid exhausting your free usage. You can keep the kc-101 cluster since we will be using it in other exercises for this course.

Create a New Topic

  1. From the Topics page of your Confluent Cloud cluster, click on Add topic.

  2. Name the topic orders and ensure that the Number of partitions is set to 6.

new-topic-orders

  1. Click on Create with defaults.

Create a Data Generator with Kafka Connect

In reality, our Kafka topic would probably be populated from an application using the producer API to write messages to it. Here, we’re going to use a data generator that’s available as a connector for Kafka Connect.

  1. In Confluent Cloud, go to your cluster’s Connectors page. In the search box, enter datagen.

datagen-source-connector Select the Datagen Source connector.

  1. Under Configuration, click on Orders.

configuration-orders

  1. Under Topic selection, click on Add a new topic.

topic-selection

  1. Name the topic orders and ensure that the Number of partitions is set to 6.

topic-orders-6

  1. Click on Create with defaults.
  2. Upon returning to Topic selection, click on orders and click Continue.

topic-selection-order-6

  1. Under Kafka credentials, click on Generate Kafka API key & download.
  2. When the API credentials appear, click Continue.

There is no need to save them as we will not be using them after this exercise. kafka-credentials

  1. On the confirmation screen, the JSON should look like this:
{
  "name": "DatagenSourceConnector_0",
  "config": {
    "connector.class": "DatagenSource",
    "name": "DatagenSourceConnector_0",
    "kafka.auth.mode": "KAFKA_API_KEY",
    "kafka.api.key": "****************",
    "kafka.api.secret": "****************************************************************",
    "kafka.topic": "orders",
    "output.data.format": "JSON",
    "quickstart": "ORDERS",
    "tasks.max": "1"
  }
}
  1. Click Launch to provision the connector. This will take a few moments. Once the provisioning is complete, the view will automatically switch to show the connector Overview page.

connector-provisioned

  1. The Overview page shows the current status of the connector as well as several metrics that reflect its general health.

datagen-source-connector1 18. From the Topics page of your cluster, select the orders topic and then Messages. You should see a steady stream of new messages arriving: orders-messages 19. Keep in mind that this Datagen Source Connector is only a jumping off point for your Kafka Connect journey. As a final step, head on over to the Connectors page and take a look at the other connectors. connectors-page

Use the promo code 101CONNECT to get $25 of free Confluent Cloud usage

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.