Senior Developer Advocate
In this exercise, you will use the Producer class to write events to a Kafka topic.
Open a terminal window and navigate to the kafka-python directory that you created in the previous exercise.
If you are not currently using the kafka-env environment that was created in the last exercise, switch to it with the following command:
source kafka-env/bin/activate
Create a file called producer.py.
Open producer.py in an IDE of your choice.
Add the following import statements to the top of the producer.py file:
from confluent_kafka import Producer
from config import config
Create a function called callback() that can be passed to the produce() method.
def callback(err, event):
if err:
print(f'Produce to topic {event.topic()} failed for event: {event.key()}')
else:
val = event.value().decode('utf8')
print(f'{val} sent to partition {event.partition()}.')
Create a function called say_hello() that takes a producer and a key.
def say_hello(producer, key):
value = f'Hello {key}!'
producer.produce('hello_topic', value, key, on_delivery=callback)
Add the following main block that will pull this all together
if __name__ == '__main__':
producer = Producer(config)
keys = ['Amy', 'Brenda', 'Cindy', 'Derrick', 'Elaine', 'Fred']
[say_hello(producer, key) for key in keys]
producer.flush()
Execute the program by running the following command
python producer.py
Notice how the different names, which we are using for keys, result in specific partition assignments. To get a better idea of how this works, you can try changing some of the names and see how the partition assignment changes.
For a good explanation of partitions and partition assignments, please refer to the Kafka 101 course on Confluent Developer.
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.