Senior Developer Advocate (Presenter)
Throughout this course, we’ll introduce you to developing Apache Kafka event streaming apps with Python through hands-on exercises that will have you produce data to and consume data from Confluent Cloud. This exercise will get you set up for the exercises that follow.
Note: If you already have a Confluent Cloud account, you can skip this section and proceed to step 8.
If you haven’t already signed up for Confluent Cloud, sign up now so when your first exercise asks you to log in, you are ready to do so.
learn-kafka-python
and click Create.Next, you will be prompted to enable one of the available Streams Governance Packages. You need to do so since the course exercises utilize Schema Registry which these packages include.
learn-kafka-python
Environmentkafka-python
and click Launch cluster.Next, you will create the configuration properties needed to connect to your cluster.
Navigate to the Cluster Overview Dashboard and for the kafka-python cluster.
Click on the Python tile in the Connect to your systems section.
Create a cluster API key and secret by clicking Create Kafka cluster API key.
Review the key and the secret in the pop-up and click Download and continue.
This will download the key and secret to the local machine Downloads directory.
To create the Schema Registry API key and secret, repeat the process by clicking Create Schema Registry API key.
Review the sample Python client configuration settings. The API keys and secrets should be populated in the properties in the middle of the page. If they are not visible, select the Show API keys option.
Click the Copy button.
Create a file named python.properties
located in the ~/.confluent
directory.
Note: The .confluent
directory is the default location for Confluent Platform related configuration files.
Edit python.properties
and paste the previously copied Python client Confluent Cloud properties.
Save python.properties
but leave it open in your IDE.
Create a config directory named kafka-python
for the course exercises and cd
into it.
Create a file named config.py
that will contain a Python dictionary.
Add the following lines to config.py
:
config = {
'bootstrap.servers': '<bootstrap-server-endpoint>',
'security.protocol': 'SASL_SSL',
'sasl.mechanisms': 'PLAIN',
'sasl.username': '<CLUSTER_API_KEY>',
'sasl.password': '<CLUSTER_API_SECRET>'}
Update the values for bootstrap.servers
, sasl.username
, and sasl.password
to the values contained in python.properties
.
We’ll add more to config.py
in later exercises.
The course exercises depend upon Python 3.X and several related packages being installed. The steps in this prerequisites section satisfy this dependency. Complete the steps as needed on your local machine.
Note: These steps correspond to installing these packages on Ubuntu 20.04. Adjust the steps as needed to suit your local environment.
First update all the packages in your local machine:
sudo apt update
If any of the system packages on your local machine need to be upgraded, run the following command to do so:
sudo apt upgrade
Install python3
:
sudo apt install python3
Install pip3
:
sudo apt install python3-pip
Identify default python
version:
python --version
The version shown in the command response should be similar to Python 3.X.X
. If a non-Python 3.X version is returned, you will need to either update your alias for the python
command or explicitly run the python3
and pip3
commands during the exercises for this course.
Install virtualenv
:
pip install virtualenv
virtualenv
Create a virtual environment:
virtualenv kafka-env
Activate the virtual environment:
source kafka-env/bin/activate
Run the following command:
pip install confluent-kafka
You’re now ready for the upcoming exercises.
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.