Senior Curriculum Developer
In this exercise, you will access and view your audit logs. You will use the Command Line Interface (CLI) to access and view your audit logs, however, you can use any Kafka client to consume audit logs as long as the client supports SASL authentication.
For this exercise, we’ll use the CLI to access and view our audit logs. However, you can use any Kafka client to consume audit logs as long as the client supports SASL authentication. At the end of the guide we’ll look at a sample configuration Confluent Cloud provides to connect Java and C/C++ consumers.
In order to access the audit log information, you need to make sure that you have been granted the OrganizationAdmin role.
Open a terminal and log in to your Confluent Cloud organization.
confluent login
Run the Confluent audit-log describe command to identify which resources to use. The following example shows the audit log information provided for a sample cluster:
confluent audit-log describe
Be sure to remember the Cluster ID and Environment ID.
Specify the environment and cluster to use by running the confluent environment use and confluent kafka cluster use commands. The environment and cluster names are available from the data retrieved in the previous step.
confluent environment use confluent kafka cluster use
If you have an existing service account API key and secret that you would like to use you can store it locally using the confluent api-key store command. Use the following format,
confluent api-key store <YOURAPIKEY> --resource CLUSTERID
where you replace YOURAPIKEY and CLUSTERID with your unique values.
If you aren't sure if you have a service account API key in use you can list all your API keys for your audit log cluster by running the confluent api-key list command, appending --resource with your unique Cluster ID. It looks something like this:
confluent api-key list --resource CLUSTERID
There is a limit of two API keys per audit log cluster. If you need to delete an existing API key for your audit log cluster you can run the following command:
confluent api-key delete <API-Key>
Since this is a new cluster we'll go ahead and create a new API and secret by running the confluent api-create command with the --service-account and --resource options. Copy the Service Account and Cluster information from the confluent audit-log describe we ran earlier.
Remember to save the API key and secret to a safe location because you will not be able to retrieve it later.
Now let's store our API key and secret by running the confluent api-key use command adding --resource at the end.
Now let's view some of the recent logs running the CLI consumer and save them to a file named audit-events.json:
confluent kafka topic consume confluent-audit-log-events > audit-events.json
This will save all audit logs to the audit-events.json file until you stop the command.
Stop the command by pressing Ctrl-C.
You can now take a look at the audit-events.json file and see all the logs that were saved.
Your output should look similar to this:
Now, let's take this exercise one step further by filtering a specific log and formatting it to make things more readable.
Log in to Confluent Cloud and go into the cluster you created. Ours is the SecurityCourse environment with a Purchases cluster. Once there, we’re going to create a new topic by clicking on Topics on the left side of the screen.
Click the + Add topic button, name your topic, and click Create with defaults.
Next, we'll start the consumer we created before exporting the logs to the same file, audit-events.json. Then we'll delete the topic from the web interface of Confluent Cloud and stop our console consumer.
Start the consumer:
confluent kafka topic consume confluent-audit-log-events > audit-events.json
Delete the topic by clicking on your newly created topic and selecting the configuration.
Click on Delete topic and enter the name of the topic in the dialog box that comes up and click Continue.
Go back to your CLI and stop the consumer by typing Ctrl-C.
In order to filter and format our logs to be more readable, we will use the jq command-line JSON processor. You need to install it yourself to follow along or use another tool to format your JSON to be more readable.
Run the following command to export our filtered and formatted log to a txt file:
cat audit-events.json | grep 'kafka.DeleteTopics' | jq '.' > deletedTopic.txt
You should see something similar to this in your output:
You can also obtain the configuration needed to connect Java and C/C++ consumers directly from the Confluent Cloud interface.
Click on the three bars in the upper-right side of the screen and select Audit log from the menu.
You will now see the instructions for connecting via the CLI, Java, or C/C++ consumers. You can also create an API key that will auto-populate once created, making it as easy as a copy/paste into your consumer of choice.
That's it! Hopefully, this quick exercise showed you how simple it is to access, read, and consume your Confluent Cloud audit logs.
For more information be sure to check out the official documentation:
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.