Get Started Free
course: Data Mesh 101

Hands On: Creating Your Own Data Mesh

5 min
Danica Fine

Danica Fine

Senior Developer Advocate (Presenter)

Rick Spurgeon

Rick Spurgeon

Integration Architect (Author)

Hands On: Creating Your Own Data Mesh

This hands-on exercise walks you through building and using a data mesh prototype, which is built as a web server using Confluent Cloud services, including the new Data Catalog API for data product management. The prototype includes a browser-based client application that allows you to interact with the data mesh after you complete the steps below.

Set Up Confluent Cloud

If you don't already have an account, go to the Confluent Cloud sign up page. Enter your name, email address, and password, which you will use later to log into Confluent Cloud. Select the Start Free button and watch your inbox for a confirmation email to continue. New accounts are given $400 of free usage within the first 30 days (this is subject to change). In addition, be sure to apply the promo code DATAMESH101 for an additional $25 of free cloud usage (details).

Note

For this hands-on exercise, you do not need to create an environment, Apache Kafka® cluster, or ksqlDB application manually, as the prototype application will provision these resources for you.

Prepare to Build the Data Mesh

First, open a browser to the data mesh prototype GitHub repository, where you can review the source code and documentation:

https://github.com/confluentinc/data-mesh-demo

In order to build and run the demo, there are some developer tool prerequisites you'll need installed on your system: git is required to clone the project, and the make tool is needed to run the data mesh creation script and build the application. You likely already have these tools, but your system may require you to install them before proceeding.

Additionally, the jq JSON processing tool is used by the data mesh creation script, so you’ll need that, and you'll also need Docker to run the local webserver.

Finally, you'll need to have the Confluent CLI installed, which is used to provision cloud resources to support the data mesh. Note that the CLI must be found on your system PATH for the project scripting to work properly. In addition, ensure your CLI is logged into your account, and choose the --save argument to prevent logouts while running the prototype:

confluent login --save

Run the Data Mesh

You can run the data mesh prototype either by using a pre-built Docker image or by building and running from source yourself. By default, we will run the application using Docker. If you prefer to build and run from source, please see the README.md in the repository for those instructions.

To run the prototype, clone the source code repository:

git clone https://github.com/confluentinc/data-mesh-demo.git

Then change into the project directory:

cd data-mesh-demo

The project has a script that will bootstrap the entire data mesh environment, including Confluent Cloud resources, a Java Spring Boot-based web server, and a browser-based client application you can use to interact with the mesh. Execute the following command to kick off the mesh creation:

make data-mesh

Note

This command creates real resources in your Confluent Cloud account. To avoid unexpected charges, be sure to destroy these resources after you are done evaluating the data mesh prototype.

The last module in this course provides instructions for destroying all resources.

The script will take about 15 minutes to complete, as it must wait for the various cloud resources to be provisioned and ready. You can continue with the next modules in the course while it is running in the background. We will come back to the prototype in Hands On: Exploring the Data Mesh and Hands On: Creating a Data Product to explore the created data mesh. In the meantime, continue to the next module to learn about data mesh principles.

Use the promo code DATAMESH101 to get $25 of free Confluent Cloud usage

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.