Senior Developer Advocate (Presenter)
Integration Architect (Author)
This hands-on exercise walks you through building and using a data mesh prototype, which is built as a web server using Confluent Cloud services, including the new Data Catalog API for data product management. The prototype includes a browser-based client application that allows you to interact with the data mesh after you complete the steps below.
If you don't already have an account, go to the Confluent Cloud sign up page. Enter your name, email address, and password, which you will use later to log into Confluent Cloud. Select the Start Free button and watch your inbox for a confirmation email to continue. New accounts are given $400 of free usage within the first 30 days (this is subject to change). In addition, be sure to apply the promo code DATAMESH101 for an additional $25 of free cloud usage (details). You can also use the promo code CONFLUENTDEV1 to delay entering a credit card for 30 days.
For this hands-on exercise, you do not need to create an environment, Apache Kafka® cluster, or ksqlDB application manually, as the prototype application will provision these resources for you.
First, open a browser to the data mesh prototype GitHub repository, where you can review the source code and documentation:
https://github.com/confluentinc/data-mesh-demo
In order to build and run the demo, there are some developer tool prerequisites you'll need installed on your system: git is required to clone the project, and the make tool is needed to run the data mesh creation script and build the application. You likely already have these tools, but your system may require you to install them before proceeding.
Additionally, the jq JSON processing tool is used by the data mesh creation script, so you’ll need that, and you'll also need Docker to run the local webserver.
Finally, you'll need to have the Confluent CLI installed, which is used to provision cloud resources to support the data mesh. Note that the CLI must be found on your system PATH for the project scripting to work properly. In addition, ensure your CLI is logged into your account, and choose the --save argument to prevent logouts while running the prototype:
confluent login --save
You can run the data mesh prototype either by using a pre-built Docker image or by building and running from source yourself. By default, we will run the application using Docker. If you prefer to build and run from source, please see the README.md in the repository for those instructions.
To run the prototype, clone the source code repository:
git clone https://github.com/confluentinc/data-mesh-demo.git
Then change into the project directory:
cd data-mesh-demo
The project has a script that will bootstrap the entire data mesh environment, including Confluent Cloud resources, a Java Spring Boot-based web server, and a browser-based client application you can use to interact with the mesh. Execute the following command to kick off the mesh creation:
make data-mesh
This command creates real resources in your Confluent Cloud account. To avoid unexpected charges, be sure to destroy these resources after you are done evaluating the data mesh prototype.
The last module in this course provides instructions for destroying all resources.
The script will take about 15 minutes to complete, as it must wait for the various cloud resources to be provisioned and ready. You can continue with the next modules in the course while it is running in the background. We will come back to the prototype in Hands On: Exploring the Data Mesh and Hands On: Creating a Data Product to explore the created data mesh. In the meantime, continue to the next module to learn about data mesh principles.
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.
This hands-on exercise will walk you through building and using a Data Mesh prototype application. This application is built as a web-server with Java Spring Boot using Confluent Cloud services, including the new Data Catalog API for data product management. The browser-based client is built using Elm and is viewable from your browser after you complete a few simple build steps. Let's get started. First, navigate to the data-mesh-demo GitHub repository where you can find the source code and documentation on the prototype. You can find the link in the module text below. In order to build and run the demo, there are some developer tool prerequisites you'll need installed on your system before proceeding. You'll need 'git' to clone the project, and also the 'make' tool to run the data mesh creation script. Additionally, you'll need the jq JSON processing tool. You can choose to run the data mesh prototype either by using a pre-built Docker image or by building and running from source yourself. By default, we will run the application using Docker so you'll need that. If you prefer to build and run from source, see the README in the repository for those instructions. The data mesh prototype automates the creation of Confluent Cloud resources including an environment, a basic cluster, topics, and a ksqlDB application. In order to run the script that provisions these resources, you will need the Confluent Command Line Interface installed on the system 'PATH'. The CLI can be installed using the provided installer script. You will also need a Confluent Cloud account to proceed. If you do not already have an account, you can sing up using the Confluent web console or by visiting confluent.cloud/signup New accounts are given $400 of free usage within the first 60 days. But keep in mind that this offer is subject to change over time, so you may see a different number. You also have the option of signing up for an account using the confluent CLI by running confluent cloud-signup and following the prompts as shown here. In addition to the new account usage, be sure to apply the promo code DATAMESH101 to your account, for an additional $101 of free Confluent Cloud usage. You can apply the promo code using the Cloud console under the billing section, or directly from the CLI. Finally, you will need a payment method on file to proceed with the demonstration. You can add a payment method using the cloud console, or again, with the CLI. All right, I know that was a lot of prep work to get started. In case you missed anything, be sure to take a look in the module notes below to see a complete list of everything you need to have ready before you move on to creating the data mesh prototype. With those housekeeping items in order, let's dive in. After cloning the source code repository, navigate into directory where the project was cloned. As mentioned earlier, the project uses the Confluent Cloud CLI to provision the Data Mesh resources on Confluent Cloud. If you didn't use the CLI for the signup process, you need to be sure your CLI is logged into your account before proceeding with the next steps. Be sure to use the --save option when you login. The '--save' flag simply allows your credentials to be saved locally to a file for future use and doing so, will prevent timeouts while the data mesh creation script is running. The project uses a Makefile to bootstrap the entire data mesh environment, including Confluent Cloud resources, a Java Spring Boot web server container, and a browser based client application that you can use to interact with the mesh. Kick off the mesh creation by running 'make data-mesh'. This script can take over 15 minutes to complete as it provisions various cloud resources. So while this command is running, you may continue with the next modules in the course and we'll pick back up where we explore the created data mesh.