Get Started Free
‹ Back to courses

How a Data Mesh Works

Get hands-on experience exploring how to set up your data mesh, how to use it, and how it works in action.

3 min
‹ Back to courses
course: Data Mesh 101

Hands On: Exploring the Data Mesh

3 min
Danica Fine

Danica Fine

Senior Developer Advocate (Presenter)

Rick Spurgeon

Rick Spurgeon

Integration Architect (Author)

Hands On: How a Data Mesh Works

In this hands-on module, you'll have learned how to build a data mesh prototype. Let's see how it works in action.

Let’s check in on our data mesh prototype created in Hands On: Data Mesh Prototype Setup. If the prototype was successfully created, you should see a log entry similar to:

Started DataMeshDemo in 2.874 seconds (JVM running for 3.31)

The server is now running, waiting to handle requests from clients. Open a browser to the data mesh UI with http://localhost:8080/.

data-mesh-landing

The data mesh landing page provides you an overview of the features of the data mesh prototype. The UI is designed with a loose workflow from left to right, with tabs across the top.

Start by exploring the data products in tab 1, "Explore Data Products."

data-mesh-explore

For the purposes of the prototype, you are a member of the hypothetical Analytics Domain. The "Explore Data Products" tab shows the data products that are available to you. The data products listed here are event streams, though in a production environment you would also expect to see other types of data products, such as those provided by request/response APIs.

Clicking on a data product presents you with its attributes.

data-mesh-selected-product

Attributes include:

  • The "Domain" and "Owner" metadata values, representing governance responsibility over the data product
  • The "Quality" and "SLA" metadata values, which inform you of the data quality and service-level support you can expect when using the data product
  • A copy of the current schema, which helps you explore the data model fields and types that make up the data product

Below the "Data Products Detail" information are buttons that forward you to various views in the Confluent Cloud Console. Each of these views provide more detail about the data product:

  • The "Topic Detail" view in the cloud console provides information about the topic backing the data product, including production and consumption rates
  • "Data Lineage" provides detailed information about the data product’s relationships with other upstream and downstream applications and dependencies
  • "Export" links you to Kafka Connect, where you can set up an export connector to move data into your application's data store

In the next module we’ll look at creating a new event streaming application from an existing data product. Then we’ll publish our derived event stream as a new data product for other consumers to use.

Use the promo code DATAMESH101 & CONFLUENTDEV1 to get $25 of free Confluent Cloud usage and skip credit card entry.

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.

Hands On: Exploring the Data Mesh

Now that we're a little more well versed on data mesh, let's jump back into our data mesh prototype. At the end of the last hands on module we had just kicked off our data mesh creation script. Let's check in on that. If the data mesh was successfully created, you should see log entries similar to the ones on the screen. If you do that means that the server is now running and waiting to handle requests from clients. With it up and running, open a browser to the landing page where you'll see an overview of the features of the data mesh prototype. The UI is designed with a loose workflow from left to right following the tabs across the top. When you initially use the prototype start with exploring the data products in the explore data products tab and then move on to creating and publishing data products. For the purposes of the prototype, you are a member of the analytics domain and the explore tab shows the data products that are available to you. The data products listed here are event streams, though in a production environment, you could also expect to see other types of data products such as those provided by synchronous APIs. Let's take a closer look at one of the products. You'll be presented with additional information such as, the domain and owner metadata values, representing governance responsibility over the data product. Quality in SLA fields that can be used to indicate to consumers of data products, what they can expect from data quality when consuming the data product and the SCHEMA, which allows you to explore the fields and data types that make up the data product. Below the data products detail information are some buttons which forward you to different views in confluent cloud console. Each of these views provides more detail about the data products, underlying Kafka topic, SCHEMA, and more. In the topic detail view, you'll get information about the topic backing the data product as well as production and consumption rates. Data lineage provides detailed information about the data products relationships with other up and downstream applications and dependencies. And finally export allows you to export data products into your application data stores, using Kafka connect. That wraps up the data products tour. You should now be fairly familiar with the data mesh console and data products. In the next module, we'll look at creating a new event streaming application from an existing data product, see you there.