Get Started Free

How a Data Mesh Works

Get hands-on experience exploring how to set up your data mesh, how to use it, and how it works in action.

3 min
course: Data Mesh 101

Hands On: Exploring the Data Mesh

3 min
Danica Fine

Danica Fine

Senior Developer Advocate (Presenter)

Rick Spurgeon

Rick Spurgeon

Integration Architect (Author)

Hands On: How a Data Mesh Works

In this hands-on module, you'll have learned how to build a data mesh prototype. Let's see how it works in action.

Let’s check in on our data mesh prototype created in Hands On: Data Mesh Prototype Setup. If the prototype was successfully created, you should see a log entry similar to:

Started DataMeshDemo in 2.874 seconds (JVM running for 3.31)

The server is now running, waiting to handle requests from clients. Open a browser to the data mesh UI with http://localhost:8080/.

data-mesh-landing

The data mesh landing page provides you an overview of the features of the data mesh prototype. The UI is designed with a loose workflow from left to right, with tabs across the top.

Start by exploring the data products in tab 1, "Explore Data Products."

data-mesh-explore

For the purposes of the prototype, you are a member of the hypothetical Analytics Domain. The "Explore Data Products" tab shows the data products that are available to you. The data products listed here are event streams, though in a production environment you would also expect to see other types of data products, such as those provided by request/response APIs.

Clicking on a data product presents you with its attributes.

data-mesh-selected-product

Attributes include:

  • The "Domain" and "Owner" metadata values, representing governance responsibility over the data product
  • The "Quality" and "SLA" metadata values, which inform you of the data quality and service-level support you can expect when using the data product
  • A copy of the current schema, which helps you explore the data model fields and types that make up the data product

Below the "Data Products Detail" information are buttons that forward you to various views in the Confluent Cloud Console. Each of these views provide more detail about the data product:

  • The "Topic Detail" view in the cloud console provides information about the topic backing the data product, including production and consumption rates
  • "Data Lineage" provides detailed information about the data product’s relationships with other upstream and downstream applications and dependencies
  • "Export" links you to Kafka Connect, where you can set up an export connector to move data into your application's data store

In the next module we’ll look at creating a new event streaming application from an existing data product. Then we’ll publish our derived event stream as a new data product for other consumers to use.

Use the promo code DATAMESH101 to get $25 of free Confluent Cloud usage

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.