Course: Data Mesh 101

Implementing a Data Mesh

8 min
Tim BerglundSr. Director, Developer Advocacy (Course Presenter)
Ben Stopford Lead Technologist (Course Author)
Michael NollPrincipal Technologist (Course Author)

Implementing a Data Mesh

This final module covers the typical journey you might go through when implementing a data mesh in your organization. It won't cover the technical nitty-gritty details (code, APIs, etc.), but will rather give architectural guidance.

You can view the four principles of the data mesh, covered in previous modules, as an evolution:


For example, you wouldn't start with Principle 3, data available everywhere, as a self service, rather, you'd start with Principle 1, data ownership by domain. You'd move from there to Principle 2, data as a product, and so on. Each principle generally features an increasing level of difficulty, but as you move through them, your capabilities will develop as well.

Data mesh has a concrete implementation, but it is more importantly a set of ideas. Reducing those ideas to practice by building out the mesh is a journey. So if you get started now, you're probably not going to have a data mesh next week. But there will come a point when you can reasonably say "I've built a data mesh." There may not be an obvious threshold, but you will recognize when you've reached that point.

First Steps

To begin a data mesh, you first need to get the necessary management commitment. Then start with some concrete initial use cases: Ideally, things that are contained, simple, and owned by high-capability, forward-looking teams. You also want them to be visible, i.e., you want results that you can show to the business.

While data mesh is a valuable concept, it's not everything. It works in conjunction with other important systems such as microservices and domain-driven design, as we've mentioned. So those other methods are most likely going to need to be a part of your work, alongside and sometimes even orthogonal to data mesh. Basically, you should apply the data mesh concepts as you see fit to gain the maximum benefit for your company.

Concrete Steps to Implement a Data Mesh in Practice

  • Centralize data in motion. Introduce a central event streaming platform; Kafka and Confluent Cloud are good solutions.
  • Nominate data owners. You should have firm owners for the key datasets in your organization, and you want everyone to know who owns which dataset.
  • Publish data on demand. You can store events in Kafka indefinitely, or they can be republished by data products on demand.
  • Handle schema changes. Owners are going to publish schema information to the mesh (perhaps in the form of a wiki, or data extracted from the Confluent Cloud Schema Registry and transformed into an HTML document), and you need a process to deal with schema change approval.
  • Secure event streams. You need a central authority to grant access to individual event streams. There are probably regulatory concerns here—perhaps even actual laws.
  • Connect from any database. There are source and sink connectors available for many supported database types, and you should make sure that your desired connectors exist so that you can easily provision new output ports and input sources.
  • Make a central user interface for discovery and registration of new event streams. This can be an application you create, or even a wiki. Ultimately, you're going to need to support searching for schemas for data of interest. You also need to support previewing event streams and requesting access to new event streams, and you need to support data lineage views.

This course is being recorded early in summer 2021. If you're working on a data mesh implementation around then, you are on the leading edge. Five years from now, we are likely to look back on this time as the early days of data mesh. We will probably consider its early adopters to be risk-taking visionaries, who paved the way for the frameworks and products built around data mesh that we'll be using in 2026. So if you're blazing this trail, you really are making a significant contribution to the community. We appreciate your efforts and look forward to hearing about the things you build.

Use the promo code DATAMESH101 to get $101 of free Confluent Cloud usage

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.