Confluent Cloud runs in any of the three major cloud providers, but sometimes one of them really is your home. And you might be interested to know what the integration points are within that cloud home of yours. Today, we're talking about Azure. I've got my colleague and Microsoft MVP, Alicia Moniz, and Microsoft Principal Solutions Architect and Kafka community stalwart, Israel Ekpo on the show to talk about some cool Azure integration points and things you might want to know if that's where you live.
Before we get to them, let me remind you Streaming Audio is brought to you by Confluent Developer, that's developer.confluent.io. The one place on the internet, the one website you want to go to, to learn about Kafka and Confluent Cloud and everything. We've got in-depth technical articles, a cool patterns library for event-driven architecture, video courses, executable examples that you work through a step at a time, and all kinds of ways to get started on Confluent Cloud. In fact, if you sign up and use the PODCAST100 code signing up on Confluent Cloud, you get an extra a hundred dollars. There are additional discount codes embedded within Confluent Developer. So check it out, developer.confluent.io. And now let's get to Israel and Alicia
Hello, and welcome to another episode of Streaming Audio. I am your host, Tim Berglund. And I'm joined here in the studio today, the virtual studio, by two guests. And these people are Israel Ekpo. Israel is a principal Cloud Solutions Architect at Microsoft Global Partner Solutions, and Alicia Moniz. Alicia is Microsoft AI MVP and Cloud Partner Solutions Architect at Confluent. Israel and Alicia, welcome to Streaming Audio.
Thank you, Tim.
Now I felt like that was a long introduction to both of you, but it's not enough and I would love to know a little bit more about you. How did you get to do what you're doing? What really are you doing right now? Because titles don't always tell that story. So Israel please what are you up to? Certainly well enough known in the Kafka community, but for those of you who don't know you.
All right. So thank you very much team for that very kind introduction. Like Tim said, my name is Israel and I'm a cloud solutions architect in Microsoft Global Partner Solutions. The Global Partner Solutions organization within the Microsoft organization is a part of the team that focuses primarily on helping ISV partners to build solutions for the cloud. So we have partners in general, but I focus on ISV partners and service partners. So Confluent is one of the ISV partners that are building solutions that run on Azure. And then we have other partners that help maybe like Ascension that help or other partners to build solutions.
So I work with those partners. And then in my free time or spare time, I like to contribute to open-source communities, particularly those that are focused on events and processing. So I contribute to the Kafka community and the Flink community. And I also create YouTube tutorials on Kafka, Flink and anything to do with Microsoft. So I live in Windermere, Florida with my wife and my two kids. So that's pretty much everything about me in total.
There you are. I will put some links to some of those videos in the show notes. And for your information and for the benefit of the listening public, I went to school in Melbourne, Florida, a little bit away, not so far.
Yeah. It's warm there.
I like it warm.
Alicia, same question.
So first of all, thank you so much for the opportunity to be on your show. And I know that Izzy is over the moon being able to have this talk with you today and I am as well. And it's just been so great to have you be so open to.... I am a fairly newcomer to the Kafka space. So thank you. And that being said, I am a newcomer to the cloud in general, relative to how long the Cloud has been out there. My name is Alicia Moniz. I'm one of the Microsoft MVPs. And as a traditional DBA, I definitely was a latecomer to the cloud. And over the last couple of years, I really found these tidbits that helped me to learn quicker and perform at a higher level at each of my positions.
And I think that's really what my contributions to the Azure community are, is I like to piece together these little tidbits for data professionals to help them accelerate delivery on data projects. And I'm just so grateful and fortunate where my position at Confluent helps me to work with my favorite technology, which is Microsoft, and help my community to continue to grow in that space. So I'm delighted to be a solutions architect here at Confluent, because, again, I'm playing with the latest and greatest tools and I'm dealing with business problems and challenges that are at the forefront of data professionals' minds. And so thank you for having me as a newbie, I have started a bit of YouTube stories to categorize and document my journey into the Kafka environment. And you can find me on YouTube. My handle is kafkaonazure. And you can also see me speaking at different data events throughout the year that are Microsoft data related to AI and data technologies on Azure.
I did not know about your YouTube channel. That will be in the show notes. Probably tweeted later today. I knew about Israel's but I didn't know about yours. So certainly that's excellent, and that kind of stuff is what makes a community strong when there are just more and more people doing things like that. So I'm really excited to hear that and delighted that you're on the show. And by the way, newcomers to the community and to the technology always belong on Streaming Audio. And we cover a lot of things. We have super seasoned senior people, deep diving and talking about big lofty ideas. And it's also, I think, helpful for everybody to see people who are relatively new and are starting to get their feet under them because that story is helpful, because most of the people in the community are like that. And so seeing other people who are beginning to be established is always a thing that... Well, it's a thing I want to do. So it's-
Yeah. Thank you, Tim. And I have noticed the parallels between... So the Microsoft community is amazing for being an enablement environment where it's okay to ask questions and it's okay to not know the answers, and there's a lot of people out there who contribute to community projects. And I've found, with the Kafka community, it's very similar where there are just so many people who are so excited about this technology that, really, what they're wanting to do is to help everybody get up to speed that much quicker so that... And I'm just in love with the community and I'm grateful to be able to participate in it.
Awesome. Israel, you said ISVs. And, everybody, if you don't know what ISV stands for, it stands for independent software vendor. And a company like Microsoft has, for decades, had some sort of ISV support mechanism. Like, let's help you write stuff on windows, back in the day, or something like that. Let's just support you as an application developer, and I'm using that word intentionally there because that makes the platform more valuable if there are more ISV-making applications that run on the platform. So that's a goal of a platform company like Microsoft, which we used to think of as an operating system company. And I was surprised when you referred to us as an ISV. And I think this is just my bias, maybe a little bit of my age, I don't know what it is, but I would think, well ISVs, that's like applications and we're an infrastructure platform. So yeah, we're partners, but we're not an ISV, but you guys think of us as an ISV.
So number one, I guess I'm just sort of calling out that that struck me. And I think it's just me that's struck. I don't think it's like the wrong word or anything. But talk to us about what is meaningful about Confluent Cloud and Azure and what are some cool things people do? Because when people think about Confluent Cloud, they don't always think of a cloud provider, right. They think about ksqlDB, they think of just fully managed Kafka. So what are cool things that people are doing?
Yeah. So there are several questions in that question, but I'll start from the top. So the goal of the Microsoft organization, in general, is to enable everyone to succeed in whatever they're trying to do. And as part of that, part of the mission of the Global Partner Solutions organization is to enable every partner to be successful. So we have pioneers and customers that depend on the platform. And the ISVs. So we have several types of partners that we work with. There are people called service partners. So these are people that provide maybe like consulting support or they provide consulting services to help other customers or partners to build solutions on Azure, where they don't actually have a product that they're selling in the marketplace. So we're pretty much selling their expertise and their time and their knowledge.
And then we'll also have ISV partners. So these are partners that are building solutions for the marketplace, where people are going to be using. So we have partners like Confluent, Elastic, where they are building foundational platform solutions. And then other customers or partners are coming to use those solutions from the marketplace to solve their own problems. So Confluent had a service that was running on Azure, and then we have some healthcare partners, we have some retail partners, we have education partners, we have manufacturing, IoT partners. We have also ISVs that are building solutions for their customers, or they're using another ISVs solution to solve that.
So as part of the apps and data team in Global Partner Solutions, my expertise is working with customers that are using open source databases, and open source solutions. And MySQL, Postgres, MariaDB, we have Elastic, Kafka. And then Confluent is one of my favorites because I really like Kafka a lot, and as you can see in my YouTube series, I talk a lot about it over and over. So I help these partners to build core solutions by guiding them on picking the right sizing, the right set of tools, bringing everything together to accomplish their objectives. So I mentioned earlier that we had several partners like retail, IoT, and all of those things. So for example, we have some retail partners that are using Confluent Cloud for inventory management. Nowadays there is this very high expectation by the customers that they want to know everything that is happening. If you order a pizza, you want to know when it's in the oven, when it's coming out and-
What was the name of the person who put the pepperoni on it
Exactly. So they want everything, not in a batch mode whereby you order it, and then you're waiting and waiting and waiting. Oh, suddenly it's done. So they want to see everything that is happening every step of the way. So we have those kinds of partners that are leveraging the Azure ecosystem with Confluent Cloud to get real-time status and real-time updates on their service. So we also have retail partners that are using Confluent Cloud to track inventory. Inventory is very important because if you don't keep track of what you are selling, what the suppliers have brought to the warehouse, what has been returned, all those things, it can end up creating a very bad customer experience, or you can end up with too much inventory where you have too much and you have to dispose of it in a way that you didn't want.
So we have customers that are using Confluent Cloud to integrate with Cosmos DB, with Microsoft SQL Server, with Azure Search, and all these different ecosystem components to provide their own customers with the best service. So, just to summarize, in any particular business, the objective is always to find the right people, provide the right process to get to the right product or service in this case. So Confluent Cloud really helps us to provide the right process for those customers because without that process in place, you may have the right people, but you will end up with the wrong service or the wrong product. So in general, that is what I do as part of the apps and data team in Global Partner Solutions with Microsoft.
Talking about inventory, it's just funny how consumer expectations have changed. And I measure consumer expectations by introspection. It's not like I gather data, but I look at myself, and I got a new iPad last week because my old one with the battery was like, couldn't make it through a day. It was fully depreciated, as our in finance might say. And I wanted the pleasure of walking into an Apple store, which you can do in middle August in the Denver area in 2021, just pandemic-wise you have to wear a mask, but you could just walk in and buy stuff. I did check stock online, like if they have the one I want if they had it. And if they had not, that would have felt like, "Wow, you guys don't have your act together." The website said you had it. And you didn't. Maybe you sold it or somebody bought it or something, but that should be accurate.
And that's a hard problem. I'm not a retailer, but that's a really hard problem. But we expect that to be bulletproof. By the way, funny story about that, if you want to buy an iPad and you just want to walk in off the street, apparently, you're supposed to make an appointment. The greeter guy was like a little... like "What? You're just walking in and you don't have to appoint... You didn't order it. You believe this guy?" "I just wanted to go to the store and buy something."
But yeah, my expectation is that that inventory people approve. On that, Israel, you mentioned some verticals, you mentioned healthcare retail. With those different workloads, can you give us examples? Like, Microsoft wouldn't have a guy like you doing a thing like this if people didn't have different technical challenges. You wouldn't be aware of the verticals if things weren't different and differently hard between them. So what are some ways that are different technology problems that people have to solve depending on what kind of business they're in, specifically within our world here of Confluent and Azure?
Yeah. In the recent past, like the last three months, Confluent and Microsoft have had to partner together to solve a lot of those challenges. One of them has to do with security. So there are some customers that they want all their data to travel through a private network, and they don't want any kind of data traveling, especially in health care and some of the government customer, they don't want any of their data traveling to any kind of public network. So we really had to partner together with the Microsoft engineering team, putting together some of the expectations that partners were complaining about. So we were able to optimize that in a way that now Confluent Cloud is able to run, where you use private endpoints throughout the data. And no way that goes through the public internet.
Another solution that was brought in was, there were a lot of customers that were using Cosmos DB, either as a source of data coming into Kafka or as a sink of data going out of Kafka, it took us much using the Kafka Connect ecosystem. Some of the people in Microsoft were able to create that connector. So now that gap has been closed, where if Cosmos DB is the origin of the events that are intended to go into Kafka for, maybe, processing or filtering, which with Kafka streams or ksqlDB before it goes out again into a different sink or back into Cosmos DB, that integration was now set up. So those are some of the high-level challenges that customers were facing.
So the ability to meet requirements that are being set by the industry; HIPAA regulations, financial service industry regulations, just that regular customer expectations. So trying to listen to the customer feedback, work between the Confluent engineering team, the Microsoft engineering team, putting together all this feedback to make sure that we're constantly improving the customer experience. So, those are just some of the things that I just mentioned, but the typical process is that whenever there is any challenge, we do take the customer's feedback very seriously because we want to also be able to provide our partners and customers with the right environment and the right support so that they can also turn around and provide your customers with what they need without violating any regulations or expectations.
Without, as they might say on Twitter, a HIPAA violation.
Now, Alicia, you mentioned being a relative newcomer to the cloud. And I guess, with a background in AI, that could make sense because AI workloads run on the cloud, But historically, there's a lot of AI that's more right in front of you and not hosted out there. But as you are making your own migration to the cloud, the segue is amazing, I just want you to know that you being a solution architect, that is a person who works with customers after they have bought things when they're actually having to build things that work, my guess is you also work with people who are migrating to the cloud. I can't get over how great that segue is. I'm sorry when you call it out, it's like not a good segue anymore, but it's so good. I just had to mention it. You're making your migration to the cloud.
That's kind of part of the reason why I am one of the best people to have this conversation with. Everything is fresh in my brain, I have gone through all the latest and greatest documentation out there. I have talked to and spend time with people who are passionate about the topic, and my references are aren't stale. So when people have conversations with me, I can definitely empathize with them, as far as what their concerns are, what their pain points are, and what risks are in the back of their heads. So while I'm a little newer to the cloud environments, I have been a traditional on-prem DBA for many, many years. And I am the data guy who tells you, "No," on Thursday nights when you want to push that code change on Friday because we just don't have the score over the weekend for break fixes.
So I love developers. I'm not going to say anything about people who prefer to do testing and POD and all that. But we all have these challenges. And data tends to be that central point and that central location. And traditionally, we've been putting all of this data in the same bucket. And you do talk about the spaghetti mess that architecture ends up in then. And we all know about when you have these rigid systems, they do tend to break. And traditional ETLs are a left-to-right process where what we're doing is we're taking data from point A and we're putting it in point B. And one of the first steps is we need to define the data schema for the data before we move it. Now, what's interesting to me is, when we're looking at solving these new event-based challenges because this is really where Kafka shines, is getting that event-level data and solving those events-level transactional processes.
So we're moving from a traditional data warehousing and ETL process, with maybe daily, if you're lucky, hourly or 15-minute data updates, to making real-time transactions. And as an AI professional, I want more event-level, real-time transactions so that I can do AI on it in the future. So I need to build up my catalog of data at the event level. So when we're looking at building out those capabilities, it's keeping what is in place working by extending it. And Kafka is great at extending that. We want to be doing what we have been doing, but we want to enable ourselves to do just a little bit more. And I think the challenge sometimes as a data professional, is you're responsible for the security and consistency of your data. So you want to minimize risk.
And sometimes I think that new things like the cloud, and new things like open source technologies, like Kafka, can be a little intimidating because now you're introducing risk. And at the end of the day, we're all data stewards and we're responsible for the data and that we're holding. So I think just over the last six months I've seen that being able to build out the Kafka pipe, and being able to build out the one pipe and that one pipe that enables multiple data consumers on the cloud end, it's really a strategy that I can add to multiple architecture patterns. So whether I'm working with data migration, which is taking data from these huge mainframe systems and sending that over to the cloud, or if I'm working with dev teams who are behind infrastructure projects, they want to get started on their Azure projects and they need data to hydrate their development environments. And, of course, we've got all these infrastructure projects that are top priority, but, hey, over here, we also want to service our developers.
So as a data professional, how can I help get them that access without creating duplicate work, and what is a safe way for me to do that? So I found Kafka has been a great option for getting that done super quickly and without the traditional ETL package development workloads.
And then, of course, I get really excited with data and AI. So when we're looking at doing real-time analysis on data and combining data across multiple data sources, this is where I want to be. Kafka really lets you pull data out of almost any data source. And then on top of that, you can go ahead and empower your developers and your data scientists with the Kafka APIs. So you can, with Python, with R, with .NET, with Java, these are all languages that you can leverage so that you can go ahead and pull these real-time data sources out of Kafka and add on to your existing analytics workloads. So when you look at the capabilities and when you look at what we're, there is a lot of value add with Kafka and it really empowers you at the data level to really leverage this event-level data. Sorry, I'm excited.
You should be excited. You said you said something in there I think that was really important, that was about extending, not replacing. So when people are building things out of Kafka, generally pushing in the direction of event-driven architecture, maybe starting with architecturally less ambitious things like real-time ETL, which is hard enough on its own, you're not tearing down an old thing, you're building a new function. And this is the way innovations like this always happen. Because you don't want to get fired. "Just give me nine months to break the old thing, and I'll make a new thing and add no new value, and guaranteed it won't have all the same features and it'll be buggy, but it's fine. It's cooler." Nobody does that. But we build new things on top of old things like you were describing.
And I think it's an easier conversation. And I know you have a huge following out there, and I know you have a variety of people in the audience there. So I'm coming in into Kafka from.... I'm stepping in from a consulting background where I'm working with Fortune 10 companies to talk about their bigger systems and their options for transitioning to the cloud. And I encourage anybody who advocates for Kafka at work to look at things from their perspective just a little bit when you're having these conversations. And I know Kafka is a fantastic technology, but I think when you introduce a little bit of reference to risk mitigation... There's a lot of things that Kafka does for risk mitigation on data projects. And a lot of that flows through Confluent Cloud. So schema registry, you can do VPC peering, you can get dedicated clusters.
So I think if you bring those things into the conversations, and when you talk about how Kafka fits into current projects and to current implementations, then I think adding that risk mitigation helps have that conversation with other people. Because I think one of the things about talking about these bigger projects is bigger projects have bigger dependencies, they have more stakeholders. So when you are advocating for Kafka, I think keeping other people's risk tolerance in mind, will help with that conversation because I do see a lot of interest in Kafka in the larger enterprise project space. And we've definitely done a lot of work with Microsoft to address enterprise-level implementations. And I think we're still growing and we do benefit from that value add with risk mitigation.
Absolutely. Final question and a jump ball here. So either one of you. If you're listening.... Okay, Israel, that was amazing. If you're just listening, you're not watching on YouTube, you want to go back to YouTube and go to around the 29, 30-minute mark, Israel has an actual ball for jump ball questions. Two balls, in fact. He's not in the same room as Alicia.
I do not have a jump ball.
[inaudible 00:30:44] ...jump ball. So Confluent Cloud is, as far as it's concerned, to some degree cloud agnostic. You can run on any of the big three clouds. But obviously, there are strong points if you're committed, as some people are. Well, like I'm going to be in Europe in October and Azure is kind of a big deal there. They don't want to talk about being cloud agnostic. I was at an event last week with a major US retailer that starts with the letter W, and they're not particularly excited about running on AWS. So people have preferences. And so when we're talking about this Confluent Cloud-Azure partnership, briefly, what are some places where that works well? The word that wants to come out of my mouth is synergy, and I'm trying to make it not come out of my mouth. So in what cases is one plus one better than two when Confluent is running on Azure,
I would like to take this one.
You can both go. So Alicia first.
Okay. So I want to say that there are different levels of partnership, and within the Microsoft partnership here at Confluent we partner at all levels. So we partner at the engineering level, we partner at the sales and marketing level. And Israel and I have worked on a couple of projects together, but what you don't see is our development teams are working hand in hand together. We recently announced the new Cosmos DB connector, which was launched as a fully managed connector this week.
And it's really a very collaborative effort. Our development teams meet, maybe, every other week and it's a regular cadence and they get things done. And it's been fantastic to work with both teams. And I've done quite a few projects with Israel, and I am looking forward to working on more. But I don't really think I've seen this anywhere where there's so much engagement and there's so much velocity behind the collaborative projects. Usually, I see somebody's doing a bunch of work and then there's a couple of months, and then there's a response from the other side. And, no, since I've gotten here, Microsoft has been there step by step with us, and there have been huge changes in the capabilities of Confluent Cloud and Azure. And Izzy back to you.
Yeah. So this goes back to the fundamental idea of why Microsoft is still in business. Our objective is always to help every customer, every developer, and every partner to be successful. And in doing so, we try to do several things on the business front. Like we try to make sure that we stay out of the partners' space and let them thrive and let them succeed. So that's why one of the reasons several retailers are coming to us is because they don't see us as a competitor, instead they see us as an enabler. With partners like Confluent, we try not to just take their product and spin it as Azure Confluent, but we focus on working side by side with the Confluent engineering team.
As Alicia said, we collaborated between the engineering team, the partnership team, the marketing team, and several other teams to bring Confluent into the Azure portal. So now you can launch, create Confluent Cloud accounts from the Azure CLI. So the whole point is to make this as integrated as possible so that the experience to deploy, the experience to get built, to track what you're spending, everything is integrated. Not like, you're not going to get an invoice from Confluent or another invoice from Azure. So simplifying the business capabilities, simplifying the business experience, simplifying the engineering experience, simplifying the interface, and all those things, make the experience much better for the partners.
We also have a lot of deployment strategies, which I didn't really get to cover. You can deploy Confluent Cloud obviously, which we prefer you to do, especially if you don't have data engineers to manage the entire infrastructure. But we also have some customers that want to be in full control where they want to deploy the Confluent platform. So if want to do that as well, that offer is also available at the marketplace and they can...
So everything is really flexible. If you want less responsibility and just want to focus on your core business, Confluent Cloud is the way to go. If you want to be in control, then some people deploy Confluent platform in your on-prem environment, and they deploy Confluent Cloud in the cloud, or they deploy Confluent platform in Azure using VMs or Kubernetes. So it is very flexible. It is not like take it or go if you want to do it in a particular way, our objective, meaning Microsoft Global Partner Solutions, is that Confluent is your help. Whatever customer accomplished their own objective. So whatever works for them best is what we're going to recommend for that person, or for that organization. So that's everything in a nutshell as to why it is better to come to Azure to do business.
My guests today have been Israel Ekpo and Alicia Moniz. Israel and Alicia, thanks for being a part of Streaming Audio.
Thank you, Tim.
And there you have it. Thanks for listening to this episode. Now, some important details before you go. Streaming Audio is brought to you by Confluent Developer, that's developer.confluent.io, a website dedicated to helping you learn Kafka, Confluent, and everything in the broader event streaming ecosystem. We've got free video courses, a library of event-driven architecture design patterns, executable tutorials covering ksqlDB, Kafka streams, and core Kafka APIs. There's even an index of episodes of this podcast. So if you take a course on Confluent Developer, you'll have the chance to use Confluent Cloud. When you sign up, use the code, PODCAST100 to get an extra a hundred dollars of free Confluent Cloud usage.
Anyway, as always, I hope this podcast was helpful to you. If you want to discuss it or ask a question, you can always reach out to me at TL Berglund on Twitter. That's T-L B-E-R-G-L-U-N-D. Or you can leave a comment on the YouTube video if you're watching and not just listening or reach out in our community Slack or forum. Both are linked in the show notes. And while you're at it, please subscribe to our YouTube channel, and to this podcast, wherever fine podcasts are sold. And if you subscribe through Apple Podcast, be sure to leave us a review there. That helps other people discover us, which we think is a good thing. So thanks for your support, and we'll see you next time.
When you order a pizza, what if you knew every step of the process from the moment it goes in the oven to being delivered to your doorstep? Event-Driven Architecture is a modern, data-driven approach that describes “events” (i.e., something that just happened).
A real-time data infrastructure enables you to provide such event-driven data insights in real time. Israel Ekpo (Principal Cloud Solutions Architect, Microsoft Global Partner Solutions, Microsoft) and Alicia Moniz (Cloud Partner Solutions Architect, Confluent) discuss use cases on leveraging Confluent Cloud and Microsoft Azure to power real-time, event-driven architectures.
As an Apache Kafka® community stalwart, Israel focuses on helping customers and independent software vendor (ISV) partners build solutions for the cloud and use open source databases and architecture solutions like Kafka, Kubernetes, Apache Flink, MySQL, and PostgreSQL on Microsoft Azure. He’s worked with retailers and those in the IoT space to help them adopt processes for inventory management with Confluent. Having a cloud-native, real-time architecture that can keep an accurate record of supply and demand is important in keeping up with the inventory and customer satisfaction. Israel has also worked with customers that use Confluent to integrate with Cosmos DB, Microsoft SQL Server, Azure Cognitive Search, and other integrations within the Azure ecosystem.
Another important use case is enabling real-time data accessibility in the public sector and healthcare while ensuring data security and regulatory compliance like HIPAA. Alicia has a background in AI, and she expresses the importance of moving away from the monolithic, centralized data warehouse to a more flexible and scalable architecture like Kafka. Building a data pipeline leveraging Kafka helps ensure data security and consistency with minimized risk.
The Confluent and Azure integration enables quick Kafka deployment with out-of-the-box solutions within the Kafka ecosystem. Confluent Schema Registry captures event streams with a consistent data structure, ksqlDB enables the development of real-time ETL pipelines, and Kafka Connect enables the streaming of data to multiple Azure services.
If there's something you want to know about Apache Kafka, Confluent or event streaming, please send us an email with your question and we'll hope to answer it on the next episode of Ask Confluent.Email Us