Hello, you are listening to Streaming Audio in the last days of 2022. Yes. The winter solstice has just passed. The planet begins to turn a corner and make its way to 2023. And over here in the pod cave, we're digesting a few large, rich meals for the holiday, and I guess attempting to digest a large and rich year just gone by. Yeah, it's time to get reflective about 2022. For me, it was the first year in several where we've been able to get out into the world and meet up with people in person, find out what's been actually happening, take the pulse of the industry. So I thought it'd be fun to bring in a couple of my colleagues to wrap up the year, have a bit of festive reflection, and the New Year's look at the year to come. So joining me are my fellow developer advocates, Robin Moffatt and Danica Fine. And they're here with me, Kris Jenkins, for a 2022 wrap-up edition of Streaming Audio. Let's get into it. The end of the year is upon us and joining me to wrap it all up is the excellent Robin Moffatt. Hey, Robin.
And the ever present Danica Fine. How you doing?
Oh my God. Ever present. Amazing. So well.
Well, that's one of the trends of the year. I think you've been everywhere this year, haven't you, then?
Oh, man. Actually, later this week I was supposed to go, I planned to go count up all the conferences I went to. And I think it was quite a bit, like-
... 13 or 14.
A lot of conferences.
... a lot.
And all sides of the world.
Was pretty good for first year as a DA. You know?
Yeah. You have got the air miles in a way that Robin has stopped doing.
I've completely, well, almost stopped doing it.
Just two conferences this year.
Which two did you go to?
The only two that were worth going to, Kafka Summit London and Current 2022.
You may have been biased, as you had something of a hand in those.
Somewhat biased, yes. And other good conferences do exist.
You were chair of the program committee for both of those this year.
Is that the first time you've chaired a committee?
Yes. Yeah, so I've been on the program-
... committee before, first time chairing it, which was quite an experience. It was good fun, actually.
And you are handing over that mantle to Danica, so-
What's it like? Because we've all been on the committee, like rifling through abstracts trying to find good ones. But what's it like being a program chair?
It's quite fun, actually, because I've been to many conferences over the years and you actually get to put your own mark on it. So you have hopefully a good feel for what the community are going to be interested in and you can actually push it in that direction. And so a bit more of this kind of topic, a bit less of that one, just a bit more opinionated. Sure, I do have opinions now and then, which I generally hold back on, except on Twitter.
Robin's never had opinion in mind that I've seen. [inaudible 00:03:10]-
It actually says it on the job posting, like, "Shit posting and memes." It's there.
We'll bleep that out, thank you very much.
It's on the fine print.
There we go.
Well, handing over to Danica. I think Danica has opinions, too, so that'll be just fine.
Danica's going to be awesome at it.
Occasionally, I think.
You are doing Kafka Summit London '23, that's coming up.
Yeah, looking forward to it. But it is pretty reassuring. I didn't realize, Robin, that last year was your first year chairing.
So that makes me feel a lot better. So I can definitely-
... do it
It's definitely the kind of thing, having been on the inside of it as a program committee member before, even if just for the boring stuff, like the platform that we use for abstracts, and who the people are, and all that kind of stuff. So having that experience already then makes stepping up to chairing less of a daunting step than if you just come in completely from the cold.
Yeah, decent tools, there.
Already, I feel like I didn't know what I already knew, and it's quite a bit. So just having experienced it for the past year. Yeah.
It'll be good.
It's going to be better.
Let's go onto that theme then, Robin. If you say you've put your stamp over which kinds of talks to emphasize, do you spot any patterns over the past year? What trends emerged in the what people want to talk about field?
Yeah. And it's changed slightly with Current, because Current, we broadened the scope. So it's not going to like for like. There's less of the, "Here's just how we run Kafka." Obviously, Kafka's well adopted now. Lots of people are using it. So simply the fact that you're running Kafka in itself isn't so interesting. And even Kafka on Kubernetes thankfully is also no longer particularly interesting. It's just like, "You do it or you don't do it. Either's fine by me." But conference talks need to be a bit more interesting than just that. And-
... so there's-
... makes it really alliterative. It's great.
That's- [inaudible 00:05:10]
Is that not enough? Is that not enough?
Not for me, it's not, no.
If you want to get into the Kafka Summit London program for next year, you've got to make your talk title alliterative for Danica's sake.
But yeah, we're going beyond the stage one thing of Kafka, I feel.
Yeah. So there's still interesting talks about running it, but it's much more like real experiences that people have had beyond just like, "Well I installed it, and ran it, and here's what I do, here's what I clicked on." But actually, "Here's the kind of things that went wrong, and how we spotted it, and how you can avoid it." And real useful stuff, which is why we go to the conferences in the first place.
So stories about stuff that people are building with it, we've always had those, they'll always be interesting stuff that people are building. I think one of the trends with Current, with opening the screw up much more, there were alternatives to Kafka there and other stream processing technologies, but I think one of the big things there was the way in which people are looking at stream processing and lots more people are getting into that now, lots more technologies, lots more companies sometimes behind those technologies. Ways of either getting data into a streaming platform or processing whilst it's there. So that's definitely one of the ones.
It definitely feels like it's starting to go somewhat mainstream, as not replacing batch, which some people would like to do, but certainly as part of a nutritious breakfast as they used to say.
Well, Kris you had your panel, right? So you know-
We did that panel at Current, people chewing over the discussion, "Is Streaming the future? Is it part of the future? Is it a fad?" And it was definitely the sense that we are going to converge on something where streaming as part of everything we do.
Yeah, and I think it's definitely part of. I'm going to be really interested to see how big a part that is. I was watching just the other day, Benn Stancil did a really great talk at Current, I think it was, it's called Big Data is Dead, the End of Big Data, something like that. And the last part of the talk, I think it starts at about 24 minutes in, and he's talking about why doesn't streaming have this widespread adoption? And his general point was, "Well once it becomes boring and like, 'Well, I may as well stream than not.' That's quite useful." But a lot of the use cases, like the realtime fraud detection or the notifications, not that many people need it, particularly if you're looking at with a data engineering slant. So until it becomes like, "Well," he just put it in the context of the technologies and the conference, "It's all about the technologies."
Like I've just said, like, "Is it this one or is it that one?" And people go to the conferences and talk about the technologies. Once that bit becomes just a given and it's like, "Well, it's as easy to stream it, so as well just use Streaming instead of the batch option, because there are benefits like your reports show more up to date data, right? That's a useful thing."
It's not necessarily as useful or it's not necessarily worth the effort of adopting a stream thing if you need to understand all of these things behind it.
Yeah, it does need to get to the point where it's boring infrastructure in the best possible way.
Danica, you've been keeping a close eye on the KIPs this year. Anything?
More or less.
Because you're the queen of our release notes these days. Is there anything in there that's helping it to become more mainstream and boring in a good way?
Ooh. Yeah, that's a great way to phrase that. Yeah, honestly I like that it's not boring yet because my background in using Kafka was like every project we did was, "How do we make this the most complicated thing ever? And let's dive into it and figure out all of the nuances of everything." We were pushing the limits of Kafka streams, which was just so much fun. But yeah, no, just on that note, I was trying to use interactive queries quite a bit for Kafka streams. And even a couple of years ago that was a great feature. You can access the information from your state stores and hit it with a rest endpoint and whatever you want to do with that information. But it was still kind of yet a, "Finesse it." And so there were a couple KIPs that have been coming out last year or so that just make interaction queries better. So we're at version two of those and fetching a specific range of information. So it just makes it easier for the average user or maybe someone who wants Kafka to be boring, right-
... to get access to that information from a relatively complicated application. So there's a lot of KIPs like that are making it boring, which is really good for the wider range of users that we want to be using Kafka, right?
Absolutely. And it frees us up to do the more interesting stuff. We've had some really interesting, what I would've thought were fringe use cases this year. Particularly, I'm thinking of on the podcast we had Simon Aubrey, who I'm sure you both know, talking about using AI to track animals in his back garden in real time. And that was really cool.
I read that blog, that was brilliant.
Yeah. It's like, "I'm getting an alert because there's an 89% chance there's a koala wandering through the backyard."
Did he show a picture of his backyard? I'm very curious. Is that something that we're at risk for, like a koala just wandering through his backyard? That's-
Not in Yorkshire-
... a question-
I'm pretty sure.
... for additional podcast.
I think there's a larger risk of a giant poisonous spider wandering through his backyard, but that's also worth having realtime data about, right?
Yeah. But meanwhile, again, I don't want to use the book word boring as a pejorative, but we've had some of the more living with it type topics, like we had a podcast with Nikoleta Verbeck on The Things You Need to Know Once You've Got Past the Basics.
Right. And I know you did a blog series-
... with every- [inaudible 00:11:17]
... so many of them.
So many of them. Yeah, that podcast was so, I'd say inspiring, just because of my experience with Kafka. Because you wish that you knew ahead of time all the things that not can go wrong, but all the things that you might just miss, right? Because it's a pretty-
... complex system for better or for worse. And so we got together and put together a blog series that we hope to continue indefinitely because there is just that much information to cover. But really taking people back to just the beginning, the absolute beginning. And like, "Oh, if you're encountering an issue with Kafka, well, I'm not just going to give you a bandaid, because, well, most of what we want is, 'How do I fix this specific issue that I encountered right now?'" But when we do that and I've been guilty of that, "I just want this fix right now. Help me." I don't understand. I don't understand what caused that problem or if there's something else that I'm doing in my system that made this issue arise. So that's what the blog post walks through is, "All right. How do we take everyone back to the beginning and actually understand the nuances of the system, and some configurations that we may be playing around with, and not knowing that they can cause issues down the road?"
And maybe that will make things more boring as well. So that's the theme. Let's make Kafka boring.
Yeah. We might have to try and rephrase this for the title of the podcast, but it's like-
I don't know if that's going to fly.
We're reaching that point where you can build a producer and a consumer and get real time data, and think, "This is great. Suddenly my data is being processed in real time." And then to pick on a victim, linger.ms, you set linger.ms to 100 instead of whatever the default was, and things get a bit better. And you think, "Oh, great." But you've got to start developing that deeper mental model of what's going on in the way we probably have as an industry, quite a deep mental model of relational databases without realizing it. You know.
Mm-hmm. And with things like linger.ms, as soon as you start playing around with those levers, they're not just in isolation. Right?
That's going to affect 14 other configurations that you've never heard of and you never wanted to deal with.
Yeah. And you have to have the mental model of what it's doing, because otherwise you're just randomly setting numbers and hoping. Yeah.
Well, it's that great quotation, isn't there, about, "Randomly jiggling things until they unbreak." I can't remember who said it, but it's this idea that you actually have to understand from the bottom up, like, "What's actually going on." You can't just find the top hit on Google, and make that change, and then hope that things will just fix themselves.
But usually that top hit on Google fixes it. And then we just move on with our lives.
Yeah, but you always end up-
And then it breaks two weeks-
... with a better longer-
... term fix if you understand just one step below the abstraction. Right?
Have you seen any good talks about that, Robin, the getting past Basecamp into stage two of Kafka adoption?
I'm not sure if it's what you're asking. I saw a great talk just last week from Brendan Gregg, I think from 2012, about-
... performance analysis.
So nothing to do with Kafka, but about troubleshooting performance problems. And it was fantastic. And it's funny, 2012. God, that sounds like, "Yeah that's ancient." In technology years, it's like, "Surely that's out of date." But it is as relevant now as it was then. And it always will be, just like talking about how you take a methodical approach to troubleshooting performance, and going through some of these great anti patterns. There was the streetlight anti pattern, and this idea that you hit a problem, and you obsess over the things that you can examine. So you pull up a WebGUI, and it's like, "This thing has got this spike. Oh, wow. Here's this spike. I need to go and look at that."
And the story that goes with it is there's a drunk on a pavement, he's looking for his keys underneath the streetlight. And the policeman comes along, he says, "Oh, can I help you, sir?" So he's like, "I've lost my keys." The policeman says, "Oh, did you drop them here?" He says, "No, I didn't drop them here, but it's the best light here." So it's this idea that you don't want to be looking underneath the streetlight. You need to actually understand what it is. And it's a really, really good talk. I'll make sure we put it in the show notes.
Yeah. But sometimes so hard to know if, to stretch the metaphor, there's only one streetlight in the area, that's the only place you really can look. It's often hard to know what you should be looking at under the hood, like Nikoleta's-
... list of tuning parameters, you wouldn't know to go and look for those unless you had someone guiding you, right?
Well, I disagree. I think you need to-
... understand. Okay, so advertised listeners in Kafka, let's pull it back onto topic, as it were. And that's one of the advertised listeners, is people listen to this haven't come across it. It's like to do with how Kafka brokers communicate with the client and you have to configure it correctly, particularly on things like Docker, or if you're ECT or whatever, because otherwise things don't work, or they work a little bit and then they stop. And it's really confusing. And it comes up again and again over the years on Stack Overflow and everywhere. And there'll be answers like, "Well, just hard code your hosts file." Which is a horrible approach, but it kind of works until then it doesn't work when you input in environments. You have to understand what's going on.
And this is one of those things I get a little bit, I won't negotiate it, or whatever the word is, because whatever you're working with, you have to understand it enough to be able to troubleshoot it. And that's sometimes a bit idealistic, and sometimes people have just got a deadline, and they need to get the thing written, and like, "Just give me the answer."
But if you're serious about understanding technology, you need to understand, like, "Well, if you've hit this error, what's the error mean?" And go and look up how the listeners work in Kafka. And I wrote a blog about it. And it gets a ton of hits because the actual documentation in Kafka, it literally explains it but it doesn't explain it enough. So it's, I guess, a gap in the docs. But that's why I like blogging. You find something that's not documented so well, "I've got to write a blog about it and that'll help a bunch of other people hitting that problem."
And it's the same thing with performance and everything else. It's not like if you don't know it, then you're daft. But I do personally feel like if you're going to work on a technology seriously, you do need to understand under the covers, the next level down. Like, "Okay, so what is the end-to-end flow here?" Or, "When it does this, which bit happens next?" And that's a beauty open source. You can actually go and look. And if you can't figure it out, there's great community out there who will help you figure that through.
Yeah, sometimes I think you have to dig until you find the fix. But to future proof yourself, you want to keep digging until you really understand why the fix fixed it.
And sometimes you can't find the fix until you understand it. But we've all been guilty of just setting it to the magic number that seems to work.
Yeah. But another trend, there's the technical trend of getting to stage two. But I've definitely noticed the social trend of streaming technology maturing, which is you start to get companies now, Data Mesh is the buzz term, but thinking about treating their data as something that you've got to talk to other departments with. Instead of talking to other departments with APIs, we're talking to each other with data now. And that's definitely becoming a bigger trend. I don't know, I'm going to put Danica on the spot, there.
Yeah, because you've been speaking to customers a lot, you've been out there, you've been dealing with KIPs. Do you think this is going to be part of the journey going forward?
Absolutely. Okay. So most of the conversations that come up around that, especially with customers and just people out in the field, it's always folks that haven't, they're like, "Oh, this seems intriguing, Data Mesh. It's going to solve all of our problems. This is the next best thing." That's what happens with buzzwords. And then asking questions about how to actually implement it. And of course, we have a Data Mesh demo, and, "Here, this is exactly how you might implement it and use it." But as we get deeper into the conversations and like, "What are you actually getting out of Data Mesh?" Treating your data as a product and changing how you communicate with other areas of your company. They all seem to, and this is my personal belief, is that it's a set of best practices, right? You don't have to-
... implement into any certain way. It's just understanding how to operate in this data-driven space in a more responsible way. And when you get to that point, and that's what it is, it's like, "Okay, is the fix Data Mesh? Or can we dig a little deeper and understand the best practices that are actually supporting Data Mesh and just start to work with those instead of just focusing on this buzzword, that fix." Right?
So that actually comes pretty neatly full circle in that conversation, because that's what I've been pushing for. And it's really cool that in having conversations with people, they come to that on their own in the middle of the conversation. Like, "Oh, wait, okay. We can just focus on those couple things and if we do that we're going to get to that point." And I like that people are excited about that because those best practices are good, and they are best. Right?
It's always nice when the best practices actually are best, and they're not too complicated, and they make sense, right?
They're intuitive. And so I think that is going to be part of the journey moving forward. Because if you want to use your data efficiently, I think that's a really big part of it.
Yeah. You say best practices around solutions. I almost think that Data Mesh is partly a solution, but there's no codified way of doing. It's partly way of codifying what shape your problems look like. You have a problem that if you can't deliver data to people, things are going to be more difficult for them to cooperate without coordinating.
You have a problem that if you are publishing data to them, then you need some kind of SLA for that. You need some kind of guarantees about quality. And it's almost, is Data Mesh a solution or a description of the trouble you've got ahead unless you solve it?
Maybe it's just the streetlight. Is Data Mesh the streetlight?
There's your title for the podcast.
My new talk with Adam Bellemare.
Is Data Mesh the Streetlight?
He's got a book out. Anyone get a signed copy of that book at Current?
Missed that one last.
The line was-
There was a big line, too, for it, though, wasn't there?
There was. Yeah, it's hot topic.
Man. I'm just going to set up my own line at the next conference, set up my own table. I don't know what book I'm going to sign, but it's not mine. I don't have one. But-
Signing people's house plants.
"Here. Here's a Leaf from my plant."
That's another one of my favorite things that I've seen a bit of, again referencing Simon Aubrey but also you and other people. IoT related fun stuff. I know you have a future IoT related project. Tell me about that and how people have reacted to your plant project over the year.
Oh, man. I didn't expect people to be that excited about it, first year as a DA, just trying to find an interesting project, one that I find interesting, because then that's going to be a driving factor for me. I want to build this, selfishly and it's going to be useful. And I think it checks a lot of boxes as far as what should people understand when they're trying to build an end-to-end pipeline. So it's been really, really well received. I think people are excited that it's a fun way to learn the end-to-end pipeline. And then inevitably, at every conference or speaking engagement, have at least one person come up and say, "Okay, how do I build this in my house? Because I need it." Or like, "My girlfriend has too many plants, how do I build this for her and stuff?"
Like, "All right, let's do it." It's so much fun to me that people are that excited about it, that they want to build this, and even if they haven't used Kafka before, they are inspired to do that or something similar. And yet I think I've got a whole backlog of random IoT projects that I want to build out now because, because it was easy. It was pretty simple to integrate it and there's so many random things that you can put together.
Yeah, it's fun to do stuff you can actually put your hands on when we're dealing with a lot of abstract ideas. And it's fun to scratch your own itch in a way. This is completely off topic, but one thing I found this year that's really interesting me is have you heard of a program called PICO-8?
You remember the age of eight bit consoles, like Nintendo Entertainment System and the-
... Game Boy. PICO-8 is a development environment for writing those, for a fantasy console that never really existed. But it's this wonderful, tiny idea that you can draw sprites, and pixel maps, and compose music. If you have kids-
... start them on learn. It's based on lure and you can start them coding with a complete build your own games environment. And the one thing it lacks, which galls me, there are no network primitives, so I can't connect it to Kafka because I really want to do a-
That would make an amazing talk, right?
Wouldn't it? Like fantasy console game browsing-
... your topics or something. That'd be awesome.
Yeah, that would be-
I- [inaudible 00:25:05]
... super cool connect to your choose your adventure game. That would've been super cool.
One thing I meant to ask you, Robin, for ages, you had a matrix display on your back wall, like IoT thing.
Did you ever connect that to Kafka?
No, it's called, it's a Divoom, D-I-V, double O-M. And it didn't actually have an interface that I could find that I could use.
I don't know if there's an unofficial API somewhere. But you had the phone app, and you could draw a sprite or find one, and then send it to it.
So I guess you'd have to-
There's no SBK?
... sniff something. Not that I could find, no. Maybe there is and that would be fun. But yeah, definitely, that would be fun.
Because I have an Adafruit Matrix portal, which I've been trying to do something with, but it runs a version of Python. And this is getting onto something, a trend I'd like to see in the coming year. It runs Python. I can't get a Kafka client onto it because the Kafka Python library depends on librdkafka to C bindings and I can't get the C library on there because it ... You know.
Too much pain. And it's like, I would like to start seeing next year, this is my hope for the future, that we start seeing more Kafka clients and more language libraries that aren't necessarily dependent on the underlying C library that we publish.
Did you try the REST API?
That's my backup plan. So far I've been struggling with SSL.
I keep chipping away.
That's another fun thing.
Yeah. SSL on IoT devices, that can be a lot of fun.
That was one of the things that took me a week to set up on my [inaudible 00:26:52]. I remember just like, "All right, Google, tell me the fix." Yeah, that was just one of those things again. So good stuff.
But that leads me to where I want to go. Just finally, that's my hope for next year. Broader, more diverse client libraries. They're getting away from this idea that Kafka is largely a Java shop, which is nonsense, but there is that-
... perception. What in your fantasy Kafka 2023 streaming world, do you hope to see, Robin?
Oh, like you say, more, better support for other languages. It's always been a problem. Still, it's getting better, but there's a huge amount more to do in it. Probably better connectivity, just general open source, like [inaudible 00:27:46] fantastic example, more stuff like that. More connectivity in that area. But just general, it's not a Java thing. It is a Java thing, but once you've got managed services, who cares what the broker's running in, right?
Let's make sure all the [inaudible 00:28:04] clients can use that as whatever people want to code in.
Them. Yeah. Postgres is-
... an NFC thing, right?
... Haskell law or something like that.
Haskell. Yes. I'd like to see Jay Kreps announce proper support for Haskell at Current '23. Jay, if you're listening, I'm prepared to help out with that effort.
Dream big. Danica, what's your hope for '23?
Oh, man. Can you repeat the question? What was the wider, it was for streaming, or for Kafka, for everything?
What's my hopes-
For this world we-
... and dreams?
... are in, what innovation do you hope that we'll see? What's the main thing-
... you're crying out for, that you think might happen next year?
Oh, man. No. Oh, man.
It's the end of-
... the year.
... think it needs to be a wider innovation. No. You two covered that. You two covered that. What I want for my end of year, my winter gift that just keeps on giving for the next year or so, and I'm trying to deal with the blog posts that I write from now on, is just teach a man to fish. No more bandaid fixes. I want as many people as possible to understand the nuances, that the levers that they're pulling for Kafka, or whatever project they're building, I want them to understand. Even if it's only one person that will make me happy.
Right. Moving away from-
That's what I want.
... enough rope to hang yourself into the enlightenment era of Kafka.
Yes. That's what we're hitting. Yes. The enlightenment, the streaming enlightenment. That's it.
That could be the catchy subtitle for Kafka Summit London, which will be the next time we all meet together, I think.
I'm so excited, looking forward.
It'll be here before we know it.
Yeah. Well, good luck with the program committee, Danica.
Robin, pass the torch successfully.
And thanks very much. We'll speak to you again soon.
Danica, Robin, thank you very much. I'm looking forward to seeing what they both do in the year to come, and I'm quite looking forward to some new guests telling me what they're doing in 2023. So if you are one of them, drop us a line. I'd like to hear from you. As ever, Streaming Audio is brought to you by Confluent Developer, so if you're spending this festive period taking a break to work on a pet project or brushing up on your technical knowledge in the downtime, then developer.confluent.io is there for you as a resource to teach you everything we know about streaming technology. But until next year, it remains for me to thank our guests, Danica Fine and Robin Moffatt for joining us today, and you for listening to our technical explorations all this year. I've been your host, Kris Jenkins, and I will catch you in '23.
The past year saw new trends emerge in the world of data streaming technologies, as well as some unexpected and novel use cases for Apache Kafka®. New reflections on the future of stream processing and when companies should adopt microservice architecture inspired several talks at this year’s industry conferences. In this episode, Kris is joined by his colleagues Danica Fine, Senior Developer Advocate, and Robin Moffatt, Principal Developer Advocate, for an end-of-year roundtable on this year’s developments and what they want to see in the year to come.
Robin and Danica kick things off with a discussion of the year’s memorable conferences. Talk submissions for Kafka Summit London and Current 2022 featuring topics were noticeably more varied than previous years, with fewer talks focused on the basics of Kafka implementation. Many abstracts featured interesting and unusual use cases, in addition to detailed explanations on what went wrong and how others could avoid the same issues.
The conferences also made clear that a lot of companies are adopting or considering stream-processing solutions. Are we close to a future where streaming is a part of everything we do? Is there anything helping streaming become more mainstream? Will stream processing replace batch?
On the other hand, a lot of in-demand talks focused on the importance of understanding the best practices supporting data mesh and understanding the nuances of the system and configurations. Danica identifies this as her big hope for next year: No more Kafka developers pursuing quick fixes. “No more band aid fixes. I want as many people as possible to understand the nuances of the levers that they're pulling for Kafka, whatever project they're building.”
Kris and Robin agree that what will make them happy in 2023 is seeing broader, more diverse client libraries for Kafka. “Getting away from this idea that Kafka is largely a Java shop, which is nonsense, but there is that perception.”
Streaming Audio returns in January 2023.
If there's something you want to know about Apache Kafka, Confluent or event streaming, please send us an email with your question and we'll hope to answer it on the next episode of Ask Confluent.Email Us