Financial services institutions or banks, as laypeople call them, have been early adopters of Kafka. My co-worker Fotios Filacouris has been working in financial services his whole career. He currently works as a solutions engineer, helping banks figure out how to build things out of Kafka and Confluent. We talk to him today about a few key use cases he's seen emerge in that sector. And we also reef a little bit about the profession of being a solutions engineer. It's a pretty interesting conversation.
Before we get to it, I should note that Streaming Audio is brought to you by Confluent Developer, that's developer.confluent.io. Our intent for Confluent Developer is that it be a place on the web that has everything you need to get started using Kafka and Confluent. We've got video courses, we've got executable tutorials, a patterns library. There's an index of episodes of this podcast, all kinds of things. You should check it out. And if you do any of the tutorials and you have to sign up on Confluent Cloud, a good thing for you, we don't want you to have to spend money to do that learning. So you can use the code PODCAST100 when you sign up to get an extra a hundred dollars of free credit. Check it out. But now, let's get to the episode.
Hello and welcome to another episode of Streaming Audio. I am your host, Tim Berglund and I'm joined in the virtual studio today by Fotios Filacouris. Fotios is a senior solutions engineer here at Confluent, which means he works with people who are trying to build things and trying to figure out what it is they can build, what's going to work, what's not. Works with them on a technical level. And Foti, well, welcome to the show.
Thank you for having me. It's a pleasure. Long-time listener, Tim. First-time caller.
You got it. Nice. I like it. I'm glad. Glad you've been a listener. I mean, you happen to work with customers and prospects in finance, and that's what I want to talk about today, which is exciting to me because customer meetings are a percentage of my job back in the day when I was on the road all the time. That was a big thing. You go to a conference and you go meet with customers and feel like that's kind of maybe coming back soon.
But financial services folks have always been some of my favorite conversations because I don't know, the problems are just cool and they tend to be fairly advanced in their thinking. So I want to dig into stuff you've found, like what are people building and how are people solving problems. But first, tell us about what you do and how you got to do that thing.
Absolutely. Yeah. So I've been working in data infrastructure, I guess, for the past six, seven years. So I've held kind of every job you possibly could within the IT world. I started working in tech support. Then I became an implementation consultant, traveled the world. And then I had the tremendous opportunity to actually become a solutions engineer. Someone gave me a shot, as one does happen within their career. And they said, "Hey, Foti, are you interested in actually talking with a bunch of different people about a bunch of different challenging problems that they have within their business?"I said, "Yeah, great. That sounds amazing." And the rest was history as they say.
So as I live in New York here, I've been working in financial services for the better part of my career.
Yeah, absolutely. Yeah. And I've been working with Apache products for the past, I would say about five years. So prior to working with Kafka, I worked with architectures around another Apache project called Apache Ignite. That led me to talk to a bunch to a bunch of Java meetups worldwide, a bunch of Apache meetups, so I've spoken at a bunch of Apache Kafka meetups and banking is my life, as they say. So thanks for having me here today.
You got it. And I've been in the Kafka world and with Confluent for a little more than four years, but I got seven years in the Apache world. So I'm kind of like you, prior to this, there was another Apache project and they tend to do well with, well, a lot of things, but the big infrastructure stuff tends to gravitate there.
So yeah, I just want to pick your brain about what people are doing with Kafka in financial services and what problems that they're solving. What looks super good and is easy and enables whole new vistas of awesomeness and what's hard to do. And let's just kind of talk about Kafka in Fins or FinTech.
Sure. Yeah. So I think, initially, when I started working exclusively with Kafka, even prior to that, in all the architectures that I work with customers and partners, Kafka was somewhere there. This was over three years ago. So you could see that Kafka had a critical place in kind of data infrastructure and architectures. But once I actually came to Confluent, and started working with customers directly, I think the growing trend that we saw was kind of messaging modernization. People in financial services viewed Kafka as this pipe where we could connect things from point A to point B and do it scalably.
Like a message queue, but a really big message queue.
But a giant one, right? Exactly.
What would be better than the message queues that we have? Much bigger message queued.
Yeah. That's very American of you.
Captures the spirit of not just the United States, but also the finance world.
Correct. Yes. And I think what we were seeing initially is that like things like IBM MQ, and your Tip Gos and traditional JMS messaging that's been out there in the wild for close to a decade or more now, they were running into some scalability issues and some problems with it. And just the way that infrastructure works and kind of how Kafka is distributed infrastructure, there are many advantages there. They could push higher throughput and the status quo was I have a new project or I have new data sources that are much more easily integrated with Kafka. I'm going to move away from something like a JMS. So that was like kind of the world, as I knew it, at least when I initially started working with Kafka.
And things progressed, companies obviously discovered new and different ways of implementing Kafka, using things like event stores, with something like Kafka streams and things like ksql and starting to make meaningful insights with data in real time directly within Kafka. So no longer now was Kafka just a pipe. It was an intelligent pipe where I could move things intelligently and get meaningful insights out of that data. But the real innovation actually, at least in my perspective has been recently with the COVID-19 pandemic. And I say that because of digital modernization, that buzzword that's out there in the world, banks have actually had to adopt that increasingly because their number of impressions in transactions from a digital perspective has increased exponentially over the course of the pandemic.
Really? Okay. I guess people used to do retail banking and now they didn't, is that?
Well, I think the idea is now that a lot more people were doing digital and retail banking. So on the retail side of the house, you had e-commerce and those types of transactions went up exponentially during the pandemic.
Oh, sure. Okay. And all those payments got to get cleared.
Absolutely. Yeah. And so you just touched upon maybe three or four different sides of the bank that needed to scale tremendously. Right? The retail side, web presence side, payments on the backend, clearing payments and settlements, all of that stuff needed to be kind of revolutionized a bit. And tremendous investment has been pushed forward over the course of the past year and a half to help drive some of that innovation.
Nice. So let's dig into those. Pick your favorite and let's talk about how Kafka has impacted it, because again, I've talked through some of these stories and I just love them. So I want them from a guy who's boots on the ground all the time and knows the world.
Yeah. Yeah. So first thing, I mean, I think that's been pretty interesting has been on the retail side of the bank. So I work with a couple customers here in the Northeast that have completely transformed their whole retail banking built on top of Kafka. So one of the, well, I'll give you a perfect example. One of my partners initially, my customer here decided, "Hey you know what'd be a great idea? If we stored all of our data permanently indefinitely within Kafka."
Yes. Yes. That would be a great idea.
Well, for you and me, that sounds great. But for-
I don't mean just commercially. I just mean that architecturally, yeah. That's where the magic is.
But if Kafka is a giant pipe, why would anyone ever do that?
Yeah. Yeah. You don't sell things in pipes, you flow things through pipes.
Correct. Right? And I'd like to position the perfect example for someone, especially, and it was a resounding type of use case that made me understand why they wanted to do this. So if you're scrolling through your online banking transactions on your phone, and you're actually trying to look through the last 10 days worth of your transactions, you scroll and you scroll and you scroll, and eventually you get to a point where you can't scroll anymore. Because that data actually sits in and has been offloaded into some cold storage or some data lake or something like that. And from a user experience perspective, I don't want to have to go and download a statement to see what I actually purchased. I want to look back in time as far as I can, and I want to be able to search all of that data and index it and say, Hey, when's the last time I went to Starbucks in October?
And now, magically, keeping all of that data within Kafka, they were actually able to do that. They were able to seek back to the beginning of that topic, which was their transaction log directly within their phone and say, Hey, there it is. I don't need to go download my statement. It sounds trivial, but it's actually something that they measured from a user perspective that was tremendously, tremendously popular.
So we started seeing things like that. We started seeing innovations on one of our banks wanting to build everything on top of Kafka. Another one of our banks and banking partners here said, Hey you know what would be great, is if we actually deploy Kafka for our retail bank directly within the cloud.
Okay. Again, it seems like that idea has come up more than once, but I love it.
It sounds good. But if you or if anyone's work in banking, you know that the cloud is a taboo term in a sense. It's not somewhere that banks tend to go.
And I don't want to sidetrack you from your story. So I want to get back here, because this is a very forward-looking Kafka user and everything, but could you just tell everybody why that is? I'll propose a reason and maybe you can tell me that that reason's not true.
Banks don't look to the cloud because they are stodgy and conservative and can't countenance change in their technology stack. Is that the reason or is there?
No, it's not. Is it?
No, no. It's not the reason. Although a lot of people would like to paint banks in that shape.
It's fun. It's easy. It's not true. So what's the deal there?
A lot of it is compliance related and that's just the fact of the matter. I mean, when you're a highly regulated business and this isn't unique to banks, but it's certainly true for banks. I mean, you see this in other industries as well, healthcare, anything that requires HIPAA compliance or anything like that, if there is any potential that there's a breach in your data or that your data is exposed externally, you can get heavily fined. And that's what comes with regulation. There's an operating expense for not being secure. Does that make sense?
Yeah. So when it comes to banks, it's not oh banks are conservative. They don't want to move to the cloud. It's banks don't want to get fined. They don't want to lose money and they want to secure their customer's data the best way that they can. And it's taken some time for the cloud vendors to get there with some of their security, that's a separate conversation, but this one forward looking customer, as you mentioned, said, "Hey, we're ready to go. We want to deploy Kafka our whole retail bank and build it in the cloud." And if you look back two years ago and asked yourself, is that humanly possible? I would have told you, Tim, no way. No way would any type of customer ever think about doing that within banking. And I got to say, I can't help, but feel like the elasticity that they get in the cloud and how they could scale not only their infrastructure, but how they can build a lot of their applications on top of Kafka, a lot of that was impacted by the pandemic and how quickly they needed to get that solution off the ground.
Got it. Got it. And well, I mean, we don't dance around the fact that we kind of like Confluent Cloud. This is a Confluent podcast. It's completely fine. And those scalability properties are things that have been built into Confluent Cloud. I mean, with great difficulty, we've watched them over the past three years kind of unfold and it's not like, well, it's in the cloud, so it's elastic. No, people had to build things that were super hard to build to make it that magical. But yeah, that's there and has been there and in Confluent Cloud since the beginning of the pandemic.
By the way, I want to clarify things. When I was introducing you and what you do, solutions engineer, and we're talking about customers, and there's like a very commercial focus to the way you approach things. So I think many of you know what an SE is, this is just typical enterprise software company speak. So if you work in enterprise, you know this. If you don't, you might not. There's no reason for you to know how this stuff works, so solutions engineer is the title. Informally, I often just say sales engineer, and nobody likes that, but.
No, it's a slimy. It's a slimy term.
It's bad. Sales is bad. Sales is in fact not bad because you have to help people solve problems and do things. And people don't usually spend money... You don't use guns, they do it voluntarily because it helps them. But the role of an SE is to be the technical component of that process. So you come from an engineering background and you help people who are customers who are maybe wanting to become customers, figure out how to build stuff because I said the figure out how to build stuff earlier, but you work with customers. So if you're detecting this commercial orientation, well, you better believe it because that's Foti's job is to help that stuff happen. That's what an SE does. So, anyway.
Yeah. And just like a little coat on that, I would say, I'm quite often in a position to tell people where I don't think Kafka works, which may sound like, hey, you're a sales guy. Why would you do that? Well, I'm not a sales guy. I help a sales guy.
Technically. Even in banking, I mean, I'll tie it back to banking. I mean, we have customers that come back to us and say, "Hey, Kafka's great thing. I want to use it for everything. And oh, by the way, I want to do all these great things like ad hoc analytics, on top of my Kafka cluster." And I'd say to them, "Well, Kafka does indexing, but doesn't really do indexing. So if you wanted do full table scans, if you're familiar with the database, every time you want to do some kind of ad hoc, like query analysis or ad hoc analytics, use a database."
Yeah. The physics of that activity are different. And even an analytics database, I mean, we've had, we've talked about Druid on this show before. Have we talked about Pino? Okay. I've done a bunch of stuff with the Pino folks, but I don't know if they've been on the podcast yet. I need to check-
One of my former colleagues works there, I can hook that up for you.
Oh, no. I got the star tree contacts. But the point is, there are these other things that become a part of an ecosystem with Kafka that do this because the physics of that activity is different. And so you're going to say that.
And you have to be objective. There's no way that you can build architectures and really gain credibility with anyone that's technical if you tell them that.l.
That is so true.
Kafka is a hammer for your foot. No. You know what I mean?
Yeah, yeah. If it is a hammer, well then let's find nails and hammer them.
And I don't want to sidetrack on this for too long, but it's an interesting topic because number one, you exist to relate to the technical people who are the ones building the thing out of Kafka at this company who maybe wants to spend money or is spending money with Confluent. And as we know, as people who have spent careers building things, it's got to work. And if it doesn't, and if it's the wrong thing, you're still there. You're in that relationship and you're dealing with it.
And we look horribly. Right?
And your life is terrible. So yeah, nobody wants that. Not a sales guy, you help a sales guy. And...
Put that on my headstone, Tim. Yeah.
And the funny thing is my interaction and sort of my team's interaction with customers when we do customer meetings is always to come alongside and support the SE because we'll kind of blow in for a day and do a talk and get a group together. And I always say, when you have real questions, I'm gone tomorrow. And the SE's not. The SE is there to actually be helping you over long-term and has to be able to deliver the goods or everything falls apart. So, anyway. I don't remember where we were, but that excurse on solutions engineering I think is important because it's a super cool role, and if you're a person has been writing software and building systems for a while, and you like people, and you don't mind traveling, I mean, this is, I think, an awesome place for a career to go.
Yeah, absolutely. Yeah. I do a little mentoring to the university, my Alma mater, Stony Brook University out here in Long Island, and yeah I always recommend folks check out solutions engineering because no one's ever heard of it until they get into enterprise software. And it's a cool way to deal with complex problems with really, really interesting use cases. And yeah, I mean, it's definitely where a lot of my passion in technology resides, so.
All right. Back to let's talk about other Kafka use case. Enough of the meta discussion that I keep pulling us in to.
Yeah. So we talked a lot about kind of the retail banking side. So we've talked about front end customer interactions, how at least the trends that I've seen in a lot of banks, they needed to scale up incredibly quickly to deal with a lot of this increased demand and throughput and transactionality of their business. But then there's the middle office and the back office side to that component as well, specifically within payments that we've seen kind of tremendous interest. And a lot of innovation has gone into the payment side of banking, whether that be treasury payments, whether that be modernizing how ACH payments get settled. There's a lot of really interesting stuff going on there on the payment side. So I think that'd be maybe a natural place to take the conversation.
Let's do it, let's dive in. I love payments.
Okay. Yeah. Great. So a lot of legacy payment infrastructure has been built on top of your traditional messaging systems out there. And messaging about 10 years ago, when we talk about things like IBM MQ and TIBCO of the world, they did really good guarantee. They guaranteed message delivery. And I think what a lot of our leadership that we've been working with now, they enjoy about Kafka is that they get kind of the best of both worlds. They get increased scalability working with Kafka. They get guarantees with exactly once semantics within the Kafka world. And they get the storage layer for replayability.
And within payments, when you're working with something like your MQs of the world, not to diminish that technology, it was great and revolutionary at the time for middleware, but it doesn't have those same facets from a feature set perspective that folks are really looking for today. So for example, if I read a payment from a queue and I lose that payment, or my client fails, and I need to try and reread that or replay that somehow, how do I do that today?
Without Kafka. And I think that involves a lot of different technologies, that involves a lot of complexity on the clients and the way that they need to build their clients to handle situations where they would get duplicates, or they would have to go back to an originating system and request a file. And once that file gets generated, they have to parse that file and then do a diff to see, Hey, we actually received these payments, but we didn't receive these payments and backing all of that stuff out was kind of a nightmare. In steps Kafka.
And so, for example, I'll give you an example of a use case. So one of our customers has been forced, I shouldn't say forced like it's a bad way, but they've been forced by the innovation of a lot of new SaaS based technologies, like your Uber's of the world to expand into developing markets. Remote locations in Africa, for example, remote locations in Asia, that traditionally haven't been big markets. Now, with the introduction of technologies like your Uber's, things like Amazon that is shipping there, now we require payments and payment settlement locally within a region, within an actual country. And there are some actual compliance and regulatory requirements in certain countries that say, Hey, you need to settle all of your transactions locally before we can do anything with them.
And folks are starting to say, well, I have all of my payment infrastructure that lives here in the United States or in Europe, but I need to get all that data that's been settled locally from a payment perspective in under a day, because Uber's coming out there or some other large e-commerce giant out there is saying, Hey, I need all of your transactions settled within a day.
Oh, okay. Wow. Okay. And they're a big enough chunk of the business that what are you going to do?
Demands have been made. Technology needs to fall. Right?
So in steps Kafka and Kafka now, and we'll talk about Kafka because I feel like we've been very Confluent centric, but Kafka is-
I mean, we talk about Kafka a lot on the show. I mean, you're cool, but go on.
Well, so some of the things that we provide, I guess from Kafka in general, and you see out there in the community are things like MirrorMaker 2, that allow for mechanisms of replicating data. So we can de-identify some of the payment information that's been settled locally from a remote location in Africa or in Asia and roll up those settlements by doing some kind of method of replication to a centralized payment infrastructure that's living somewhere in EMEA or Europe or somewhere in the United States.
Now, where Confluent comes into play is obviously we provide, and we're in the process of providing something like that we're building out there for multi-region clusters, cluster linking, some really cool things within our architecture that are allowing us to almost create this notion of namespacing within Kafka, which I think is kind of revolutionary to a degree of saying these are global payment topics. These are local payment topics. I can all define them at the topic layer and send data where it needs to without impacting things like GDPR and local types of regulatory and compliance settlement laws that exist in some of these municipalities and local governments. So really cool stuff that's going on there.
Now, maybe wrapping up kind of a top level question, I'm thinking of this thing Winston Churchill said, he said a lot of things, but he said, "We shape our buildings and then our buildings shape us" and you've got demands of the business and just general desire for modernization pressuring a Kafka adoption. But now you've got this new event streaming architecture beginning to underpin things. Do you see that driving the way banks behave or features they deploy or? You kind of hinted at that with Uber, but that's still exogenous from the outside. But how does it play out? How's banking change now that it's getting Kafkafied?
Well, what people need to understand, and I think a lot of people that are in the industry understand, but if you're kind of a developer and you work in the open source world and you don't deal with the enterprise as much, this might be some insight for you. There are two sides to building and innovating within any type of, at least in the banking world to my experience, right? There's the business and the business side, and then there's the technology side. And the business says you have to do X, Y, and Z. And the technology side tells them how they can't do it. And then the business says, you need to get it done by yesterday. And the technology side says, Hey, Foti, can you help me? This is my experience.
And you say yes, maybe.
Yes, I can try. I can try. So, yeah. I mean, we hearkened on it. We touched on it a bit, but you have companies like Uber, you have companies like Amazon, you have companies like Google that are out there that are saying, "Hey, we have these giant marketplaces. And the ways of traditional banking just don't work for us anymore. We can't wait two to three days for an ACH payment to get settled." ACH has been around since the seventies, right?
And it shows. I mean, I've been around since the seventies, but I can get a lot of things done in less than two or three days.
Yeah. Well, ACHs were designed for non-business critical payments, and I think what we're finding is that the volume of payments that are going on nowadays are actually these non-business critical payments. And things that are going through like separate types of clearinghouses or the critical types of payments, I mean, they're larger payments, and they're treasury payments, but they don't meet the volume of what's going on in today's world. So banks need to adopt, they need to scale. And Kafka really does solve a big hole in a lot of architectures that they have. Whether it's passing data from A to B, whether it's the incredible integration suite that Kafka has built around it with Connect, or whether it's complex event processing and the things that we've done with Kafka Streams in ksql, Kafka kind of checks a lot of different boxes as an ecosystem for banks, that they would probably need a multitude of other tools to use to get the same type of functionality.
And when they get unreasonable types of requests from the business to say, "Hey, build us a flux capacitor", they can come back and say, "Well, at least we can get 90% of the way there or 70 or 80% of the way there with Kafka." And I think that's been really powerful and a really, really compelling part of what I do within banking and kind of the things that we're seeing within banking.
My guest today has been Foti Filacouris. Foti, thanks for being a part of Streaming Audio.
Thank you very much, Tim. And I just want to shout out to the Northeast SE team and thank you again for having me. It's been a pleasure.
And there you have it. Thanks for listening to this episode. Now, some important details before you go. Streaming Audio is brought to you by Confluent Developer, that's developer.confluent.io. A website dedicated to helping you learn Kafka, Confluent, and everything in the broader event streaming ecosystem. We've got free video courses, a library of event driven architecture design patterns, executable tutorials, covering ksqIDB Kafka Streams, and core Kafka APIs. There's even an index of episodes of this podcast. So if you take a course on Confluent Developer, you'll have the chance to use Confluent Cloud. When you sign up, use the code PODCAST100 to get an extra a hundred dollars of free Confluent Cloud usage.
Anyway, as always, I hope this podcast was helpful to you. If you want to discuss it or ask a question, you can always reach out to me @tlberglund on Twitter. That's T-L-B-E-R-G-L-U-N-D. Or you can leave a comment on the YouTube video if you're watching and not just listening or reach out in our Community Slack or Forum, both are linked in the show notes. And while you're at it, please subscribe to our YouTube channel. And to this podcast, wherever fine podcasts are sold. And if you subscribe through Apple Podcast, be sure to leave us a review there. That helps other people discover us, which we think is a good thing. So thanks for your support and we'll see you next time.
It’s been said that financial services organizations have been early Apache Kafka® adopters due to the strong delivery guarantees and scalability that Kafka provides. With experience working and designing architectural solutions for financial services, Fotios Filacouris (Senior Solutions Engineer, Enterprise Solutions Engineering, Confluent) joins Tim to discuss how Kafka and Confluent help banks build modern architectures, highlighting key emerging use cases from the sector.
Previously, Kafka was often viewed as a simple pipe that connected databases together, which allows for easy and scalable data migration. As the Kafka ecosystem evolves with added components like ksqlDB, Kafka Streams, and Kafka Connect, the implementation of Kafka goes beyond being just a pipe—it’s an intelligent pipe that enables real-time, actionable data insights.
Fotios shares a couple of use cases showcasing how Kafka solves the problems that many banks are facing today. One of his customers transformed retail banking by using Kafka as the architectural base for storing all data permanently and indefinitely. This approach enables data in motion and a better user experience for frontend users while scrolling through their transaction history by eliminating the need to download old statements that have been offloaded in the cloud or a data lake. Kafka also provides the best of both worlds with increased scalability and strong message delivery guarantees that are comparable to queuing middleware like IBM MQ and TIBCO.
In addition to use cases, Tim and Fotios talk about deploying Kafka for banks within the cloud and drill into the profession of being a solutions engineer.
If there's something you want to know about Apache Kafka, Confluent or event streaming, please send us an email with your question and we'll hope to answer it on the next episode of Ask Confluent.Email Us