Confluent - Building New Products & Revenue with Event Streams Data
Common Sense Virtual Roundtable:

Building New Products & Revenue with Event Streams Data

March 18th, 2021, Successfully held

Request Detailed Session Notes

We are careful about who we send this key document to. The session notes will be sent based on the request, provided your profile matches our qualification criteria.

Presented by

Confluent

Here’s what we discussed:

In this VRT we discussed how companies can turn real time event streams data into new products and services and even prevent fraud before it occurs. Properly handled, event streams data can connect disparate legacy systems or even replace some of those systems. We shared some examples of how this works in practice, and had a discussion on how you and your peers are using event stream data.

Moderated by 

Kai Waehner
Field CTO / Global Technology Advisor at Confluent
LinkedIn

Here is what we learned:

The session began with challenges requiring event stream data. In the banking sector, the biggest challenge is retention, so there is plenty of demand for customization and personalization to drive the business.

The problem that one FSI company faces is the data is siloed. How do you make event stream data more discoverable for everyone in the organization to use it?

Kai observed that the point is not to get from A to B in real-time, the point is to integrate with many different systems, and then store the data so you can correlate it, something like a “central nervous system” for event-based data. The data are encrypted end to end, so security and governance are covered.

This doesn’t mean everything needs to be real-time but there is added value when you correlate the data in real-time, when events are happening, maybe even with analytics and machine learning under the hood. You can combine this data on the back end with your CRM system or your loyalty system.  Based on that the end-user gets the right interaction at the right time.

It’s an analytics and machine learning Integration that can modernize your legacy services and mainframe data. In addition to mainframe, you can integrate or replace hundreds of IBM MQ systems with a single Kafka cluster, which can process 100,000 or more messages per second. It can also be used to accept global payments. The integration and replication can happen between data centers, and even between different regions and continents. It’s not just about integration between legacy and modern technology like a database or data lake. It’s also open architecture and it’s very scalable.

Request detailed session notes

We are careful about who we send this key document to. The session notes will be sent based on the request, provided your profile matches our qualification criteria.

About Confluent:

Confluent is pioneering a fundamentally new category of data infrastructure focused on data in motion. Our cloud-native offering is the foundational platform for data in motion – designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization.

Participation in the Virtual Roundtable is free of charge to qualified attendees. Once you’ve completed the registration, we’ll confirm your invitation and send you a calendar invite with a link to the meeting.

If you don’t qualify, we’ll suggest other learning events that may be a better fit for you.

Here’s what attendees at past events have said:

Other active events

Talk To Us About Attending Future Events












    This site is protected by Invisible reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

    Receive Upcoming Event Notifications
    Find out about upcoming events you or your team may want to attend.
    SUBMIT
    By hitting submit, you agree to receive important updates from Common Sense.