Common Sense Virtual Roundtable:

4 Limitations of Data Warehouses in a World of Infinite Data

August 10th, 10 AM – 11 AM CT, Successfully held

Request Detailed Session Notes

We are careful about who we send this key document to. The session notes will be sent based on the request, provided your profile matches our qualification criteria.

Presented by

Here’s what we discussed:

A 2021 Wakefield study reveals 94% of data leaders have serious concerns about their data warehouse investments, and there are four clear reasons why:

Multiple data copies: Redundant data copies throughout the organization’s architecture exponentially increases infrastructure setup, maintenance costs and time to value.

Inconsistent and inaccurate data: Chief data officers and data architects constantly worry about data drift and lack of data governance and security as they deal with the complexity that comes with multiple data copies.

Complexity slows access: Data consumers often must rely on overwhelmed data teams to extract the data they need from the data warehouse. It can take weeks or even months for users to get access to the data in the form that suits their reporting and dashboard needs.

Vendor lock-in: The architecture and pricing models of most data warehouses are designed to increase dependency on the proprietary data warehouse vendor to a point where the organization cannot access their own data without paying the vendor.

In this session, we discussed how companies are taking another look at the best way to democratize data access while avoiding these limitations.

Solution Expert

Louis Bedard
Director, Solutions Architect at Dremio

Here is what we learned:

Puneet Bhargava, Chief Architect, Securities Services Technology, Citi

• Puneet is involved in Citi’s data modernization initiatives and programs and joined the session to understand what his peers are going through.
• They’ve experimented not so successfully with data lakes. Magically data would come together, but the promise of putting the data in one place to solve all our problems has not been solved or come to fruition.
• Relying on domain-driven operational data stores feeding into a warehouse has started gaining traction. The conformity level has provided a singular consolidated view.
• Data lakes do have a place when mobbing to cloud-based platforms and away from Hadoop.
• Systems are getting a little more complex, but expectations have multiplied many times.
• The publishing of the data needs to be standardized. It’s shifting left as much as possible.

Request detailed session notes

We are careful about who we send this key document to. The session notes will be sent based on the request, provided your profile matches our qualification criteria.

About Dremio:

Dremio is a high-performance SQL (data) lakehouse platform built on an open data architecture that helps to accelerate BI and Analytics directly on cloud data lake storage. Created by veterans of open source and big data technologies, and the creators of Apache Arrow, Dremio is a fundamentally new approach to data analytics that helps companies get more value from their data, faster. Dremio makes data engineering teams more productive, and data consumers more self-sufficient.

Participation in the Virtual Roundtable is free of charge to qualified attendees. Once you’ve completed the registration, we’ll confirm your invitation and send you a calendar invite with a link to the meeting.

If you don’t qualify, we’ll suggest other learning events that may be a better fit for you.

Talk To Us About Attending Future Events

    This site is protected by Invisible reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

    Here’s what attendees at past events have said:

    Other active events

    Receive Upcoming Event Notifications
    Find out about upcoming events you or your team may want to attend.
    By hitting submit, you agree to receive important updates from Common Sense.