Common Sense Virtual Roundtable:

4 Limitations of Data Warehouses in a World of Infinite Data

June 22nd, 2022, 10 AM – 11 AM CT, Successfully held

Request Detailed Session Notes

We are careful about who we send this key document to. The session notes will be sent based on the request, provided your profile matches our qualification criteria.

Presented by

Here’s what we discussed:

A 2021 Wakefield study reveals 94% of data leaders have serious concerns about their data warehouse investments, and there are four clear reasons why:

Multiple data copies: Redundant data copies throughout the organization’s architecture exponentially increases infrastructure setup, maintenance costs and time to value.

Inconsistent and inaccurate data: Chief data officers and data architects constantly worry about data drift and lack of data governance and security as they deal with the complexity that comes with multiple data copies.

Complexity slows access: Data consumers often must rely on overwhelmed data teams to extract the data they need from the data warehouse. It can take weeks or even months for users to get access to the data in the form that suits their reporting and dashboard needs.

Vendor lock-in: The architecture and pricing models of most data warehouses are designed to increase dependency on the proprietary data warehouse vendor to a point where the organization cannot access their own data without paying the vendor.

In this session, we discussed how companies are taking another look at the best way to democratize data access while avoiding these limitations.

Solution Expert

James Chien
Senior Solutions Architect at Dremio
LinkedIn

Here is what we learned:

Bappa Roy, Enterprise Cloud Architect, The TJX Companies, Inc.

• Bappa joined the session to learn from others with similar challenges.
• Once the data is in the data lake, you have to get out of the UTL business. That’s the goal they are trying to reach.

Ryan Legner, Staff Product Manager – Data & Analytics, Genesys

• Ryan joined the session to discuss multitenant models, and to remove partners who want that model.
• The challenges come with the customers who want to take all their data from their relational models and copy it into theirs. It has to be transformed to fit into their SQL database because it’s not 1-to-1.
• There can also be scalability problems. People want the cloud model, but they also want control over it.
• It’s hard to figure out all the clients’ needs to be able to push the right data into the warehouse.

Request detailed session notes

We are careful about who we send this key document to. The session notes will be sent based on the request, provided your profile matches our qualification criteria.

About Dremio:

Dremio is a high-performance SQL (data) lakehouse platform built on an open data architecture that helps to accelerate BI and Analytics directly on cloud data lake storage. Created by veterans of open source and big data technologies, and the creators of Apache Arrow, Dremio is a fundamentally new approach to data analytics that helps companies get more value from their data, faster. Dremio makes data engineering teams more productive, and data consumers more self-sufficient.

Participation in the Virtual Roundtable is free of charge to qualified attendees. Once you’ve completed the registration, we’ll confirm your invitation and send you a calendar invite with a link to the meeting.

If you don’t qualify, we’ll suggest other learning events that may be a better fit for you.

Here’s what attendees at past events have said:

Other active events

Talk To Us About Attending Future Events












    This site is protected by Invisible reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

    Receive Upcoming Event Notifications
    Find out about upcoming events you or your team may want to attend.
    SUBMIT
    By hitting submit, you agree to receive important updates from Common Sense.