This IDC white paper discusses the topic of managing data used for broad-based AI and analytics use cases, both in the formal context of a data warehouse, and in the more dynamic context of a data lake. It looks at the challenges in managing rapidly growing amounts and types of data, keeping it current, coordinated, consistent, and coherent across platforms for trustworthy data analytics and scaling artificial
intelligence/machine learning (AI/ML) workloads. It considers the need for a technology that provides
a mediating capability between data warehouses and data lakes. It describes the data lake house in
this context, showing how this approach addresses the needs outlined above. It then considers various
approaches offered to enterprises for building a data lake house. Finally, it examines watsonx.data, IBM’s
fit-for-purpose data store built on an open lake house architecture that enables enterprises to share a single copy of data across multiple fit-for-purpose query engines to optimize and scale AI/ML and analytics workloads. The paper seeks to show how it addresses all the issues and concerns involved
Included in this Contents
- Current Trends in the Market for Data and Analytics.
- The Data Lakehouse.
This article is posted at idc.com

Please fill out the form to access the content






