Skip to Content
Tech

Beyond Data Lakehouses: Delivering the Last Mile of Trusted, Transient Data for Insurers

Paul Moxon | August 15, 2025

On This Page
a lakehouse on a lake drawn in blue and orange vertical binary code lines

Insurance companies have long relied on data to assess risk, price products, detect fraud, and comply with regulatory requirements. But the challenge today isn't just collecting more data—it's taming the data tide created by constant, rapid changes in information flowing into insurers from countless sources.

Many insurers have invested in modern data lakehouses to store and analyze huge volumes of data more cost-effectively. Lakehouses blend the flexibility of data lakes with the structure of data warehouses, supporting large-scale analytics and artificial intelligence (AI). However, lakehouses alone often struggle to deliver the last mile of data delivery: fast, trusted, and business-ready insights at the moment they're needed.

The Role of Transient Data

The main reason for this last-mile challenge is the growing importance of transient data—short-lived, fast-changing information that may not be permanently stored but is critical for timely decisions. In insurance, this transient data includes the following.

  • Real-time claims updates during a catastrophe, where minutes matter for not only relevant customer communication but also loss reserves
  • Internet of Things (IoT) sensor feeds for usage-based insurance, in which streaming data affects premium adjustments
  • Market data shifts, such as daily foreign exchange (FX) rates, that are impacting International Financial Reporting Standards (IFRS) 17 calculations and financial reporting
  • Instant fraud signals, such as behavioral anomalies during online claims submissions
  • Digital customer interactions, in which the next-best action depends on data from the latest clickstream

Individually, these data points might seem small, but they represent the data that fuels competitive advantage. Yet this transient data can overwhelm traditional data architectures. Loading it all into a lakehouse for batch processing often adds delays that insurers can't afford.

Why Data Lakehouses Are Not Enough

Data lakehouses have proven to be valuable for storing massive historical data sets, supporting advanced analytics and AI training, and reducing the costs of traditional data warehouses. However, they have a few limitations. For example, they rely on data to be loaded into the lakehouse via batch-oriented data integration and ingestion pipelines, but often, time-sensitive transient data use cases can't be held up by batch schedules. Also, access to data within a data lakehouse can be fragmented, as many insurers still hold data in siloed systems, separate software as a service (SaaS) platforms, and external providers. Finally, raw data in a lakehouse often lacks business definitions and context, making it harder to produce trusted, actionable insights quickly, which is when such insights are often needed.

Logical Data Management: Delivering the Last Mile

Logical data management solves this problem by introducing a virtual data layer that connects data across all systems—lakehouse included—without requiring the data to first be physically moved. This capability enables insurers to do the following.

  • Instantly access transient data. Whether it's streaming IoT data, daily financial market feeds, or claims updates, logical data management enables insurers to query data in place, delivering insights in real time.
  • Publish trusted data products. Logical data management transforms raw, scattered data into reusable, governed data products tailored for specific business needs—whether for IFRS 17 reporting, fraud detection, or underwriting decisions.
  • Tame the data tide. Instead of trying to load every data change into the lakehouse, logical data management platforms provide a unified logical view across systems, reducing data duplication and unnecessary data movement.

For example, an insurer leveraging logical data management can combine real-time telematics data with policy records for dynamic pricing decisions, merge live claims feeds with historical patterns to detect fraud faster, and integrate daily FX rates with accounting systems for timely IFRS 17 compliance, all without having to wait for physical data pipelines to refresh.

Data Lakehouses + Logical Data Management: A Strategic Combination

Data lakehouses and logical data management work best together. Lakehouses remain the ideal place for large-scale analytics and AI workloads. Meanwhile, logical data management enables insurers to access both historical and transient data, delivering the trusted insights needed in the last mile for operational and regulatory success.

Together, they empower insurers to accomplish the following.

  • Respond rapidly to emerging risks.
  • Meet regulatory deadlines with confidence.
  • Create personalized customer experiences based on the most current data.
  • Reduce operational risks tied to inconsistent or delayed data.

When decisions are time-critical and trust is nonnegotiable, logical data management enables insurers to tame the data tide and deliver the last mile of data that truly matters. Insurers evaluating their data strategies should ask: Can we deliver the last mile of trusted, transient data our teams need for risk decisions and compliance? If the answer is "no," logical data management might be the key.


Opinions expressed in Expert Commentary articles are those of the author and are not necessarily held by the author's employer or IRMI. Expert Commentary articles and other IRMI Online content do not purport to provide legal, accounting, or other professional advice or opinion. If such advice is needed, consult with your attorney, accountant, or other qualified adviser.