For years, enterprise data has existed in two separate domains: operational systems (OLTP), which run day-to-day applications, and analytical systems (OLAP), which deliver insights. That divide was created by old infrastructure limits, but it also shaped how organisations worked—doubling effort, isolating teams and slowing decisions.
While developers focused on keeping applications running, analysts worked with delayed or incomplete data. Even as cloud infrastructure has removed many of the original technical barriers, the divide persists, and is now upheld more by legacy software, vendor lock-in and inertia than genuine necessity. It’s time to challenge this model and how we manage data.
Access deeper industry intelligence
Experience unmatched clarity with a single platform that combines unique data, AI, and human expertise.
Once data lands in a transactional system, it becomes hard and expensive to move. Proprietary storage formats and tightly coupled architectures trap data inside operational systems and block integration with modern data and AI workflows. Organisations end up working around infrastructure that no longer fits their needs.
Today’s AI agents and applications require fast and reliable access to live data.
But when operational data is stuck in legacy environments, it becomes much harder to enable automation, personalisation or real-time decision-making. This not only slows development, but it also limits responsiveness, scalability and the ability to extract timely insights from rapidly growing data volumes.
More organisations are now seeking alternatives that remove these constraints and offer a unified, responsive foundation for modern data-driven systems.
From fragmentation to unification
The original OLTP/OLAP split made sense when compute was limited. Running analytics alongside operational workloads simply wasn’t viable. But with cloud-native storage, such as open table formats, organisations no longer need separate pipelines to make operational data available for analytics. And yet many enterprises still rely on architectures where operational data must be extracted, transformed and loaded before it can be analysed, introducing delays, duplication and overhead.
US Tariffs are shifting - will you react or anticipate?
Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.
By GlobalDataThe impact is significant. Analysts base decisions on outdated information. Developers spend time maintaining fragile pipelines instead of building new capabilities. Innovation slows and opportunity costs mount.
In response, more organisations are moving to unified data architectures, where operational and analytical workloads share a single data foundation, utilising engines optimised for each specific task. This reduces complexity, improves efficiency and enables faster iteration—all critical benefits in the AI era.
Agentic AI changes the data ecosystem
AI agents are driving a step-change in application development. These intelligent systems can perform complex, multi-step tasks by reasoning over proprietary data and interacting with other components in real time. With the ability to coordinate decisions and actions throughout an entire data ecosystem, these technologies are evolving beyond basic automation to become fundamental parts of organisational operations.
To support this shift, infrastructure must evolve. AI agents need low-latency access to live data, smooth integration across systems and modern development workflows. A new concept known as a lakebase tackles these problems head-on. It delivers the reliability of an operational database and the openness of a data lake in one place, so teams can run transactions and analytics without juggling systems. It gives fast access to data, scales easily through separated storage and compute, and fits modern development habits like instant branching and versioning. Built for today’s AI-driven workloads, a lakebase lets both developers and AI agents build, test, and ship applications quickly, without the constraints of old OLTP setups.
Looking ahead, the trajectory points clearly towards openness and convergence. Organisations need infrastructure that breaks down silos, supports both analytical and operational needs and gives developers the flexibility to move fast without compromise.
Traditional OLTP systems, with their rigid architectures and heavy vendor lock-in, are increasingly at odds with this direction. What’s needed is a new approach; open, interoperable platforms that unify workloads and support the performance, scale and agility required by AI-native applications.
This transition won’t happen overnight. But organisations that act now—reducing fragmentation, embracing openness and designing for intelligent system—will be better positioned to lead in the AI era.
