Daily Newsletter

01 January 1970

Daily Newsletter

01 January 1970

Google introduces Nested Learning

The method conceptualises neural models as collections of interconnected optimisation problems functioning across multiple levels.

November 10 2025

Google Research has introduced Nested Learning, a machine learning approach designed to address the problem of catastrophic forgetting in continual learning.

This new method has been detailed in the paper “Nested Learning: The Illusion of Deep Learning Architectures” at NeurIPS 2025.

It frames neural models as sets of interconnected optimisation problems that operate at multiple levels, each with distinct context flows and update frequencies.

Catastrophic forgetting remains a recognised limitation in current large language models (LLMs), where adaptation to new tasks can result in the loss of previously acquired knowledge.

Conventional strategies tend to treat architectural design and optimisation algorithms as separate entities.

Nested Learning challenges this separation by assuming that both are different layers within a unified system of nested optimisation tasks, according to Google.

In this view, learning occurs across a spectrum of modules, each managing its own internal information and update cycle.

Google Research’s team tested these principles by developing Hope, a self-modifying recurrent architecture built on Titans memory modules but augmented with continuum memory systems (CMS).

The CMS structure allows for variable update rates across memory components, aiming to more closely align with patterns observed in human neuroplasticity.

Google reported that Hope produces lower perplexity and higher accuracy than standard transformers and recurrent models on several public language modelling and reasoning benchmarks.

Nested Learning generalises both optimisers and key architectural elements as associative memory systems, formalising their role as mapping functions between data points and error signals or sequence relationships.

The Hope architecture exploits nested optimisation by enabling memory components to update at multiple frequencies, forming what the authors refer to as a continuum memory system.

This allows for self-referential modification and integration of new data without discarding existing information.

Google is seeking further exploration of this approach within the broader machine learning community.

Recently, the US Department of Justice (DOJ) concluded its antitrust review of Google’s $32bn Wiz acquisition, removing a major regulatory barrier for the Alphabet subsidiary as it seeks to advance in the cloud security market.

The Federal Trade Commission (FTC) confirmed on its website that early termination for the DOJ antitrust review was granted on 24 October 2025.

Uncover your next opportunity with expert reports

Steer your business strategy with key data and insights from our latest market research reports and company profiles. Not ready to buy? Start small by downloading a sample report first.

Newsletters by sectors

close

Sign up to the newsletter: In Brief

Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Thank you for subscribing

View all newsletters from across the GlobalData Media network.

close