Reflection AI, a startup founded by former Google DeepMind researchers, has launched with $130m in early-stage funding.

The funding was secured over two rounds. The initial $25m seed investment was led by Sequoia Capital and CRV.

CRV also co-led the subsequent $105m Series A round with Lightspeed Venture Partners.

Other notable investors include Nvidia’s venture capital arm, LinkedIn co-founder Reid Hoffman, and Scale AI CEO Alexandr Wang.

The company, valued at $555m, aims to develop superintelligence, defined as an AI system capable of performing most computer-related tasks, Bloomberg reported.

As a first step, Reflection AI plans to create an autonomous programming tool, leveraging technical building blocks for potential superintelligence.

Reflection AI is spearheaded by co-founders Misha Laskin and Ioannis Antonoglou.

Announcing the launch in a LinkedIn post, ReflectionAI CEO Laskin said: “I’m launching ReflectionAI with my friend and co-founder Ioannis Alexandros Antonoglou.

“We’ve assembled a team that drove major breakthroughs in AI over the last decade including AlphaGo, AlphaZero, and Gemini. At Reflection, we’re building superintelligent autonomous systems. Starting with autonomous coding.”

Laskin contributed to the training workflow for Google’s Gemini large language model series while Antonoglou focused on post-training systems of Gemini, optimising LLMs to enhance output quality, the news publication added.

The company plans to initially develop AI agents to automate specific programming tasks.

These agents will scan code for vulnerabilities, optimise memory usage, and test for reliability issues.

Additionally, Reflection AI’s technology will generate documentation and help manage application infrastructure.

Reflection AI intends to use LLMs and reinforcement learning to power its software thereby simplifying dataset creation to eliminating the need for explanations accompanying each data point.

The company is also exploring new AI architectures, potentially moving beyond the Transformer neural network architecture.

A job posting indicates that Reflection AI plans to train its models using tens of thousands of graphics cards while working on “vLLM-like platforms for non-LLM models”, which reduce memory usage of language models.