Antimatter Launches World’s First Vertically Integrated Neocloud for AI Inference


Published: 30 Apr 2026

Author: Gautam Mahajan

Share : linkedin twitter facebook

Antimatter, A new type of neocloud purpose-built for the fragmented AI economy, announced its launch with the help of strategic moves by three companies together namely, Data factory, which is the U.S.-based energy and power infrastructure, Policloud, a modular micro data center network, and Hive net, a distributed cloud provider. They introduced the industry’s first fully integrated AI infrastructure platform for energy source, physical hardware and cloud software. It is designed to serve the rapidly expanding global demand for AI inference at a fraction of hyperscale costs and faster time to market.

Antimatter

Antimatter is deploying capital with the highest rate to develop a first global neocloud network optimized for AI inference. 300 million Euro has been secured by the company so far to fund the deployment of its first 100 Policloud units by the end of 2027. It shows 40,000 GPUs and nearly 3.6 exaFLOPS of active compute capacity.

According to Precedence Research, the AI inference-as-a-service market size accounted for USD 18.60 billion in 2025 and is predicted to increase from USD 23.40 billion in 2026 to approximately USD 197.50 billion by 2035 expanding at a CAGR of 26.80% from 2026 to 2035 due to the increasing demand for cost-effective, scalable, and high-performance deployment of ML and GenAI models, real-time data analysis in the sectors like healthcare, and finance.

A co-founder and CEO of Antimatter said, In the age of AI, intelligence is not the bottleneck, energy is. The infrastructure built for the first era of cloud and AI was designed around a centralized scale. But the inference era requires a different model: more distributed, faster to deploy, and sovereign by design. That is the infrastructure Antimatter is building.

A recent report by Precedence Research highlights that the AI inference-as-a-service market is benefiting from the higher scalability and reduced complexity offered by AI inference platforms and rapid adoption of hybrid and multi-cloud models to balance strict data safety needs and scalability of public cloud AI services.

Latest News