Multiverse Computing and Axelera AI launched a strategic collaboration to deploy artificial intelligence models on energy-efficient edge devices, potentially expanding AI capabilities beyond traditional data centers. The collaboration will integrate Multiverse Computing’s compressed AI models directly into Axelera’s Metis and future Europa platforms, utilizing Axelera AI’s Titania chiplet, allowing complex AI workloads to run on compact hardware. This development aims to reduce energy consumption and enhance data privacy by enabling local model fine-tuning and cost-effective scaling of AI deployments. “Our mission is to make AI radically more efficient and accessible,” said Enrique Lizaso, Co-founder & CEO of Multiverse Computing, highlighting the partnership’s focus on bringing powerful reasoning to devices where latency is critical. The initiative also supports broader European strategies to strengthen technological sovereignty in both semiconductors and artificial intelligence.
CompactifAI Compression Achieves 95% LLM Reduction with Minimal Precision Loss
Multiverse Computing has achieved a 95% compression of large language models (LLMs) using its CompactifAI technology, with a precision loss of only 2-3%. This level of reduction promises to redefine the possibilities for artificial intelligence deployment on resource-constrained devices. The company’s origins in quantum software and AI research directly informed the development of CompactifAI, positioning it as a leader in compressed AI model provision and enabling applications previously limited by computational demands. This compression reduces file sizes and fundamentally alters the energy profile and accessibility of advanced AI. The implications of this technology extend beyond shrinking model footprints; it allows complex AI workloads traditionally requiring substantial datacenter infrastructure to operate on compact, energy-efficient edge devices, and this accessibility is further enhanced by the ability to fine-tune models locally, a feature crucial for preserving data privacy and adhering to increasingly stringent regulatory requirements.
This advancement is particularly significant within the context of European technological independence, as Multiverse Computing is collaborating with Axelera AI to integrate these compressed models into their Metis and future Europa platforms, including Axelera AI’s Titania chiplet. According to Ekaterina Zaharieva, Commissioner for Startups, Research and Innovation, “Europe’s competitiveness in the next decade will depend on our ability to combine chips with trustworthy, efficient AI.” The partnership aims to reduce reliance on non-European AI infrastructure while simultaneously empowering regional industries and institutions with locally designed capabilities, and the resulting solution will also enable more cost-effective scaling of AI deployments across large device fleets.
Metis & Europa Platforms Enable Edge AI Deployment for European Sovereignty
The current push to deploy artificial intelligence at the edge, processing data on devices rather than relying on distant servers, is creating a demand for increasingly efficient models and specialized hardware. Large language models and other advanced AI systems offer impressive capabilities, but their computational demands often preclude their use in resource-constrained environments. This necessitates a shift toward optimization strategies that can deliver performance without excessive energy consumption or infrastructure requirements. Several companies are addressing this challenge, and a collaboration announced last year between Multiverse Computing and Axelera AI focuses specifically on bringing compressed AI models to Axelera’s Metis and forthcoming Europa platforms, with a clear emphasis on bolstering European technological independence.
The collaboration will facilitate the development of ultra-efficient inference and fine-tuning engines, allowing organizations to run complex AI models on low-power devices, preserve data privacy through local model adjustments, and scale AI deployments more cost-effectively across large device fleets. Multiverse Computing’s CompactifAI technology, capable of compressing large language models by up to 95% with minimal precision loss, is central to this effort, and Axelera AI’s Titania chiplet will also play a key role in enabling these capabilities.
Axelera AI is committed to delivering the most powerful and efficient AI inference solutions to the world.
Fabrizio Del Maffeo, co-founder & CEO of Axelera AI
Source: https://multiversecomputing.com/
