Ainekko, a startup pioneering open, software-defined AI infrastructure, today announced the launch of AI Foundry. This open-source platform extends open-source principles all the way down to silicon. The initiative includes silicon RTL, emulation tools, developer APIs, and community resources, all now available on aifoundry.org. This marks a major milestone in making advanced AI hardware and tooling accessible, modular, and community-driven. Ainekko co-founder Tanya Dadasheva emphasized the platform’s potential to revolutionize AI infrastructure, comparing its impact to Linux and Kubernetes. The launch addresses growing demands for energy-efficient, affordable alternatives to closed AI platforms.
Open-Source Principles Extend to AI Hardware
Ainekko today launched AI Foundry, an open-source platform extending open principles to AI hardware development. This initiative reimagines how hardware and software co-evolve, making chip-level innovation accessible and community-driven, according to the company announcement. By open-sourcing modular building blocks, Ainekko empowers developers to construct everything from lightweight edge devices to high-performance AI inference systems, fundamentally changing the landscape of AI infrastructure.
The platform includes silicon RTL, emulation tools, developer APIs, and community resources, all readily available today. Ainekko’s approach addresses the growing demand for energy-efficient and affordable AI alternatives, as electricity demand from data centers is projected to more than double by 2030, according to the IEA. Building on this, AI Foundry offers a modular, open, and software-defined foundation for building the next generation of inference systems, promising significant improvements in both performance and energy efficiency. Tanya Dadasheva from Ainekko emphasized that this mirrors the impact of Linux on operating systems and Kubernetes on cloud infrastructure, bringing a similar spirit of openness to AI hardware.
Roman Shaposhnik from Ainekko stated the company is committed to building a movement, not just a product, and invites developers, researchers, and partners to collaboratively shape the future of AI infrastructure. The platform’s foundation includes highly energy-efficient and programmable RISC-V Minion cores, complete with attached accelerators and configurable on-die SRAM. This open ecosystem will support community collaboration, contributions, and adoption across a wide range of edge and embedded AI applications, from robotics and retail to industrial automation and smart devices.
Modular Building Blocks for Next-Gen AI Systems
AI Foundry’s modular approach centers around RISC-V Minion cores, designed for efficient AI acceleration. These cores are paired with configurable on-die SRAM and the Next-Generation Esperanto Mesh Interface (NEMI), enabling flexible system design. According to the company, this architecture allows developers to tailor hardware configurations to specific application demands, ranging from low-power edge devices to high-performance inference systems. This granular control over resources represents a significant departure from traditional, fixed-function AI accelerators.
Building on this foundation, Ainekko provides a collection of pre-designed building blocks and developer APIs. These tools streamline the integration of specialized accelerators and memory configurations, accelerating development cycles. The platform also supports emulation tools, allowing developers to test and refine designs before committing to silicon. Furthermore, the NEMI interface facilitates high-bandwidth, low-latency communication between cores and accelerators, crucial for demanding AI workloads. Jayesh Iyer from Esperanto Technologies highlighted the importance of interconnectivity in maximizing system performance.
This modularity addresses the growing demand for energy-efficient AI infrastructure, particularly as electricity consumption from data centers continues to rise. The company announced that AI Foundry’s architecture aims to reduce power consumption while maintaining high performance. Tanya Dadasheva from Ainekko emphasized that the platform empowers developers to create customized solutions that optimize both energy efficiency and cost. This approach is particularly relevant for applications in robotics, industrial automation, and smart devices, where power constraints are often critical.
Building on this open-source foundation, Ainekko’s AI Foundry could enable a new wave of customized AI solutions, particularly for edge and embedded applications. For industries reliant on adaptable, efficient AI, this represents a significant shift toward accessible hardware innovation. According to Tanya Dadasheva from Ainekko, the platform extends the composable spirit of projects like Linux and Kubernetes down to the silicon level.
The implications extend beyond immediate applications, potentially fostering a more collaborative and rapid development cycle for specialized AI systems. Roman Shaposhnik from Ainekko envisions a future where developers have unprecedented control over their AI infrastructure, ultimately accelerating progress across diverse fields like robotics and industrial automation.
