Ampere® today announced two major milestones in its mission to deliver open, scalable, and efficient AI and cloud workload infrastructure: the launch of the Ampere Systems Builders program and expanded availability of 12-channel platforms based on AmpereOne® M processors. These developments highlight Ampere’s commitment to ecosystem collaboration and modular innovation, advancing AI inference infrastructure to meet growing demand.
Ampere Systems Builders Program: Driving Open, Modular Platform Innovation
- Collaborative Ecosystem: The program unites industry leaders including Broadcom, Supermicro, Giga Computing, ASRock Rack, Jabil, and Rebellions to co-develop modular, standards-based AI and cloud-native platforms.
- Open Standards and Flexibility: Based on shared architectures like the DC-MHS standard, the program fosters interoperability, faster time to market, and customization to meet diverse AI workload demands.
- Turnkey Solutions: Members can integrate hardware, software, and services, accelerating deployment of AI inference infrastructure optimized for scalability and efficiency.
Jeff Wittich, Chief Product Officer at Ampere, emphasized,
“As the industry accelerates AI Compute build-out, maintaining open, efficient, and scalable principles is critical. Ampere Systems Builders enables delivery of high-performance infrastructure ready for AI’s future.”
Expanding 12-Channel AmpereOne M Platforms for Data-Intensive AI Workloads
- Next-Gen Platform Rollout: Following AmpereOne M’s December 2024 release, new 12-channel platforms broaden market reach with exceptional memory bandwidth and performance-per-watt optimized for AI inference.
- Available and Upcoming Systems:
- Giga Computing’s 12-channel platform is commercially available, delivering scalable, high-density AI and cloud-native compute.
- Jabil’s DC-MHS host processing module based on AmpereOne M will sample by Q3 2025, designed for edge to hyperscale environments.
- ASRock Rack has developed a 12-channel AmpereOne M platform, further expanding modular compute options.
Industry Perspectives
- ASRock Rack: Weishi Sa, President, highlighted the opportunity to innovate scalable AI infrastructure through Ampere collaboration.
- Broadcom: Jas Tremblay, VP and GM of Data Center Solutions, emphasized the importance of open modular designs for growing AI workloads.
- Giga Computing: Vincent Wang, Sales VP, showcased their unique 2U server stacking dual AmpereOne M sockets for massive memory density, ideal for large language models like LLaMa.
- Jabil: Ed Bailey, CTO and SVP, reaffirmed their commitment to agile, scalable AI infrastructure aligned with open standards.
- Supermicro: Michael Clegg, VP & GM, Edge, detailed MegaDC modular systems supporting CPU-only and CPU-plus-accelerator configurations tailored for high-volume AI inference.
Looking Ahead: Building an Open AI Infrastructure Future
Ampere’s dual announcements reinforce its vision for open, scalable AI infrastructure that meets the complex demands of modern AI inference. These systems will be showcased at Computex 2025, spotlighting innovation through ecosystem collaboration and modular design principles.