STOCK TITAN

COMPAL Optimizes AI Workloads with AMD Instinct MI355X at AMD Advancing AI 2025 and International Supercomputing Conference 2025

Rhea-AI Impact
(Neutral)
Rhea-AI Sentiment
(Very Positive)
Tags
AI
Compal Electronics unveiled its latest high-performance server platform SG720-2A/OG720-2A at AMD Advancing AI 2025 and ISC 2025, featuring AMD Instinct MI355X GPU architecture. The platform supports up to eight AMD Instinct MI350 Series GPUs and offers both single-phase and two-phase liquid cooling configurations. Key features include 288GB HBM3E memory, 8TB/s bandwidth, PCIe Gen5 connectivity, and AMD Infinity Fabric for multi-GPU orchestration. The server platform is designed for next-generation generative AI and large language model training, offering compatibility with mainstream AI frameworks like ROCm, PyTorch, and TensorFlow. The system supports both EIA 19" and ORv3 21" rack standards and includes a two-phase liquid cooling solution developed in partnership with ZutaCore.
Loading...
Loading translation...

Positive

  • Advanced technical specifications with support for up to eight AMD Instinct MI350 Series GPUs
  • Dual cooling architecture offering both air and liquid cooling options for enhanced thermal efficiency
  • High-performance capabilities with 288GB HBM3E memory and 8TB/s bandwidth
  • Comprehensive compatibility with major AI frameworks and rack standards

Negative

  • None.

SAN JOSE, Calif., June 12, 2025 /PRNewswire/ -- As AI computing accelerates toward higher density and greater energy efficiency, Compal Electronics (Compal; Stock Ticker: 2324.TW), a global leader in IT and computing solutions, unveiled its latest high-performance server platform: SG720-2A/ OG720-2A at both AMD Advancing AI 2025 in the U.S. and the International Supercomputing Conference (ISC) 2025 in Europe. It features the AMD Instinct™ MI355X GPU architecture and offers both single-phase and two-phase liquid cooling configurations, showcasing Compal's leadership in thermal innovation and system integration. Tailored for next-generation generative AI and large language model (LLM) training, the SG720-2A/OG720-2A delivers exceptional flexibility and scalability for modern data center operations, drawing significant attention across the industry.

With generative AI and LLMs driving increasingly intensive compute demands, enterprises are placing greater emphasis on infrastructure that offers both performance and adaptability. The SG720-2A/OG720-2A emerges as a robust solution, combining high-density GPU integration and flexible liquid cooling options, positioning itself as an ideal platform for next-generation AI training and inference workloads.

Key Technical Highlights:

  • Support for up to eight AMD Instinct MI350 Series GPUs (including MI350X / MI355X): Enables scalable, high-density training for LLMs and generative AI applications.
  • Dual cooling architecture – Air & Liquid Cooling: Optimized for high thermal density workloads and diverse deployment scenarios, enhancing thermal efficiency and infrastructure flexibility. The two-phase liquid cooling solution, co-developed with ZutaCore®, leverages the ZutaCore® HyperCool® 2-Phase DLC liquid cooling solution, delivering stable and exceptional thermal performance, even in extreme computing environments.
  • Advanced architecture & memory configuration: Built on the CDNA 4 architecture with 288GB HBM3E memory and 8TB/s bandwidth, supporting FP6 and FP4 data formats, optimized for AI and HPC applications.
  • High-speed interconnect performance: Equipped with PCIe Gen5 and AMD Infinity Fabric™ for multi-GPU orchestration and high-throughput communication, reducing latency and boosting AI inference efficiency.
  • Comprehensive support for mainstream open-source AI stacks: Fully compatible with ROCm™, PyTorch, TensorFlow, and more—enabling developers to streamline AI model integration and accelerate time-to-market.
  • Rack compatibility & modular design: Supports EIA 19" and ORv3 21" rack standards with modular architecture for simplified upgrades and maintenance in diverse data center environments.

Compal has maintained a long-standing, strategic collaboration with AMD across multiple server platform generations. From high-density GPU design and liquid cooling deployment to open ecosystem integration, both companies continue to co-develop solutions that drive greater efficiency and sustainability in data center operations.

"The future of AI and HPC is not just about speed, it's about intelligent integration and sustainable deployment. Each server we build aims to address real-world technical and operational challenges, not just push hardware specs. SG720-2A/ OG720-2A is a true collaboration with AMD that empowers customers with a stable, high-performance, and scalable compute foundation." said Alan Chang, Vice President of the Infrastructure Solutions Business Group at Compal.

The series made its debut at Advancing AI 2025 and was concurrently showcased at the ISC 2025 in Europe. Through this dual-platform exposure, Compal is further expanding its global visibility and partnership network across the AI and HPC domains, demonstrating a strong commitment to next-generation intelligent computing and international strategic development.

For more information, visit the website: https://www.compalserver.com

AMD, Instinct, ROCm, and combinations thereof are trademarks of Advanced Micro Devices, Inc. Other names are for informational purposes only and may be trademarks of their respective owners.

About Compal

Founded in 1984, Compal is a leading manufacturer in the notebook and smart device industry, creating brand value in collaboration with various sectors. Its groundbreaking product designs have received numerous international awards. In 2024, Compal was recognized by CommonWealth Magazine as one of Taiwan's top 6 manufacturers and has consistently ranked among the Forbes Global 2000 and Fortune Global 500 companies. In recent years, Compal has actively developed emerging businesses, including cloud servers, auto electronics, and smart medical, leveraging its integrated hardware and software R&D and manufacturing capabilities to create relevant solutions. More information, please visit https://www.compal.com

FAQ

What are the key features of Compal's new SG720-2A/OG720-2A server platform?

The platform features AMD Instinct MI355X GPU architecture, supports up to eight AMD Instinct MI350 Series GPUs, offers dual cooling options (air and liquid), includes 288GB HBM3E memory with 8TB/s bandwidth, and supports PCIe Gen5 connectivity.

How does the cooling system work in Compal's new server platform?

The platform offers both single-phase and two-phase liquid cooling configurations, with the two-phase solution co-developed with ZutaCore using their HyperCool 2-Phase DLC technology for optimal thermal performance.

What AI frameworks are supported by Compal's SG720-2A/OG720-2A?

The platform is fully compatible with ROCm, PyTorch, TensorFlow, and other mainstream open-source AI stacks.

What rack standards does the Compal SG720-2A/OG720-2A support?

The server platform supports both EIA 19" and ORv3 21" rack standards with a modular architecture for easy upgrades and maintenance.

What is the primary application of Compal's new server platform?

The platform is primarily designed for next-generation generative AI and large language model (LLM) training, offering high-density GPU integration and flexible cooling options for modern data center operations.
Compal Elect Gdr

OTC:CMPFF

CMPFF Rankings

CMPFF Latest News

CMPFF Stock Data

871.43M