ASUS has unveiled its latest AI server lineup at COMPUTEX 2024, featuring a variety of solutions for applications ranging from generative AI to advanced storage; let’s find out what they are.

ASUS ESC AI POD with NVIDIA GB200 NVL72 COMPUTEX 2024 1As the title suggested, the highlight of this exhibit is the ASUS ESC AI POD (ESC NM2N721-E1 with NVIDIA GB200 NVL72) that is developed in collaboration with NVIDIA. This scale-up, larger-form-factor system is powered by the NVIDIA Blackwell architecture, featuring the NVIDIA GB200 Grace Blackwell Superchip and fifth-generation NVIDIA NVLink technology.

Designed for high-performance and efficient AI computing, the full rack solution integrates GPUs, CPUs, and switches for rapid communication, enhancing trillion-parameter LLM training and real-time inference. It also supports both liquid-to-air and liquid-to-liquid cooling solutions for optimal performance.

Furthermore, the brand also introduced a range of servers based on the NVIDIA MGX architecture, namely the 2U ESC NM1-E1 and ESC NM2-E1 servers with NVIDIA GB200 NVL2, and the 1U ESR1-511N-M1, to tackle the increased demands of generative AI applications. Powered by the NVIDIA GH200 Grace Hopper Superchip, these servers are designed for large-scale AI and HPC applications, enabling efficient data transfers, deep-learning training and inference, data analytics, and high-performance computing.

Concluding the list are ASUS’s latest HGX servers, featuring the ESC N8 that runs on 5th Gen Intel Xeon Scalable processors and NVIDIA Blackwell Tensor Core GPUs, and the ESC N8A powered by AMD EPYC 9004 processors and NVIDIA Blackwell GPUs. Packing improved thermal solutions for optimal performance and reduced PUE, these servers are tailored for AI and data science, as they offer a unique one-GPU-to-one-NIC configuration to maximize throughput for compute-intensive tasks.

Facebook
Twitter
LinkedIn
Pinterest

Related Posts

Subscribe via Email

Enter your email address to subscribe to Tech-Critter and receive notifications of new posts by email.

Leave a Reply