- US - English
- China - 简体中文
- India - English
- Japan - 日本語
- Malaysia - English
- Singapore - English
- Taiwan – 繁體中文
Key takeaways
Micron DDR5 128GB memory and 5th Gen AMD EPYCTM CPUs with 128 cores offer a powerful solution for AI and database infrastructures. This solution effectively handles the computational complexity of large model sizes and vast datasets characteristic of AI, machine learning, data analytics and IMDB applications.
SVM 1.3x higher bandwidth use due to higher memory clock speeds on Micron® 128GB RDIMMs, along with increased core counts on 5th Gen AMD EPYC™ processors.1 |
IMDB REDIS 30% |
SAP SD 201k users Improves SAP Sales and Distribution (SAP SD) benchmark. With 30% higher memory capacity and 30% higher clock, SAP SD scores higher than previous six-socket performance score. |
1 Pairing Micron® 128GB DDR5 RDIMMs with 5th Gen AMD EPYC™ processors (codenamed Turin).
2 As compared to Micron® 96GB DDR5 RDIMMs and 4th Gen AMD EPYCTM processors with 96 cores (codenamed Genoa).
Modern data centers require high-capacity memory and significant processing power to run a variety of workloads that support enterprise artificial intelligence (AI) and machine learning (ML) initiatives. Pairing Micron® DDR5 128GB RDIMMs with 5th Gen AMD EPYC™ processors delivers outstanding performance and capability for the wide range of server workloads that data centers run, including powering large-scale cloud-based infrastructures and hosting demanding business applications.
In this blog, we showcase the benchmark test results of AI/ML support vector machine (SVM), Redis in-memory database (IMDB), and SAP SD, where our hardware setup compared the following:
- Micron DDR5 128GB DIMMs with 5th Gen AMD EPYC processors (codenamed Turin)
- Micron DDR5 96GB DIMMs with 4th Gen AMD EPYC processors (codenamed Genoa)
Our testing shows that performance for SVM, SAP SD and Redis IMDB are enhanced by Micron DDR5 RDIMMs with higher capacity (128GB) and bandwidth (capability up to 8000 MT/s).
Hardware and system setup
The details of the system architecture are shown in the table below. We compared two systems, denoted in this blog as A and B. System A was composed of 4th Gen AMD EPYC processors (96 cores) and Micron 96GB DDR5 DIMMs, and system B combined 5th Gen AMD EPYC processors (128 cores) and Micron 128GB DDR5 DIMMs. Both systems support a 12GB/core configuration, with the 128-core CPU having 128GB across 12 memory channels and the 96-core CPU having 96GB across 12 memory channels.
System A | System B | |
---|---|---|
Hardware | 4th Gen AMD EPYC™ processors (codenamed Genoa) | 5th Gen AMD EPYC™ processors (codenamed Turin) |
Memory | Micron 96GB DDR5 4800 MT/s Dual rank, 12 channels |
Micron 128GB DDR5 6400 MT/s Dual rank, 12 channels |
CPU | Dual-socket AMD EPYC 9654 (96-core) | Dual-socket AMD EPYC (128-core) |
Storage (for SVM) | Micron 9400 8TB (3) | Micron 9400 8TB (3) |
AI/ML support vector machine (SVM)
SVM is a type of machine learning algorithm popularly used to preprocess datasets for many data science services deployed in the cloud. In our testing, we used a 2TB dataset with the Intel Hi-Bench architecture and SparkML engine for data processing.
Faster execution time
For SVM, system B achieved an execution time 30% faster than system A. This is primarily due to the higher capacity and bandwidth offered by the 128GB memory modules, the higher core count of the processor on system B, and the effective bandwidth use.
Higher bandwidth use
Our results show 1.3 times higher bandwidth use for SVM on system B than for system A due to the faster memory clock (6400 MT/s vs . 4800 MT/s) and the additional Zen5 cores enabled by the 5th Gen AMD EPYC processors with 128 cores.1
The higher capacity of system B (128GB vs. 96GB) enables the SVM to store more data in memory, which minimizes storage input/out. We held the memory capacity per core for both configurations constant at 12GB/core. This approach enabled us to isolate the effect of additional compute capacity and increased clock speed (memory) over the baseline configuration (system A).
Redis
Redis is a fast, in-memory database (IMBD) used to store and access data for applications that require low latency. Memtier benchmarks Redis with numerous set:get operations, mimicking a multithreaded and multiclient execution model.
We achieve a 1.2 times speedup when running Redis on system B (128GB and 128 cores).2 Furthermore, this same combination improves average latency by 30% and p99 latency by 60%. Compared to previous generations of AMD EPYC processors, the higher core counts — like the 128 cores in the 5th Gen — can better use both the higher capacity and bandwidth of Micron 128GB DIMMs. The additional cores enable more throughput, which effectively allows enterprise data centers to serve more users.
SAP Sales and Distribution (SAP SD)
Systems Application and Products (SAP) is a widely used software suite for enterprise resource planning (ERP). It’s made up of multiple subcomponents as part of the SAP ecosystem. The component comprising all the operations and processes for SAP Sales and Distribution (SAP SD) — when benchmarked with the Dell PowerEdge R6725 server and equipped with Micron DDR5 128GB RDIMMs and 5th Gen AMD EPYC processors — set a new performance world record of 201,000 users for the SAP SD benchmark on a two-socket system. That’s higher than the best six-socket score. The higher number of benchmark users indicates the performance advantage of using Micron memory together with 5th Gen AMD EPYC processors on Dell PowerEdge servers for database use cases. For details, check out Dell's blog.
AI in data centers
High-capacity memory — along with high memory bandwidth and low latency — is key for data center infrastructures to effectively handle the computational complexity, large model sizes and vast datasets characteristic of AI, machine learning, data analytics and IMDB applications. Micron DDR5 128GB memory modules paired with 5th Gen AMD EPYC processors offer a powerful solution for these environments, as shown by our workload results.
If you’re upgrading your enterprise or AI infrastructure or HPC environment and want to locate the right DDR5 configuration, contact Micron Sales Network.