Optimised for Hyperscale Data Centre, Infrastructure, Front End Server and Enterprise Server. Ampere® Altra® Processor. Dual Redundant 800W power supply. 10x 2.5" hot-swap NVMe drive bays.
Ampere-powered MegaDC servers deliver AI Inference performance for environments from Edge to Cloud with three purpose-built server models and recommended configurations, with or without the use of AI accelerators.
For better flexibility, all MegaDC servers support open standards including OpenBMC for customised control over functionality and versioning, Advanced I/O Modules (AIOM) that support OCP V3.0 SFF cards, as well as common redundant power supplies (CRPS) in 2U systems. Ampere Cloud Native Processors with Ampere Optimised AI Frameworks are uniquely positioned to offer GPU-Free AI Inference at performance levels that meet client needs of all AI functions be it generative AI, NLP, recommender engines, or computer vision.
Key Applications:
CPU Family
Ampere
Drive Bays
10 front hot-swap 2.5" NVMe drive bays
Expansion Slots
1 PCIe 4.0 x16 AIOM slot (OCP 3.0 compatible)
3 PCIe 4.0 x16 LP slots
Form Factor
1U Rackmount
IPMI
IPMI 2.0 with virtual media over LAN and KVM-over-LAN support
Support for Intelligent Platform Management Interface v.2.0
Manufacturer
Supermicro
Memory Slots
16 DIMM slots
Memory Voltage
1.2V
Network Connectivity
2 SFP28 25GbE
System Cooling
6 Heavy Duty 4cm Fan(s)
To help our clients make informed decisions about new technologies, we have opened up our research & development facilities and actively encourage customers to try the latest platforms using their own tools and if necessary together with their existing hardware. Remote access is also available
There are no events coming up right now.