The Nvidia DGX-2 isn't the initial off-the-peg Nvidia web server to be targeted at AI. That honour mosts likely to the DGX-1, based on a mix of Intel Xeon processors paired with Nvidia's very own AI-optimised Tesla V100 Volta-architecture GPUs. The DGX-2 proceeds that technique, yet instead of eight Tesla V100s signed up with using Nvidia's NVLink bus, the DGX-2 features 16 of these mighty GPUs connected using its more scalable NVswitch technology. According to Nvidia, this arrangement permits the Nvidia DGX-2 to take care of deep discovering as well as other demanding AI as well as HPC workloads as much as 10 times faster than its smaller sized sibling.
Although it was introduced at the same time as the Nvidia DGX-2, it has taken an additional six months for the larger model to show up. One of the initial to make it to the UK was installed in the labs of Nvidia companion Boston Limited. They asked if we wish to take a look: we did, and also below is what we located.
Along with efficiency, size is a huge differentiator with the DGX-2 which has the very same crackle-finish gold bezel as the DGX-1 but is literally a whole lot bigger, weighing in at 154.2 kg (340lbs) compared to 60.8 kg (134lbs) for the DGX-1 and consuming 10 shelf units instead of 3
You Can Also : BullGuard Antivirus 2019
It's additionally worth keeping in mind that the Nvidia DGX-2 requires a lot even more power than its little bro, calling for up to 10kW at full tilt, rising to 12kW for the lately announced DGX-2H design (regarding which much more quickly). The picture below shows the power plans at Boston required to keep this little monster pleased. Air conditioning, similarly, will require cautious factor to consider, specifically where more than one DGX-2 is deployed or where it's installed alongside various other equipment in the exact same shelf.
Dispersing that power is a set of 6 hot-swap and redundant PSUs that slide in behind the framework in addition to the numerous modules that comprise the rest of the system. Cooling, on the other hand, is taken care of by a selection of 10 followers located behind the front bezel with space on either side for 16 2.5-inch storage devices in two financial institutions of eight.
Nvidia consists of 8 3.84 TB Micron 9200 Pro NVMe drives as part of the base configuration, relating to simply over 30TB of high-performance storage. This, nevertheless, is mostly to manage neighborhood information, with added storage on the main motherboard for OS as well as application code. It also leaves eight bays empty to add more storage space if needed. On top of that, the Nvidia DGX-2 is bristling with high-bandwidth network interfaces to link to much more capability and develop server clusters if required.