NVIDIA’s DGX Cloud infrastructure, which lets organizations lease space on supercomputing hardware suitable for training generative AI models, is now generally available. First announced in March, the $36,999 per instance per month service is in competition with NVIDIA’s own $200,000 DGX server. It runs on Oracle Cloud infrastructure and on NVIDIA hardware located in the US and the United Kingdom. DGX Cloud is a remote-access version of NVIDIA’s hardware, including the thousands of NVIDIA GPUs online on Oracle Cloud Infrastructure. The DGX AI system is the hardware that ChatGPT trained on in the first place, so NVIDIA has the right pedigree for organizations that want to spin up their own generative AI models. When training ChatGPT, Microsoft linked together tens of thousands of NVIDIA’s A100 graphics chips to get the power it needed; now, NVIDIA wants to make the process much easier — essentially, providing AI training as a service. Pharmaceutical companies, manufacturers and finance institutions using natural language processing and AI chatbots are among DGX Cloud’s existing customers, NVIDIA said.
Full story : NVIDIA DGX Cloud AI Supercomputing Brings AI Training as-a-Service.