Nvidia has unveiled 65 new and updated software development kits at GTC 2021, alongside a partnership with industry leaders to speed up quantum research.
The company’s roster of accelerated computing kits now exceeds 150 and supports the almost three million developers in NVIDIA’s Developer Program.
Four of the major new SDKs are:
- ReOpt – Automatically optimises logistical processes using advanced, parallel algorithms. This includes vehicle routes, warehouse selection, and fleet mix. The dynamic rerouting capabilities – shown in an on-stage demo – can reduce travel time, save fuel costs, and minimise idle periods.
- cuNumeric – Implements the popular NumPy application programming interface and enables scaling to multi-GPU and multi-node systems with zero code changes.
- cuQuantum – Designed for quantum computing, it enables large quantum circuits to be simulated faster. This enables quantum researchers to simulate areas such as near-term variational quantum algorithms for molecules, error correction algorithms to identify fault tolerance, and accelerate popular quantum simulators from Atos, Google, and IBM.
- CUDA-X accelerated DGL container – Helps developers and data scientists working on graph neural networks to quickly set up a working environment. The container makes it easy to work in an integrated, GPU-accelerated GNN environment combining DGL and Pytorch.
Some existing AI-related SDKs that have received notable updates are:
- Deepstream 6.0 – introduces a new graph composer that makes computer vision accessible with a visual drag-and-drop interface.
- Triton 2.15, TensorRT 8.2 and cuDNN 8.4 – assists with the development of deep neural networks by providing new optimisations for large language models and inference acceleration for gradient-boosted decision trees and random forests.
- Merlin 0.8 – boosts recommendation systems with its new capabilities for predicting a user’s next action with little or no user data and support for models larger than GPU memory.
Accelerating quantum research
Nvidia has established a partnership with Google, IBM, and a number of small companies, national labs, and university research groups to accelerate quantum research.
“It takes a village to nurture an emerging technology, so Nvidia is collaborating with Google Quantum AI, IBM, and others to take quantum computing to the next level,” explained the company in a blog post.
The first library from the aforementioned new cuQuantum SDK is Nvidia’s initial contribution to the partnership. The library is called cuStateVec and is an accelerator for the state vector simulation method which tracks the full state of the system in memory and can scale to tens of qubits.
cuStateVec has been integrated into Google Quantum AI’s state vector simulator qsim and can be used through the open-source framework Cirq.
“Quantum computing promises to solve tough challenges in computing that are beyond the reach of traditional systems,” commented Catherine Vollgraff Heidweiller at Google Quantum AI.
“This high-performance simulation stack will accelerate the work of researchers around the world who are developing algorithms and applications for quantum computers.”
In December, cuStateVec will also be integrated with Qiskit Aer—a high-performance simulator framework for quantum circuits from IBM.
Among the national labs using cuQuantum to accelerate their research are Oak Ridge, Argonne, Lawrence Berkeley National Laboratory, and Pacific Northwest National Laboratory. University research groups include those at Caltech, Oxford, and MIT.
Nvidia is helping developers to get started by creating a ‘DGX quantum appliance’ that puts its simulation software in a container optimised for its DGX A100 systems. The software will be available early next year via the company’s NGC Catalog.
(Image Credit: Nvidia)
Looking to revamp your digital transformation strategy? Learn more about the Digital Transformation Week event taking place in Amsterdam on 23-24 November 2021 and discover key strategies for making your digital efforts a success.