Nvidia explains how ‘true adoption’ of AI is making an impact

Ryan Daws is a senior editor at TechForge Media with over a decade of experience in crafting compelling narratives and making complex topics accessible. His articles and interviews with industry leaders have earned him recognition as a key influencer by organisations like Onalytica. Under his leadership, publications have been praised by analyst firms such as Forrester for their excellence and performance. Connect with him on X (@gadget_ry) or Mastodon (@gadgetry@techhub.social)

Nvidia Senior Director of Enterprise David Hogan spoke at this year’s AI Expo about how the company is seeing artificial intelligence adoption making an impact.

In the keynote session, titled ‘What is the true adoption of AI’, Hogan provided real-world examples of how the technology is being used and enabled by Nvidia’s GPUs. But first, he highlighted the momentum we’re seeing in AI.

“Many governments have announced investments in AI and how they’re going to position themselves,” comments Hogan. “Countries around the world are starting to invest in very large infrastructures.”

The world’s most powerful supercomputers are powered by Nvidia GPUs. ORNL Summit, the current fastest, uses an incredible 27,648 GPUs to deliver over 144 petaflops of performance. Vast amounts of computational power is needed for AI which puts Nvidia in a great position to capitalise.

“The compute demands of AI are huge and beyond what anybody has seen within a standard enterprise environment before,” says Hogan. “You cannot train a neural network on a standard CPU cluster.”

Nvidia started off by creating graphics cards for gaming. While that’s still a big part of what the company does, Hogan says the company pivoted towards AI back in 2012.

A great deal of the presentation was spent on autonomous vehicles, which is unsurprising given the demand and Nvidia’s expertise in the field. Hogan highlights that you simply cannot train driverless cars using CPUs and provided a comparison in cost, size, and power consumption.

“A new type of computing is starting to evolve based around GPU architecture called ‘dense computing’ – the ability to build systems that are highly-powerful, huge amounts of computational scale, but actually contained within a very small configuration,” explains Hogan.

Autonomous car manufacturers need to train petabytes of data per day, reiterate their models, and deploy them again in order to get those vehicles to market.

Nvidia has a machine called the DGX-2 which delivers two petaflops of performance. “That is one server that’s equivalent to 800 traditional servers in one box.”

Nvidia has a total of 370 autonomous vehicles which Hogan says covers most of the world’s automotive brands. Many of these are investing heavily and rushing to deliver at least ‘Level 2’ driverless cars in the 2020-21 timeframe.

“We have a fleet of autonomous cars,” says Hogan. “It’s not our intention to compete with Uber, Daimler or BMW, but the best way of us helping our customers enable that is by trying it ourselves.”

“All the work our customers do we’ve also done ourselves so we understand the challenges and what it takes to do this.”

Real-world impact

Hogan notes how AI is a “horizontal capability that sits across organisations” and is “an enabler for many, many things”. It’s certainly a challenge to come up with examples of industries that cannot be improved to some degree through AI.

Following autonomous cars, Nvidia sees the next mass scaling of AI happening in healthcare (which our dear readers already know, of course.)

Hogan provides the natural example of the UK’s National Health Service (NHS) which has vast amounts of patient data. Bringing this data together and having an AI make sense of it can unlock valuable information to improve healthcare.

AIs which can make sense of medical imaging on a par with, or even better, than some doctors are starting to become available. However, they are still 2D images that are alien to most people.

Hogan showed how AI is able to turn 2D imagery into 3D models of the organs which are easier to understand. In the GIF below, we see a radiograph of a heart being turned into a 3D model:

We’ve also heard about how AI is helping with the field of genomics, assisting in finding cures for human diseases. Nvidia GPUs are used for Oxford Nanopore’s MinIT handheld which enables DNA sequencing of things such as plants to be conducted in-the-field.

In a blog post last year, Nvidia explained how MinIT uses AI for basecalling:

“Nanopore sequencing measures tiny ionic currents that pass through nanoscale holes called nanopores. It detects signal changes when DNA passes through these holes. This captured signal produces raw data that requires signal processing to determine the order of DNA bases – known as the ‘sequence.’ This is called basecalling.

This analysis problem is a perfect match for AI, specifically recurrent neural networks. Compared with previous methods, RNNs allow for more accuracy in time-series data, which Oxford Nanopore’s sequencers are known for.”

Hogan notes how, in many respects, eCommerce paved the way for AI. Data collected for things such as advertising helps to train neural networks. In addition, eCommerce firms have consistently aimed to improve and optimise their algorithms for things such as recommendations to attract customers.

“All that data, all that Facebook information that we’ve created, has enabled us to train networks,” notes Hogan.

Brick-and-mortar retailers are also being improved by AI. Hogan gives the example of Walmart which is using AI to improve their demand forecasting and keep supply chains running smoothly.

In real-time, Walmart is able to see where potential supply challenges are and take action to avoid or minimise. The company is even able to see where weather conditions may cause issues.

Hogan says this has saved Walmart tens of billions of dollars. “This is just one example of how AI is making an impact today not just on the bottom line but also the overall performance of the business”.

Accenture is now detecting around 200 million cyber threats per day, claims Hogan. He notes how protecting against such a vast number of evolving threats is simply not possible without AI.

“It’s impossible to address that, look at it, prioritise it, and action it in any other way than applying AI,” comments Hogan. “AI is based around patterns – things that are different – and when to act and when not to.”

While often we hear about what AI could one day be used for, Hogan’s presentation was a fascinating insight into how Nvidia is seeing it making an impact today or in the not-so-distant future.

Interested in hearing industry leaders discuss subjects like this and their use cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London, and Amsterdam to learn more. Co-located with the IoT Tech Expo, Blockchain Expo, and Cyber Security & Cloud Expo.

Tags: , , , , , , , , , , , , , , , ,

View Comments
Leave a comment

Leave a Reply