Smells Like AI Spirit: Baidu will help develop Intel’s Nervana neural processor

Ryan Daws is a senior editor at TechForge Media, with a seasoned background spanning over a decade in tech journalism. His expertise lies in identifying the latest technological trends, dissecting complex topics, and weaving compelling narratives around the most cutting-edge developments. His articles and interviews with leading industry figures have gained him recognition as a key influencer by organisations such as Onalytica. Publications under his stewardship have since gained recognition from leading analyst houses like Forrester for their performance. Find him on X (@gadget_ry) or Mastodon (

Intel announced during Baidu’s Create conference this week that Baidu will help to develop the former’s Nervana Neural Network Processor.

Speaking on stage at the conference in Beijing, Intel corporate vice president Naveen Rao made the announcement.

“The next few years will see an explosion in the complexity of AI models and the need for massive deep learning compute at scale. Intel and Baidu are focusing their decade-long collaboration on building radical new hardware, codesigned with enabling software, that will evolve with this new reality – something we call ‘AI 2.0.’

Intel’s so-called Neural Network Processor for Training is codenamed NNP-T 1000 and designed for training deep learning models at lightning speed. A large amount (32GB) of HBM memory and local SRAM is put closer to where computation happens to enable more storage of model parameters on-die, saving significant power for an increase in performance.

The NNP-T 1000 is set to ship alongside the Neural Network Processor for Inference (NNP-I 1000) chip later this year. As the name suggests, the NNP-I 1000 is designed for AI inferencing and features general-purpose processor cores based on Intel’s Ice Lake architecture.

Baidu and Intel have a history of collaborating in AI. Intel has helped to optimise Baidu’s PaddlePaddle deep learning framework for its Xeon Scalable processors since 2016. More recently, Baidu and Intel developed the BIE-AI-Box – a hardware kit for analysing the frames of footage captured by cockpit cameras.

Intel sees a great deal of its future growth in AI. The company’s AI chips generated $1 billion in revenue last year and Intel expects a growth rate of 30 percent annually up to $10 billion by 2022.

Interested in hearing industry leaders discuss subjects like this and their use cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London, and Amsterdam to learn more. Co-located with the IoT Tech Expo, Blockchain Expo, and Cyber Security & Cloud Expo.

Tags: , , , , , ,

View Comments
Leave a comment

Leave a Reply