Intel 14th Gen Meteor Lake Brings Dedicated NPU for Offline AI Processing

Intel, at its Innovation 2023 event, announced the14th Gen Meteor Lake processorlineup that comes with a wide variety of improvements and new features. While the Intel 4 process, a disaggregated architecture, and Intel Arc integrated graphics are exciting upgrades, the users will appreciate the AI features that Meteor Lake chips come integrated with. Intel announced it is bringing artificial intelligence to modern PCs through an all-new neural processing unit (NPU) in 14th-Gen Meteor Lake processors. But what is the NPU, and how will it help? Let’s talk about it below.

Intel Brings NPU to Client Chips for the First Time

Intel Brings NPU to Client Chips for the First Time

While artificial intelligence is widely used online,being on the cloud has limitations.The most prominent of them include high latency and connectivity issues. ANeural Processing Unit (or NPU)is a unit dedicated to the CPU for processing AI tasks. So, instead of relying on the cloud, all AI-related processing is done on the device itself. Intel’s new NPU will be able to do exactly that.

While Intel has been experimenting with AI for a while, this will be the first time the company will bring an NPU to the client-sided silicon. The NPU on Meteor Lake chips is adedicated low-power AI enginethat can handle sustained AI workloads both offline & online.

This means instead of relying on AI programs to do the heavy lifting from the internet; you will be able to use the Intel NPU to do the same job such as AI image editing, audio separation, and more. Having an on-device NPU for AI processing and inference will undoubtedly have a lot of advantages.

Since there will be no latency, users can expect lightning-fast processing and output. Furthermore, Intel’s NPU will helpimprove privacy and securitywith the chip’s security protocols. It is safe to say you will be able to use AI more interchangeably, and that too daily.

Intel NPU Can Sustainably Interchange Workloads

Intel NPU Can Sustainably Interchange Workloads

The Intel NPU has been divided into two main portions that each do their job. These are Device Management and Compute Management. The former supports a new driver model for Microsoft called theMicrosoft Compute Driver Model(MCDM). This is vital to facilitate AI processing on the device. Furthermore, because of this driver model,users will be able tosee the Intel NPU in the Windows Task Manageralong with the CPU and GPU.

Intel has been working on the aforementioned driver model with Microsoft for the last six years. This has enabled the company to ensure that the NPU efficiently works through tasks while dynamically managing power. This means the NPU can quickly switch to a low-power state to sustain these workloads and back again.

The Intel Meteor Lake NPUemploys a multi-engine architecture. Built inside it are two neural compute engines. These can work on two different workloads or together on the same one. Within the neural engine lies the Inference Pipeline, which acts as a core component and will help the NPU function.

Intel’s NPU Makes On-Device AI Possible

While the actual workings of the NPU go far deeper, users can expect a lot of firsts with the new Meteor Lake architecture. With on-device AI processing coming to Intel’s consumer chips, it will be exciting to see their range of applications. Also, the presence of the Intel 14th Gen Meteor Lake’s NPU in the Task Manager, while sounding simple, indicates that consumers will be able to capitalize on it. However, we will have to wait for the rollout to see how far we can push it.

Featured Image Courtesy: Intel

Upanishad Sharma

Combining his love for Literature and Tech, Upanishad dived into the world of technology journalism with fire. Now he writes about anything and everything while keeping a keen eye on his first love of gaming. Often found chronically walking around the office.

Add new comment

Name

Email ID

Δ

01

02

03

04

05