Intel explains the perks AI PCs provide to users including the upcoming ability to run Microsoft Copilot locally

 Intel AI Summit 2024 - AI PCs exponentially increasesing.
Intel AI Summit 2024 - AI PCs exponentially increasesing.

What you need to know

  • Intel held an AI Summit for developers in Taipei this week.

  • During the event, Intel explained that Microsoft Copilot will run locally on AI PCs in the not so distant future.

  • This shift to local AI services on PC is made possible by AI-ready CPUs such as Intel Core Ultra processors.

  • Intel listed out the benefits of being able to run AI services like Copilot locally, which includes cost reduction, reliability in performance, privacy (personal and organization data), and more personalization capabilities.


This week in Taipei, Taiwan, thousands of developers gathered for an Intel AI Summit. I personally was able to attend this event. During a question and answer meeting with Intel executives, I learned that Microsoft Copilot will soon be able to work locally on AI PCs rather than its current method of operating via the cloud. This is something that has previously been rumored but hadn't been confirmed. Intel's Vice President of Client Computing Group Todd Lewellen clarified that certain aspects of Copilot service could still require cloud access to work properly on an AI PC.

"As we go to that next gen, it's just going to enable us to run more things locally," Lewellyn said, "just like they will run Copilot with more elements of Copilot running locally on the client. I don't mean that everything in Copilot is running local, but you'll get a lot of key capabilities that will show up running on that NPU. We're starting now with Core Ultra, but we're gonna be right there with Microsoft as we go gen to gen."

Dell XPS 14 (9440) for 2024
Dell XPS 14 (9440) for 2024

Unsurprisingly, Intel and Microsoft have been meeting together to discuss what makes an AI PC and how to provide the best user experience.

"I would say, we've gotten incredibly aligned with Microsoft over the last three months [regarding AI PCs]," said Lewellyn. "We [at Intel] would say that an AI PC is an Ultra that has an integrated NPU. Microsoft, if they were sitting with me right now would say, 'Todd you're right, but there are two other things that go with it, Copilot and the Copilot key. Those three things.'"

You might recall that previously Microsoft announced a new Copilot key would be appearing on AI PC keyboards and we've already seen this with the likes of the new Dell XPS laptops (see our Dell XPS 14 review) , ASUS Zenbooks as well as the newly announced Surface Pro and Surface Laptop. Additional AI PCs will continue to roll out as the year goes on.

Intel Core Ultra processor with NPU
Intel Core Ultra processor with NPU

But let's turn our attention back to Intel. As you've probably heard by now, Intel Core Ultra processors (also known as Meteor Lake) allow PCs to work far more efficiently than before. This is due to them utilizing a Neural Processing Unit (NPU) in addition to the traditional CPU and GPU. You can learn more at my NPU guide, but the gist is that NPUs are specifically designed to take on certain AI and machine learning (ML) tasks, lightening the work of the CPU and GPU so they can run more efficiently. Thus, the system performs better overall in terms of processing power.

On another note, it turns out that giving AI tasks to the NPU rather than a GPU helps a system's overall battery life. As explained by Lewellyn, Intel has been talking frequently with Microsoft about the best way to get AI services to run locally on PC, and better battery life was a key aspect of those talks.

Microsoft Copilot
Microsoft Copilot


"We had these long discussions [with Microsoft] over the course of the last year," said Lewellyn, "and we're like, 'Why can't we just run that on the GPU?' They're like, 'no, no. We want to make sure that the GPU and the CPU are freed up to do all this other work. But also, we want to make sure it's a great battery life experience and if we started running Copilot and some of those workloads on the GPU, suddenly you're going to see a huge hit, you know, on the battery life side."

"So, it really is about the right workflow map to the right processor. And we're doing that with Microsoft, but we're also doing that with the ISVs (independent software vendors)."

Why you should care about AI PCs and Copilot running locally

Intel AI Summit 2024 sign.
Intel AI Summit 2024 sign.

But why should you care about AI PCs and Copilot working locally? During the Intel AI Summit's opening keynote, Computer Vision Software Engineer Jamie Chang explained five key benefits of switching from cloud-based AI services to local AI operation. Specifically, he listed cost reduction, better reliability in performance, more privacy with both personal and organization data, and additional personalization options.

These are all very good points, but as I've been sitting here thinking about our AI PC future, I've thought of some additional ones. For one thing, being able to run AI services like Copilot locally could eventually mean that I don't have to use an internet connection when having it do certain AI or ML tasks on my laptop. There have been plenty of times when I was out and about, away from trusted Wi-Fi and wanted to access certain AI services on my laptop but had to wait until I got back home. Being able to run Copilot locally would certainly help with AI and ML tasks on the go.

My second thought expounds on Intel's point about reliability in performance. As it currently stands, Copilot is cloud-based, which means users need to wait for the service to both process and then generate responses to your prompts. Since this work is done on a distant server there are often latency issues that can disrupt or hamper the experience. So working locally on your computer could ensure smoother (and faster) responses that aren't beholden to the internet.

From my time at the Intel AI Summit, it's become clear that both Intel and Microsoft are focused on creating a good user experience when planning the future of AI PCs and AI services. Being able to run ML and AI tasks locally on our devices could make using a laptop far more convenient than it already is and that's an aspect of the future I'm excited for.