In an period the place curiosity in AI PCs is hovering and injecting life into what was a stagnant endpoint gadget market, the keenness has been met with a scarcity of clearly outlined use instances.
The early use instances touted by {hardware} and software program distributors typically revolved round unified communications and collaboration. Whereas that is an effective way to reveal discrete {hardware} and its optimistic affect on audio and video high quality, it was fixing an issue that did not require fixing, not less than for many customers. It was cool, however not “drop every little thing, we now have to have this” cool.
Throughout the trade, we have gone forwards and backwards making an attempt to resolve whether or not a broad, must-have AI PC use case will emerge — I wrote as a lot in 2023. Or maybe AI will simply quietly infiltrate every little thing we do, which is the place I ultimately got here round to. As of late, I discover myself someplace within the center. I do know it’s going to be helpful — however I would nonetheless prefer to see a really compelling cause for widespread adoption.
Regardless of the place we glance, we’re not discovering what we thought could be there. Once we search for broad, everybody’s-got-to-have-it use instances for having native AI sources, we simply cannot discover them. Some are rising, reminiscent of safety and agentic AI, and after we look to the prevailing locations that AI is getting used on the endpoint, they’re nearly invariably utilizing a cloud-based service. These cloud-based providers are extremely helpful, widespread in use, and are driving tangible advantages nearly in every single place you look. However cloud providers do not use native AI, so why do we want native AI?
Over the previous 12-18 months, use instances have emerged in help of native AI, however with combined usefulness and response. Copilot+ launched with Recall, which was obtained with a response greatest described as a cross between “simply because you may, does not imply you need to,” and “oh, h— no!” Others have touted the flexibility to construct fashions utilizing open supply giant language fashions (LLMs) and disseminate finely tuned smaller fashions to finish customers — principally builders, however there are use instances outdoors of this, too.
The issue is that coaching your personal mannequin is:
Pricey.
Tough to maintain up with the speedy tempo of innovation that cloud-scale LLMs are setting.
Prone to develop into outdated rapidly.
Have to be retrained typically, so the cycle repeats.
So the place does that go away us? I am making an attempt to combat off that “answer in search of an issue” feeling. That sounds harsh, however I used an AI PC for 2 months in my common office-worker job and the one time I tickled the neural processing unit (NPU) meter was after I used Groups.
However not all is misplaced. The truth is, broad use instances are rising within the type of safety, which could very properly be the common use case and justification we have been wanting. It may assist anchor AI PC usefulness whereas different use instances evolve alongside AI PC adoption, like agentic AI.
Safety and agentic AI rising as AI PC use instances
Earlier than we transfer on, it is price defining the AI PC, since I am typically requested, “Is not my machine with a beefy GPU an AI PC already?” I lately heard somebody from Intel outline it this fashion, and I appreciated it sufficient to attempt to paraphrase it right here:
An AI PC is one which has devoted {hardware} divided up for particular functions. The CPU is fitted to fast and light-weight duties. The GPU is supposed for data-intensive AI operations. And the NPU is an “AI accelerator” for workloads that must run persistently on the system in a low energy method.
So, a GPU alone can allow AI PC workloads in the identical method {that a} sledgehammer can drive in a nail. It is simply that GPUs are costly and never wanted in all conditions. An AI PC and its NPU, is type of within the center between a CPU and a GPU. If you’re an AI researcher or are working in ways in which require a ton of AI sources, an AI PC is not going to maneuver the needle a lot. You may nonetheless want GPUs. However for the remainder of us, NPUs might be useful, and we’re beginning to see extra methods this will occur.
So, a GPU alone can allow AI PC workloads in the identical method {that a} sledgehammer can drive in a nail. It is simply that GPUs are costly and never wanted in all conditions.
Safety
Think about the audio and video touchups the bottom department on the AI PC tree — the following degree up is endpoint safety. The truth is, endpoint safety that makes use of native AI is likely one of the issues that I will be in search of at RSA Convention this yr.
I used to be upset final yr when the AI endpoint safety angle could possibly be summed up in a single phrase: chatbots. This yr, I’ve already seen rising makes use of, like ESET’s announcement about leveraging Intel NPUs, shifting some workloads to the NPU when applicable, growing velocity and decreasing the affect on system sources. I am certain they are not alone in that regard, and I hope to be taught as a lot as I can at RSAC.
Agentic AI
Subsequent on the tree of AI is agentic AI, which is the buzzword of 2025. The factor about agentic AI is that whereas its eventual usefulness is off the charts, there are such a lot of angles that have to be thought-about earlier than utilizing it. If the brokers are actually unbiased of finish customers — which means totally autonomous brokers appearing on behalf of the group itself versus finish customers — there are safety, identification, compliance and belief confidence points that must be overcome. This may occur, however it is going to be sluggish.
The center floor for agentic AI could possibly be on the endpoint, the place brokers work on behalf of the tip customers themselves to perform duties. An agent may file your bills, compile TPS experiences, construct a go-to-market plan based mostly on key inputs and assembly notes, and many others.
It is the latter use case that might profit from native AI. Sure, there’ll at all times be cloud-based — or possibly organizationally centralized — providers that may do that. However offloading a few of the extra menial issues to the endpoint would release cloud sources for extra intensive or big-picture issues.
Conclusion
Whereas we anticipate the killer app that makes AI the Excel of the trendy period AI’s “Excel second,” to borrow some phrasing from a current interview with Satya Nadella evaluating AI brokers to how the PC modified company forecasting workflows attributable to Excel. It is good to see use instances rising which are helpful.
I lately had the chance to be taught extra about how ESET is utilizing AI PCs to enhance its endpoint safety merchandise, so search for a publish within the subsequent few days about that. And after RSAC, I am going to hopefully have much more fascinating, tangible makes use of for AI PCs to share.
Gabe Knuth is the senior end-user computing analyst at Enterprise Technique Group, now a part of Omdia.
Enterprise Technique Group is a part of Omdia. Its analysts have enterprise relationships with expertise distributors.