Microsoft’s Copilot AI could soon run locally on PCs rather than relying on the cloud.
Intel told Tom’s Hardware that the chatbot could run on future AI-enabled PCs that would need to incorporate neural processing units (NPUs) capable of exceeding 40 trillion operations per second (TOPS) — a performance level not yet matched by any consumer processor currently available.
Intel mentioned that these AI PCs would be equipped to handle “more elements of Copilot” directly on the machine. Copilot currently relies predominantly on cloud processing for most tasks, leading to noticeable delays, especially for minor requests. Enhancing local computing power is expected to reduce such delays, potentially boosting performance and privacy.