Edge AI isn’t just a buzzword anymore—it’s turning into real licensing revenue.
Ceva, Inc. says 2025 marked a breakout year for its artificial intelligence business, signing 10 NeuPro neural processing unit (NPU) agreements and generating more than 20% of its annual licensing revenue from AI-related deals. For a company long known for connectivity and DSP IP at the smart edge, that’s more than incremental growth—it’s a revenue mix shift.
The takeaway: AI at the edge is moving from experimentation to deployment, and silicon vendors are standardizing on dedicated NPUs to make it happen.
AI Becomes a Material Revenue Driver
Ceva licenses silicon and software IP used in everything from Bluetooth chips to automotive platforms. But in 2025, AI emerged as a major growth engine.
According to the company, AI-related licenses accounted for over one-fifth of total licensing revenue—an inflection point that signals its NeuPro NPU line is gaining traction across multiple verticals.
That traction spans ultra-low-power inference for microcontrollers all the way up to high-performance NPUs aimed at PCs, automotive systems, and other compute-intensive platforms.
As AI capabilities become embedded directly into operating systems and applications, dedicated NPUs are increasingly viewed as essential silicon blocks rather than optional accelerators.
A Major PC OEM Design Win
One of Ceva’s most significant wins this year was a strategic NPU licensing agreement with a leading global PC OEM. The customer selected Ceva’s high-performance NeuPro NPUs as foundational IP for its next-generation on-device AI compute architecture.
While the OEM wasn’t named, the implications are notable. The PC market is in the middle of an architectural shift toward AI-native designs, driven by Microsoft’s Copilot+ PC push and chipmakers like Intel, AMD, and Qualcomm embedding NPUs into client processors.
By securing a role in a next-gen AI PC platform, Ceva validates its scalable NPU strategy and positions itself within a segment expected to drive sustained demand for local inference—particularly as privacy, latency, and cost concerns push more AI workloads off the cloud and onto devices.
Microchip and the Move to Standardized NPUs
Earlier in the year, Ceva signed a comprehensive NeuPro NPU portfolio license with Microchip Technology, a major supplier of microcontrollers and connectivity solutions.
Under the agreement, Microchip will embed Ceva’s NPUs broadly across its product families. That’s significant because it reflects a broader semiconductor trend: AI acceleration is becoming a standard feature, not a premium add-on.
For microcontroller vendors, adding NPUs enables on-device vision, audio, and sensor processing without dramatically increasing power budgets. For Ceva, it expands its footprint into high-volume embedded markets where long product cycles can translate into durable royalty streams.
Additional NPU wins in 2025 included agreements with ALi Corp and Nextchip, extending Ceva’s AI reach into consumer electronics, video platforms, and automotive advanced driver assistance systems (ADAS).
From Evaluation to Deployment
Across its 10 NPU agreements, Ceva says six customers are expected to have silicon back by the end of 2026. That matters because IP licensing revenue is only part of the story; the larger payoff typically comes from royalties once chips ship in volume.
The company also noted that several NPU deals came from existing high-volume connectivity customers expanding into AI. That increases content per device and strengthens Ceva’s “licensing-to-royalty flywheel,” where each additional IP block embedded in a design compounds long-term revenue potential.
In practical terms, the industry appears to be moving beyond AI proof-of-concept phases. Instead of testing neural workloads on general-purpose DSPs or CPUs, chipmakers are locking in dedicated NPUs as standard architecture components.
AI DSPs Complement the NPU Push
While NPUs dominated headlines, Ceva also signed multiple AI-focused digital signal processor (DSP) licensing agreements across automotive and consumer markets.
AI DSPs often work alongside NPUs, handling signal pre-processing from cameras, microphones, and other sensors before inference occurs. In vision-heavy or sensor-rich applications—such as ADAS or smart home devices—that division of labor improves performance and power efficiency.
In many deployments, Ceva’s AI IP is paired with its connectivity portfolio, allowing customers to combine wireless communication, sensing, and inference in a single SoC. That convergence aligns with the company’s broader “Physical AI” thesis: intelligent systems where connectivity, sensing, and decision-making happen directly inside edge devices.
Riding the Edge AI Wave
The broader industry backdrop supports Ceva’s momentum.
As AI expands beyond hyperscale data centers, edge devices—from PCs and smartphones to vehicles and industrial sensors—are becoming inference engines in their own right. Power efficiency, thermal constraints, and real-time requirements are driving demand for specialized architectures rather than cloud-dependent models.
At the same time, regulatory and privacy concerns are accelerating interest in on-device processing. Keeping data local reduces latency and avoids transmitting sensitive information to centralized servers.
Ceva’s diversified customer base—spanning automotive, consumer, industrial, PC, and infrastructure markets—gives it exposure across multiple edge AI growth vectors rather than tying its fate to a single segment.
What to Watch in 2026
The real test will come as licensed designs move into production.
With six customers expected to reach silicon by the end of 2026, Ceva is entering the phase where AI-related royalty ramps could materially impact financial results. If edge AI adoption continues at its current pace, AI’s share of revenue could climb further.
Competition in the NPU IP market is intensifying, with players like Arm, Synopsys, and various in-house silicon teams vying for design wins. But Ceva’s 2025 results suggest that demand for proven, licensable NPU architectures is strong—particularly among companies that prefer not to build AI accelerators from scratch.
For now, the numbers tell the story: AI is no longer a side bet for Ceva. It’s a core revenue driver—and a signal that edge AI silicon is shifting from roadmap slide to shipping product.
Power Tomorrow’s Intelligence — Build It with TechEdgeAI












