
Apple Vision Pro 2 Rumors: 40% Lighter, Half the Price, and M5-Powered — Everything We Know
June 4, 2025
VisionOS 3 AI Features: 7 Spatial Intelligence Upgrades Expected at WWDC 2025
June 5, 2025Four days. That’s all that stands between us and what could be the most significant software update Apple Vision Pro has ever received. WWDC 2025 kicks off on June 9, and if the leaks are even half right, visionOS 3 AI features are about to redefine what spatial computing actually means. After months of rumors from Bloomberg’s Mark Gurman, patent filings, and insider reports, the picture is finally coming together—and it’s far more ambitious than most people expected.

Let me be clear: this isn’t another incremental update. Apple Intelligence arrived on Vision Pro just two months ago with visionOS 2.4, bringing writing tools and notification summaries. That was the appetizer. What’s coming at WWDC 2025 looks like the full course—spatial intelligence that understands your environment, object recognition that knows what it’s looking at, and an AI layer that could make the $3,499 headset finally feel like the future Apple promised.
The Naming Shift: Why visionOS 26 Matters More Than You Think
Before we dive into the visionOS 3 AI features themselves, there’s a significant branding change to address. Multiple sources, including MacRumors, report that Apple is rebranding from sequential numbering to a year-based convention. That means visionOS 3 will likely debut as “visionOS 26” at WWDC 2025, aligning with iOS 26, macOS 26, and the rest of the ecosystem.
This isn’t just cosmetic. The naming unification signals Apple’s intent to bring spatial computing into the same update cadence and feature parity as its other platforms. When every OS shares the same number, it’s easier to build cross-platform AI features—and that’s exactly what the rumors suggest is happening with Apple Intelligence on Vision Pro.
VisionOS 3 AI Features: Spatial Intelligence and Object Recognition
The most exciting visionOS 3 AI features center around spatial intelligence—the headset’s ability to understand, interpret, and interact with the physical world around you. According to 9to5Mac’s reporting on Bloomberg’s Gurman leak, Apple has been planning a “feature-packed” update that prioritizes software over hardware, and AI-powered spatial awareness is at the core of that strategy.
Here’s what we’re expecting based on accumulated leaks and reports:
1. Real-Time Object Recognition
Vision Pro already maps your room using LiDAR and camera arrays, but current capabilities are mostly limited to surface detection for placing virtual objects. The expected upgrade would let visionOS 3 actually identify what objects are—your coffee table, your keyboard, your bookshelf—and enable contextual interactions. Imagine pointing at a product and getting instant information, or having your workspace automatically adapt based on what’s on your desk.
2. Apple Intelligence Deep Integration
VisionOS 2.4 laid the groundwork in March 2025. Apple’s own newsroom confirmed that writing tools and notification summaries were just the beginning. For visionOS 3, the expectation is full Apple Intelligence integration: text summarization across spatial apps, smart suggestions based on your spatial context, and AI-powered content generation directly within the headset’s interface. BGR reports that these writing and summarization capabilities will expand significantly, leveraging the M2 chip’s Neural Engine for on-device processing.
3. Enhanced Siri with Spatial Context
A more conversational Siri powered by large language models has been in development across all Apple platforms. However, AppleMagazine reports that the fully advanced LLM-powered Siri may not arrive until spring 2026. What we might see at WWDC 2025 is an intermediate step: Siri that understands spatial commands better, can reference objects in your environment, and integrates with Apple Intelligence-powered Shortcuts creation on Vision Pro.
4. Brain-Computer Interface Accessibility
This one sounds straight out of science fiction, but MacRumors reports that visionOS 3 may include brain-computer interface support for Switch Control accessibility features. This would allow users with physical disabilities to control Vision Pro using neural signals—a remarkable application of Apple’s machine learning capabilities for accessibility.
Hardware Interaction Upgrades: Controllers and Eye Tracking
While the visionOS 3 AI features grab the headlines, the hardware interaction improvements are equally important for the platform’s future.
5. Sony PSVR2 Controller Support
One of the most surprising leaks: Apple is reportedly working with Sony to bring PSVR 2 Sense Controller compatibility to Vision Pro. As reported by Cult of Mac, this VR hand controller support would dramatically expand gaming and productivity possibilities on the platform. For developers who’ve been limited to hand tracking and eye gaze, physical controllers could unlock entirely new categories of spatial applications.
6. 3x Faster Hand Tracking
According to community reports compiled by AppleInsider, hand tracking performance is rumored to improve by up to three times in visionOS 3. Combined with the AI-powered object recognition features, faster hand tracking would make interactions feel significantly more natural and responsive—addressing one of the most common complaints from current Vision Pro users.
7. Advanced Eye Tracking Features
Eye tracking is already Vision Pro’s primary input method, but visionOS 3 is expected to refine it significantly. The AI layer could enable predictive gaze targeting—the system anticipating where you’re about to look based on learned patterns—and more nuanced interaction models where different gaze durations trigger different actions. This kind of spatial intelligence makes the headset feel less like a tool and more like an extension of your intent.
Software Focus Over Hardware: Apple’s Strategic Pivot
Perhaps the most telling detail from all the pre-WWDC 2025 leaks is what’s not coming. Multiple sources confirm no new Vision Pro hardware in 2025. Road to VR reports that while a second-generation Vision Pro with a newer M-series chip is in the development pipeline, and cheaper variants are being explored, Apple’s 2025 XR strategy is entirely software-driven.
This is actually the smart play. The current Vision Pro hardware is powerful enough—the M2 chip and R1 co-processor can handle significantly more than visionOS 2 asks of them. By pouring resources into visionOS 3 AI features, Apple can make every existing Vision Pro owner feel like they got a free upgrade, while giving developers a richer platform to build on before the next hardware generation arrives.
Gurman’s description of a “feature-packed” update reinforces this approach. When Apple’s most reliable leaker uses that language, it typically means the company has been stockpiling features specifically to make a software release feel like a new product launch.
Developer Impact: What visionOS 3 AI Features Mean for the Ecosystem
For developers, the expanded AI capabilities in visionOS 3 could be transformative. Object recognition APIs would enable entirely new application categories—from interior design apps that understand your furniture to educational tools that can identify and explain real-world objects in real time. Spatial intelligence features could allow apps to adapt their interfaces based on the user’s physical environment, creating truly context-aware experiences.
The Apple Intelligence integration also means developers could tap into on-device language models for spatial applications—imagine a cooking app that reads your recipe, tracks your ingredients on the counter via object recognition, and provides step-by-step spatial guidance. These aren’t fantasies; they’re the logical extension of the visionOS 3 AI features being rumored.
Enhanced virtual environments and refined UI scaling, as highlighted in the AppleInsider community wishlist, would also give developers more creative flexibility. First-party app improvements typically signal expanded frameworks that third-party developers can leverage.

AI Battery Management: The Practical Side
Not every visionOS 3 AI feature is flashy. MacRumors reports an AI battery management feature coming to Vision Pro—something that could make a real difference in daily usage. The current external battery pack provides roughly two hours of use, and intelligent power management that learns your usage patterns could squeeze meaningful extra minutes from each charge. It’s the kind of practical AI application that doesn’t make headlines but makes the product significantly more usable.
The Bigger Picture: Spatial Computing’s AI Moment
Stepping back from individual features, what makes these visionOS 3 AI features significant is the convergence they represent. Apple Intelligence has been rolling out across iPhone, iPad, and Mac throughout late 2024 and early 2025. Vision Pro was notably behind—until now. WWDC 2025 appears to be the moment Apple brings its spatial computing platform into full AI parity with the rest of the ecosystem.
Object recognition plus spatial intelligence plus Apple Intelligence equals something no other headset platform currently offers: an AI that understands both your digital and physical worlds simultaneously. Meta’s Quest devices have their own AI features, but Apple’s advantage lies in the tight hardware-software integration and the Neural Engine’s on-device processing capabilities.
The public release is expected in September 2025, giving developers the summer to prepare. But the real excitement starts Monday, June 9, when we’ll finally see which of these rumors were on target and which features Apple has been keeping secret entirely.
Whether you’re a developer building for spatial computing, a Vision Pro owner waiting for the next big software leap, or simply watching where AI meets hardware, WWDC 2025 is shaping up to be a pivotal moment. The visionOS 3 AI features we’ve outlined here represent just what’s leaked—Apple’s track record suggests the actual announcement will include at least a few surprises nobody predicted. Four days to go.
Spatial computing, AI integration, emerging hardware—the tech landscape is moving fast. If you need consulting on how these technologies can transform your workflow or business, Sean Kim can help.
Get weekly AI, music, and tech trends delivered to your inbox.



