Apple turns its attention to AirPods with cameras after the iPhone 16e and iPad Air


Apple has been on a relentless product launch spree over the past few weeks, unveiling everything from new Macs and iPads to the much-anticipated iPhone 16e. With the dust settling on its latest hardware releases, the company is now turning its attention toward future innovations, including an ambitious project: AirPods with integrated cameras. According to Bloomberg’s Mark Gurman, Apple is “actively developing” a version of AirPods that will feature built-in cameras, a groundbreaking addition that could redefine how users interact with their surroundings. However, this technology is not expected to debut with the upcoming AirPods Pro 3, set to launch later this year. Instead, it is likely being prepared for a future iteration, possibly the AirPods Pro 4, which could arrive around 2027.

Apple’s vision for AirPods with cameras appears to align with its broader strategy of integrating artificial intelligence (AI) into everyday devices. Gurman suggests that these AirPods will leverage their cameras and AI-powered software to analyze the user’s environment in real time, effectively bringing some of the capabilities of smart glasses to a more compact and discreet form factor. This development could help Apple strengthen its position in the AI race, especially as competitors like Meta and Google continue to push the boundaries of wearable technology.

With the launch of the iPhone 16 lineup, Apple introduced "Camera Control," a new physical button that allows users to quickly capture photos and adjust settings. Alongside this, Apple unveiled "Visual Intelligence," a feature designed to enhance the way users interact with their devices through AI-powered contextual awareness. The new AirPods with cameras are expected to build on these innovations, potentially offering features such as object recognition, augmented reality overlays, and enhanced spatial audio experiences based on real-world inputs.

While Apple is exploring the potential of AirPods with cameras, the company is also reportedly working on smart glasses akin to Meta’s Ray-Ban Stories. This could be a way to leverage the billions of dollars invested in the Vision Pro’s visual intelligence technology, which is designed to scan and interpret the surrounding environment. Such an expansion would allow Apple to diversify its wearables lineup, offering both audio-focused AI wearables (AirPods) and visual AR-enhanced products (smart glasses).

Apple Intelligence Delays and iOS 19’s AI Catch-Up

While Apple is pushing forward with long-term AI-focused projects, its immediate AI roadmap seems to be lagging behind. At WWDC 2024, Apple first introduced "Apple Intelligence," its suite of AI-powered features, but the rollout has been slower than expected. The most anticipated upgrade—an advanced, conversational Siri powered by large language models (LLM)—has reportedly been delayed until at least iOS 20.

Gurman’s latest report reveals that Apple’s upcoming iOS 19 update will primarily focus on expanding existing Apple Intelligence features rather than introducing entirely new ones. This suggests that Apple is using iOS 19 as a "catch-up" year, refining the AI tools it introduced in iOS 18 while ensuring that they work seamlessly across its ecosystem. However, no groundbreaking AI features are expected at WWDC 2025.

"The bad news is that Apple is unlikely to unveil groundbreaking new AI features at this coming WWDC. Instead, it will likely lay out plans for bringing current capabilities to more apps," Gurman states.

Additionally, the launch of the LLM-powered Siri backend, initially set for iOS 18.4, has also been postponed. The next major update, iOS 19.4, is expected to introduce some of these delayed AI enhancements, but the full-fledged "conversational Siri" experience may not arrive until 2026.

Apple’s slow progress in AI development stands in contrast to rivals like OpenAI, Google, and Microsoft, which have been aggressively rolling out next-gen AI models with real-time reasoning, multimodal capabilities, and deep integration into their respective ecosystems. While Apple remains focused on privacy and on-device AI processing, it is clear that the company is facing increasing pressure to accelerate its AI roadmap.

What’s Next?

In the short term, Apple is prioritizing incremental AI improvements, refining the current Apple Intelligence framework, and ensuring a seamless user experience across devices. In the long term, its push toward AI-enhanced wearables—such as AirPods with cameras and potential smart glasses—could position the company as a leader in AI-driven consumer technology. However, with major AI advancements still years away, Apple’s ability to keep pace with its competitors will depend on how effectively it can balance innovation with execution.

With iOS 19 expected to focus more on refinement than innovation, all eyes will be on WWDC 2025 to see whether Apple can finally deliver the AI revolution it has been promising.




 

buttons=(Accept !) days=(20)

Our website uses cookies to enhance your experience. Learn More
Accept !