Apple Considers Using GPS to Control Adaptive Audio Levels in AirPods Pro

With iOS 17 now available, Apple has also released a firmware update for the second-generation AirPods Pro that brings some new features. features including adaptive audio. This feature combines transparency and noise reduction modes and balances between them depending on the external environment. Interestingly, Apple once considered using the iPhone's GPS to control adaptive audio levels, but it ultimately didn't happen.

How adaptive audio works in AirPods Pro 2

In an interview with TechCrunch, Apple executives Ron Huang and Eric Cods talked about the new features of AirPods Pro.

In addition to Adaptive Audio, the new firmware also features Personalized Volume, which adjusts media volume based on environmental conditions, and Conversation Awareness, which reduces media volume and amplifies voices in front of you when you start talking.

As for adaptive audio, it may sound similar to adaptive transparency, a feature announced last year for AirPods Pro. With Adaptive Transparency, the headphones constantly monitor external sound to reduce some annoying noise, even when Transparency mode is turned on. On the other hand, Adaptive Audio does much more.

Tresky explained that the new mode is slower than Adaptive Transparency, but that's because there's a “much more methodical process for letting you know what you're listening to.” and intelligently adjust the spectrum between transparency and noise reduction.

The system detects whether you're listening to a song or a podcast, and the microphones inside AirPods measure the volume in your ears to understand the volume the user is experiencing. But Apple tried a completely different approach when developing this feature.

The feature almost relied on GPS

Instead of relying on microphones and other sensors, Apple has considered using the iPhone's GPS to detect when the user is in a noisy environment and adjust Adaptive Audio levels. For example, AirPods automatically switch to transparent mode when the user walks down the street.

“In the early stages of exploring adaptive audio, we basically put you in ANC mode rather than transparency mode, depending on where you are,” Huang explained. “You can imagine the phone being able to give a hint to the AirPods and say, 'Hey, you're home,'” he said. and so on.”

“After all our knowledge, we don't think this is the right way to do it, and we didn't do it that way. Of course, your home is not always quiet, and the streets are not always noisy. “We decided that instead of relying on your phone's location cues, AirPods monitor your surroundings in real time and make smart decisions on their own,” he added.

Even more interesting information about AirPods

For personalized volume, Apple says it analyzed “hours of different data” about how users listen to different content in different environments to understand their preferences. AIrPods also remember the user's preferences based on where they are and the noise level – all this happens on the device.

As for conversational awareness, this feature is not just waiting until it detects the dominant voice, but also uses accelerometers to detect jaw movements to ensure it's the user speaking and not someone else nearby.

The full interview contains some other details, such as Apple executives confirm that the updated version of the second generation AirPods Pro with USB-C uses the new 5GHz wireless protocol for lossless audio when connected to Apple Vision Pro.

Leave a Reply

Your email address will not be published. Required fields are marked *