AirPods only got a passive mention during the keynote at Apple’s event. It’s understandable — the iPhone 15 and Apple Watch Series 9 (and Ultra 2) were center stage. Besides, the headphones didn’t get the same manner of hardware updates. As a press release issued after the event confirmed, the biggest physical change to the AirPods Pro 2 is the (admittedly long-awaited) arrival of a USB-C charging case.
You would be forgiven for thinking the AirPods news ended there. However, Apple’s high-end earbuds also received a meaningful software update, in the form of new listening modes that can be accessed with a few taps in iOS 17 in both versions of the AirPods Pro 2 (USB-C and Lightning).
With the new models connected, swipe down to pull up Control Center and then long-press the volume slide. Three mode selections will pop up below: Noise Cancellation, Conversational Awareness and Spatial Audio. It’s the first two that are getting the love this year.
Adaptive Audio has been added to the options, alongside standard Noise Cancellation, Transparency and off. Tapping the new option, it gets highlighted with a rainbow backdrop. The new feature seamlessly flits between different settings in real time. It’s a bid to bring both ends of the spectrum to single setting, so you can walk down a crowded street with situational awareness, while not getting the full noise impact of the trash truck as it drives by.
Although similarly named to last year’s Adaptive Transparency feature, Adaptive Audio offers a full spectrum of modes, with both transparency and noise cancellation playing a role.
“Adaptive Transparency, which we announced last year, that has to happen really quickly,” Product Marketing Director Eric Treski said in a conversation with TechCrunch. “That happens at 40,000 times a second. That’s not only the monitoring, that’s the reduction as well. In order to bring that down quickly, it needs to be happening in real time. Adaptive audio is a little bit slower over the course of a few seconds, because it’s meant to be a much more methodical process to know what you’re listening to. We’re going from Adaptive Audio into transparency, so — in order to make it less jarring and more comfortable — it’s much more purposely slower for that reason.”
The system also factors in whether the content you’re listening to is music versus a podcast. That’s determined based on tagging from apps like Apple Music. A microphone also measures the volume inside your ear to get a true sense of the volume you’re experiencing. “Because if you only measure the loudness that you think you’re playing into someone’s ears,” VP of Sensing and Connectivity Ron Huang explains, depending on how they’re wearing it and other factors, it may be less accurate.”
Huang tells TechCrunch that the company considered leveraging your device’s GPS to determine sound levels based on location. In real-world testing, however, the method proved inefficient.
“During early exploration for Adaptive Audio, we basically put you in ANC versus transparency, based on where you are,” says Huang. “You can imagine the phone can give a hint to the AirPods and say, “hey, you’re in the house” and so forth. That is a way to do that, but after all our learnings, we don’t think that is the right way to do it, and that is not what we did. Of course, your house is not always quiet and the streets are not always loud. We decided that, instead of relying on a location hint from the phone, the AirPods monitor your environment in real time and make those decisions intelligently on their own.”
Personalized Volume is also a big part of the Adaptive Audio experience. The system combines a pool of user data with personalized preferences to build a fuller picture of listener habits, paired with “machine learning to understand environmental conditions and listening preferences over time to automatically fine-tune the media experience,” according to Apple. Several different metrics are included.
“We took tens of thousands of hours of different data — different users listening to different content and with different background noise — to really understand different listening preferences, and what are distractors and aggressors from a noise standpoint to keep your content really clear,” Huang ads. “We also remember your personal preferences. Given a type of environment, the amount of noise there, how loud you typically listen to your content, and remember it for you. We add it to our machine learning model to make it work even better for you.”
The other big mode introduced through iOS 17 is Conversational Awareness, which turns down the track’s volume when you begin speaking. External voices won’t trigger the effect, though — just the wearers. Apple is able to accomplish this effect without keeping on-board voice profiles. Instead, it leverages a number of on-board sensors. When the mics hear a voice and the accelerometer detects jaw movement, the feature ticks on. How long it lasts depends on a variety of different factors. I was impressed with the feature’s ability to avoid being triggered by things like a throat clear or yawn.
The team also took a stab at another longstanding earbud bugbear: switching. That-five second gap between picking up a call and hearing it on your earbuds feels like forever. Taking advantage of new switching speed requires the user to be locked into the Apple ecosystem.
“Connection times for our AirPods to our devices are way faster with this new software update,” says Huang. “That comes from all of the different ways we are using to discover nearby devices. It’s really key for us to know what the iPhone is doing, what the iPad is doing, what the Mac is doing. A phone call is more important than music, so when you’re answering a phone call, we make sure we take the route away from the iPhone and connect with your Mac for the conference call, for example.”
The last big piece of the AirPods announcement is Vision Pro connectivity. For the full audio experience, those using Apple’s upcoming spatial computing headset should bring along the new AirPods Pro for ultra low latency lossless audio.
“Bluetooth typically runs on 2.4 gigahertz, and that airspace is very, very noisy,” says Huang. “Everybody’s running on 2.4. That’s why routers Wi-Fi routers, for example, are typically dual-band if not tri-band, because the 5Ghz spectrum is that much cleaner. To get to really, really low latency audio, and to get to really high fidelity, lossless audio — it’s all about a very, very clean and real-time channel between two. The combination of 5Ghz and the fact that they are very proximal allowed us to do that. We’re able to basically redesign a brand new audio protocol over the 5Ghz for AirPods.”