iOS 17 is on its way to the masses, with developer and public beta programs of the upcoming software updates currently available, while expected to debut soon after the September 12 event.
But aside from the changes coming to your iPhone later this month, iOS 17 also brings all-new software features to AirPods, and they’re made possible by Apple silicon.
If you’re an Apple user who owns a Mac computer, you’ve probably heard the term ‘Apple silicon’ before. Apple ditched Intel processors on the Mac lineup back in 2020 in favor of its own in-house systems-on-a-chip (SoC), which are referred to by the umbrella term of Apple silicon.
However, Apple has made custom chips for its products for more than a decade, starting long before it brought Apple Silicon to the Mac. Its first-ever SoC, the A4 chip, was designed for the iPhone 4 and the original iPad.
Apple still makes powerful A-series chips for iPhones, but the company also branched out to develop custom silicon for its other products. It has an S-series chip for the Apple Watch and M-series chips for iPads and Macs. What might come as a surprise to some Apple fans is that the company makes its own audio processors for AirPods, and these chips are the reason why some of the earbuds’ best features even exist.
The custom audio chips found within AirPods
The first AirPods, released in 2016, featured a W1 chip that facilitated many of the earbuds’ features that were considered groundbreaking at the time. This includes automatic pairing, truly wireless connectivity, and battery life efficiency. Aside from the original AirPods, the W1 chip made its way to some Beats headphones, like the Beats Solo 3.
The second-generation AirPods brought a new processor in tow, this time featuring an H1 chip. It’s unclear why Apple altered the naming scheme between processor generations, but the H-series moniker is the one it stuck with. Compared to the prior W1 chip, the newer H1 chip featured Bluetooth 5.0 instead of Bluetooth 4.2.
In practical use, AirPods that used the H1 chip could switch between devices at a two times faster rate than before. Latency and connection speeds for phone calls, streaming, and gaming were also improved thanks to the H1 chip.
Plus, if you have AirPods with an H1 chip, you can activate Siri by only using your voice. Battery life is also better on the H1 platform, and you’ll get double the talk time on H1 than you’d find on W1.
Fast-forward to the second-generation AirPods Pro that debuted at the end of 2022, and you’ll find the newest Apple Silicon audio chip available, H2. The H2 chip brought improvements to Spatial Audio, active noise-canceling, and transparency modes. Apple says that the processor runs computational audio algorithms tailored to your specific ear shape, provided you use Personalized Spatial Audio.
At the time, we thought that was all the changes brought by the H2 chip, but WWDC 2023 shed some more light on the upgrades.
The H2 chip is making AirPods Pro even better on iOS 17
With iOS 17, second-generation AirPods Pro can take advantage of new features like Adaptive Audio and Conversation Awareness. Adaptive Audio serves as an option between active noise-canceling and transparency mode, automatically blending the two together based on environmental conditions.
This is tied with Conversation Awareness, which can lower the music volume and enhance voices when AirPods detect that you are speaking. There’s also Personalized Volume, which uses machine learning to adjust the sound profile of your AirPods.
These three features are limited to AirPods Pro 2 and are reliant on the H2 chip.
For people who use AirPods Pro often, the upcoming features set to debut this fall could be game-changing, and they’re a direct result of Apple Silicon’s presence in the earbuds.