Apple is working on technologies that could turn future AirPods into more than just headphones: devices capable of recording and interpreting certain electrical signals from the brain from the ear itself. This isn't immediate science fiction, but it is a line of research that, in the medium or long term, could change the way we understand digital health.
Behind this idea there is not a single project, but a combination of scientific studies, advanced techniques of artificial intelligence applied to electroencephalography (EEG) signals and patent applications related to the measurement of biosignals from the ear canal. Although there is no commercial product announced today, the documents published by the company offer fairly clear clues as to where things might be headed.
Apple's bid to decipher EEG without human annotations
One of the cornerstones of this potential evolution of AirPods is a study in which Apple researchers present a method for an AI model to learn the temporal structure of brain electrical activity without relying on experts to manually label the data. This approach focuses on leveraging large volumes of raw EEG recordings.
The method, named as PARS (PAirwise Relative Shift)This falls under the category of self-supervised learning. Instead of instructing the system which fragments correspond to each sleep stage or an epileptic seizure, the model receives randomly extracted pairs of EEG signal segments and must guess the time shift between them.
By forcing the AI ​​to solve this "puzzle" of relative positions, PARS gets the model to gradually capture the global composition of neuronal signals over timebeyond simple local patterns. In this way, useful internal representations are obtained that can then be reused for clinical tasks such as sleep stage classification or seizure detection.
According to the published results, transformers pre-trained with PARS matched or exceeded the performance of other self-supervised learning strategies in three out of four tests with different EEG datasets. This suggests that the model not only "fills in the gaps" in the signal, but also understands long-range relationships between different points of brain activity.
One of the most interesting aspects of the study is that it includes records taken with systems of ear EEGThat is, measurements taken from the ear instead of the scalp. This variant is much more discreet and comfortable, and fits perfectly with the idea of ​​integrating it into devices like AirPods.
From the lab to the ear: why ear-EEG fits with AirPods
Among the datasets used to evaluate PARS is EESM17 (Ear-EEG Sleep Monitoring 2017), which collects nighttime recordings with a portable ear EEG system multi-channel, combined with a traditional scalp EEG. This type of hardware demonstrates that it is feasible to obtain relevant brain information from the ear.
The auricular EEG uses electrodes placed inside or around the outer ear. Although the signals are somewhat weaker and noisier than those recorded directly on the head, they still reflect independently. clinically useful aspects such as sleep stages or certain patterns linked to epileptic seizures. The obvious benefit is that the device is less bulky and much more discreet.
In the context of Apple, the company has been adding health sensors to its devices for years: ECG on Apple Watch, blood oxygen measurements, advanced heart rate And, more recently, sensors like the photoplethysmograph (PPG) in headphones. The next step, according to research and patents, would be to explore monitoring brain activity from the ear.
The idea of ​​a future AirPods model incorporating EEG sensors doesn't seem so far-fetched when you consider that widely used formatThe combination of a widely adopted format, such as wireless headphones, with new medical or wellness capabilities could fit well with the digital health strategy that Apple has been promoting for years.
Alongside the scientific study on PARS, other factors have also been identified. intellectual property documents which describe in a fairly concrete way how these sensors could be implemented in a mass-market consumer device.
The patent that points to AirPods with biosignal sensors
In 2023, Apple filed a patent application for a "Portable electronic device for measuring biosignals" from the user's ear. Although the text does not explicitly mention AirPods, the illustrations and descriptions are quite reminiscent of the format of in-ear headphones.
The patent indicates that brain activity can be measured not only with electrodes on the scalp, but also with electrodes placed inside or around the outer earThis second option offers clear advantages: less visibility of the sensors, greater user comfort and greater mobility, compared to traditional clinical systems with conductive tapes and gels.
However, the document itself acknowledges that, in order to achieve an accurate measurement with an atrial EEG device, it might be necessary customize the fit to the ear of each person. Areas such as the concha, the ear canal, or the tragus vary greatly from one individual to another, which makes it difficult for a single design to guarantee a optimal electrode placement in all cases.
To circumvent this limitation, the patent proposes a system in which the headphones would integrate more sensors than strictly necessarydistributed around the tips of the ears. An AI model would then identify, among all these electrodes, which offer the best signal quality based on metrics such as impedance, noise level, and skin contact quality.
The device would combine the signals from the different measurement points, assigning a different weight to each electrode and thus generating a optimized "fused" brainwaveThe same patent even includes gestures of touching or pressing the earpiece to start or stop the capture of biosignals, in addition to several design alternatives to adapt to different ears.
How the system might work in future AirPods
Connecting the pieces of the puzzle—the PARS method, the ear-EEG, and the patent—suggests a likely operating scenario for headphones with brain signal reading capabilities. The AirPods would integrate bioelectric electrodes and sensors in the inner casing, in the areas that make contact with the ear canal and the outer ear.
These sensors would record small variations in electrical potential associated with neuronal activity and other nearby biosignals, such as muscle activity or blood volume pulseAn integrated chip in each earbud would preprocess the signal, segment it into zones, and apply basic filters to reduce some of the noise. adjust the volume of the effects.
The data would then be wirelessly transmitted to the iPhone, iPad, or Apple Watch, where a more complex AI model—possibly based on approaches similar to PARS—would analyze the information. Thanks to self-supervised pre-training, the system could recognize sleep patterns or detect anomalies without the need for a neurologist to label each record beforehand.
Among the potential uses mentioned in the documents are sleep stage monitoring, seizure detection or neurological irregularities detection, and tracking other health indicators related to the nervous system and circulation. Applications to [unclear - possibly "individual" or "medical"] are also not ruled out. concentration tasks, alertness state or fatigue prevention.
In day-to-day use, the user wouldn't have to do much more than put on their headphones as usual. The system could run in the background, being manually activated by a gesture (for example, the Live Listening feature) or on a scheduled basis—for example, overnight—to record the activity and then display it summary reports on the quality of rest or possible incidents.
Advantages and limitations of measuring the brain from the ear
The choice of the ear as the measurement point is not accidental. On a practical level, an ear-EEG system offers a Visibility very reduced compared to traditional EEG headsets, something that in Europe could facilitate their everyday use both in domestic environments and in public spaces without attracting attention.
Furthermore, headphones are an accessory that many people already use daily, which opens the door to frequent and longitudinal measurements without needing a doctor's appointment for each recording. This could be especially useful for those undergoing neurological treatments or who need to monitor their sleep regularly.
However, it's not all advantages. weaker and noisier signals that are collected in the ear require the deployment of much more demanding processing strategies, such as switch between noise control modes on your AirPodsEveryday activities such as talking, chewing, or exercising can introduce artifacts that distort reading.
The fit of the earphone itself also plays a role: a slight change in position can alter the quality of skin contact and, consequently, the accuracy of the measurement. Solving these problems requires combining a Very careful ergonomic design with noise compensation algorithms and continuous calibration.
From a clinical point of view, it is likely that, at least in an initial phase, this type of device will be closer to the general monitoring and the general follow-up of the formal medical diagnosis. In Europe, any step towards regulated health uses would involve oversight by bodies such as the European Medicines Agency (EMA) or national health product agencies.
Implications for privacy and regulation in Europe
Beyond the technical challenges, the possibility of headphones reading brain activity opens up an important debate about privacy, data security and regulation, especially relevant in the European Union, where the General Data Protection Regulation (GDPR) is particularly strict.
EEG recordings can reveal extremely sensitive information: sleep patterns, states of wakefulness, possible neurological disorders, or even, in the future, indirect indicators of the level of careAll of this falls squarely within the category of health data, which the GDPR protects in a reinforced way and for which it requires clear legal bases and explicit consent.
In this context, a company like Apple would have to ensure the strong encryption of biosignals both in transit and at rest, as well as transparent mechanisms so that the user knows what is being measured, for what purpose, and for how long the data is retained. Control over the deletion and portability of information would also be key.
Another sensitive point is the potential boundary between a consumer product and a regulated medical device. If future AirPods with EEG sensors are marketed as a tool for general well-being—for example, to measure sleep quality—they will follow a different regulatory path than a device designed for the diagnosis or treatment of diseases, which would require [specific regulations]. formal clinical assessments and specific certifications.
European regulators have already shown interest in the expansion of digital health and wearables. Any advances in reading brain signals from headphones will likely force review and specify regulatory frameworksto prevent innovation from outpacing guarantees for citizens.
Current state of development and time horizon
With everything that has been published, the current situation may None of them imply an imminent launch. There are three fronts: research on AI applied to EEG, patents describing hardware designed to measure biosignals from the ear, and examples of other manufacturers already taking steps in this direction. None of these, for now, imply an imminent launch of AirPods that read the brain.
Apple's own study on PARS makes it clear that it is research and experimentationnot of a specific product in its final stage. The main objective is to test whether a model can learn the temporal structure of brain waves on its own from unlabeled data and whether this improves performance on different decoding tasks.
The 2023 patent, meanwhile, details a possible implementation of electrodes and sensors in headphones, but Patents do not always translate into productsThey often serve to protect development paths or cover up ideas that may be revisited years later, if the technology matures and the market views it favorably.
Meanwhile, companies like Aware Custom Biometric Wearables have already launched headphones designed specifically to capture brain activity and signals linked to the vagus nerve and blood vessels in the ear canal. These examples demonstrate that the biometric wearables sector is not an isolated oddity, but a field in full swing.
Given the industry's usual pace and the need for rigorous validation of these technologies, it doesn't seem likely that AirPods with advanced EEG readings will reach the market immediately. Experts estimate that, if research results continue to be favorable, we could be talking about timeframes of several years or even the next decade to see robust and well-integrated solutions in consumer products.
discreet window into brain activity Apple is building the necessary components for future AirPods to go beyond audio and function as a discreet window into the user's brain activity, combining in-ear sensors, AI algorithms capable of learning without human input, and a software layer focused on wellness and health. The leap from laboratories and patents to everyday life for people in Spain and the rest of Europe will depend on both technical maturity and the response of regulators, as well as the extent to which users are willing to entrust their brain data to a device that, currently, is only used for listening to music.