Apple is once again facing scrutiny in France as prosecutors reopen an investigation into how the company handled voice recordings from Siri, its popular virtual assistant. The probe, led by the OFAC cybercrime agency, comes nearly six years after Apple’s earlier controversy in 2019, when it was revealed that Siri conversations were being analyzed by external contractors for “quality control” purposes. 
This time, however, French authorities seem determined to dig deeper into whether users’ privacy was compromised and how extensive the data collection truly was.
The case originated from a complaint filed in February by the French human rights organization Ligue des droits de l’Homme. It relies heavily on the testimony of whistleblower Thomas Le Bonniec, a former contractor at Globe Technical Services in Ireland. Le Bonniec’s revelations back in 2019 exposed that Apple’s subcontractors were routinely listening to audio snippets captured by Siri, including moments that were never meant to be recorded – such as private conversations or background discussions accidentally triggered by the assistant’s wake phrase.
While Apple maintains that the recordings were anonymized and solely used to improve Siri’s accuracy, the implications are far more troubling. According to Le Bonniec, the recordings included sensitive personal data: intimate conversations, health-related discussions between patients and doctors, and even background noises from homes and offices. All of it, he says, was gathered without users’ explicit awareness or meaningful consent.
At the time, Apple argued that less than 1% of Siri interactions were reviewed manually and that such data never left the company’s secure systems. But the French investigation questions just how true that was – and whether the data, even anonymized, could still reveal user identities. Critics argue that anonymization often fails when combined with contextual information such as voices, accents, and background sounds.
The controversy echoes similar scandals that have haunted tech giants for years. Apple has already faced a class-action lawsuit in France tied to these allegations, and earlier this year, the company agreed to a $95 million settlement in the United States to resolve a related case. The settlement compensated users up to $20 per Siri-enabled device, though Apple emphasized that it did not admit any wrongdoing. It claimed Siri was never used for marketing purposes and that the company does not sell user data.
Nonetheless, the French probe highlights a deeper issue: trust in voice assistants. These devices are designed to blend seamlessly into users’ daily lives – but that very intimacy makes them potential privacy risks. Each time Siri, Alexa, or Google Assistant misfires and starts recording unprompted, it captures fragments of our lives we never intended to share. Regulators, therefore, are increasingly demanding transparency on how long companies store these snippets, who can access them, and for what purpose.
Apple’s defense has long been that its privacy philosophy sets it apart from competitors. Indeed, Apple frequently contrasts its practices with data-hungry advertising models from Google and Meta. Yet, even Apple’s most loyal fans are beginning to wonder: if a company so focused on privacy can mishandle voice data, what hope is there for everyone else?
Le Bonniec has called for a full audit of Apple’s data-handling practices, including how many recordings have been made since 2014 and where exactly they are stored. Privacy advocates hope the renewed French investigation will set a precedent for greater accountability and perhaps inspire broader EU action under the GDPR framework.
Ultimately, the Siri investigation is not just about Apple; it’s about how much control users truly have over their digital lives. Whether the case leads to penalties or policy changes, it’s a reminder that in the age of AI assistants, convenience often comes at the cost of invisible compromise.
2 comments
Man, 6 years later and we’re still talking about Siri spying on us. Kinda wild that these assistants keep listening even when they’re not supposed to
People really need to start reading those terms and conditions… yeah they’re boring, but this stuff is usually right there in fine print 😩