Chances are, either you or friends and family use Apple’s Siri multiple times a day to check your calendar, look up actor names, and more. Then the custom ads come in when you’re browsing social media. You see an ad for shampoo after you’ve talked with friends about your hair care routine. You get a Spotify suggestion for workout playlists after you’ve told your brother you want to try CrossFit. It’s definitely creepy, but a new article suggests that’s just the tip of the iceberg.

Siri is supposed to wake up when an Apple user says, “Hey Siri.” However, as The Guardian reports, Siri sometimes wakes up even when those words aren’t spoken. An anonymous source told The Guardian that Siri sometimes activates when it hears words similar to “Hey Siri.” Zipper sounds also cause Siri to wake up. The whistleblower stated that the Apple Watch and Apple’s HomePod smart speaker appear to be the most sensitive to accidental Siri recordings.


Like other tech companies who make voice assistants, Apple hires contractors to review Siri recordings. This helps improve accuracy and Siri’s suggestion-making capabilities. But it also means they’re hearing things some people would rather keep private. The anonymous source, who is one of these contractors, told The Guardian that contractors regularly hear all kinds of confidential information. They’ve heard drug deals, intimate encounters, and medical information.

But that’s not all. The recordings are accompanied by detailed user data like location, contact details, and more. According to The Guardian’s anonymous source, Apple contractors are not taught how to handle these accidental recordings short of reporting them as technical problems.

Apple isn’t the first tech company to run into hot water with its voice assistant. Earlier this summer, Amazon confirmed that it keeps Alexa recordings indefinitely unless a user specifically requests for them to be deleted. Amazon has also admitted that their employees listen to small samples of that audio.