Apple Inc. was sued over claims that the privacy of Siri users was violated when human reviewers listened to customer recordings. The lawsuit was filed as a class action Wednesday, just days after Apple said it would pause its program in which company contractors would listen to a small portion of Siri inputs to improve the voice recognition service.
Amazon.com Inc. and Alphabet Inc. took similar steps after reports emerged about contractors hearing private information. Bloomberg News reported earlier this year that Apple had a team that listens to select Siri recordings.
The use of human reviewers by Apple, Google and Amazon already has spurred examinations by lawmakers and regulators in the U.S. and Europe. Privacy advocates have voiced concern that the companies’ practices could violate users’ rights, particularly in cases where devices begin recording unintentionally or without the user’s knowledge.
Appe has said that it listened to fewer than 1% of commands and that it only intends to listen to commands given to Siri intentionally. The company said when it re-enables the listening program, it will allow users to opt out from participating.
Apple didn’t immediately respond to a request for comment on the lawsuit. The company also has said that the recordings are stripped of personally identifiable data, but reports have indicated that contractors could see some location information.
Popular Smart Home Devices Carry Cybersecurity Risks
In Wednesday’s complaint, filed in federal court in San Jose by the adult guardian of a child in California who both use iPhones, Apple is accused of violating a California privacy law that prohibits recording of people without their permission.
The allegation is based on a story in the Guardian newspaper in the U.K. which said Apple contractors “regularly” listened to recordings without the knowledge of the people recorded. The unauthorized recordings included confidential medical information, drug deals and sexual encounters, according to a person described in the story as a company whistle-blower, but who wasn’t identified.
Apple’s user agreement gives the company the right to record users when they activate Siri with the “Hey Siri” command. But Siri “can be activated by nearly anything,” including the sound of a zipper or a user raising an arm, according to the complaint.
The plaintiffs also accuse Apple of lying to Congress in written answers to questions about its privacy policies. One question asked, “Do Apple’s iPhone devices have the capability to listen to consumers without a clear, unambiguous audio trigger?”
Apple answered: “iPhone doesn’t listen to consumers except to recognize the clear, unambiguous audio trigger ‘Hey Siri.”’
The case is Lopez v. Apple Inc., 5:19-cv-04577, U.S. District Court, Northern District of California (San Jose).
If you have an interesting article / experience / case study to share, please get in touch with us at [email protected]