Recording User Conservation Without Consent, Apple and Google Facing Lawsuits

By Author:
123 0
Apple is facing a lawsuit for recording people's conversations without them knowing.
Apple is facing a lawsuit for recording people's conversations without them knowing. (Image: tookapic via Pixabay)

Big tech firms Apple and Google are facing lawsuits due to reports of their voice assistants violating user privacy. Assistants record voices when users spell out certain keywords. Sometimes, the voice assistants accidentally record conversations without the user’s knowledge.

On Sep. 2, U.S. District Judge Jeffrey White ruled that Apple must face a proposed class-action lawsuit. Apple sought to dismiss the case, but the judge said that the plaintiffs have the right to prove that Siri, Apple’s voice assistant, regularly records private conversations due to “accidental activations.”

In addition, Apple is accused of allowing third parties, such as advertisers, to access the recordings. According to two users, their discussion about Air Jordan sneakers and Pit Viper sunglasses led them to receive ads for these items. Another user who discussed a specific treatment with his doctor received targeted ads afterward.

“The private setting alone is enough to show a reasonable expectation of privacy,” White wrote. The plaintiffs claim that the tech company violated the California privacy law and federal Wiretap Act.

In response to the lawsuit, Apple said that the recordings made by Siri were not associated with any “identifiable individual.” The company said that it believes in user privacy, and pointed to Siri being designed in such a way that users can disable it at any time.

“Apple actively works to improve Siri to prevent inadvertent triggers and provides visual and audio cues (acknowledged by several Plaintiffs) so users know when Siri is triggered,” the company said in its motion to dismiss.

Back in July, U.S. District Judge Beth Labson Freeman ruled that Google must face a similar lawsuit, in which plaintiffs accused the firm of illegally recording and disseminating conversations of people who had accidentally activated the Google Voice Assistant feature on their smartphones.

Freeman said that plaintiffs used the Google Assistant-enabled devices enough times to have a reasonable expectation of privacy when engaged in conversations. Though Google has disclosed how it collects information for ads in its privacy policy, “it does not sufficiently apprise users that it will use recordings made in the absence of manual activation or a hot word utterance.” The term “hot word” refers to phrases that trigger the Google Assistant into action.

Google dismissed the suit, saying that the company “never promises” that its voice assistant will only activate when the plaintiffs intend to do so. The company also stated that the plaintiffs failed to show that Google had broken any contractual guarantees or that they were harmed in any way.

According to market research company eMarketer, around 128 million Americans used voice assistants at least once a month as of late 2020. “I think this lawsuit is part of people finally starting to realize that Siri doesn’t work for us, it works for Apple,” Nicole Ozer, the technology and civil liberties director of the American Civil Liberties Union (ACLU) of California, said of the lawsuit against Apple.

Back in 2019, a controversy regarding Amazon’s Alexa voice assistant emerged. A report by The Sun showed that Amazon staff listened to British couples having private conversations and even engaging in intercourse. The noises made during intercourse were among the triggers that activated Alexa’s recording feature. 

In the same year, Apple announced that it was suspending the use of human reviewers to listen and grade Siri recordings. However, months later, the company reintroduced the feature, giving users the choice to opt-out of it.

  • Die-hard anime fan, would watch movies all day long if possible, any genre. The most prized investment ever made in the house is the theater room. If Prakash is not writing, he'll be in there.