Truth, Inspiration, Hope.

U.S. Tech Giant Apple Under Scrutiny for Secret Government Surveillance Feature

Darren Maung
Darren is an aspiring writer who wishes to share or create stories to the world and bring humanity together as one. A massive Star Wars nerd and history buff, he finds enjoyable, heart-warming or interesting subjects in any written media.
Published: September 15, 2021
Apple Store photographed on August 08, 2021 in Dusseldorf, Germany.
Apple Store photographed on August 08, 2021 in Dusseldorf, Germany. (Image: Jeremy Moeller via Getty Images)

While China-owned company Huawei has become the center of several surveillance-related controversies, rival U.S.-based company Apple has recently also sparked outrage.

A planned surveillance program intended to scan pictures on its messenger app is now under fire from users nationwide, with people demanding that Apple drop the program altogether.

Combating child exploitation or government surveillance?

Apple’s new message scanning program is supposed to help combat child exploitation and abuse by scanning photos on iMessage, Apple’s main messenger app, for potential explicit Child Sexual Abuse Material (CSAM).

The scanning, however, would have to compare photos to a secret government database of other child abuse photos, increasing the government’s surveillance and censorship capabilities.

According to the Electronic Frontier Foundation (EFF), this feature  “breaks the promise of end-to-end encryption,” while also potentially putting kids at risk, especially if the self-learning algorithm flags all images of child abuse, including those that could be used as proof against the abuser.

The feature has limitations, including an “opt-in” to allow child accounts to choose whether to send or receive images. It would still be easy for non-children users to create child accounts. The feature would allow Apple to oversee relationships between parents and children, or between children and friends.

Such easy access to childrens’ content could give potential child abusers, many of whom could be parents, a much more dangerous grip on their children. Even victims who are not children could be forced to create child accounts by their abusers to be monitored at all times.

Opposing the plan

Despite Apple’s promise to reject government “demands to build and deploy government-mandated changes that degrade the privacy of users,” the EFF fears that the scanning feature would not be enough, and could create privacy, legislative, and court-related obstacles for Apple.

An EFF political analyst said, “Users want the devices they have purchased to work for them – not to spy on them for others. Delaying the program is a step in the right direction, but it is not enough. Apple needs to take the next step to protect its users and abandon the program.”

In opposition to Apple’s scanning feature, protestors from EFF and Fight For the Future (FFTF) plan to gather in front of the company’s stores to criticize the program.

Moreover, more than 90 global digital and human rights organizations came together to publish an open letter to Apple to drop the program, warning that the surveillance feature would not fight the spread of CSAM, but would instead censor free speech and threaten privacy and security for users, especially children.

Apple versus Epic

This has not been the only issue causing trouble for Apple.

A court case between Apple and game company Epic, which is owned by Beijing-based Tencent, involved a dispute over the inclusion of Epic’s Fortnite game in the Apple Store after the tech giant denied in-app links or promotions that directed players away from Apple for purchasing.

The case concluded with Apple maintaining its stance to exclude Fortnite from its stores. However, the company was forced to allow other developers and apps to promote other forms of purchasing content.

With this change of policy, Apple could potentially lose billions of dollars from its users.