Apple to start remotely scanning billions of iPhones for kiddie porn, findings to be reported

Joined
Aug 31, 2007
Messages
117,732
What better way to target and suppress millions of dissidents and refuseniks?

Report to authorities what you just "found" on their phones, through a search with no probable cause, no warrant, no consent and no knowledge that it even took place, until you get hut hutted at oh dark thirty.

Can we finally drop the ridiculous notion that somehow this ok because it's "their" private company?



Apple to Begin Scanning All iPhones for Images of Child Sexual Abuse

https://www.breitbart.com/tech/2021...all-iphones-for-images-of-child-sexual-abuse/

Lucas Nolan 6 Aug 2021

Tech giant Apple recently announced a new feature that will allow it to scan iPhone and iPad photos to detect if they contain sexually explicit imagery involving children, which Apple will report to authorities — however, many privacy experts are worried about the implications of Apple snooping on user content. One expert points out that Apple’s move is well-intentioned, but they should be thinking about one important question: “What will China want them to block?”

Forbes reports that this week Apple announced a new addition to its upcoming iOS 15 and iPadOS 15 firmware for iPhones and iPads. The new feature will allow Apple to scan user photos stored in Apple’s iCloud service and determine if they contain sexually explicit activities involving children.

In a statement on its website, Apple said: “This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC).” NCMEC works as a reporting center for child sexual abuse material (CSAM) and collaborates with multiple law enforcement agencies in the U.S.

Apple claims that the way it detects CSAM is “designed with user privacy in mind,” and it is not directly accessing iCloud users’ photos, but rather utilizing a device-local, hash-based lookup and matching system to cross-reference the hashes of user photos with the hashes of known CSAM. If there is a match between a user’s photos and the CSAM database, Apple manually reviews the issue and will then disable the user’s account before sending a report to NCMEC.

However, many privacy experts are extremely worried about the new system.

NSA whistleblower Edward Snowden tweeted about the issue stating: “No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.”

No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.

They turned a trillion dollars of devices into iNarcs—*without asking.* https://t.co/wIMWijIjJk

— Edward Snowden (@Snowden) August 6, 2021

Alec Muffett, a noted encryption expert and former Facebook security staffer, also expressed his worries over the system, telling Forbes: “How such a feature might be repurposed in an illiberal state is fairly easy to visualize. Apple is performing proactive surveillance on client-purchased devices in order to defend their own interests but in the name of child protection. What will China want them to block?”

Apple has always branded itself as a privacy-focused tech firm, even ridiculing companies like Google with a large billboard at CES which stated: “What happens on your iPhone stays on your iPhone.” It seems that may not be the case for much longer.

Read more at Forbes
 
Porn isnt what they are looking for. They are looking to CONTROL SPEECH, PERIOD.

When we have Individual Censorship, they are disrupting the peoples ability to organize a MEANINGFUL RESISTANCE. Porn as always is the excuse they hide behind.

And for the record, they could give two fucks about children. I am sure that the heads at Apple, Microshaft, Twatter, Netfux, and Googlag in addition to a good number of people in high positions in government around the world are actively involved in the Human Trafficking.
 
It's just an excuse to plant crap on someone they don't like.

Oh look, somebody said something bad about leftists, let's put some child porn on his system and arrest him.
 
are they finally gonna ALLOW ME to delete the gay porn they pushed on our phone...asking for a friend
 
It's just an excuse to plant crap on someone they don't like.

Oh look, somebody said something bad about leftists, let's put some child porn on his system and arrest him.

If a person can not guarantee that NO ONE, including the Manufacturer, has ZERO access to their phone, then and ONLY then should they be held FULLY accountable. But as you just said, it is an EASY way for them to plant evidence.

+Rep
 
This why I'm going to get the freedom phone, the only phone made by conservatives for conservatives. It's endorsed by Candace Owens!
 
https://twitter.com/kaepora/status/1423738825369604106
VIRslFD.png
 
Last edited:
Previously they were scanning everything uploaded / backed up from your phone. Now they're going to scan things that you keep only on your phone.

But the point is that the probability that this news item has been published before said scans took places is approximately... zero.
 
On matching software: It doesn't matter if they are matching pictures, faces, signatures, etc. The concept is the same. The threshold can be set low enough to guarantee hits. Anyone could be targeted.

If past history is any guide, the way it works will be with a derived confidence code of some kind. For example, on a scale of 1-10, drones can be set to not fire unless they get a 9 confidence code (enemy or not). The problem being that the people who set these levels will have no concept of what that number means or how it is derived. Often, levels as high as 8 will actually be nearly meaningless. A roll of the dice. But those who issue the orders will be clueless.

Yep, huge differences depending upon how the signature is obtained.

Lol. There’s an ad for Gartner at the bottom of the page. They are a big player in this space. Being familiar with this type of matching software, it is by no means perfect. Usually there will multiple parameters set that determine how the matching will be done, for example what confidence code or percentage constitutes a match. While you might think that 80% match is pretty good, it’s not. I would like to see what this software accepted and rejected.

Adam Laxalt interview with Shannon Bream



He also says in the interview that the resolution of signatures on file are not high enough to adequately use the matching software.

A 40% verification setting with matching software is a complete joke.
...
 
I have never owned an apple product and never will.

What if somebody has a picture of their kid in the bathtub? Are they getting turned over to the feds?

Does anyone actually believe this is about kiddy porn anyway?
 
Back
Top