Just because Apple has a plan — and hard sex porn videosa forthcoming security feature — designed to combat the spread of child sex abuse images, that doesn't mean everyone's getting on board.
WhatsApp boss Will Cathcart joined the chorus of Apple critics on Friday, stating in no uncertain terms that the Facebook-owned messaging app won't be adopting this new feature once it launches. Cathcart then went on to lay out his concerns about the machine learning-driven system in a sprawling thread.
"This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control," Cathcart wrote midway through the thread. "Countries where iPhones are sold will have different definitions on what is acceptable."
While WhatsApp's position the feature itself is clear enough, Cathcart's thread focuses mostly on raising hypothetical scenarios that suggest where things could go wrong with it. He wants to know if and how the system will be used in China, and "what will happen when" spyware companies exploit it, and how error-proof it really is.
The thread amounts to an emotional appeal. It isn't terribly helpful for those who might be seeking information on why Apple's announcement raised eyebrows. Cathcart parrots some of the top-level talking points raised by critics, but the approach is more provocative than informative.
As Mashable reported on Thursday, one piece the forthcoming security update uses a proprietary technology called NeuralHash that scans each image file hash — a signature, basically — and checks it against the hashes of known Child Sex Abuse Materials (CSAM). All of this happens before a photo gets stored in iCloud Photos, and Apple isn't allowed to do or look at a thing unless the hash check sets off alarms.
The hash check approach is fallible, of course. It's not going to catch CSAM that aren't catalogued in a database, for one. Matthew Green, a cybersecurity expert and professor at Johns Hopkins University, also pointed to the possible risk of someone weaponizing a CSAM file hash inside a non-CSAM image file.
This Tweet is currently unavailable. It might be loading or has been removed.
There's another piece to the security update as well. In addition to NeuralHash-powered hash checks, Apple will also introduce a parental control feature that scans images sent via iMessage to child accounts (meaning accounts that belong to minors, as designated by the account owners) for sexually explicit materials. Parents and guardians that activate the feature will be notified when Apple's content alarm trips.
SEE ALSO: Tesla channels old school sorority values by policing customers' social media postsThe Electronic Frontier Foundation (EFF) released a statement critical of the forthcoming update shortly after Apple's announcement. It's an evidence-supported takedown of the plan that offers a much clearer sense of the issues Cathcart gestures at vaguely in his thread.
There's a reasonable discussion to be had about the merits and risks of Apple's plan. Further, WhatsApp is perfectly within its rights to raise objections and commit to not making use of the feature. But you, a user who might just want to better understand this thing before you form an opinion, have better options for digging up the info you want than a Facebook executive's Twitter thread.
Start with Apple's own explanation of what's coming. The EFF response is a great place to turn next, along with some of the supporting links shared in that write-up. It's not that voices like Cathcart and even Green have nothing to add to the conversation; more than you're going to get a fuller picture if you look beyond the 280-character limits of Twitter.
Topics Apple Cybersecurity Privacy Social Media WhatsApp
Unseasoned chicken cooked on a bare pan is mocked by the internetSandra Oh wearing a onesie and diamonds is the energy we all need in 2019Snap launches Dual Camera for all SnapchattersSnap launches Dual Camera for all SnapchattersWhat's what in MiddleTwitter is redesigning Spaces and adding podcast suggestionsGreat work, internet: Jeff Bezos' alleged sext is a meme nowWordle today: Here's the August 31 Wordle answer and hintsWordle today: Here's the August 29 Wordle answer and hints10 best '80s sitcoms on Amazon Prime Video for a trip back in timeMost watched TV shows and movies of the week (Aug 27)Everything you need to know about THAT 'She'Wordle' today: Here's the answer, hints for September 2The 'Killing Eve' moment at the Golden Globes that you may have missedWordle today: Here's the August 29 Wordle answer and hintsWoman stuck in airport fights terminal boredom by making a hilarious dance videoNews helicopter catches out people playing 'Mario Kart' on stadium's big screenNow there's a dating app inspired by Black Mirror's 'Hang the DJ'Most watched TV shows and movies of the week (Aug 27)Uber safety update puts users with non Millennials are entering a decade of despair. Here's how they can prepare. The Morning News Roundup for March 12, 2014 Barbenheimer just helped break another huge cinema record Immune System by Dan Piepenbring See the First Footage from the Cinematograph, Circa 1895 Finding a Life on the Edge by Laurel Holland E. L. Doctorow’s prescient, forgotten sci The Citizen app is testing a service that lets people order private security on demand, leaks show The Savage by David Mamet Twitch's new content tags are long overdue but they'll need back Why Threads is about to eat Twitter's lunch No Grownups Allowed by Sadie Stein Turn out the Lights and Watch Joško Marušić’s Fisheye Bull City Redux by Nicole Rudick Sadie Stein on childhood fame Wordle today: Here's the answer and hints for August 4 The Morning News Roundup for March 20, 2014 Snap's new Spectacles let you view the world in AR through the lenses Tweetdeck is now called XPro 'Charlie bit my finger' viral video will leave YouTube, become an NFT
1.8299s , 8614.5234375 kb
Copyright © 2025 Powered by 【hard sex porn videos】,Fresh Information Network