@@cardboardpackage Tim Cook has been hiding while he throws Craig under the bus. I think the CEO of the company should be the one explaining this to its customers and media, not his VP of Software Engineering. Cook just threw him under the bus and started driving it.
@@MidNiteR32 nah. I think it was the right move. First of all Craig is more agreeable and second the risk is lower. And to be honest Craig managed it formidably imo
@@albinjt probably because Apple restricts certain channels on Telegram. It’s insane that I have to go the browser version of Telegram to view those restricted channels.
@@bigrich9654 What sort of channels are they restricting tho ? And what are the restricted channels in iOS you have been accessing ? Could ya send us the links ?
3:21 “pornography of any other sort” I’m glad Craig essentially said that Apple knows and understands that people simply just have nudes on their phones
@@nicolelea615 My friend there are people out there that are just as bad as mentioned but they are people who have private photos of their partners/spouses don’t put everyone and everything under one group its not fair hope you understand (:
If they are stored on their servers in this day and age I feel as if it’s your fault for trusting big tech. Either way, we’ll all forget about this in a couple of weeks. We basically already have
If they can install a program that tell me my battery is at 10% after 10minutes of use, when a quick hard restart bring it back on at 100% there is no telling what they can install on your phone. If Ur f-ingdeau gives Apple a couple hundred million of our tax dollars because we proved that the vax was ineffective and self immunity has a 80% success at beating the virus there’s no telling what those greedy blasters will do.
Tim Cook is the ceo Craig is the software person he knows what everything does he made it Tim Cook does not do software and anyone that stores things to the cloud they dont own the servers all they own is the main device storage they dont scan on the device they scan on the cloud only
“It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you'll do things differently.” -Warren Buffett Apple is feeling this hard, hence the panicked response to media.
@Carrot Cruncher I'm an Android guy through and through Brett has a network engineer I can tell you programs are much more flawed than people realize they're doing this for margin of error
Now that this whole news has gotten out, actual petafiles aren’t going to be storing their Photos on iPhone anymore. So basically this feature is useless now.
The people stupid enough to store highly illegal material in cloud storage won't be stopped by these news. It was always one of the easiest ways to get caught
It's definitely a little more complicated, I can "own" a car but there's a lot of restrictions on what I can to with it or to it, especially if you want to use it on a road. Ownership doesn't really imply full control most of the time, even with land you have tons of laws limiting what you can do with it
@@joshgribbon8510 Exactly we as consumers don't really own anything anymore and that's the world over, we don't have any rights just privileges until someone decides to take them away.
I’m glad that she pushed the “who owns your phone” and the conclusion. I applaud WSJ on pushing the exec on something that felt not scripted apple BS interview. Now, how do we know those pictures being provided by those associations won’t be manipulated into searching for other stuff. At the end of the day apple has no idea what those hashes are. Who knows what the hash provided was.
First of all, thank you for covering the issue. I wish you pressed him on what type of audit he mentions, because to me anyone can force Apple to add a database via the FISA court. I want to know what is done to prevent that from happening instead of taking Apple as its word.
@@zonka6598 if that it what’s works for you and gives you a sense of privacy then by all means but just note that if you use google they’re already doing it and worst so yeah…
I don't want my images to be scanned even if I don't engage in any illegal activities. It doesn't matter if it's AI or a human looking through my photos it just makes me feel uncomfortable.
@@johansm97 lol I don't understand how people think everything in their iPhones aren't already being touched by AI, especially photos. How do you think your photos look so good? Computational photography using AI. How do you think they group faces and show you memories? AI. This is just Apple using AI, in a much more careful way than other companies, to do something. That's all it is, and people are losing their minds
@@7billza they actually aren't, facial recognition on iphone is done on device, apple doesn't scan anything, it's the only company that believes in privacy
@@bouzianenadhir8503 they wouldn't have destroyed end to end encryption to Apple servers aka iCloud. If they can snoop around while a photo is uploading to cloud, it's not end to end encrypted. It's not private. As simple as that.
Same tech can be used to identify political dissidents, protesters, and just about anybody. Imagine matching memes commonly shared by people of the groups to identify people for political persecution.
Yes. Even If we take them at their word and accept that they can't see other photos because they can only see the ones that neural network has very tightly matched for. They still haven't said anything about the possibility of them searching for other stuff.
I'm sure your isp, phone provider, Google, facebook (including Instagram), and any other social media or messaging platform do that. If you truly care about privacy, you have to get an opensource operating system, and only use opensource apps. There's no way around it
"It's not a backdoor. But it can be manually verified by humans in case our algorithm finds a match." Hmmmmmmm 🤔 That sounds suspiciously like a backdoor to me.
@@ifiwantyoutofeel no it's the fact that it might be a faulty system. How can it differentiate an image of a child posing in a sexual manner with lingerie, to a baby taking a bath. Will it flag both, none, or one of those images? Simple things like that can really impact a person's future
As a longtime Apple customer (1986)I was thrilled with Tim Cook's statement about privacy and your history of resisting law enforcement and government when it comes to privacy. Now you have appointed yourself the law. And now you are going to scan my phone without my permission. At least the government has to get a warrant. Just a month ago I got rid of my Fitbit watch because Google bought the company and bought an Apple watch because of Apple's supposed commitment to privacy. You are not the government so I have no recourse if you abuse my privacy. So you can do whatever you think is right and I have no recourse. There are only two operating systems in the world and we just have to accept that Big Brother Apple is like Big Brother Google who knows what’s best for the unwashed. We have just about as much recourse as people in China.
Or Apple wanted to avoid government parties such as FBI and CIA so long that by doing so (according to past features such as adding a feature to destroy all users data should the phone's password be typed 10times wrong) it could jeopardize the company. Donald Trump single handedly managed to give an executive order to Google to stop providing the official version of Android and it's services to Huawei and Huawei was almost ready to exit the market. Now imagine Apple being forced to show all users iCloud data to governments due to child pornography claims even though you do not have any. That would suck for them and the user's privacy. Apple (for now) found an in-between solution that still protects legit users data on iCloud and protects Apple from governments by giving an actual "backdoor" to them after many years (as seems by Kreg's tone). The only time this feature will get out of hand is only if it expands for political parties or political correctness such as someone posting an LGBTQ funny image that seems insulting in apple's eyes. Then things will not look good for Apple.
@@milantoth6246 It was extreme. My concern is the fact that the internet and media companies are becoming a necessity. Most businesses or utility companies assume you have internet access. The problem is the tools you need to access the internet are companies that can make arbitrary decisions that change your access to the internet, and you have no recourse. There are only two operating systems in reality Apple and Android, private companies.
He doesn't seem to understand the fundamental reason people are upset. The hash database is on your phone. The scanning is on your phone. This means that we have no guarantee that our phones will be private in the future.
I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now. Why are you in an uproar now? When they first said your phone was Private, why didn't you roll your eyes and say "ya, but what about the future?"
@@bhavinbijlani They already built the tech to do it. That was their argument against creating a backdoor in 2015. Now it exists and Apple has no excuse that they “can’t comply.” They’ve already stated that they developed the technology to comply.
People are upset because they don’t understand the underlying technology, the same way that a lack of education about natural forces and science leads people, still, to call someone a witch and persecute them.
The reason to worry about this photo scanning is that there's no way it doesn't evolve. Currently, it only checks 1) photos being uploaded to iCloud 2) that match a database of known CSAM. Importantly, this doesn't do anything about new CSAM created in the abuse of children. Catching new material is the obvious next step. And there's no way to achieve that with the current hashing architecture. It has to be done by constantly monitoring all media on the phone, probably with some "AI moderator". And there's no way that some government doesn't demand that this monitoring be used to detect something other than CSAM, like political dissent (remember: China is Apple's biggest market). That's the worry. This new tech is only *kinda ok* as long as it doesn't evolve a single step beyond what it is now. And there's virtually no chance of that happening.
Yeah, this tech should evolve because this step alone doesn’t solve the problem. Regardless, It was either going to get created to do the right thing or the wrong thing. That’s just how it works. For now it’s use-case is positive.
This explanation from Apple is even more worrying. They describe a technical solution where no one will be able to independently evaluate what content triggers the alert. Hashes results will be ciphered so that no one will know what content matches what "hash of interest" on the device nor on the backend. Any political sensitive content could be part of the database without any one never knowing. Never trust anyone's word to keep you safe from technology abuse.
The explanation is literally a lie. We don't process the images on your phone, here's the misunderstanding: oeighoihzgoiehrg hieogheorighe oriheoirhgoierhg "scanning on your phone, yes, but," eoitgoiehgeihgo That could be the TL;DW of the video tbh
The alternative is, as Google and MS do, to scan the whole cloud content of all users. Apple wants to be in a position to not being able to see our data. And that's the way to protect our privacy, while trying to follow the laws of the US and EU that want to have more and more supervision.
Oh it gets worse. At some point audits (real humans) get involved. At this level who knows what can happen…and if anything did go afoul at Apple, how would you know? What happens when hackers find a way to inject foul hashes or FISA requests force apple to apply this tech for political reasons (under the guise of domestic terrorists)…..in fact, the timing is extraordinarily on point with recent up dates to terrorism.
@@tomboss9940 The alternative is better. You, as a user, can decide whether or not your content undergo the screening. While your data rests on your computer or phone, they remain yours and on your sole control. What Apple is doing is potentially removing that control from your hands: any data on your phone may be monitored without you even granting that right. the only things preventing them from doing that is their good will. Technology history taught us that you should never trust anyone's word from preventing technology abuse (be it knowingly or not).
It doesn’t matter what the steps are between if A is uploading a photo and Z is them reviewing/alerting authorities. They “Review your private photos” despite the letters in between. Don’t get lost in the steps.
They also announced it on a Friday afternoon because they knew there would be blowback and they just wanted people to forget about it during the weekend. Well that’s not happening.
@Apple Genius Jokes on you, I dont use Samsung or Apple. Sad for u Apple fanboy, need to pay 3X the price of a smartphone, just to get low battery, low storage/ram, ugly notch, slow charging and 60hz display in 2021. LOL so sad. Just make sure u dont break the glass, the repair price is more expensive than 1 android phone.
This was a weak interview. Craig threw some big fancy words when asked to simply describe the system. No hard questions asked and this seemed more like a PR move than an interview
"Craig, tell us why it's okay to treat your customers as if they are guilty until proven innocent, and why you want to foist the system resources onto the users instead of your data centers..." That's what should have been asked.
@@ssud11 Yep, they also pay through future access to things, not just through money. So if they cover this in a way that Apple likes they get future access to news first because they're seen as trusted.
It seems like Apple still doesn't understand just how strange this has made their most loyal and fervent customers feel. This has the potential to really spiral out of control on PR terms, much like the 'apple purposely slows down phones' headlines came out of the throttling due to battery age thing. This loyal base kinda sets the tone for what the sentiment around Apple is, and right now they are spewing and the issue isn't going away. I think the way Craig handled this won't do anything to dampen the concerns either, condescendingly dismissing the backdoor concerns and also giving no details on how it will be expanded or just how we can guarantee Apple is limiting it to child porn. I understand Apple has a new head of PR, it's making people question just what Apple has been up to before that their slick PR glossed over. Some kinda line has been crossed here that I've never felt/seen in my 25 years of using and following Apple.
I feel exactly the same way. I’ve been apple only since I was 8… huge fan of the company… they basically bought my house… some line is being crossed here. Like maybe I’m not in love anymore…
The apple loyalists will always fall in line. As a person who used to think "this is surely the last straw for Apple fans" I don't doubt anymore. I buy the stock and get "rich" with the winning team. Public backlash needs to be HUGE to stop this. Apple hard-core fans aren't revolting against Apple. I've put my money on that.
But isn’t it better than having the door wide open as it is on many cloud services? I think this is the best balance they could find between not hosting CSAM on their servers and also protecting customer privacy.
@@Dlawderek absolutely not, however good their intentions are, i will never agree to having my private data monitored. Something many forget is that one’s privacy is protected by law. Even if police were to illegally obtain even legitimate evidence against one (be it through illegal wiretapping or else) that evidence will be rejected as unlawfully obtained. What apple is doing here is basically rephrasing “we will hack into your storage and check if you have anything illegal” into “we will scan all your photos and if you don’t agree then we will stop providing service to you even if you paid for it”. Barbarism.
@@Dlawderek NO. If you upload to a cloud service its not your hardware or a private space. Apple is now saying my hardware is actually theirs too to do as they please...
Many of us understood exactly what this was from day one, this "talking down to" by Apple is gross. You don't control what's in the database and a government can change it from just CSAM to anything they want. Creating the backdoor is the problem.
I disagree completely with calling this a "backdoor". Apple is not *entering* your phone to do anything, Apple is scanning what *you decide to send to them*. This is more of a bouncer than a backdoor.
This question can and should not be asked to Apple directly but to the government and entities responsible for controlling data security. All companies have a similar or identical technology and unlike Apple they’ve been using it for decades now.
@@gobi817 Because you have a reasonable expectation of privacy on your personal cell phone and companies don't have the right to search and report your content to the police. They shouldn't be looking at your data beyond what is necessary to provide cell phone service. iCloud was marketed as a way to store your data, not a service to scan for and prevent illegal activity.
Just don’t upload your photos to Apple then? Also, I don’t think people send well known pictures of drugs to other people. Funnily enough, if Apple has a hash for your drug photo, this proves you didn’t take it yourself.
Ive been repairing Apple products for 2 years, and aside from battery replacements I wouldn’t recommend non-techie/ qualified/ confident people to do other things like replacing screen screens, lightning ports, FaceID sensors etc
From apple: “Could governments force Apple to add non-CSAM images to the hash list? -> Apple will refuse any such demands” So once again… you’re missing the point. Yes, a government could force this… but trust us. Why should we trust them? Who’s the next leadership team? Apple needs to stop this now. I’m honestly considering breaking up with them for the first time in 30 years.
@@hsing-kaichen5062 they also gave into to China and put a data center in China for China icloud. So China just has to walk over to their icloud data center in china, pull the physical data and they have China iPhone data. They already caved to China once. China will ask to add their own csam database, will you disagree then? And what's in that database? We won't know.
"Who owns this phone?" "Well, customers do, but good luck running any software other than ours on it." Answers to moot questions keep average consumers misinformed.
7:05 should have digged deeper here. The reference hashes belong to child pornography today, tomorrow some state might want to force Apple to add additional references hashes, e.g., of Winnie Poh pictures. If too many Winnie Poh pics get uploaded to the cloud, we have our manual verification prompt, thus our backdoor. One could try to weaken the hashes, too, so they cover more pictures, prompting the manual verification on all kinds of pictures. In the end, you still need to trust Apple to only check for the hashes they tell you about. Not quite the advertised "you don't have to trust a single entity".
I’m so lost with why people are upset 🤷🏻♂️ So what if a manual verification prompt occurs if we have too many Winnie the Pooh picture? Are you saying that then Disney could then advertise to us more or something? Like apple aren’t gonna report you to the police for having Winnie the Pooh on your phone
@@UnkleRiceYo they will if your in China. That’s the point slick. In China it’s a hidden law not to have the photo referencing their leader as Winnie the Pooh so they arrest people who do. Apples software could easily be rolled out to match the picture and report people in China.
@@Chaser-mw1fb So Apple is to blame because of China’s unfair censorship laws? Also, there’s no indication whatsoever that they will be doing anything of the sort.
@@Dlawderek I believe you don‘t understand the issue. A country like china could say „hey apple additional to csam also scan for the following images when uploading to iCloud (f.e. HongKong freedom acitivism photos)“ If Apple then goes: „na wr promised our customers not to do that“ China could go: „do it or you‘re no longer allowed to sell your products in China“ (a huge market that brings a lot of revenue). It really isn‘t hard to understand. The problem is not what apple is doing but the possibility of the miss use.
I appreciate the tone and balance of this interview. Nice job. My biggest problem with these features is that Apple is assuming a moral position. Let me say I am 100% aligned on these behaviors being immoral/heinous. What concerns me is simply that they are taking a moral position. What happens when next month, it’s not child porn but “hate words” in iMessage? Hate defined however Silicon Valley defines it. Applying tech to moral subjects is a very slippery slope. To suggest they can’t or won’t misuse this kind of tech in the future is just ignorant/naive.
Legislation or court systems in other countries could easily add requirements to Apple’s scanning database. It’s hard to believe Apple executives could be this short-sighted about a technology. In order to save face Apple can simply say there are problems with the technology and shelve this for the time being.
I think CSAM and “hate words” are not even nearly in the same league. CSAM is illegal and demonstrably dangerous. The 1st amendment protects your “hate words” so I find it hard to believe that Apple would scan or flag this content. This is a “slippery slope” logical fallacy.
@@Dlawderek “Hateful content” like Nazi imagery is illegal in some European countries. What’s to stop governments from requiring to Apple to include that in the database of images they scan for?
@@davehugstrees Maybe they will. If they start censoring political speech by looking through people's images and reporting them, I would be mad. This is not that. If that day comes, we can all turn off our iCloud storage and/or get rid of our Apple products. I don't think outrage is justified in a case where they are taking very cautious steps to curb the storage of CSAM on their servers. It takes 30 instances of hashcodes matching known CSAM before there is an audit. Even if some photos are flagged mistakenly (which I understand to be very rare) it would never reach 30 by mere chance. Even if it did, I would not mind someone at Apple verifying that I have no illegal images in my iCloud. There shouldn't be anything here to worry about.
@@Eugenepanels yeah it’s really strange how they try to underplay this change in a way. They should have done a comprehensive press release from the start, considering how important this change is.
Dude the whole thing was leaked before they could properly present this. That’s why it’s causing problems, because it wasn’t officially presented by Apple.
"I think our customers own their phones, huh, for sure." Too bad his thinking is not reflecting what is really happening... #righttorepair p.s. this is not an interview, merely a communication from Apple...
In a practical sense, Apple at least has good intentions by doing this. Its unarguably a good thing that they are planning on tracking down phones that happen to have children on there. I do see why people are mad tho. Apple has always had a long history of keeping information secure for its customers and this seems like a slap in the face to those who use iPhone because of its security.
Because apple has always been so vocal about Privacy and Not letting other apps track your data. And also cuz apple has starting to show ads on their platforms and hiring people to create a targeted ad network that they have been opposing for so long, pushing out the whole competition. Google scans your data all the time, and flags illegal stuff on gDrive but it's not a bid deal cuz they never said they won't do it or tracking personal data is a bad thing like Apple has been doing.
@@blackhatson13 they are big tech companies…and they follow many rules and regulations, it isnt very simple for every employee over there to come and view our private icloud photos…
Isn't Facebook also did this with it's team of "human moderator". I'm not saying if that's not gonna invade my privacy, but without those human moderator, we could be seeing terrorism, porn and those nasty things in our message / chat
They actually cut the Apple campus out from any word Craig said which could be used to make memes! That means there has to be an agreement for this interview. I wonder if that includes other limitations as well since the interviewer didn’t pressure Apple that much. This feels more like Apple marketing than journalism
Given that this is an exclusive, this is most likely a way for Apple to take control of the situation. Most companies will only agree to these types of interviews if only certain questions are asked to control the narrative.
I was shocked he used that language. I’m guessing Craig, Tim and anyone else giving media interviews are demanding the questions upfront. Then Apple legal, corp comm and marketing can train the two of them with exactly what to say that will answer SOME questions, but not enough to commit to anything that could lead up to being used in a courtroom or in Congress against them.
If the customers owned their phones, they'd be able to install software from wherever they wanted to obtain it. They'd also be able to replace the battery themselves, even if it meant buying a special tool for the job.
They aren’t looking at your photos. The only people that should be worried about this are child predators… which may be telling of why you care so much.
@@cmtheone I’ve worked with law enforcement to put predators in jail before. It’s funny that you’re too dim to see how having your privacy tampered with in the name of the greater good isn’t concerning. Then again, you’re the ideal complacent sheeple that big companies and governments want us all to be. Enjoy your ignorance friend.
@lol what makes you think child predators will store their photos on their phones? Same idiocy as using gun registration to stop violence criminals using guns to rob a bank.
Even more than that, this whole thing is probably start from China because Huawei got banned. So, CCP lost it's surveillance tools and turn to Apple for answer. What else can forced Apple to sudden launch such opposite program
@@akhileshjayaranjan5628 If Apple cannot see your photos then what's even the point of this system? Algorithms are faulty and Apple admits that if this system flags something, there will have to be a human to double-check. And that's the big problem right there: they *can* check your photos. Who is to say that they or a local/US government agency wouldn't just check every photo instead of only the ones that have been flagged?
it sounds like a blind raid without probable cause or a warrant, they can't see exactly what you have in your house as they rummage around, but they'll check anyway. it's either private or its not.
Apple is like: a man comes to a lady during her shower, saying he will keep eyes closed and just scan for security. People just don’t believe it and don’t buy it. The point is not “ a safe way to scan phones”. The point is “ DON’T scan my phone”. Dont’t means don’t
I like your woman in the shower analogy, but you should have elaborated more on that story. Left me wonder what happens next. When can the man open his eyes?
Did you watch the video? Or did you just not understand how the system works? Yes, Apple scans iCloud images for child porn, but the scans on device stay on device and Apple doesn't know what photos you have, unless the neural hashes match the csam database when you upload it to iCloud, and after levels of scanning.
@@ShubhamKumar-xu2od mm nice one It hadn’t occurred to me but agreed. This is the worry with technology little by little but we’re getting to that point
I wish Joanna would have asked about the future "enhancement and expansion" of this thing, as Apple announced. Dystopian world we are about to live in.
@@samsonsoturian6013 NO. DON"T TOUCH MY PHONE. DON"T USE MY IPHONE"S COMPUTATIONAL POWER TO DO THE FIRST HALF OF THE WORK. NONE OF MY BUSINESS. I DON"T WANT TO BE INVOLVED.
He gave a vague answer. In future apple is planning to scan our entire phone . People like you who still don't understand and still thinks that apple is god . whatever they do is perfect.. I feel bad for you brother .
Another part of this issue is the idea of who owns the content. Regardless of where it is stored. If the police need a warrant to search a safety deposit box at a bank, shouldn't Apple need a warrant before searching photos? The idea of fiduciary duty and trust. If someone is purposely posting items to a public location by all means search away. But when photos are privately being stored in the cloud it feels very invasive.
@@crusherman2001 And that is the issue. When I store paper files in a safety deposit box (1) the bank can't nosy through my stuff and (2) the bank has the responsibility of keeping my files secure. I still own the documents. For all Apple's talk of privacy this could be manipulated to be very big brother...
@@crusherman2001 If you have a reasonable expectation of privacy, they can't just go through your images to report them to the police. For example, if you pay to store your physical items at a storage place, they can't go through and search your stuff and report it to the police. Now, if they have a reason to think you are doing something illegal (smell of weed coming out, for example), they can report it to the police who will then need reasonable suspicion or a warrant to search your stuff. This proactively searching and reporting people to the authorities is not only a terrible invasion of privacy, but it's one that could create legal issues for innocent users.
With Dropbox, Google, MS, this is happening now. Apple wants to safeguard iCloud. That's why they came up with this (complex) solution to not having to watch all your photos. The plan is to encrypt all parts of iCloud in a way that Apple cannot read it. This solution is a counter-offer to the US and EU's intrusive laws in the works for "child protection" (as a scapegoat for sniffing through all our cloud data and communications).
7:18 This is FACTUALLY WRONG. First, only the database of HASHES of CSam is stored on the device. By the nature of hashing, it’s IMPOSSIBLE to get the source image that produced that hash, meaning you CANNOT CHECK for yourself if any given image will be classified as CSam without trusting the US authorities’ database for only including relevant hashes (and not hashes of political or religious content). In fact, I believe even Apple has to trust the authorities to provide relevant hashes only. Second, 7:30 Mr. Federighi didn’t mention that the system will launch in the US only, thus he could say that the database will be THE SAME. No, there exists no universe in which the Russian or Chinese government that will allow iPhones to be shipped with magic hashes that were SECRETLY produced by the US authorities. I bet, the first thing they will do, perhaps justly, is to DEMAND either ACCESS to the US database or (more likely) use of their own databases for their citizens. And I can guarantee that the databases will be matching against WHATEVER doesn’t suit the government. Finally, here’s why all of these is an attack on our rights: There exists a way to bypass this mechanism by simply turning off iCloud Photos. In other words, while the pervs will lose the convenience of syncing CSam across their devices, everyone else will be surveilled for no reason.
"In other words, while the pervs will lose the convenience of syncing CSam across their devices, everyone else will be surveilled for no reason." Lolololol! Nicely put.
Actually they aren’t scanning any files. They are creating an encrypted hash that is checked against their database of CP hashes. How hashes work is they cannot be decoded and the only wait to identify them is to have a hash. Thus, the only data that is “revealed” in this process is CP data which should be banned. However, this is not to say that I agree with what they are doing or that I don’t recognize the potential of what this may be come as it relates to privacy, but the fundamental feature actually doesn’t breach privacy unless the user uploads CP.
Hashing isn’t scanning. The whole point of hashing is to efficiently store and retrieve data without scanning. The hash does not know the contents of the file, it just calculates a number (hash) that is used during transport to check if errors occurred (the checksum is calculated at the source and destination) and it needs to be sent again. Did you guys take a computer networking class or not?
As a tech worker, I don't think we misunderstood you, Apple. You are still using MY iPhone's computational power to do something you want to do, without my consent. (Generating the hashes of photos, described by Craig as the "first half" of the process) I don't want to be a part of YOUR company's sense of social responsibility. I have a say if you gonna use MY phone to do anything. I am extremely disappointed with Apple (but thanks to WSJ for this interview). And as a 10-year loyal iPhone and Mac customer, I will re-consider if I should use Apple product if this 'feature' goes live.
While I am against this. This is simply not true. They don’t even scan images that were taken yourself (apparently). Only images that came from another source and is then uploaded to the cloud.
Child pornography is really bad, scan them, and alert the authorities. Stop child abuse. Authorities please take action especially when that person is very politically connected like your policy makers.
I'm also a developer, but I kinda don't agree with him. Actually, providing any information that can identify your data (which can be again identify you by another process), is quite scary. That is not a backdoor YET. But it'll surely become on when someone will find an access to it (and we aren't even talking about using it). This is very innovative, but also very dangerous (as a developer or a client) due to the potential of this kind of technology. But, for the good side, that is an awesome tech that can help improvind other research fields :)
@@datahearth1738 Apple should really implement the best ZeroTrust. The feature is ok, but it's a matter of who they trust. As a developer, I would never tap on this field of sensitivity on users' data.
You can just opt out by deactivating iCloud Photos, and if that still isn’t enough for you, switch to Android, BlackBerry, build a OS yourself or pay someone else who you trust to do it for you or just use pigeons, or don’t ever install iOS 15!
And that's why companies hire likeable people to do their pr campaigns. And it worked. Hence why I own the stock. Apple could kill your kids, and you'd still buy the phones.
I appreciated that she did this fiercely straightforward interviewing sessions (mostly kind of interrogation) for the good of every Apple device users. Thank You ✌🏼
So: 1) Don't use the cloud (Apple's or otherwise). 2) You can now blackmail people by getting access to their cloud account. 3) How does Apple or the legal authorities know that you uploaded it? 4) How does Apple know that it's working? If it's not, will the feature be discontinued?
Even if this feature remains harmless the implications for the future use of the abuse of said feature is bad. Apple should stop this before it gets worse for there reputation.
Edward Snowden wrote earlier today: “ Neither the message nor the messenger was a mistake. Apple dispatched its SVP-for-Software Ken doll to speak with the Journal not to protect the company's users, but to reassure the company's investors.”
They even made a cut when Federighi said “pornography” so the Apple campus won’t be in the background if “memed”. Sneaky little weasels.. I can see your tricks!
Though I was concerned about the privacy aspect at first, I think Craig explained it pretty well here, and I kind of get his viewpoint from a software development/iCloud as a service standpoint. 1. *Without* cloud services, the device itself is secure and encrypted. 2. When you're using iCloud, when images are uploaded, they sort of perform a comparison to the reference CSAM image database on-device (which I guess is a trained neural network to flag that part), and upload it with the actual photo. Actual photo never gets opened, only the CSAM neural net jumbled encoding gets processed in the second half of the neural network in the cloud. Now the 2nd part may sound like potential invasion of *on-device privacy*. Maybe. From my experience even "secure" cloud services like MEGA or such do routinely flag accounts for takedown when hashes of the files match copyrighted content. I think the major difference is in the way the actual cryptography is performed. As far as I understand. With most cloud providers, they scan all files in the cloud, like, the whole hashing/neural network encoding stuff is all done on your files in the cloud. But in Apple's case, as far as Craig says it's more like.. Hey we don't want our servers to know this much about the original image, we'd rather have part of the encoding performed on device, then sent over to the cloud. This benefits both Apple and people in that the cloud doesn't need to perform as much workload (maybe), and the cloud doesn't need access to the actual image in order to perform encodings on it. Put simply, I think the way they see it is more like... 1. "That on-device encoding is better, because now we still can't access user data *while also complying with the law on cloud services, without being able to give the authorities the actual data they have*". 2. Processing images on-device considered as *part of the uploading process*, rather than considered as "scanning of all photos". With the recent iPhones sporting more and more advanced neural network accelerators, I think it's a logical step from a development standpoint to simply perform an encoding of the images on-device (both very efficient for the server & makes server not need to go though the actual image data). So the on-device security is not compromised, it's just that part of iCloud support is built in to iOS, and this part is not used at all when you don't use the cloud services (i.e. Craig's "and you don't have to" around 2:34). It's the iCloud uploading process that does the scanning. And so, iCloud is still secure "data-wise". Nobody can read your data, *but* your cloud account can still get flagged for suspicious activity. Pretty neat engineering, but to consider if it's an invasion of privacy is up to debate. But on-device still sounds pretty secure to me.
Completely missed the point then, it’s not about CSAM, it’s about the putting scanning code on to devices, meaning in the future, another update comes along to scan for “potential” terrorists, and so on. It’s all about the concept of putting scanning code on to devices, in the first place, that’s the issue here.
@@DrumToTheBassWoop No really, I understand people's concerns over the code being there on the device. But from an engineering perspective, iCloud is just another preinstalled software, which is optional to use. Your perspective is that iCloud Photos is the core part of iOS, and that scanning happens on every phone. To me, it's just another service that is optional and happens to be preinstalled, and won't ever be used if it's not for the exact action of uploading photos to iCloud. Kind of like, what's on device your device is yours, but if you want to put the stuff in this cloud service then the service will scan before uploading. Like I said, depends on the perspective. I see it as just another Cloud service that happens to be preinstalled, you see it as major intrusion in the OS itself. The only way I would consider this to be invasion of On-Device privacy is if the uploads/analyses are performed without disclosure. This announcement is mostly just them saying their cloud services *will* comply with the law.
There no confusion, you are using child pornography as an excuse to scan peoples phone and literally spying on them. Who is Apple to decide they will police people. Drop Apple phones fast, this is not ok, they will use any wording to confuse or convince you this is ok. Watch Edward Snowden video on this bs.
I don't get why the hashing has to be done in my phone. That can be easily done once the image is in iCloud (if you choose to use that service). Kinda of scary that my phone is generating hashes to identify the files I store in my device. You don't need to think too much to realize this can be weaponized by the government
@@alexkobzin557 yes but no to identify my files comparing the hashes to a government database. As I said, that can be weaponized if you change the CSAM database for any database related about content that the government don’t like. And no, I don’t approve the CSAM. My concern is more about privacy. Something Apple has been using over and over to sell their products
Actually hashing is a security feature meant to protect the user. It means that your data isn’t in plain sight. So your photo from Utah might look like”hwhizb&37$;8€![€|*...” when stored on a server Instead of the actual image. Which is what you want in case their database is ever breached. On-device hashing happens too for the same reasons.
@@CalienteFrijoles They have the hashes in the database, therefore, they have the hash itself and the file that generates that hash. Yes, Apple cannot see the content directly but if the hash matches the the government can go to they database and check the file. Now apply this hashing to a database related to content that the government don't like and you have an easy way to identify people that are 'against' the system. This has the potential to become something similar to the scoring system that China already has
@@soylocomoco1162 If you don't have iCloud on automatically it incessantly pesters you throughout the day. They also made it very difficult to do physical backups of your phone the way you used to be able to.
Two questions this interviewer should have asked 1. If I don’t use iCloud will this have any effect on me? 2. So if there are only 29 CSAM images will there be no alert for that account?
I wonder just how damaging to Apple this will become. Amazing how a company like this could not see past their own clever technology and not see the larger ramifications, it’s almost like they are so busy showing how clever they are that they didn’t stop to think! It’s really amateur and naive. And this guy is trying to dig Apple out of a hole.. and failing.
It's their defence strategy against the full surveillance which the US and EU want. At least Apple can use it against the attacks that "Apple protects pedophiles" when they want to safeguard iCloud.
Here is the issue THE US GOV cannot just go around and go through each device and report what is on there. So neither should apple or any other private company for that matter. This is a lot bigger issue than it is being presented.
Apple: "We're breaking into your house!" Everyone: "What??" Apple: "No, don't worry, we're only looking for bad things! We'll ignore everything else! We're not criminals!"
The real problem is this technology could be used to identify protesters and activists in the name of “CSAM”. Any of your photos and metadata of that photo might be read by Apple’s employees. That’s because Apple can’t guarantee the accuracy of the neural network. My real concern is that law enforcement could take advantage of that, and forcing Apple to access to someone’s gallery to identify social activists, and Apple explains that as a technical defect.
The database they refer to for these images are strictly from NCMEC. NCMEC does not have a database to protesters and activists. Governments do not have access to the Csam verified images, I repeat Csam images only, until the manual verification process which will not happen after repeated algorithmic testing.
Apple has blocked law enforcement before on getting private information from a phone of a literal terrorist. If any precedent was set, its that Apple values your privacy and these what if scenarios will not happen until they do. And even if it does happen, Apple will be accountable for it as Craig mentioned.
@@frappes_ that's not quite the objection at hand. It's NCMEC today. Same approach could be applied to any other database. With regard to the precedent, access was denied to the United States. Courts are able to protect Apple corporation from arbitrary requests. End-to-end encryption was also an excuse used. It's a different ballpark when it comes to authoritarian regimes.
@@justshad937 I get that, and I share that concern, but based on Apple sources this will only laucnh at the US for now. I know it can become a slippery slope but I have faith in the precedent set by Apple in denying governments backdoors to their technology, and even if it does get that bad, the choice for consumers will be clear; do not buy Apple stuff anymore.
@@frappes_ that government could simply make it illegal for Apple to disclose this information. "In the name of national security." The same way US telcos didn't disclosure NSA surveillance.
Craig Federighi looked highly uncomfortable during the interview and his explanations were a concerning mess. Thumbs up for the questions made by the reporter, they usually go much softer on them.