Тёмный

Apple Has Begun Scanning Users Files EVEN WITH iCloud TURNED OFF 

Mental Outlaw
Подписаться 641 тыс.
Просмотров 189 тыс.
50% 1

In this video I discuss how several news medias have announced that Apple will no longer pursue scanning peoples iCloud accounts for CCAM and a blog post that appears to be showing that Apple is indeed scanning local filesystems on Mac OS without users consent (iCloud and analytics turned off)
sneak.berlin/20230115/macos-s...
₿💰💵💲Help Support the Channel by Donating Crypto💲💵💰₿
Monero
45F2bNHVcRzXVBsvZ5giyvKGAgm6LFhMsjUUVPTEtdgJJ5SNyxzSNUmFSBR5qCCWLpjiUjYMkmZoX9b3cChNjvxR7kvh436
Bitcoin
3MMKHXPQrGHEsmdHaAGD59FWhKFGeUsAxV
Ethereum
0xeA4DA3F9BAb091Eb86921CA6E41712438f4E5079
Litecoin
MBfrxLJMuw26hbVi2MjCVDFkkExz8rYvUF
Dash
Xh9PXPEy5RoLJgFDGYCDjrbXdjshMaYerz
Zcash
t1aWtU5SBpxuUWBSwDKy4gTkT2T1ZwtFvrr
Chainlink
0x0f7f21D267d2C9dbae17fd8c20012eFEA3678F14
Bitcoin Cash
qz2st00dtu9e79zrq5wshsgaxsjw299n7c69th8ryp
Etherum Classic
0xeA641e59913960f578ad39A6B4d02051A5556BfC
USD Coin
0x0B045f743A693b225630862a3464B52fefE79FdB
Subscribe to my RU-vid channel goo.gl/9U10Wz
and be sure to click that notification bell so you know when new videos are released.

Наука

Опубликовано:

 

23 янв 2023

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 2,2 тыс.   
@paull1248
@paull1248 Год назад
Every time I hear "Think about the children", it makes me sick to my stomach. I'm sure a trillion dollar company with ridiculous margins, who exploits people in 3rd world countries to make production costs even lower, truly care, deeply about children. The fact they're trying to encroach on your privacy is bad enough, hiding behind children while doing so is disgusting.
@O1OO1O1
@O1OO1O1 Год назад
"that's for using children" - Trinity, Matrix Resurrections
@powerdude_dk
@powerdude_dk Год назад
Solid point there
@christiangonzalez6945
@christiangonzalez6945 Год назад
Another thing that makes me sick to my stomach "are you hidding something", yeah put 24/7 cameras in your bedroom, you aren't hiding something.
@christiangonzalez6945
@christiangonzalez6945 Год назад
Another thing that makes me sick to my stomach "are you hidding something", yeah put 24/7 cameras in your bedroom, you aren't hiding something.
@Mustachioed_Mollusk
@Mustachioed_Mollusk Год назад
We’re going to need you to stop posting opposing opinions because what if some children read what you said? Be responsible.
@ichigo_nyanko
@ichigo_nyanko Год назад
I can totally see this being adapted (quietly) to scan and detect copyrighted material and snitching to the police whenever it is found.
@jer1776
@jer1776 Год назад
Yep, as well as memes that promote "hate".
@RaaynML
@RaaynML Год назад
Anyone can have a different definition of freedom
@xXx_Regulus_xXx
@xXx_Regulus_xXx Год назад
that's definitely one of the primary goals, they just use the unobjectionable goal as the one to get their foot in the door. "we're gonna make sure there's no CP on your computer, which we know you don't have anon so don't worry about it. And while we're in there we're gonna check for pirated anime, maybe see if that netflix login in your keyring was borrowed from anybody. But you're not some TOS breaking freak, so you have nothing to worry about right bro?"
@zeppie_
@zeppie_ Год назад
I think it’s more likely that it straight up deletes those files from your hard drive and when you try to open it again you get some annoying as fucc notification like “we couldn’t find your files :(“
@youreyesarebleeding1368
@youreyesarebleeding1368 Год назад
@@jer1776 These heckin trolls on 4chan spreading H8 sp33ch memes are currently the BIGGEST THREAT TO OUR DeMoCrAcY!!!! We HAVE to scan your personal files, it's to protect Democracy! and the children! and minorities! and immigrants!!!
@kinomoto3633
@kinomoto3633 Год назад
one step closer to the day when having "offline storage" will be a suspicious, borderline criminal thing to have
@BlazeEst
@BlazeEst Год назад
Bruh 💀
@swish6143
@swish6143 Год назад
Same as cash.
@leandrogastonlovera8909
@leandrogastonlovera8909 Год назад
You can be jailed for failing to decrypt a hard-drive or file.
@SuperMassiveMax
@SuperMassiveMax Год назад
Time to switch to SD cards and USB drives with Veracrypt-encryption.
@NihongoWakannai
@NihongoWakannai Год назад
@@leandrogastonlovera8909 well yeah no shit, if they have a warrant then they have the right to demand you show it to them. The difference is that they need a court ordered warrant to do that, it's not a private company looking at your files whenever they want.
@DarkMetaOFFICIAL
@DarkMetaOFFICIAL Год назад
Trusting Apple with your privacy is like having Epstein as the family babysitter
@anglepsycho
@anglepsycho Год назад
Or Lena Dunham as the homeschool teacher.
@DarkMetaOFFICIAL
@DarkMetaOFFICIAL Год назад
@@anglepsycho lmao
@diablo.the.cheater
@diablo.the.cheater Год назад
Or trusting me with running a government.
@JayRagon
@JayRagon Год назад
@@diablo.the.cheater that's honestly not bad enough to compare to trusting apple with privacy
@Yorkshire42069
@Yorkshire42069 Год назад
What do you do to get around it?
@SergioLeonardoCornejo
@SergioLeonardoCornejo Год назад
Protecting children is always the excuse to impose authoritarian measures.
@ilearncode7365
@ilearncode7365 Год назад
The same people that think a bad sequence of pixel values is the worst thing ever because they care about children so much, are the same people that support a woman’s right to kill their own child while in the womb.
@HisCarlnessI
@HisCarlnessI Год назад
We both thinking about gun rights, and statistically crazy rare events?
@flamestoyershadowkill6400
@flamestoyershadowkill6400 Год назад
Or minorities
@UninspiredUsername40
@UninspiredUsername40 Год назад
Or terrorism
@Akab
@Akab Год назад
@@HisCarlnessI a bit of regulation like a background check should always happen(i mean it is already anyways in most states) but guns should definitely not be banned or even over regulated, yes. And yeah, "think of the children" is really such an old excuse, you would think people would've realized that farce by now but they still don't 🙄
@bremsberg
@bremsberg Год назад
Really miss the times when we had more hardware choices rather than between "don't be evil" trash bin and "think different" trash bin.
@nevabeensmart
@nevabeensmart Год назад
Monopoly is more than a kids game...
@Arimodu
@Arimodu Год назад
Its not really the hardware as it is the pre-packaged software. If someone took a recent samsung, slapped a decent ROM on it and re-sold it you would have your good choice, but that person would also have a BIG FAT LAWSUIT in their lap the next day.
@villagernumber77
@villagernumber77 Год назад
You forgot the where do you want to go today rubbish bin
@TheOfficalAndI
@TheOfficalAndI Год назад
And the first trashbin dropped the whole not being evil thing.
@vallisdaemonumofficial
@vallisdaemonumofficial Год назад
@@Arimodu this is why it should be as easy to build a phone as it a PC. Seriously, fuck a lotta this hardware that's glued together.
@mad_vegan
@mad_vegan Год назад
Soon we'll have "personalized ads" based on the types of files we have on our devices.
@Dave102693
@Dave102693 Год назад
Google and MS already does this
@riftsquid7659
@riftsquid7659 Год назад
You’re already getting ads based on every single thing that comes out of your mouth. 99% of your apps have baked in microphone access.
@Dave102693
@Dave102693 Год назад
@@riftsquid7659 exactly
@Hackanhacker
@Hackanhacker Год назад
@@Dave102693 they arnt on on my device and its shows xD Mostly i have no ad anywhere lol fuck oof no in my house but when one or two slip trough it have no shit related to me xD its pretty hard to get rid of thos system.tho (phone)
@Hypnotically_Caucasian
@Hypnotically_Caucasian Год назад
company: “wOnT sOmEbOdY pLeAsE tHiNk oF tHe cHiLdRen?!?” Also company: * uses child labor in India to make $1,000 phones in horrific conditions *
@decoy3418
@decoy3418 Год назад
It didn't happen in America so it's not real.
@sorryi6685
@sorryi6685 Год назад
Their is definitely no child labour in India to make Iphones. Child Labour exist in India but it is unorganised sector and in interior of India but definitely not in Iphones. That will be a PR nightmare. Why take risk when there is huge adult population who will work for very low wage
@avyam7509
@avyam7509 Год назад
It's china.
@justaweeb14688
@justaweeb14688 Год назад
It’s impossible to not have human rights violations in any of the manufacturing process. Fair phone tried and failed.
@Schopenhauer69
@Schopenhauer69 Год назад
China maybe. Child labor is a crime in India.
@turtleswithbombs
@turtleswithbombs Год назад
Brb, going to go break into people's houses to make sure they're not abusing children
@soda3185
@soda3185 Год назад
The hero we need but not the one we deserve /s
@users4007
@users4007 Год назад
@you will see it I suspect this to be rick roll
@whitelily2942
@whitelily2942 Год назад
@@users4007 no it’s spam
@idiotontheweb
@idiotontheweb Год назад
@TruthfulIy thx bro I needed it
@namesurname4666
@namesurname4666 Год назад
during your trip you want some cupcakes?
@kevina.4036
@kevina.4036 Год назад
"We know what is best for you. Think of the children". Atrocities have been committed for less.
@wrongthinker843
@wrongthinker843 Год назад
"Think of the children" - a megacorporation that exploits child labor
@YouKnowMeDuh
@YouKnowMeDuh Год назад
Less? You can drop a lot of words: "We know what's best." Yep, it's happened.
@paulosullivan3472
@paulosullivan3472 Год назад
I think its safe to assume all tech companies are doing this with the full support of the governments.
@Micchi-
@Micchi- Год назад
Always has been *gunshot
@-_Somebody_
@-_Somebody_ Год назад
@@Micchi- I see what you did there
@warlockpaladin2261
@warlockpaladin2261 Год назад
Apple's relationship with China adds an interesting wrinkle to this fabric.
@benni1015
@benni1015 Год назад
If you look at what the EU commission does right now i feel like many governments want it to go much much further.
@Angelarius82
@Angelarius82 Год назад
One thing about this that no one seems to have considered is this: Whatever this software is it will only ever be able to make suggestions about what the photo "might" be. Before any police or authorities are involved a real human will need to look at the suggestions that the AI has given. If you have private photos of yourself on your computer that might be enough to tigger a suggestion and next thing you know some stranger is looking at it because an AI said it "might" be something.
@Axman6
@Axman6 Год назад
The technology they were proposing to use was specifically not AI based, but Mental Outlaw just made assumptions because he’s too lazy to do any research at all. There’s not a single piece of evidence that anything here has anything to do with CSAM - go read the original blog post and see if you can find anything. IIRC, they were even going to require several hits against known CSAM before any action at all would be taken.
@nonormies
@nonormies Год назад
@@Axman6 do corporate boots actually taste like apples?
@Axman6
@Axman6 Год назад
@@Channel-gz9hm I’m not a fan of any of this, I’m glad Apple,reversed the decision, but just making up facts without evidence and getting mad is pathetic and delusional. This is like Qanon levels of leaps of logic. Paid shill, I wish; I’m just interested in verifiable facts instead of sensationalism with zero evidence.
@anglepsycho
@anglepsycho Год назад
It probably will be both AI and human error in the way since, currently, AI has to mess up and blend in with brush tools to make a form of a match like DALLE.
@thesenamesaretaken
@thesenamesaretaken Год назад
@@Axman6 it's a Qanon level leap to logic to think tech companies that datamine customers and that collude with governments might do both at the same time, wew
@Blood-PawWerewolf
@Blood-PawWerewolf Год назад
I called it the moment they “quietly” abandoned the CSAM BS last year. We all knew it was be snuck in without our knowledge
@21N13
@21N13 Год назад
Do you have any evidence that this has been snuck in? The process that this RU-vidr who earns money off of this video mentions, without going into detail on while basing the entire video on it, has existed on macOS for over a decade. You can now select text in images, images are scanned for subjects and Visual Look Up fetches related information over the web. Both you and the RU-vidr earning money off of this clickbait video could've done 1 web search and discovered everything you could possibly want to know about how the process works. Instead you're spreading fake news and FUD here.
@timewave02012
@timewave02012 Год назад
Apple's original whitepaper described the scanning as host-side, using a form of homomorphic encryption to avoid revealing the perceptual hashes needed to craft preimage or collision attacks (think: attackers being able to craft false positive images). I don't support it in any way, but it should be no surprise, considering this is exactly how the system was originally designed to work.
@GareWorks
@GareWorks Год назад
Yeah, I think a lot of us had a pretty good idea of what they were really up to.
@RD-eh3tz
@RD-eh3tz Год назад
@@timewave02012 typical Apple being homomorphic, when will they learn to tolerate minorities.
@janpapaj4373
@janpapaj4373 Год назад
HOW THE FUCK DID APPLE TRAIN THE DETECTION MODEL
@Bloom_HD
@Bloom_HD Год назад
3 letter agencies have huge databases of cp. That's where it comes from whenever one of their whistleblowers wakes up to find 18TB of it uploaded onto his HDD and the local police "anonymously" tipped.
@alifahran8033
@alifahran8033 Год назад
The elites' private collection. A lot of people were regulars on Epstein's island.
@o-hogameplay185
@o-hogameplay185 Год назад
it is actually not that difficult. they just hash every images, and see if they match with hashes of known csam
@hihihihi3806
@hihihihi3806 Год назад
@@Bloom_HD the agencies r the real pedos
@Bloom_HD
@Bloom_HD Год назад
@@o-hogameplay185 take the image, slightly alter it by manipulating brightness by 0.1% or adding a single off-color pixel into the corner. And boom, new hash. That's not what they do.
@anthonyeid6534
@anthonyeid6534 Год назад
TLDR: mediaanalysisd is a process that has been around for years; its purpose is to send neural hashes to apple and get information on what those hashes mean. Such as if there's a cat in a photo, a painting, etc. It can be disabled by turning off Siri suggestions by going to System Settings > Siri & Spotlight. Note disabling mediaanalysisd will turn off visual look up. Mediaaanalysisd is apart of visual look up (VLU) which is a system apple uses to recognize objects in photos like buildings, animals, paintings and more and give users information about the photo. VLU works by computing a neural hash locally that is used to distinguish photos/videos by the objects within them. During VLU apple requests a neural hash to compute what that hash represents and sends it back to the user. I'm assuming the database used to compute the hash’s meaning is too massive to be used locally and wouldn't be able to stay updated with new information. I really didn't like the article this video is based on because it makes claims without any real evidence or research. mediaanalysisd is a process that has been around for years and it's very easy to look up what it's used for and how to disable it. The author is close to some conspiracy theorist in my opinion. Anyway a much more in depth read on this topic can be found here: eclecticlight.co/2023/01/18/is-apple-checking-images-we-view-in-the-finder/
@mapl3mage
@mapl3mage Год назад
literally the only person who bothered to look up and search what the program actually does. no one else bothered. to be fair, the program name is suspicious and enough to ring alarm bells for anyone who believes the government is out to get them, which is literally the target audience of this channel.
@bcj842
@bcj842 Год назад
So if I turn that off I won’t be able to grab people out of a photo and make a clipart of them?
@poweron3654
@poweron3654 Год назад
Should be much higher up.
@Me-eb3wv
@Me-eb3wv Год назад
Interesting
@js32096
@js32096 Год назад
Sad that I had to scroll this far, to see your comment. I recently subscribed to Mental Outlaw and if he doesn't address this misinformation, I'll stop trusting his channel. Takes minutes to debunk. Mental Outlaw should be doing this due diligence before disseminating to his audience.
@4.0.4
@4.0.4 Год назад
Using children as human shields for surveillance technology is in line with using them in sweatshops to make the iPhones 👍
@nigeltheoutlaw
@nigeltheoutlaw Год назад
I'm really grateful to guys like you and Upper Echelon calling attention to problems like this that the mainstream is in full, unhesitating support of. This is some truly frightening, dystopian crap, and it's eternally disappointing how unintelligent the average NPC is that they lack any and all pattern recognition to realize "protect the children" is never the goal.
@ProteinFeen
@ProteinFeen Год назад
Thanks for the shoutout to upper echelon bouta check him out❤
@gamtax
@gamtax Год назад
@@ProteinFeen That guy came from gaming channel to commentary channel. Worth to check it out.
@ProteinFeen
@ProteinFeen Год назад
@@gamtax yeah I just subscribed to him, I really like the cyber security info channels especially this one. Keeps things fresh.
@maxia8302
@maxia8302 Год назад
While I like UEG, he sometimes gets stuff wrong. Problem is, we don‘t track false predictions. Remember his discussion that Musk would never buy Twitter or that SBF would be Epsteined?
@RT-qd8yl
@RT-qd8yl Год назад
That's why we need to round up the NPCs and put them all in a prison camp.
@cubusek5849
@cubusek5849 Год назад
Maybe they should include to detect photos of repairing your Apple device, so they can send the secret anti-right to repair police
@canardchronique3477
@canardchronique3477 Год назад
Are you under the impression that they aren't currently doing exactly that? On the remote chance they aren't yet, maybe you shouldn't give them any ideas...
@Hackanhacker
@Hackanhacker Год назад
at ome point apple gunma make transformers for rwal damn it
@Network126
@Network126 Год назад
@@Hackanhacker Lol how old are you?
@KyzoVR
@KyzoVR 4 месяца назад
@@Network126i am 4
@beardalaxy
@beardalaxy Год назад
There was the story from i think a year or 2 ago where Google flagged a guy who had sent a text of a rash on his child's genitals to their doctor, and he got visited by the cops and everything. Had a really bad time with it. I can't imagine the stress from that happening with everyone thinking you actually had CSAM on you. That's messed up.
@janmaker227
@janmaker227 Год назад
This!!! Had a similar situation and you know wha try the most crazy thing? As a parent you get crazy thinking about how many people get to look at it that are not you or your doctor!
@beardalaxy
@beardalaxy Год назад
@@xCDF-pt8kj right, it's impossible for an AI to know intent unless you very clearly spell it out to them on a case-by-case basis, and sometimes not even then.
@tangentfox4677
@tangentfox4677 Год назад
The true highlight of that particular story is how even after proving innocence and getting everything sorted out, Google upheld banning him from their platforms for CP. Which means this guy can never work at any company that uses Google products, can't access any of his data that Google held, can't use Android, can't access his email, can't ever use any tool built on a Google login. He is effectively locked out of a significant portion of society simply because an automated system doesn't understand context matters.
@evil_radfem9162
@evil_radfem9162 Год назад
so, the scanner worked
@beardalaxy
@beardalaxy Год назад
@@evil_radfem9162 it did, but it lacks context and thus fucks over someone
@Reaya
@Reaya Год назад
The people who store this type of data aren't stupid enough to put it in the cloud, and I'm sure Apple is aware of this. The only reason this was even considered is that they want to collect even more information from their users under the guise of protecting the children.
@YouAreStillNotablaze
@YouAreStillNotablaze Год назад
They actually really are that stupid and the thing is so many do this out in the open it's astronomical and LEAs can't even keep up.
@NewWarrior21st
@NewWarrior21st Год назад
I actually cracked up when you we're talking about Apple using the children pretext and then it cut to EDP 😆
@Dave102693
@Dave102693 Год назад
Facts
@arghpee
@arghpee Год назад
>petite woman nudes false flagged by Apple Bot >sent to Apple HQ all part of the plan.
@appalachiabrauchfrau
@appalachiabrauchfrau Год назад
tfw 4ft11 womanlet with babyface realizing Apple has turned me into a weapon, gg.
@arghpee
@arghpee Год назад
@@appalachiabrauchfrau mfw 6'5 and own an Android. 🗿
@BlazeEst
@BlazeEst Год назад
Just deleted nsfw content of petite aged women because this video got me paranoid, in the future there’s gonna be laws from being attracted to youthful petite women
@synexiasaturnds727yearsago7
@@arghpee damn, should've trolled big bro whatever, you're putting the "big bro" in the first sentence anyways
@anglepsycho
@anglepsycho Год назад
Lmao, I'm concerned for East Asian middle-aged women now, better get that surgery if you don't want anything bad done to your chest online.
@0xsupersane920
@0xsupersane920 Год назад
The worst part is that they advertise: "Privacy, that's iPhone" and then do shxt like this. They have the money to advertise to get the most reach possible, like for major sporting events, awards shows and all over the internet, which makes this problem even worse.
@paracelsus407
@paracelsus407 Год назад
If Apple wanted to protect children, they'd build it into a Camera app, rather than search files on the device. All this does is make it easier to frame innocent people.
@brandonw1604
@brandonw1604 Год назад
They don’t search on device. This article is 100% wrong.
@xinfinity4756
@xinfinity4756 Год назад
@@brandonw1604 could you provide a reputable article demonstrating or at least explaining how/ why it doesn't?
@brandonw1604
@brandonw1604 Год назад
@@xXGeth270Xx but they’re not scanning anything so there’s that.
@Axman6
@Axman6 Год назад
Having it in the camera makes no sense, this was supposed to detect *known* CSAM material, which is derived from a database of image hashes of material collected by law enforcement agencies. It isn’t capable of detecting new material being generated, which is a massively more difficult problem and one significantly more prone to false positives.
@brandonw1604
@brandonw1604 Год назад
@@xinfinity4756 the other part is common sense. The scanning was done server side on iCloud. They’re not going to use a process calling home that you can block with an application firewall like Little Snitch.
@stephaneduhamel7706
@stephaneduhamel7706 Год назад
How would they even train an AI to recognize this kind of images? I can't imagine any ethical or legal way of getting training data.
@sprtwlf9314
@sprtwlf9314 Год назад
Ethical and legal aren't considerations for the global elite.
@sampletext9426
@sampletext9426 Год назад
@Happy Hippie Hose why is it ok for the government to have the largest collection of see pea, but not us?
@masterloquendo0
@masterloquendo0 Год назад
@@sampletext9426 nigga what?
@ali-1000
@ali-1000 Год назад
@@sampletext9426are you saying that you should have the right to store and view CSAM material 😟🤨
@sampletext9426
@sampletext9426 Год назад
@@ali-1000 absolutely not, but we both can agree that our leaders can store and keep all they want
@VBYTP
@VBYTP Год назад
One more reason I'm really glad I didn't buy an Apple device this Christmas. IMO if you want to protect children, start putting in real punishments for child abusers and leave good normal people alone.
@notafbihoneypot8487
@notafbihoneypot8487 Год назад
It's like getting rid of encryption, it would only hurt everyone. Even if it's for a good cause. I don't trust that it wouldn't be abused on a massive level
@pluto8404
@pluto8404 Год назад
That would be the logical thing to do. But as we have seen in the gun debate or war on drugs, it is about controlling the normal people, they dont care about the criminals.
@ryderostby
@ryderostby Год назад
Its always the safety excuse, always when it comes to taking away your privacy because ‘the less control you have the better’. Yeah I feel even better giving my data to the biggest data whores on this fucking planet
@m0-m0597
@m0-m0597 Год назад
come on guys, who really needs privacy? Let them just look through you. All will be fine. Also, isn't it just annoying being constantly concerned about stuff like that? Enjoy life and stop thinking :-)
@SergioLeonardoCornejo
@SergioLeonardoCornejo Год назад
Children are the excuse because they know that appeals to the feelings of people
@matthewsjardine
@matthewsjardine Год назад
Apple wasn't happy with the old adage: "The cloud is just someone else's computer...". They decided to take it one step further, 'your computer is just someone else's computer'.
@MunyuShizumi
@MunyuShizumi Год назад
A moment of silence for all the conversations where our disapproval will immediately be rebuked with "So you're defending child predators?" followed by incoherent screeching noises.
@DsiakMondala
@DsiakMondala Год назад
reeeeEEEE
@PsRohrbaugh
@PsRohrbaugh Год назад
There are thousands of people who had charges dropped when they proved that a virus or hacker downloaded the illegal images, not them. There are thousands in jail right now who claim to be innocent but were unable to prove it. That should terrify you.
@PsRohrbaugh
@PsRohrbaugh Год назад
@@bacon222 No. I can't post links here, but search things like "State Worker's Child Porn Charges Dropped; Virus Blamed" "Child porn downloaded by mistake" "Computer porn hacker is making our lives a misery" There used to be a lot more examples easily available through Google, but I've spent 20 minutes searching and can't find any. Hmm... 🤔
@fatalityin1
@fatalityin1 Год назад
@@bacon222 It is a common "hack" first introduced by 4ch during their scientology raids, when they first with driveby-laptops scanned the wlan and then downloaded illegal stuff to the scientology router storages. Now that software is a lot more sophisticated, some viruses even deliberately download illegal images from a govt honeypot into a hidden folder in your sys directory, making it impossible to find while you can expect a "FBI, open up" within a week. The shocking thing is: the more you are versatile with PCs, the more likely you are going to get sentenced, because in theory you should have been able to prevent it. People workin in IT in my country already fight this kind of legislature move, just because you work in IT doesn't mean you are omnipotent and even the accusation is enough to make you unemployed forever
@Dave102693
@Dave102693 Год назад
Idk why don’t think that this happens more often then not.
@fayenotfaye
@fayenotfaye Год назад
“There are thousands in jail right now who claim to be innocent but we’re unable to prove it” Source: it came to them in a dream
@Memelord18
@Memelord18 Год назад
source?
@BrandonCurington1
@BrandonCurington1 Год назад
People, this is what we have been talking about. The future. We need to stop it while we still can. (Remember; you will own nothing, and you will be happy) I do not want to live in a world like this.
@totallynotsarcastic7392
@totallynotsarcastic7392 Год назад
there is no political way to stop it
@appalachiabrauchfrau
@appalachiabrauchfrau Год назад
I'm just going back to disposable cameras. Got a roll of film developed and it felt like opening a present.
@MOTH_moff
@MOTH_moff Год назад
Print all your photos and keep them in a book. I'm only half joking.
@BrandonCurington1
@BrandonCurington1 Год назад
@@MOTH_moff yeah thanks im gonna print out my messages and mail them mf
@Tathanic
@Tathanic Год назад
say you don't like apple, "your icloud" suddenly finds 100TB of CP they found while scanning
@sampletext9426
@sampletext9426 Год назад
its scary that sea pe has become a weapon that can incriminate people. why aren't the citizens worried?
@25thDaveWalker
@25thDaveWalker Год назад
I love it that there's a reply under this comment but I can't view it, because youtube hid it from me. What a time to be alive
@audigamer8261
@audigamer8261 Год назад
@@25thDaveWalker same
@warlockpaladin2261
@warlockpaladin2261 Год назад
I too have noticed that the comment count is frequently off on YT.
@fayenotfaye
@fayenotfaye Год назад
@@25thDaveWalker it’s spam. There’s no proof as of yet that RU-vid shadow bans people from comments, they sometimes do it from the recommended feed but not from comments.
@by9798
@by9798 Год назад
You're my favorite RU-vidr bro! Always alerting me of important tech news without stressing me out and making me rage. Easy listening even when the matter is serious or infringing.
@ZucchiniCzar
@ZucchiniCzar Год назад
It's always in the name of "security".
@B1anticabal999
@B1anticabal999 Год назад
Or "officer safety "
@XXLuigiMario
@XXLuigiMario Год назад
Just not *your* security
@SimGunther
@SimGunther Год назад
More reason to use the BSDs, Linux, Haiku, and TempleOS
@nigeltheoutlaw
@nigeltheoutlaw Год назад
RIP Terry
@frog7362
@frog7362 Год назад
RIP brother terry
@hereticanthem5652
@hereticanthem5652 Год назад
BDSM as well
@steelfox1448
@steelfox1448 Год назад
RIP Terry
@RandyHanley
@RandyHanley Год назад
So true! These crooked companies are no different than crooked politicians, trying to sell it as something for the good of the people.
@mario7501
@mario7501 Год назад
There's a reason the government always calls laws that infringe on privacy something like "kid's online protection act". It's a very sinister game
@raj18x
@raj18x Месяц назад
Agreed
@gamergaminggmod
@gamergaminggmod Год назад
I love Apple - Pay 3 times more for a laptop that's overpriced af, and have your privacy violated, becuase "Think of the children". I remember these classic Simspons episodes where Reverend's wife screams "Won't somebody, please, think of the children!"
@GSFigure
@GSFigure Год назад
Madness is about to flourish.
@notafbihoneypot8487
@notafbihoneypot8487 Год назад
Don't worry. Give me your Phone number and I'll send You secure link to protect your DATA. SPONSORED BY NORDVPN
@MaxDankOG
@MaxDankOG Год назад
@Vetrussuper based
@totallynotsarcastic7392
@totallynotsarcastic7392 Год назад
"about to"?
@big_red_machine3547
@big_red_machine3547 Год назад
And AI is about to magnify what’s already so wrong about this sick society
@captainsmirk9218
@captainsmirk9218 Год назад
A LOT of people put their intellectual property in icloud - ideas, business plans, unpublished book scripts, trade strategies, engineer designs, personal and bank information, etc. etc. Its only a matter of time before this is stolen by a bad actor given this level of access "for the children" (whether a hacker or contractor paid to look through files). I've been an iphone supporter since the iphone 2G - I am now getting rid of ALL my apple products. The access isn't just images and videos.
@Haise-san
@Haise-san Год назад
Me too, fuck that company
@bcj842
@bcj842 Год назад
Don’t throw your stuff out unless you plan on buying some next-level privacy shizz to replace it. Throwing out your Apple device over a headline just to go out and buy something from Xiaomi or Google is just trading one prison cell for another.
@MrTonyBarzini
@MrTonyBarzini Год назад
@@bcj842 is pinephone viable?
@bcj842
@bcj842 Год назад
@@MrTonyBarzini Sadly, this is about where my expertise ends… I know there’s more privacy-oriented devices out there, but what I don’t know is which ones are solid and which ones to avoid. I don’t have the technical background to make that call.
@raphaelcardoso7927
@raphaelcardoso7927 Год назад
@Kougami where in their user agreement?
@Icee47
@Icee47 Год назад
Bro I found your channel yesterday, and I’m binging all your vids man. So good! I’ll start becoming more private as well…
@h.e.pennypacker4728
@h.e.pennypacker4728 Год назад
This could be the best youtube channel for any topic, definitely for anything tech related
@veirant5004
@veirant5004 Год назад
The joke is that you can be imprisoned for a decade just for clicking on a "download" button somewhere on the Internet. I don't even give a cr*p about the content of what is being downloaded. Jail for saving a picture from the 'net. It's the f*cking end. And yeah, hey to the freedomest state of America ❤️.
@JamesWilson01
@JamesWilson01 Год назад
The sad thing is that most Apple users are so brainwashed that they either don't care or think this kind of thing is a great idea 😬
@forbidden-cyrillic-handle
@forbidden-cyrillic-handle Год назад
Good for them. They get exactly what they want. The last good Apple for me was Apple ][.
@nitebreak
@nitebreak Год назад
@@forbidden-cyrillic-handle I have an apple phone and I like it well enough because of some of the features but this makes me nervous. I have music on my iTunes that was not all purchased and I wonder if they are gonna start dmca on private files…
@localvoid69420
@localvoid69420 Год назад
@@nitebreak same, hope they don't dmca me cuz I don't wanna get in trouble just because I was listening to music that's not available on itunes
@ArdivKmen
@ArdivKmen Год назад
@@forbidden-cyrillic-handle Every single phone manufacturer copies Apple in one way or another, what makes you think it will only be Apple that does stuff like this? No matter how much we complain all phones will get this eventually. You see what kind of wiretap people install in their homes, all they have to say "Think of the children" and suddenly disabling that feature makes you a suspected child molester.
@Skullet
@Skullet Год назад
No, the sad thing here is making gross generalisations about people based on the products they buy.
@bradhaines3142
@bradhaines3142 Год назад
this is going to be a great way for apple to destroy peoples lives for nothing. and even worse, this could end up a boy cried wolf issue of all the false positives making people ignore when its a legit positive.
@gorofujita5767
@gorofujita5767 Год назад
This has less to do with scanning for copyrighted material, and more to do with spying (or even falsely incriminating, when really needed) political or ideological opponents marked by the NSA. Also, in limit situations, think of what they could possibly do with undesirable counter-establishment speakers. If they can scan your files, they can also make something "bad" mysteriously appear in your phone during police custody. There's a world os possibilities to this, and I wish more people thought about the implications seriously.
@geeshta
@geeshta Год назад
So if Google automatically makes collections out of my photos like "Animals", "Food", "Nature" etc. does it mean it scans my photos as well?
@PvtAnonymous
@PvtAnonymous Год назад
is this a rhetorical question?
@tmacman0418
@tmacman0418 Год назад
Yes, all your photos are fed into it's AI including the faces of everyone you took a picture with so they can track and send ads to you more effectively.
@totallynotsarcastic7392
@totallynotsarcastic7392 Год назад
do you really need to ask?
@DarthChrisB
@DarthChrisB Год назад
No, the computer just happens to know what's in the photo whithout looking at it...
@filiphabek271
@filiphabek271 Год назад
For me it doesn't happen. Which google product do you use?
@xE92vD
@xE92vD Год назад
I wouldn't be surprised if iPhone users still continued using their spyware filled devices after witnessing this.
@ygx6
@ygx6 Год назад
@Vetrus cuz they isleep when people state facts
@fanban2926
@fanban2926 Год назад
Same with Samsungs tho
@digi3218
@digi3218 Год назад
witnessing what? Sent on iPhone
@ygx6
@ygx6 Год назад
@@fanban2926 duh, google devices but apple just exposed themselves for scanning local images and forwarding them to their servers, google hasn't _yet_
@staidey5994
@staidey5994 Год назад
@@fanban2926 it's a complete different thing when the company doesn't actively advertise their privacy features. Afaik, neither Google nor Samsung nor any other Android device manufacturers have ever advertised their phones as being private; however, apple had been claiming the privacy features of their phones until they got recently exposed for having their phone not being as private as they've claimed this entire time.
@bedazzledmisery6969
@bedazzledmisery6969 Год назад
My prediction is this is gonna really bring back old school film photography and developing in a dark room to prevent images uploaded to any types of clouds.
@hmr1122
@hmr1122 Год назад
I wouldn't be surprised if windows already has some type of file scanning in the works.
@by9798
@by9798 Год назад
Onedrive randomly sends me notifications about how it made various photo albums for me every few months and it freaks me out a little bit. Even if it's just looking at timestamps in metadata it's like WTF I did not ask for this.
@anglepsycho
@anglepsycho Год назад
@@by9798 I forgot they do that-
@ra2enjoyer708
@ra2enjoyer708 Год назад
They did play with the idea of showing you ads right in the file manager, so it's not so far off. Especially how windows normalizes having superuser privileges at all times, therefore any program can send and receive arbitrary data from the internet.
@DsiakMondala
@DsiakMondala Год назад
@FLN 764 Trop kek. There is no such a thing as deleting something you uploaded to the internet new friend. It is there forever.
@DsiakMondala
@DsiakMondala Год назад
@FLN 764 You... think you get to choose what goes on your windows installation? A-are you t-that new? OwO'
@ProteinFeen
@ProteinFeen Год назад
Just found your account and man you are an awesome creator. I’m surprised I haven’t seen your videos before. I know nothing about half the stuff you talk about but you got a like and sun from me!❤
@ProteinFeen
@ProteinFeen Год назад
Sub*
@MentalOutlaw
@MentalOutlaw Год назад
Thanks, glad you enjoy the videos
@mattgamei5vods649
@mattgamei5vods649 Год назад
@@MentalOutlaw when do you launch your onlyfans?
@ProteinFeen
@ProteinFeen Год назад
@@MentalOutlaw great stuff watching this on my iPhone 14 kms😂😂
@ProteinFeen
@ProteinFeen Год назад
@TruthfulIy someone has to clean the toilets everyone can’t just play in the computer
@aguywithaytusername
@aguywithaytusername Год назад
wait a minute. if it's an ai, how did they train it?
@upcomingweeb136
@upcomingweeb136 Год назад
Good question
@Dratchev241
@Dratchev241 Год назад
by feeding it tons of images that would get us tossed in a nice hotel with grey bars and doors.
@Ryfinius
@Ryfinius Год назад
Watch the snl skit where Dwayne Johnson trained his evil robot.
@RogueA.I.
@RogueA.I. Год назад
You know how…
@IaMaPh1991
@IaMaPh1991 Год назад
The glowies have literal terabytes of material in their possession, which they very likely fap to when they aren't arresting and prosecuting innocent citizens for "evidence" they likely planted in the first place. Of course they are willing to share it with a powerful corporation. It's certainly not illegal when THEY posses and distribute it amongst one another...
@kotzpenner
@kotzpenner Год назад
Corporations and Governments try not to spy on their customers/citizens (IMPOSSIBLE, GONE SEXUAL)
@daverei1211
@daverei1211 Год назад
You got to wonder the potential misuse of this where bad actors use adware or drive by malware to drop “suspicious” images and then try to extort you by showing that there is a file, and post to this scanning, and that by paying them 1btc for them to protect you……
@smolbirb4
@smolbirb4 Год назад
I love your content especially the more Linux/privacy focused content, and more so when you cover gentoo stuff hardly anyone talks about gentoo
@Wampa842
@Wampa842 Год назад
They can't talk until they're done compiling.
@notafbihoneypot8487
@notafbihoneypot8487 Год назад
Gentoo takes so long to compile.
@notafbihoneypot8487
@notafbihoneypot8487 Год назад
@@Wampa842 😂😂😂
@smolbirb4
@smolbirb4 Год назад
@@Wampa842 true
@MillywiggZ
@MillywiggZ Год назад
Adobe is doing the same with everyone’s Photoshop, Illustrator, etc. files. But that’s to train their A.I. art program.
@Spaghetti742
@Spaghetti742 Год назад
Thats actually a decent idea, if it were consent based. I mean think about all of the people who use photoshop/illustrator. BUt there needs to be a way to opt out
@Memelord18
@Memelord18 Год назад
@Spaghetti8696 I think it should be opt in for copyright reasons
@users4007
@users4007 Год назад
damn, if I ever use photoshop I’m def gonna pirate an old version then
@Spaghetti742
@Spaghetti742 Год назад
@@Memelord18 True, didn't think about that
@nothing_
@nothing_ Год назад
Can't they just scrape the internet like everyone else?
@MisterPancake778
@MisterPancake778 Год назад
Me keeping memes on my various cloud accounts to make the FBI agents look at thousands of memes while they work
@BastianInukChristensen
@BastianInukChristensen Год назад
Both Google and MS already does CSAM scanning, but for the time being I only know of it on their cloud services.
@IAmAlpharius14
@IAmAlpharius14 Год назад
always when i use any cloud platform i treat it as if all my files are publicly visible and just encrypt everything i upload onto it using gpg
@rejvaik00
@rejvaik00 Год назад
Can you teach me this?
@alexandervowles3518
@alexandervowles3518 Год назад
@@rejvaik00 you can just use 7zip to lock your files if you're lazy
@jamhamtime1878
@jamhamtime1878 Год назад
Yeah, but is there better methods that's much more easily integrated in any OS? I'm currently just using gpg, simple in linux, can easily install/use in windows, but very annoying to use on my phone when I do need it. Other comment said 7zip, maybe that would be more convenient? I never knew 7zip can even lock with passwords, I definitely will try later. Are there any more possibly convenient method for linux+windows+android? And honestly, gpg on android is not THAT inconvenient.
@Slugbunny
@Slugbunny Год назад
Can practically see the privacy being dusted out of that ecosystem.
@o-hogameplay185
@o-hogameplay185 Год назад
Louis Rossmann made a video about a man, who was asked by his son's doctor to send pictures about his son's injuries, so he can get the correct medicine faster. google scanned the photos, and alerted the police because they mistook it as csam. the worst part in this is, that if i am correct, before talking to the police, a human went through those photos, CONFIRMING they were csam... so not only IA will scan your photos (or files) but some karens too
@Justin-vq9co
@Justin-vq9co Год назад
thank you for making videos about this stuff. Scary!
@c-LAW
@c-LAW Год назад
Just the attorney fees defending against an accusation is thousands of dollars if not tens of thousands.
@kevinmiller5467
@kevinmiller5467 Год назад
Don't worry investor, it doesn't cost Apple a dime.
@Basieeee
@Basieeee Год назад
I bet they will just scan for known checksums of csam files, but it could obviously be extended to anything they want to track.
@Bloom_HD
@Bloom_HD Год назад
Yes. Give them the benefit of the doubt. Always. Apple is your friend. /s
@homuraakemi9556
@homuraakemi9556 Год назад
Scanning the checksum wouldn't work since any alteration to the photo would change the checksum and that wouldn't catch any new CSAM either
@pluto8404
@pluto8404 Год назад
I hear Tim Cook liked "partying" on a famous island in the US Virgin Islands. I am sure he truly cares about this cause.
@ygx6
@ygx6 Год назад
They check for fuzzy hashes, not 1:1 checksums
@ygx6
@ygx6 Год назад
@@homuraakemi9556 they check for fuzzy hashes, not the exact checksum so cropping and editing won't do much but yes, new csam images would slide by the check
@DoublesC
@DoublesC Год назад
9:15 Apple is a honeypot for people who want privacy but don't understand technology
@OblateSpheroid
@OblateSpheroid Год назад
Thank you for your work.
@LarsLarsen77
@LarsLarsen77 Год назад
One of MANY reasons why I've never owned an apple product.
@c-LAW
@c-LAW Год назад
7:46 "Litttle Snitch" is a great little utilities. I've used it on Mac for many years.
@warlockpaladin2261
@warlockpaladin2261 Год назад
Explain?
@c-LAW
@c-LAW Год назад
@@warlockpaladin2261 It's an outbound app firewall
@sirflimflam
@sirflimflam Год назад
mediaanalysisd is related to visual lookup. I have my own qualms with the inability to disable VSL but it's not inherently related to the whole CSAM thing. it's also been around forever in one form or another.
@buddylee6203
@buddylee6203 Год назад
Im kinda tired of xmrig and lominer being reported as a virus
@11alekon
@11alekon Год назад
I wonder what would happen to weapon artist for games, cause we have tons and tons of images/videos/documents for all sorts of weapons, even ourselves holding the the gun to understand how to use it. Apple is going to go mental over it in a string gun country like the UK
@ra2enjoyer708
@ra2enjoyer708 Год назад
Just get a friendly visit by a glowie every now and then. And also don't forget to pay an attorney each time.
@mimikyu_
@mimikyu_ Год назад
as an artist i didnt even think about this. ive been learning to draw weapons recently and i know my phone already scans the images and guesses what object is in them. like it made a folder for my cats full of all the cat photos in my phone without me doing anything. so it can easily see me having weapon images and think im some sort of criminal
@floppa9415
@floppa9415 Год назад
That explains why battery life is going down the shitter.
@spencer3752
@spencer3752 Год назад
Unfortunately, the author fails to establish that `mediaanalysisd` is actually scanning their files and transmitting any data about those files to Apple's APIs in a way that is inconsistent with their policies. In fact, quick research suggests this has already been investigated with conclusions to the contrary. The only thing the author proves is that a network connection was attempted, which can be for any purpose, including purposes for which the user has provided consent. For example, Apple will OCR images with text automatically, a capability which is powered by mediaanalysisd, and is available irrespective whether images are stored in iCloud storage or not...
@LukeLane1984
@LukeLane1984 Год назад
I wonder, how do they train an AI to detect such content? Don't they need a shit ton of training data in the form of images and videos to do that?
@sampletext9426
@sampletext9426 Год назад
@@SookaNooka isn't it funny how citizens find no issue with their government keeping all that kid data?
@Jarczenko
@Jarczenko Год назад
@@sampletext9426 yeah like they care
@Dave102693
@Dave102693 Год назад
@@sampletext9426 the FBI has darkweb and normal web depositories of CSAM,
@Memelord18
@Memelord18 Год назад
@Sample Text there are 2 choices for the government when they seize such content. 1. Delete it forever. 2. Keep the content and use it to make locking up creeps easier. I will not give my personal opinion on which I think is best, just putting that out.
@ScorgRus
@ScorgRus Год назад
@@Memelord18 are gov servers leak proof?
@ozymandias_yt
@ozymandias_yt Год назад
Two things first: 1. The following criticism isn’t supposed to defend Apple. It’s just about getting a broader perspective. 2. The approach of analysing personal data is alarming, which makes the main point of this video absolutely valid. BUT I think there are a few things wrong here. 1. “Mediaanalysisd” is not new. It has been around for some time and is used for many different things. One of them is Spotlight. Everything typed into spotlight is sent to Apple Servers for analysis, because there is a web search feature in this program as well. Even without iCloud, I can imagine the intelligent file search is still turned on, which allows the user to search for stuff and MacOS finds local pictures with the corresponding thing in them. I see no evidence, that Jeffery Pauls pictures are sent to Apple or that they are scanned for CSAM content. 2. Apple uses hashes for comparison with the NCMEC database. This means there is no AI, which is analysing photos like some people seem to imagine it. Theoretically it is even supposed to „check for exact matches of known child-abuse imagery without possessing any of the images or gleaning any information about non-matching pictures“ (The Verge). The last statement is questionable, because the algorithm apparently can not guarantee perfect avoidance of false positives. However this entire concept can’t be compared with full image recognition software, even though it is still scary. 3. Another thing so many people got wrong. Apple never said they will stop working on their CSAM scanning software. They just stated that they will push the release further into the future because further development is needed. This means we definitely still need to keep an eye on this. 4. There are at least a few ways of transferring photos from iOS devices to the Mac without the photos app. The Finder, which is the file manager, is indeed one of them.
@nousquest
@nousquest Год назад
The fact that there are closed source media analysis tools, important part being that they have to contact apple's servers to work, running in the background, given their track record, makes the suspicions valid.
@Mantikal
@Mantikal Год назад
Man, these guys are really helping to make UNIX & Linux distros more popular
@LokiScarletWasHere
@LokiScarletWasHere Год назад
Client-side scanning is exactly what they advertised they'd do, before claiming they're rolling it back. What's changed is they're doing it even with iCloud turned off.
@ussenterprise3156
@ussenterprise3156 Год назад
When Your family are apple fans but you are not
@goofylookincat5028
@goofylookincat5028 Год назад
I can’t remember where I heard it, but if image scanning isn’t actually involved, Apple will at least try and scan the hashes of the files and see if it matches up with their database. Now this is still bad because when this technology gets to governments, they can scan and look for ANY kind of file such as a picture of Xi Jinping photoshopped onto Winnie the Pooh.
@ra2enjoyer708
@ra2enjoyer708 Год назад
> image scanning isn’t actually involved > scan the hashes of the files What do you think is the process of getting a hash of the file, lol?
@Fixer_Su3ana
@Fixer_Su3ana Год назад
What do you mean photoshopped? He already looks that way.
@Spumoon
@Spumoon Год назад
Cleaned out my temp files on Windows the other day and my bing search before and after yielded different results. Maybe I'm just a hecking n00b, but that came as quite a surprise to me.
@benni1015
@benni1015 Год назад
Since barely anybody seems to know, in the EU the eu commission tries to implement something similar to CSAM, but instead for files in some cloud storage it is for our private communication. With client side scanning they want to have an A.I read through your messages, scan your photos etc before they are encrypted and sent to the receipiant.
@andre-le-bone-aparte
@andre-le-bone-aparte Год назад
Question: Can you do a monthly news update of these kinds of topics? - This was helpful for us who use Unix / Linux but have to interact with MacOS / WinOS for work.
@RegenerationOfficial
@RegenerationOfficial Год назад
While discussing, to found a company that installs private one home servers and rents leftover space, we ran into the issue to distinguish between legitimate childhood photos and abusive material. Like flirting, there is a fine line towards denying it ever was what it seemed to be.
@tp6335
@tp6335 Год назад
What interests me the most is, if Apple is training a closed source ai in house to look for csam, they are bound to have a training set in their possession or provided to them. Is the existence of such a training set not highly immoral? What if there is a bad actor somewhere involved and a leak occurs?
@nsfeliz7825
@nsfeliz7825 Год назад
youre right ,someone somewhere is being paid to posssess child pron . for the purpose of training ai. th$ mere possesion of cp is indeed illegal.
@pyromcr
@pyromcr Год назад
Some engineer at Apple is just looking for an excuse to jack off all day and came up with this project.
@SuperTort0ise
@SuperTort0ise Год назад
9:24 "What happens on your iPhone, stays on your iPhone." Until your iPhone tells us what happened on it.
@yevoidstar
@yevoidstar Год назад
This video either needs to be cleared up or outright deleted since the network connections made by mediaanalysisd were never proven or shown that it is actually reporting results of scanning images for "CSAM". And it's coming out now that this was an outright bug... Nice video though.
@klausb.7505
@klausb.7505 Год назад
YES !!! To everything you said! Sad enough though, thx 4 spreading the news!
@Jennn
@Jennn Год назад
Thank you for explaining this so well to us. It really Is spooky! I understand the Good that comes of it but can also understand how this can be exploited. Wild world, wild days these Are.
@ra2enjoyer708
@ra2enjoyer708 Год назад
What exactly is good about it? You know file hash comparison only works for known files, right? So it won't stop ongoing abuses or prevent future ones and can only work as long as FBI collects terrabytes of epstein files ASAP continuosly.
@Mustachioed_Mollusk
@Mustachioed_Mollusk Год назад
This HAS to be a major breach in privacy. What information is going to be stolen using the, “Think of the children” excuse?
@alexander1989x
@alexander1989x Год назад
It's a very thin line between "scanning for crime" and "scanning for criticism, dissent, journalism and negativiry".
@warlockpaladin2261
@warlockpaladin2261 Год назад
Apple's relationship with China adds an interesting wrinkle to this fabric.
@kxuydhj
@kxuydhj Год назад
when this controversy first showed up i thought "good, if there's anything i hate more than children it's child predators", but the road to hell is paved with good intentions and this really is a great example. me getting pissed off at targeted ads might also have had something to do with my switch in attitude, but whatevs.
@aldrickfondracul9297
@aldrickfondracul9297 Год назад
It's times like this that I'm glad I've stuck with dumbphones for so long, even though I am pretty much fighting progress at this point. I seriously dread the day I finally get a smartphone.
@moki5796
@moki5796 Год назад
You should post an update to the Jeffrey Paul story. This "issue" has been resolved now, turns out it was a bug where the service would send empty requests whenever a media file was previewed. No information about the files were ever transmitted without consent.
@Nightcaat
@Nightcaat Год назад
Normally I find your videos informative and entertaining (sure they have filler, but they’re nice to listen to) so I’m disappointed in you for not checking this further. This article is fear-mongering and the comments are eating it up thanks to their confirmation bias against Apple. mediaanalysisd is related to Spotlight and have been a part of macOS long before Apple was interested in CSAM detection, used for things like face, text, and object recognition.
@pqsk
@pqsk Год назад
MS has been spying on Windows users for decades. I learned about this in a windows administration class. Up to windows XP there was a way to block it, but when Vista came out of you block it then windows no longer works.
@perrywood3839
@perrywood3839 Год назад
Would love a video on how to setup something like Little Snitch/Glasswire/open source alternatives for those of us still using OSX/Windows to try and cull stuff like this
@WhiteGirlHeaven
@WhiteGirlHeaven Год назад
Little Snitch Little Snitch is a firewall application that monitors and controls outbound internet traffic. Paid • Proprietary Firewall Mac
@PatrickVogelius
@PatrickVogelius Год назад
This claim has already been debunked as false. Are you going to upload a video to clarify this like Louis Rossmann did?
@alouisschafer7212
@alouisschafer7212 Год назад
By the way as of today I am one of your kind. My desktop now runs Arch Linux (Garuda to be precise I mean what else did you expect from a normie like me) and im more than happy with how it all works. I even got a game to run pretty well without a win10 VM that's something I believed to be near impossible unil the steam deck came out now Linux Compatibility is literally build into the words biggest game vendor and launcher. Slowly becoming a based, ghost to the feds, redpilled FOSS user by the day. Now I need to convert my Laptop and do something about my shitty Android... Thinking about Google Pixle + Graphene OS. Should be doable for me. Never buying an apple that's for sure not with headlines like oh by the way we scan all pictures on your phone without your consent and when you are offline :) hell no Apple!
@laterskater6341
@laterskater6341 Год назад
Another one filtered
@rejvaik00
@rejvaik00 Год назад
Can you please teach me this?
@BrainEatPenguin
@BrainEatPenguin Год назад
@@rejvaik00 mental outlaw has videos on some of it
@TheSuperBoyProject
@TheSuperBoyProject Год назад
@Happy Hippie Hose if you're using manjaro you should look back
@synexiasaturnds727yearsago7
@Happy Hippie Hose Don't. Use what OP is using, Midjaro is unsecure af
@GuideZer0
@GuideZer0 Год назад
I wish I could tell companies, "I don't care about the specificity of your advertising. The violation of privacy and commodification of personal information for the purpose of targeted advertising is creepy to me and I don't want you to try to sell me things on that basis!!"
@revco196
@revco196 Год назад
So is it csam scanning or scanning photos to identify people to search in your library?
@jaytrip420
@jaytrip420 Год назад
Every day I see more and more reasons why my switch to android based devices was more than necessary
@georgelsgomes9634
@georgelsgomes9634 Год назад
as far I remember about Apple's neuralHash white paper it actually dont require the image parsing. it's much like imagededupe. from a given file u compute a hash and then compare its value with CSAM_DB, "if the string distance is >= threshold, then is csam, else not_csam". sorry my english 😅
Далее
Worlds Dumbest Darknet Admin Gets Busted
14:54
Просмотров 326 тыс.
Цены на iPhone и Жигули в ЕГИПТЕ!
50:12
Google Search Ads Are Spreading Dangerous Viruses
12:45
Introducing Galaxy Z Fold6 and Z Flip6 | Samsung
4:47
Government wants to END Apple and iPhone
12:53
Просмотров 149 тыс.
What Happened To Google Search?
14:05
Просмотров 3,1 млн
Ads and Tracking is Getting Worse on iPhones
10:18
Просмотров 149 тыс.
The EU Wants to Control Every Citizens Chats
9:50
Просмотров 185 тыс.
Bad OPSEC - How The Feds Traced a Monero User
13:55
Просмотров 501 тыс.
Do This Before Putting Your Files in the Cloud
12:56
Просмотров 160 тыс.
Artists Are Fighting AI With AI
16:52
Просмотров 314 тыс.
Acer Predator Тараканьи Бега!
1:00
Просмотров 400 тыс.