Тёмный

Are You Feeding a Powerful Facial Recognition Algorithm? 

NOVA PBS Official
Подписаться 937 тыс.
Просмотров 14 тыс.
50% 1

Facial recognition technology has great potential to help law enforcement identify suspects. But collecting and storing data from online photos has raised concern among critics.
Clearview AI, which has more than three billion faces in its database, is the largest-known facial recognition database in the U.S. And, because it continuously gathers data from open-source internet pages-news, mugshot, and social media sites, and even "private" platforms like money transfer app Venmo-Clearview AI’s database is always growing.
Supporters argue that services like Clearview AI were essential to help identify (and ultimately charge) more than 400 of the January 6 Capitol rioters, many of whom were found through “digital breadcrumbs” like photos, location data, and surveillance footage. While traces of online data can be leveraged to help law enforcement investigate suspects, artificial intelligence (AI) software, particularly facial recognition technology, has a darker side.
"Artificial intelligence has the veneer of being objective," says Janai Nelson, Associate Director-Counsel at the NAACP. "We have been very concerned about the inputs into these systems that often produce racially-discriminatory results." Typically trained on a majority of white faces, AI often incorrectly identifies people, particularly those of color, and therefore shouldn’t serve as the only means of identifying a suspect, former FBI agent Doug Kouns says. Privacy breaches are also a concern, critics say.
Canada has already outlawed Clearview AI, stating it poses a violation of privacy rights, and has ordered the app to remove Canadian faces from its database. Now, its use is also being challenged in Illinois and California.
Are the benefits of facial recognition technology worth the general public’s loss of privacy-and possibly even control?
PRODUCTION CREDITS:
Produced by:
Emily Zendt
Production assistance:
Christina Monnen
Amanda Willis
© WGBH Educational Foundation 2021

Наука

Опубликовано:

 

21 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 48   
@LisaBeergutHolst
@LisaBeergutHolst 3 года назад
"Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should."
@joesmith1142
@joesmith1142 2 года назад
I always have believed that the end of humanity wont come from another countrys weapons arsenal but from the blundering of a scientist.
@LisaBeergutHolst
@LisaBeergutHolst 2 года назад
@@joesmith1142 Yet here you are watching a science program lol
@frogz
@frogz 3 года назад
"we do what we must; because we can"
@SolaceEasy
@SolaceEasy 3 года назад
Fool 'em! Don't paste on the dull, complacent to corporate bullying face anymore. Show Some Emotion. Put Expression In Your Life. Light Up When You're Feeling Happy! When You're Sad, Just Let Those Tears Flow Down.
@SolaceEasy
@SolaceEasy 3 года назад
Acts of humanity confuse the machines.
@ooee8088
@ooee8088 3 года назад
So much for the right to refuse to identify yourself..
@derekspitz9225
@derekspitz9225 2 года назад
'Artificial intelligence' is an oxymoron. And people's perceptions that AI is somehow this infallible super technology is really concerning. As for facial recognition? In the UK, where we are essentially living in a surveillance state, the kind of facial recognition that Clear View thinks is ok raises serious questions about individual freedom and privacy. Once cash disappears (which it will) and social credit (which is already here) becomes ubiquitous, the Orwellian nightmare will be complete. Dark days ahead.
@ronkirk5099
@ronkirk5099 3 года назад
I don't post ANY photos and I keep my computer camera taped over. Paranoia? Maybe, but it doesn't mean they're not out to get you.
@kdavis4910
@kdavis4910 3 года назад
I don't post photos either. The only camera is my phone. I flip it off regularly just so they know, I know. Better paranoid than blindsided.
@aaronreeves5880
@aaronreeves5880 3 года назад
@@kdavis4910 Is it truly paranoia if it is happening or could happen....I think its called being prepared, and it makes us sound better then labeling it has paranoid
@frogz
@frogz 3 года назад
i hate to say it but i could make you 100x more paranoid in less than 5 minutes of digging into the internet for traces of what you've done and said on all associated names, identities and probably even your real identity, you only need to submit a comment on a random webpage to be linked to a real identity at times
@jdraga1263
@jdraga1263 7 месяцев назад
They loaded driver licenses and state ID photos into the system along with military IDs and school IDs some retail places you shop if you have an ID on your membership card with a photo that’s also loaded in the Clearview AI I know because I saw the system and what is being loaded into the system. The company called ID me also loads to Clearview AI.
@Red-Magic
@Red-Magic 3 года назад
Thing is with the racial involvement, is that it *is* a factor when it comes to identifying people. It's part of how it would tell you from someone else. I do wish people would stop making such a big deal about segregation in stuff like this. A more appropriate word for these kind of applications is just categorizing. Everyone is different from everyone else (except as far as faces are concerned, identical twins lol) That said I'm warry of having a system that invades so much of your personal life. China has been doing this already to a whole other level for a while. I see the pros, I see the cons, and can't really pick a side on it. It will probably be fine as long as it stays with law enforcement *only.* Don't want to see China's level of surveillance anywhere else near me or my communities. That's stepping leagues over the line
@christopherlindeman4210
@christopherlindeman4210 2 года назад
Race is one of many factors in eliminating bias in many machine learning algorithms. In fact, it often amplifies or at least perpetuates existing structural racism in that it is trained that 'correct' is based on the status quo. Example: I live in a historically impoverished area of my city. I just moved here and it happens to be very close to my work, and the area is currently enjoying a bit of a renaissance (call it gentrification if you will). A bank loan processing algorithm for a new loan will use this against me, regardless of my personal qualifications, because it has been correct historically in determining that borrowers in this area often default on loans. As people of similar race tend to live near one another, this is shown to have outcomes that are strongly biased. It's exactly why redlining voting districts is not appropriate. Data scientists are responsible for eliminating biases that come from correlations regarding protected classes (race, age, disability status, etc.). If I can't show that my algorithm isn't discriminatory in one of these ways in certain tasks (like loan/insurance approval), it's a no-go. Doesn't matter if it's 99.99% accurate and will save the company tens of millions. It's a no-go. Illegal. Correlation does not always imply causation, however most easy-to-explain machine learning algorithms are taught to use all available information. Sometimes neighborhood has a great correlation to loan repayment. The real argument isn't about using race to identify a person from a visual cue. Sure, race is an easy way to eliminate a lot of other potential suspects. She was white immediately excludes all POC from consideration, however the argument is more nuanced than that.
@governmentgaslighting5232
@governmentgaslighting5232 3 года назад
They can now pinpoint you whenever you go for a dump in the toilet! They call it 'faecal recognition'.
@governmentgaslighting5232
@governmentgaslighting5232 3 года назад
@MEDIA THING Probably. They will be able to analyse it and tell if you have been eating red meat, suger or too much fat. Frightening.
@gerriehrmantraut5914
@gerriehrmantraut5914 3 года назад
It has to be because of the way people are now!
@LindaCasey
@LindaCasey 3 года назад
They're welcome to recognize MY face anytime because I live my life honorably, but I'm sure glad they're able to utilize this technology in cases where people DON'T behave.
@RoxanneM-
@RoxanneM- 3 года назад
Linda, really? And a woman! Naïveté is the most dangerous psychological reaction we have as a human. Is this because you are not aware of the abuses occurring right now already with ALL the new digital technologies? It is not the police, or the good police I should add, using this. Beginning that where this data is store is NOT safe. Don’t be so naive, it’s not the good caring father figures who are using this technology to come and help you. Normalcy and fear of stepping out of the cozy safe world is as bad as being paranoid all the time. Grow up. You are projecting that need for your father to come and protect you. Don’t you see who is storing this, and how unsafe this technology is? It is not the police. Come a see what my neighbors do with this technology ... it will open your eyes.
@Me-ou5tc
@Me-ou5tc 3 года назад
Dear, the tone of your statement is a bit naive. Google up the cheerleader Mom who has manipulated high school videos and images to get daughters team mates in trouble. Software is so sophisticated experts can't always tell now what's real and what fake! Also,. Facebook/social media can LEGALLY use ANY posted image on Thier site however THEY CHOOSE as of 2019! They can use your face to make money or to make a political statement you may not agree with... And let's not forget this is done through AI and software. That is now used by law enforcement.. what if the technology is WRONG and an innocent person..who works so hard to live Thier life RIGHT is falsely accused... Lots of people in jail today are proven innocent thru the DNA project everyday. People who lived Thier lives right too, and we're wrongfully convicted, perhaps based on the technology that was available at that time. For your sake and our children/grandchildrens sake... This story scares me! Be aware and be careful. Your face.. or my grandchildrens faces could be front and center for a political event you may not agree with... Or your/their image could be portrayed as a villian or a "Karen" or... They could be so darn cute, Thier face is everywhere with no compensation from corporate America.
@kdavis4910
@kdavis4910 3 года назад
Dear God. I really hope you're not serious?
@m.x.
@m.x. 3 года назад
Your naiveness is so cute.
@KyleHohn
@KyleHohn 3 года назад
Thanks Linda! Very cool
@bearg4019
@bearg4019 3 года назад
Good we need tech like this...cctv has been used. Forever in the U.K..soooo we need this because of the world of lunatics todays its not about race ...we need this..your phone tracks everything 😩😫😪 you do now..we need this now and stop crying 😢
@amxaas4450
@amxaas4450 Год назад
It helps to have a brain...
Далее
Defeating Facial Recognition - Retia on Hak5
10:00
Просмотров 442 тыс.
How To Disappear Completely and Never Be Found
14:20
Просмотров 3,8 млн
What facial recognition steals from us
9:43
Просмотров 1,4 млн
Nokia imba #trollface #sorts
0:31
Просмотров 5 млн
Новый фонарик в iPhone с iOS 18
0:49
Просмотров 267 тыс.
Лучшая защита экрана
0:40
Просмотров 595 тыс.