Тёмный

How I'm fighting bias in algorithms | Joy Buolamwini 

TED
Подписаться 25 млн
Просмотров 298 тыс.
50% 1

MIT grad student Joy Buolamwini was working with facial analysis software when she noticed a problem: the software didn't detect her face -- because the people who coded the algorithm hadn't taught it to identify a broad range of skin tones and facial structures. Now she's on a mission to fight bias in machine learning, a phenomenon she calls the "coded gaze." It's an eye-opening talk about the need for accountability in coding ... as algorithms take over more and more aspects of our lives.
TEDTalks is a daily video podcast of the best talks and performances from the TED Conference, where the world's leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design -- plus science, business, global issues, the arts and much more.
Find closed captions and translated subtitles in many languages at www.ted.com/translate
Follow TED news on Twitter: / tednews
Like TED on Facebook: / ted
Subscribe to our channel: / tedtalksdirector

Наука

Опубликовано:

 

28 мар 2017

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 654   
@Nick-kb2jc
@Nick-kb2jc 7 лет назад
I can see that lots of the people in the comments have no clue how machine learning works...
@HarikrishnanR95
@HarikrishnanR95 7 лет назад
Nick just like the lady in the video, who has no clue how facial recognition works
@thebrochu1983
@thebrochu1983 6 лет назад
Red
@fakeapplestore4710
@fakeapplestore4710 6 лет назад
"MIT grad student Joy Buolamwini was working with facial analysis software "
@ckilr01
@ckilr01 4 года назад
Do you understand how bias works? Bias is the Unconscious preference. It's related to the mirror principal where we are unconsciously attracted to those like us and repeled by things we dislike unconsciously. She is saying she is fighting your unconscious choices, in other words forcing your choices not based in racism or hate.
@SuperMsmystery
@SuperMsmystery 3 года назад
@@HarikrishnanR95 she has a PhD ,but continue mansplaining. How about you publish your reasoning?
@MarkArandjus
@MarkArandjus 7 лет назад
Okay let me break it down for you folks real simple-like: if a webcam finds it difficult to detect, for example, dark skintones, then the functionality of the webcam is biased towards dark-skin users, because it performs poorly due to their appearance. She's not saying this is a result of racism on the part of programmers, or that webcams are racists, its just an unfortunate by-product of the technology and she's working to correct it. Facial recognition is a powerful tool with a wide application from privacy to security to entertainment, this isn't some SJW nonsense. Jeez.
@Samzillah
@Samzillah 7 лет назад
Seriously. Imagine cops are trying to find a criminal with the technology but cant find them because of this mistake? There are billions of non-white people, so it needs to work on them too.
@DenGuleBalje
@DenGuleBalje 4 года назад
@gbmpyzochwfdisurjklvanetxq You obviously have no idea how a camera works.
@DenGuleBalje
@DenGuleBalje 4 года назад
@gbmpyzochwfdisurjklvanetxq Are you unaware that a camera relies on light hitting the sensor? Darker skin reflects less light. The less light the longer the exposure needs to be to give a good visual representation of what you're looking at. A webcam sets the auto exposure to get a set amount of light to the sensor. If the background is a lot lighter than the person's skin then the face will look even darker, because the camera shortens the exposure to reduce the overall brightness. Another factor is that face recognition relies on contrast to make out what an eyebrow, mouth or nose is. Dark brown on black is just harder for a computer to define than for example brown on "white".
@amychittenden8993
@amychittenden8993 4 года назад
This would not happen if the coders had dark skin. It depends on who the coders are. So, yes, it really is racism, albeit a more passive form, but with the same results. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-gV0_raKR2UQ.html
@MarkArandjus
@MarkArandjus 4 года назад
@@amychittenden8993 Okay sure if we look at it by outcome then it is racism, the same way systemic racism is racism even if that was not the intended design.
@ijousha
@ijousha 7 лет назад
Personally , I wish my face was less detectable by surveillance cameras.
@AjahSharee
@AjahSharee 4 года назад
Yeah until you get misidentified and arrested because of an algorithm.
@SuperMsmystery
@SuperMsmystery 3 года назад
The problem that you don't see is why surveillance in the first place?
@insearchof9903
@insearchof9903 2 года назад
@@AjahSharee but if you're innocent why worry? You will just be like bye when they see they have the wrong person.
@aelf_ears9119
@aelf_ears9119 2 года назад
@@insearchof9903 but if the system that determines right/wrong is flawed then it doesn't matter if you view yourself to be innocent or not...
@ShonTolliverMusic
@ShonTolliverMusic 7 лет назад
reminds me of that video explaining how Kodak film couldn't replicate brown tone people right until the mid 80s
@whipshaw
@whipshaw 7 лет назад
I liked her introduction, "a poet of code", as a programmer I'm feeling flattered
@BERNARDO712
@BERNARDO712 7 лет назад
Nice resume: Joy Buolamwini is a poet of code on a mission to show compassion through computation. She is a Rhodes Scholar, Fulbright Fellow, Google Anita Borg Scholar, Astronaut Scholar, A Stamps President's Scholar and Carter Center technical consultant recognized as a distinguished volunteer. She holds a master's degree in Learning and Technology from Oxford University and a bachelor's degree in Computer Science from the Georgia Institute of Technology.
@austinjohn8713
@austinjohn8713 2 года назад
and she does not understand how AI works. The problem she is calling bias isn't bias. it was not coded. an AI algorithm is given a data set to learn features and this learning is then used to make prediction given new a data. if the AI struggled with her face, it was because it wasn't trained with a data set that contained dark skin tones. to fix the problem feed more black faces to it. it has nothting to do with bias. if the AI was trained only on black faces, it would not recognize white faces except the face was covered with a black mask
@abdullahb4453
@abdullahb4453 2 года назад
@@austinjohn8713 That is what she said. the algorithm is biased because of that. :)
@austinjohn8713
@austinjohn8713 2 года назад
@@abdullahb4453 algorithms are not biased. what a machine learning algorithms does is not explicitly programmed so it makes no sense to accuse it of bias. if she is an expert in the field she was supposed t retrain it using black faces only and see that the same way it behaved with black faces it would with white faces. she is looking for racism where it does not exists. I say this as a black person
@randomguy_069
@randomguy_069 2 года назад
@@austinjohn8713 Correct. Algorithms are learning from what we are teaching them. They are not biased we are biased in feeding the training data. It feels as if humans are masking an unethical aspect of their society by calling it the fault of AI. And in recent years I have actually seen many leaders in this Algorithmic Bias movement moving ahead and designing proper ethical AIs instead of crying about bias or whatnot and blaming everything on an AI Evil god which they, ironically, trained themselves.
@austinjohn8713
@austinjohn8713 2 года назад
@@abdullahb4453 it is not algorithmic bias. if any bias exits at all it should be in the data fed to the AI.
@TheEndofThis
@TheEndofThis 7 лет назад
how interesting that a video on algorithmic bias has equal parts likes to dislikes.
@aizenplay
@aizenplay 3 года назад
what's the name of the facial tracking system thanks
@laurencross6240
@laurencross6240 5 лет назад
This is so interesting! Joy Buolamwini rocks.
@skdooman
@skdooman 5 лет назад
I appreciate the content of the video, but I wish she would have included more statistical examples. Its one thing to claim your face wasn't recognized. Its another thing to display data on many people who's faces were scanned and were or were not recognized.
@pettersonvitorinomorais8894
@pettersonvitorinomorais8894 2 года назад
She has no data. Only political speech on a purely cientific subject.
@Nick-kb2jc
@Nick-kb2jc 7 лет назад
The real reason some people are triggered in the comments: they hate seeing a smart, educated African American MIT student.
@IronLungProductionsOfficial
@IronLungProductionsOfficial 2 года назад
Haha nope, she's got nothing productive to bring to the table in the field of AI, constantly touting lol, first world problemos
@mtngirl4944
@mtngirl4944 2 года назад
😂😂😂
@tigerlilly9038
@tigerlilly9038 2 года назад
Human humans forget that computers are only as you are only as smart as you make them there is no secret to be unfolded this was a wonderful talk
@Ashberrysoda
@Ashberrysoda 7 лет назад
🙄 I am always confused by people who comment on and dislike things they haven't seen. Clearly shows a bias of the viewers. Good job Joy.
@daniels7568
@daniels7568 7 лет назад
1,092 ppl didn't watch the video before rating
@gg-wk2ww
@gg-wk2ww 2 года назад
Keen curiosity brings things to light, good and bad, good job
@SergioLongoni
@SergioLongoni 7 лет назад
I agree with the content of the video about the potential bias of algorithm but I have a problem with the example of face recognition. My phone as no problem in tracking the speaker face, so I think that this is not a real problem for marketable applications not trained with outdated and small data sample
@giordanoparisotto5617
@giordanoparisotto5617 Год назад
Excellent!!! Loved her! Shes awesome!
@robbieh6182
@robbieh6182 7 лет назад
You can tell most of the people who disliked the video don't have a science background. It IS a bias if the algorithm only recognizes a certain type of face. The word bias has no negative connotation by itself, it simply means preference or "works better with". She isn't saying the algorithms are "racist".
@maximilianpohlmann9106
@maximilianpohlmann9106 7 лет назад
Thank you TED for not disabling comments!
@chinweogedegbe5449
@chinweogedegbe5449 Год назад
This is really great..... even 7 yrs later .... this information is very relevant, Thank you Joy !!
@LKKMD
@LKKMD 3 года назад
Good work thank you
@miss118jess
@miss118jess 2 года назад
This is such powerful talk! Let's imagine that the reason facial recognition works 'better' on light skinned people because cameras are good at picking up light, (and not biased datasets). If facial recognition technology is not working on darker skin then it's not working at all. It's not recognising faces! Facial recognition is still unreliable using high quality photos of members of congress and honestly, surveillance cameras using this tech would be low quality anyway. AI, years later, still excludes a large proportion of the population and needs to be a big topic of discussion as society increasingly relies on its decision making.
@mouath_14
@mouath_14 2 года назад
Laying concern over this technology's ability, or better phrased, inability to detect and identify faces is like scratching the tip of the iceberg, because facial analysis is physiognomy and we all know that that idea is horrible. Yet, facial analysis is also becoming super popular. Some propose complete bans from certain domains and they are not wrong for proposing that either...
@karthickkrish5098
@karthickkrish5098 Год назад
It isn’t because of the brightness/contrast/camera quality, it’s because of the datasets we got till now. These algorithms have been trained with certain age group and certain colour, if the subject matches with the face it recognises in fraction of seconds, if it’s quite opposite, there’s a problem. Lack of brightness/contrast/Camera quality just make’s the problem worse! I’m a MSc Artificial intelligence student, So I know what we use to train the systems! It’s hard to digest but it’s the truth!
@PatrickInCayman
@PatrickInCayman Год назад
@@karthickkrish5098 Right, so I guess Microsoft also failed at this as well because their entire team didn't think to train their facial recognition on people of color.. I think MIT and other AI students should return to learn basic physics.
@erricomalatesta2557
@erricomalatesta2557 7 лет назад
You could use this to your advantage instead of fixing it. Being anonymous these days is a gift
@Melusi47
@Melusi47 3 года назад
Snitch to the whole race. Now they will play attention 😂
@moneysittintall3611
@moneysittintall3611 3 года назад
why does this have so many dislikes, she makes a valid point
@impanthering
@impanthering 3 года назад
Willful ignorance
@GhostMillionairesTV
@GhostMillionairesTV 3 года назад
Because she doesn't make a valid point and you can't think well enough to even figure out why.
@pinegulf
@pinegulf 7 лет назад
I'd like to see the code she writes.
@socdemigod
@socdemigod 6 лет назад
Brilliant. But I don't think anyone should want to have their face recognized by a software. Doesn't that seem a bit intrusive?
@vonneely1977
@vonneely1977 7 лет назад
Is this to be an empathy test? Capillary dilation of the so-called "blush response?" Fluctuation of the pupil, involuntary dilation of the iris?
@bunkertons
@bunkertons 3 года назад
This is so informative, thank you for sharing.
@TripodJonas
@TripodJonas 7 лет назад
Also, why not use things other than visible light to do the same thing, darker faces are darker in visible light, not under other sources.
@newdeflabs
@newdeflabs 7 лет назад
Thanks
@tunjilegba
@tunjilegba 7 лет назад
Hopefully when Robocop comes into fruition it will mistake me for a Tree 😊
@dividedperceptions6626
@dividedperceptions6626 7 лет назад
Tunji Legba That is the kind of positiv thinking we all should learn from:)
@jillgaumet8416
@jillgaumet8416 4 года назад
I hope we don't have Robocops. I want smart humans, not smart machines.
@hannahl1387
@hannahl1387 3 года назад
@tunji legba you can but dream.
@hvbris_
@hvbris_ 4 года назад
She's great
@josephinegrey4517
@josephinegrey4517 4 месяца назад
i wonder how this can be applied to ageing faces?
@leviengstrom7359
@leviengstrom7359 4 года назад
why the background look like it was built out of solo cups
@sundar72
@sundar72 4 месяца назад
A few years ago I was in Frankfurt airport on transit. The restrooms have automatic soap dispensers.. it could only detect light skinned hands for some reason. I am from India. There were two people in that restroom trying every dispenser. A black person and myself .. we were telling each other that the damn things are broken. In walks a white guy and uses it. We asked him to try the other soap dispensers and they all worked for him! We laughed, shook our heads and moved on saying "someone did not design and test their product right!" . At the end of the day, design and test should always consider the spectrum of end users. Always remember this mantra when designing things with AI "You are not your user"!
@theaminswey9733
@theaminswey9733 7 лет назад
Great talk, I'll leave without checking comments section now, thank you, Joy❤
@st8of1der
@st8of1der 7 лет назад
Couldn't this be addressed by using a light that's outside of the visible spectrum? What about combining a high-resolution camera with infrared light?
@jorgerincon6874
@jorgerincon6874 4 года назад
Ok I wasn't to keen on seeing this video mainly because the title, but it's a good theme honestly.
@israelip
@israelip 4 года назад
For those who don't know how machine learning works and can't even hear her. Try to read about training sets.
@austinjohn8713
@austinjohn8713 2 года назад
if she knew it was due to training set and not bias she would not make this talk calling it bias. the AI would do the same to a white face if it was trained only on black faces
@matthewfanous8468
@matthewfanous8468 7 лет назад
i was waiting for her to say "if you cant tell, this is because i was black" BUT SHE DIDNT AND NOW I DONT KNOW WHY THEY COULDNT DETECT HER
@mr_lrb6879
@mr_lrb6879 3 года назад
I'm a bit embarrassed to admit that every time she said "Coded Gaze", I heard it as "coded gays" and got really confused about what that had to do with coding until she showed us the Coded Gaze webpage. Still a good and eye-opening talk though.
@tammieknuth6020
@tammieknuth6020 3 года назад
That would mean shes biases against gays and the LGTQ+ and shes literally a different race
@Zoza15
@Zoza15 7 лет назад
Well, i once had to put my face on a cam and the cam didn´t recognize my face either.. She actually does something about it, so why the dislikes for this video?. I support her actions, as long as it doesnt results in consequences that leave other groups out for the sake of the main group..
5 лет назад
Because she displacing science for ideology.
@dud3man6969
@dud3man6969 3 года назад
In my opinion the ability to defeat AI facial recognition is an advantage.
@nerdtuts797
@nerdtuts797 7 лет назад
All those who are saying that she doesn't make sense doesn't know anything about machine learning. If the training data doesn't have enough images of black people, the algorithm will have a hard time detecting them. It's not about the lighting or the camera. I am surprised by the number of dislikes on this video!
@viperxiiii
@viperxiiii 7 лет назад
Love how her example was as she called it a cheap webcam and not a more complex one.
@theegreatestever2420
@theegreatestever2420 4 года назад
This was such an important TD Talk. I cant believe I just recently found out about it when diving deep into AI and using it in my apps but I am glad I didnt find out later. Its unfortunate the domain name is now for sale and no longer operated by them but I loved this
@tracykarinp
@tracykarinp 7 лет назад
Thank you for a "Very Informative" presentation! It's wonderful that you brought this issue to the front burner! Kudos Joy! :-)
@coulorfully
@coulorfully 6 лет назад
The people programing/coding the algorithms that impact us all have implicit (internal) biases that become implicit in their code.
@betterdaysbetterdaysbetterdays
@betterdaysbetterdaysbetterdays 3 года назад
it's not the people programming it so much, it's the training dataset that a neural network uses.
@derekvaillant6303
@derekvaillant6303 4 года назад
Take back the algorithm and open up those black boxes. Thanks for the encouraging message, Joy.
@thierrybock3847
@thierrybock3847 7 лет назад
that ain't no bug it's a privacy feature. don't break features.
@betterdaysbetterdaysbetterdays
@betterdaysbetterdaysbetterdays 3 года назад
do you know... do you know how machine learning works
@robertsolem9234
@robertsolem9234 2 года назад
Yes, what we need to do is *improve* facial recognition technology /s
@letmetranslate4249
@letmetranslate4249 Год назад
😂
@robertschaaf7192
@robertschaaf7192 Год назад
Why?
@inachu
@inachu 2 года назад
In years to come this will be an issue. Companies will need test subjects or images of all races to make sure technology truly works for all races. What if the next super smart techie nerd is born in india and the camera only works with people from India? It can and will happen.
@luisfernandez7426
@luisfernandez7426 2 года назад
Great talk, Joy! It’s important that this topic is getting visibility. Great work you’re doing toward combatting these biases
@andychung1068
@andychung1068 3 года назад
very inspiring, i can see that she is very hard working person
@brendarua01
@brendarua01 7 лет назад
Unless there is a wide variance in the delivery of email notices at least 17 people disliked this before saw more than they saw the first 5 minutes. That says a lot about some very foolish people. This is a great presentation! She's a wonderful presenter who is dynamic and entertaining on what is a technically and socially complex topic. It's exciting to have and example of discrimination that, while unintended or even unconscious , is very real and has very concrete results. Thank you for sharing, TED.
@ArtInMotionStudios
@ArtInMotionStudios 7 лет назад
It is the title more than anything and the way she starts off the video. The issue is more complicated than just, it can't detect black faces. Which is simply not true, it does just not as many. I met someone who has been trying to solve this for years because I live in a black majority country and well it is easier said than done,
@Wegnerrobert2
@Wegnerrobert2 7 лет назад
Brenda Rua tell me, don't you ever like a video the moment you click on it?
@brendarua01
@brendarua01 7 лет назад
Golo, I don't click like or dislike until I've listened to at least have of a clip. Sure, I have plenty of topics and presenters that I'm attracted to. But whether I agre or not, I try to listen objectively and critically. I can't recall ever disliking something because of the subject matter. I will do so and I'll post a comment, if I find "alt facts" or fallacious arguments.
@brendarua01
@brendarua01 7 лет назад
Ok, Golo. I can see how using "bias" in the title would be a trigger. One would have to listen for several minutes to realize she wasn't talking about social issues but about skewed data used in the training. Even then that might not get through to some listeners.
@Wegnerrobert2
@Wegnerrobert2 7 лет назад
my point is that most people on youtube frequently immediately like videos because they simply expect that it will be good. Disliking a video directly is just as normal as liking it. But since that doesn't apply for you I will give you some other arguments why immediately liking a video is no problem. You mentioned it already in the other comment; I think that a title can be enough for a video to get a dislike because it's a simple method of feedback. And a rating is only temporary anyways. I don't pretend that my brain doesn't immediately form an opinion when it sees the video in a feed. But you can always change the rating.
@stew_baby7942
@stew_baby7942 6 месяцев назад
Not good for some computer to judge by facial features...or to judge anyone
@evilplaguedoctor5158
@evilplaguedoctor5158 7 лет назад
I wish she did more research, as in, the details as to what part of the algorithms that causes them to fail with different skin colours, and how to fix those issues. because it kind of sounds like she is just complaining wanting others to fix this problem for her.. but I could be mistaken.
@betterdaysbetterdaysbetterdays
@betterdaysbetterdaysbetterdays 3 года назад
look up "tone policing"
@evilplaguedoctor5158
@evilplaguedoctor5158 3 года назад
@@betterdaysbetterdaysbetterdays ... When I say 'sounds like' I wasn't commenting on her tone of voice.. or is that not what you meant?
@CandyLemon36
@CandyLemon36 9 месяцев назад
This content is relevant and timely. A book I read on this topic was equally pertinent. "Game Theory and the Pursuit of Algorithmic Fairness" by Jack Frostwell
@timojissink4715
@timojissink4715 7 лет назад
I've actually studied everything to do with 3D printing and so also 3D scanning, I've learned that there are 3 things that are difficuld to scan by a 3D scanner. the first is shiny objects, the second in translucent objects and the last was black objects... "Black objects" it's true, licht gets absorbed by the color.
@betterdaysbetterdaysbetterdays
@betterdaysbetterdaysbetterdays 3 года назад
do you have... do you know how machine learning works?
@morgrimx5732
@morgrimx5732 7 лет назад
It seems there is also a human bias toward facial recognition software. The bias gives more credibility to it since it's a computer !
@BERNARDO712
@BERNARDO712 7 лет назад
Great accomplishments, Joy: Joy Buolamwini is a poet of code on a mission to show compassion through computation. She is a Rhodes Scholar, Fulbright Fellow, Google Anita Borg Scholar, Astronaut Scholar, A Stamps President's Scholar and Carter Center technical consultant recognized as a distinguished volunteer. She holds a master's degree in Learning and Technology from Oxford University and a bachelor's degree in Computer Science from the Georgia Institute of Technology.
@tbpp6553
@tbpp6553 7 лет назад
More Dislikes than Likes ?? MY GOD ! This is a Real issue. My racist coolpad camera doesn't recognize my face when I select face-detection mode. It is so embarrassing !!
@CrazySondre
@CrazySondre 7 лет назад
What happened to TED Talk...?
@readingrebellion9758
@readingrebellion9758 5 лет назад
I agree with the premise of making services equitable in access and fair, but I think "unlocking equality" through digital technology is a vague and concerning mission. Equality of what? Between which groups/sub-groups? And who decides?
@Yirsi
@Yirsi 7 лет назад
While the problem that an algorithm does not work correctly is true in this case, I don't think it's connected to bias at all. But you certainly have to point out where the problem lies within the code, so the people behind it can fix it. Focusing on that issue seems more important to me.
@Joe-yr1em
@Joe-yr1em 4 года назад
Bias just means it is geared towards certain features more than others. It is connected to bias. Not in the sense that you have a coder that's biased or anything but in the sense that the model is making predictions based on datasets that dont accurately represent the target market.
@Chronomatrix
@Chronomatrix 7 лет назад
Nothing wrong can come from a facial analysis software...
@Dataanti
@Dataanti 7 лет назад
i bet it has more to do with the camera having a hard time picking up darker skin tones... because.... cameras in general have harder times with darker colours. I dont see how you will be able to fix this without upgrading all cameras to have direct light sources or IR Depth sensors. this has nothing to do with algorithms
@swordwaker7749
@swordwaker7749 7 лет назад
Well after AI learned once. It rarely try to think against it. It should identify reasons. What about training while working? If it found many faces near this spectrum then it recognize those faces. but beware not to do it for gorilla.
@barneypotts9868
@barneypotts9868 7 лет назад
It turns out that if you get a cheap webcam with a cheap face recognition software, you don't get very good face recognition
@jamespharris2494
@jamespharris2494 4 года назад
It's simple math. How many bubbles are in a bar of soap?
@Alitari
@Alitari 7 лет назад
I agree that this is a problem, but she seems to really have a shotgun approach to trying to create new or take over existing phrases / memes / acronyms ... feels to me like she's hoping one or more of them will gain traction for her own self aggrandisement ... self promotion is one thing, but it feels like this speaker took it to another level, beyond that which TED is normally known for.
@shell9918
@shell9918 3 года назад
she is so well spoken and likeable
@dermihai
@dermihai 7 лет назад
Wow, people do overreact... It is very true what she said, just watch the whole video. One thing tho... When she says that we need diversity among coders, so that they can fill each other's gaps, I hope she means diversity of exeperience and of field of study, not racial/national/sexual diversity.
@jyotiswamy6305
@jyotiswamy6305 5 лет назад
Thank you Joy! This world needs you, because programmers (as evidence from the comments), have no understanding of the SOCIAL IMPACT of their work. Of course there may be other solutions, but it is the SOCIAL structure of your field that matters. This would not be a TECHNICAL issue if every programmer was black, but because RACIAL MINORITIES are highly disadvantageous due to unethical practices and historical processes, (that have kept them from learning about such software relative to others dependent on race and gender), cannot be due to a "glitch" in the system. WAKE UP PEOPLE!!! Racial paradigmatic bias exist in computer science as well. Also, this is a BLACK WOMAN talking about facial analysis software which is changing the SOCIAL STRUCTURE of the field, and needed to prevent issues like this in the future. You can most definitely argue that there are other ways to fix this solutions, but you can't argue that the minority elite does not look like Joy. I swear this world needs to be more reflexive........UGH. JOY YOU ARE A QUEEN! Thank you so much for speaking up and being a voice for the voices in a very underrepresented field. (Simply look at the representation of the audience). WAKE UP YALL.
@austinjohn8713
@austinjohn8713 2 года назад
no. she is mis characterizing the problem. the AI would behave the same to a white person if it was trained only on black faces. it is not bias. the dataset was skewed
@TheRoomcleaner
@TheRoomcleaner 7 лет назад
didn't like the choice of words, but i realy liked the talk. a nice surprise for such a video title :-)
@mstfstone
@mstfstone 7 лет назад
Türkçe çeviri alalım buna bi
@Tripp393
@Tripp393 7 лет назад
Guys this is just something she's doing with her life. It would be dumb if they didn't talk about it
@ocubex
@ocubex 3 года назад
UK Passport Agency, are you listening?
@Sirius_Blazing_Star
@Sirius_Blazing_Star 9 месяцев назад
If the Training sets aren’t really that Diverse, any Face that Deviates too much from the Established Norm will be Harder to Detect...
@inachu
@inachu 2 года назад
Better camera and better lighting. But I do agree with her in part for some coding could use some changes but it is not about race. If Bob invents from scratch all by himself a robot to see him then this is not bias or racist. So if a technology is brand new and she invented it then just maybe the robot would just see black faces and not light skinned people. There are so many takes on this that it is crazy you can go so many ways with this. So if I create a new system to only recognize dogs then all the cat owners get in a hissy fit? No. Just build your own thing But after the technology matures and it is ready for global release then yes it should be usable for all races.
@mouath_14
@mouath_14 2 года назад
Èxcept that it IS all about race when we're talking about racial discrimination in FRT and these issue go way beyond the system's performance and accuracy to perceive and identify a colored face. Have you ever heard of Physiognomy? I recommend a paper by Luke Stark and Jevan Hutson which covers these problematic systems very clearly. This technology is not brand new and is prevalent everywhere so being super serious about its implications for society is not cosmetics, it's a necessity!
@inachu
@inachu 2 года назад
@@mouath_14 Physiognomy or any sort of psychology is not programmed into computers. Computers do not have currently any sort of AI to hate. Computers can not hate. You can not sue a company that makes a low tech camera. Now 10 to 20 years from now that can be a thing where a computer senses complexion of skin tone and act on that skin tone. This too will be hack one day to truly then make companies liable but today cameras are operating in dumb mode.
@jddebr
@jddebr 7 лет назад
Awful lot of folks in this comments section who don't understand what the word bias means in the scientific community. Bias is a real thing in machine learning. All algorithms have inductive bias. She is not saying that algorithms are racist...
@davlmt
@davlmt 7 лет назад
Yep the face detection on my sony camera always ignore black pple's faces and track white faces flawlessly
@stephenclement3349
@stephenclement3349 7 лет назад
Cool initiative! I am sure coders would love you helping them identify their bugs and provisioning them with free data. Just make sure you remember they probably didn't do it intentionally and approach them kindly. Otherwise you will end up as crusaders fighting someone who isn't really your enemy.
@Skinny97214
@Skinny97214 3 года назад
Might want to google "tone policing."
@Frozlie1
@Frozlie1 7 лет назад
When security cameras start using facial recognition this will cease to be an issue.
@aitortilla5128
@aitortilla5128 4 года назад
Many security cameras in many countries already use facial recognition.
@p3ncill3ad12
@p3ncill3ad12 7 лет назад
I would be happy if a computer could not recognize me.
@canUfeelMYface
@canUfeelMYface 4 года назад
"Someone else will solve this problem. "
@phantomcruizer
@phantomcruizer Год назад
Yes, “Colossus/Guardian…SkyNet/Legion” !
@DeoMachina
@DeoMachina 7 лет назад
>nonpolitical video about software >mass dislikes Tell me again how there isn't a problem with racism in this channel's audience
@DeoMachina
@DeoMachina 7 лет назад
What's debatable about it? Why doesn't the same thing happen when white guys talk about software?
@emmanuelezenwere
@emmanuelezenwere 7 лет назад
Nice one Joy, I'll join your fight!
@milanpaudel9624
@milanpaudel9624 7 лет назад
wtf.. Whats with those many Dislikes ? This is genuinely good ted-talk.
@garfield2406
@garfield2406 4 года назад
Could never be related to the fact that dark colors are harder to get contrast on.
@betterdaysbetterdaysbetterdays
@betterdaysbetterdaysbetterdays 3 года назад
what is your experience in the field of machine learning, photography, or both?
@jacobcromer7192
@jacobcromer7192 7 лет назад
Nobodies fighting anything, no one is trying to stop you from fixing this. Why is she framing this like a civil cause?
@theetravelingfoodie
@theetravelingfoodie 11 месяцев назад
Its ok with 247,000 views shes got something to say and they enjoy listening 😉
@randomguy_069
@randomguy_069 2 года назад
Problem was and always have been with humans, not with the algorithms. And guess what, it has been proved that even if we feed biased data into the algorithms they are less biased than humans. If we show algorithms that most of the people convicted to a crime belong to single race it will learn this and do this same thing. These algorithms are at best mirror of ourselves, we should do our best to show an ethical image instead of our real image if we want the algorithms to be 'fair'.
@phantomcruizer
@phantomcruizer Год назад
Yes, it is like teaching a child all the WRONG things in life. It doesn't know any better but, we do or, should by now!
@IshtarNike
@IshtarNike 7 лет назад
This always annoys me. Taking selfies with my mates, never get facial recognition. It's a small peeve, but it's quite annoying.
@ArtArtisian
@ArtArtisian 7 лет назад
+
@dansadler
@dansadler 7 лет назад
But it also means you have kinda natural visual surveillance protection because your face is less contrastive.
@premier69
@premier69 7 лет назад
+Dan Sadler rofl
@MysticScapes
@MysticScapes 7 лет назад
She is just seeing the problem from only one perspective. Hardwares like webcam are so important as much as these algorithms. I'm not a black person and even my webcam sometimes doesn't recognize my face cause I would have a long hipster beard. She was trying to make this bias as political and racial as possible however science doesn't care about all these labels. To avoid over fitting simply look outside the box and stop blaming others.
@TheAquaticBeef
@TheAquaticBeef 7 лет назад
AMAZING!! :D
@narayantx
@narayantx 7 лет назад
So the machines are taking over after all. 😃
@Fnidner
@Fnidner 7 лет назад
Let my people code!
@TheSkipper1921
@TheSkipper1921 7 лет назад
Everybody has their faces in a database. That is, anyone with a driver's licence or government issued ID. Have you heard of "Real ID"
@missachol24
@missachol24 7 лет назад
Omg goodness did people even watch the whole video? People are crazy 🙄
@jacob5208
@jacob5208 7 лет назад
missachol24 the algorithm she is talking about is outdated and will soon be replaced by pattern tracking software
Далее
Это iPhone 16
00:52
Просмотров 1,4 млн
Все сезоны бравл пассов…!!!😨
00:59
How AI is Deciding Who Gets Hired
15:28
Просмотров 165 тыс.