Тёмный

Gender Shades 

MIT Media Lab
Подписаться 62 тыс.
Просмотров 189 тыс.
50% 1

The Gender Shades Project pilots an intersectional approach to inclusive product testing for AI.
Gender Shades is a preliminary excavation of inadvertent negligence that will cripple the age of automation and further exacerbate inequality if left to fester. The deeper we dig, the more remnants of bias we will find in our technology. We cannot afford to look away this time, because the stakes are simply too high. We risk losing the gains made with the civil rights movement and women's movement under the false assumption of machine neutrality. Automated systems are not inherently neutral. They reflect the priorities, preferences, and prejudices-the coded gaze-of those who have the power to mold artificial intelligence.
Video produced by Joy Buolamwini and Jimmy Day
Many thanks to the Natural Sciences and Engineering Research Council of Canada | Conseil de recherches en sciences naturelles et en génie du Canada for translating the captions into French.
More information at: www.media.mit.edu/projects/ge...
License: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License (creativecommons.org/licenses/...)

Наука

Опубликовано:

 

8 фев 2018

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии    
Далее
How AI Image Generators Make Bias Worse
8:11
Просмотров 48 тыс.
How I'm fighting bias in algorithms | Joy Buolamwini
8:45
Caroline Criado Perez on Invisible Women
4:58
Просмотров 47 тыс.
Don't Put People in Boxes
4:25
Просмотров 6 млн
7 Signs of Undiagnosed Autism in Adults
15:24
Просмотров 1 млн
Собери ПК и Получи 10,000₽
1:00
Просмотров 2,5 млн