WEBVTT

00:01.333 --> 00:04.933 align:left position:0%,start line:80% size:100%
Heffner: I'm Alexander Heffner,
your host on The Open Mind.

00:46.400 --> 00:48.633 align:left position:0%,start line:80% size:100%
It was heartening to learn
from today's guest about

00:48.733 --> 00:51.333 align:left position:12.5%,start line:80% size:87.5%
the formation of the
Algorithmic Justice League

00:51.433 --> 00:53.533 align:left position:25%,start line:80% size:75%
at the MIT Media
Lab last year.

00:53.633 --> 00:56.166 align:left position:12.5%,start line:80% size:87.5%
That's because she's
building a prescription

00:56.266 --> 00:59.733 align:left position:12.5%,start line:80% size:87.5%
to unethical artificial
intelligence with which

00:59.833 --> 01:02.333 align:left position:12.5%,start line:80% size:87.5%
scholar activists,
Virginia Eubanks,

01:02.433 --> 01:04.666 align:left position:12.5%,start line:80% size:87.5%
and Cathy O'Neil most
recently grappled

01:04.766 --> 01:06.066 align:left position:25%,start line:80% size:75%
on The Open Mind.

01:06.166 --> 01:08.933 align:left position:0%,start line:80% size:100%
We are thrilled to welcome
Algorithmic Justice League

01:09.033 --> 01:11.433 align:left position:12.5%,start line:80% size:87.5%
founder Joy Buolamwini.

01:11.533 --> 01:13.766 align:left position:12.5%,start line:80% size:87.5%
Her organization's
mission is to highlight

01:13.866 --> 01:16.000 align:left position:25%,start line:80% size:75%
algorithmic bias
through media,

01:16.100 --> 01:19.000 align:left position:12.5%,start line:80% size:87.5%
art and science, to
provide space for people

01:19.100 --> 01:21.966 align:left position:12.5%,start line:80% size:87.5%
to voice concerns and
experiences with coded

01:22.066 --> 01:24.466 align:left position:25%,start line:80% size:75%
bias and to
develop practices for

01:24.566 --> 01:26.633 align:left position:25%,start line:80% size:75%
accountability
during the design,

01:26.733 --> 01:29.733 align:left position:25%,start line:80% size:75%
development, and
deployment of coded systems.

01:29.833 --> 01:33.800 align:left position:0%,start line:80% size:100%
In her efforts to document bias
and restore trust to technology.

01:33.900 --> 01:36.866 align:left position:12.5%,start line:80% size:87.5%
Buolamwini recently
delivered a presentation

01:36.966 --> 01:39.866 align:left position:12.5%,start line:80% size:87.5%
to the Federal Trade
Commission with her MIT

01:39.966 --> 01:43.933 align:left position:0%,start line:80% size:100%
thesis "Findings on Gender
and Racial Bias in Facial

01:44.033 --> 01:46.866 align:left position:12.5%,start line:80% size:87.5%
Analysis Technology"
developed from IBM,

01:46.966 --> 01:50.400 align:left position:0%,start line:80% size:100%
Microsoft, and other cognitive
service technologies.

01:50.500 --> 01:54.066 align:left position:12.5%,start line:80% size:87.5%
Their ultimate effect if
unchecked can be a cycle

01:54.166 --> 01:56.433 align:left position:12.5%,start line:80% size:87.5%
of computer-generated
discrimination,

01:56.533 --> 01:58.300 align:left position:25%,start line:80% size:75%
and I'll ask Joy
to expound on that.

01:58.400 --> 01:59.400 align:left position:25%,start line:80% size:75%
A pleasure to
see you again.

01:59.500 --> 02:01.633 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: Great to
be here. Thank you.

02:01.733 --> 02:03.200 align:left position:12.5%,start line:80% size:87.5%
HEFFNER: It was like a
spark went off when

02:03.300 --> 02:04.533 align:left position:12.5%,start line:80% size:87.5%
I was in your lab.

02:04.633 --> 02:06.600 align:left position:25%,start line:80% size:75%
You're in Ethan
Zuckerman's lab

02:06.700 --> 02:09.800 align:left position:0%,start line:80% size:100%
and you're giving this terrific
presentation on bias

02:09.900 --> 02:13.000 align:left position:12.5%,start line:80% size:87.5%
that is institutionalized
in effect through

02:13.100 --> 02:16.066 align:left position:0%,start line:80% size:100%
technologies today because
we had recently hosted two

02:16.166 --> 02:18.233 align:left position:25%,start line:80% size:75%
authors on that
very subject who were

02:18.333 --> 02:20.566 align:left position:25%,start line:80% size:75%
documenting the
problems, but here you are

02:20.666 --> 02:22.766 align:left position:25%,start line:80% size:75%
addressing it
with solutions.

02:22.866 --> 02:25.400 align:left position:12.5%,start line:80% size:87.5%
So can you give our
viewers a sense of the

02:25.500 --> 02:28.766 align:left position:0%,start line:80% size:100%
origin of your project, how you
created this organization?

02:28.866 --> 02:31.700 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: Sure, so I
didn't start out thinking

02:31.800 --> 02:34.833 align:left position:12.5%,start line:80% size:87.5%
about algorithmic justice
when I was at the media

02:34.933 --> 02:36.466 align:left position:0%,start line:80% size:100%
lab for my first semester.

02:36.566 --> 02:38.533 align:left position:12.5%,start line:80% size:87.5%
I took a course called
science fabrication.

02:38.633 --> 02:40.800 align:left position:12.5%,start line:80% size:87.5%
You read science fiction
and you try to build

02:40.900 --> 02:42.933 align:left position:12.5%,start line:80% size:87.5%
something fanciful
that would probably be

02:43.033 --> 02:44.266 align:left position:12.5%,start line:80% size:87.5%
impractical otherwise.

02:44.366 --> 02:46.833 align:left position:12.5%,start line:80% size:87.5%
So I built this project
called the aspire mirror.

02:46.933 --> 02:49.133 align:left position:12.5%,start line:80% size:87.5%
You look into what seems
like a regular mirror

02:49.233 --> 02:51.333 align:left position:12.5%,start line:80% size:87.5%
and then when the camera
detects your face,

02:51.433 --> 02:53.100 align:left position:25%,start line:80% size:75%
suddenly you can
become a lion,

02:53.200 --> 02:55.166 align:left position:12.5%,start line:80% size:87.5%
or in my case, I wanted
to be Serena Williams,

02:55.266 --> 02:57.566 align:left position:12.5%,start line:80% size:87.5%
so whatever you want
to be in the mirror.

02:57.666 --> 02:59.933 align:left position:25%,start line:80% size:75%
And as I was
building this project,

03:00.033 --> 03:02.966 align:left position:12.5%,start line:80% size:87.5%
I was using a Webcam
that had computer vision

03:03.066 --> 03:05.400 align:left position:25%,start line:80% size:75%
software meant to
detect the face,

03:05.500 --> 03:08.033 align:left position:0%,start line:80% size:100%
but it didn't consistently
detect my face until

03:08.133 --> 03:10.166 align:left position:25%,start line:80% size:75%
I literally put
on a white mask.

03:10.266 --> 03:13.100 align:left position:12.5%,start line:80% size:87.5%
So it was this experience
of building what was

03:13.200 --> 03:15.766 align:left position:12.5%,start line:80% size:87.5%
essentially an art
installation and running

03:15.866 --> 03:18.133 align:left position:12.5%,start line:80% size:87.5%
into issues with the
technology that I started

03:18.233 --> 03:22.733 align:left position:0%,start line:80% size:100%
questioning why am I wearing a
white mask to be detected?

03:22.833 --> 03:25.100 align:left position:12.5%,start line:80% size:87.5%
I have lighter skinned
colleagues who seem to use

03:25.200 --> 03:27.066 align:left position:12.5%,start line:80% size:87.5%
this just fine as it
because of the lighting

03:27.166 --> 03:28.500 align:left position:25%,start line:80% size:75%
conditions,
what's going on.

03:28.600 --> 03:31.766 align:left position:12.5%,start line:80% size:87.5%
So that's really when I
started exploring facial

03:31.866 --> 03:36.000 align:left position:0%,start line:80% size:100%
analysis technology, which is
being powered by AI techniques.

03:36.100 --> 03:37.600 align:left position:37.5%,start line:80% size:62.5%
So that's
where it started.

03:37.700 --> 03:40.266 align:left position:25%,start line:80% size:75%
And so I gave a
Ted Talk about this,

03:40.366 --> 03:42.933 align:left position:12.5%,start line:80% size:87.5%
over a million views and
I thought somebody might

03:43.033 --> 03:45.666 align:left position:12.5%,start line:80% size:87.5%
want to check my claims
right about not having

03:45.766 --> 03:47.100 align:left position:25%,start line:80% size:75%
my face detected.

03:47.200 --> 03:48.800 align:left position:25%,start line:80% size:75%
So why don't I
check myself?

03:48.900 --> 03:52.233 align:left position:12.5%,start line:80% size:87.5%
So I took the Ted profile
image and I ran it across

03:52.333 --> 03:55.133 align:left position:25%,start line:80% size:75%
various systems
from IBM, Microsoft,

03:55.233 --> 03:58.266 align:left position:12.5%,start line:80% size:87.5%
Google, etc. And I found
that some of their systems

03:58.366 --> 04:00.433 align:left position:25%,start line:80% size:75%
didn't detect
that image at all,

04:00.533 --> 04:04.400 align:left position:0%,start line:80% size:100%
and the ones that did detect the
image labeled me as male.

04:04.500 --> 04:07.666 align:left position:0%,start line:80% size:100%
I am not male right either
in gender expression or

04:07.766 --> 04:11.400 align:left position:12.5%,start line:80% size:87.5%
identity and so I wanted
to know if it was just my

04:11.500 --> 04:14.700 align:left position:0%,start line:80% size:100%
unique facial features or if it
was something more systematic.

04:14.800 --> 04:17.933 align:left position:12.5%,start line:80% size:87.5%
So that's what ended up
forming the basis of my

04:18.033 --> 04:21.400 align:left position:12.5%,start line:80% size:87.5%
Media Lab thesis where I
wanted to see how accurate

04:21.500 --> 04:24.433 align:left position:12.5%,start line:80% size:87.5%
are these facial analysis
systems when it comes to

04:24.533 --> 04:26.833 align:left position:12.5%,start line:80% size:87.5%
guessing the gender of
an age and does your skin

04:26.933 --> 04:30.466 align:left position:0%,start line:80% size:100%
type make a difference, does
your gender make a difference?

04:30.566 --> 04:31.600 align:left position:25%,start line:80% size:75%
And it might
seem like, okay,

04:31.700 --> 04:33.600 align:left position:12.5%,start line:80% size:87.5%
you got mis-gendered,
does that really matter?

04:33.700 --> 04:36.266 align:left position:12.5%,start line:80% size:87.5%
And so when I came
across a report called

04:36.366 --> 04:39.000 align:left position:0%,start line:80% size:100%
The Perpetual Lineup, which
came out of Georgetown

04:39.100 --> 04:42.000 align:left position:12.5%,start line:80% size:87.5%
Law, it showed that one
in two adults in the us,

04:42.100 --> 04:45.433 align:left position:12.5%,start line:80% size:87.5%
that's over 117 million
people has their face

04:45.533 --> 04:47.933 align:left position:0%,start line:80% size:100%
in a face recognition network
that can be searched

04:48.033 --> 04:50.033 align:left position:12.5%,start line:80% size:87.5%
by law enforcement
unwarranted using

04:50.133 --> 04:53.366 align:left position:12.5%,start line:80% size:87.5%
algorithms that haven't
been audited for accuracy.

04:53.466 --> 04:55.900 align:left position:12.5%,start line:80% size:87.5%
And in the UK where they
actually have real world

04:56.000 --> 04:59.300 align:left position:12.5%,start line:80% size:87.5%
performance reports, you
have false positive match

04:59.400 --> 05:03.033 align:left position:12.5%,start line:80% size:87.5%
rates over 90 percent and
they even had instances

05:03.133 --> 05:06.066 align:left position:12.5%,start line:80% size:87.5%
where two women were
falsely identified as

05:06.166 --> 05:08.733 align:left position:25%,start line:80% size:75%
innocent men, so
this situation,

05:08.833 --> 05:11.666 align:left position:0%,start line:80% size:100%
which in one context might
seem like an annoyance,

05:11.766 --> 05:14.733 align:left position:12.5%,start line:80% size:87.5%
in the real world can
actually lead to issues

05:14.833 --> 05:17.433 align:left position:12.5%,start line:80% size:87.5%
and so this was a bit
of the backdrop which

05:17.533 --> 05:20.366 align:left position:12.5%,start line:80% size:87.5%
I decided to go ahead
and test these systems,

05:20.466 --> 05:24.333 align:left position:12.5%,start line:80% size:87.5%
but I ran into a major
issue which was that the

05:24.433 --> 05:27.600 align:left position:12.5%,start line:80% size:87.5%
existing benchmarks, the
data sets of faces that

05:27.700 --> 05:31.300 align:left position:0%,start line:80% size:100%
are used to judge how well
these systems work are not

05:31.400 --> 05:32.833 align:left position:12.5%,start line:80% size:87.5%
very representative.

05:32.933 --> 05:35.133 align:left position:12.5%,start line:80% size:87.5%
So I started looking at
various benchmarks that

05:35.233 --> 05:37.766 align:left position:0%,start line:80% size:100%
have been used as the gold
standard to say how well

05:37.866 --> 05:40.366 align:left position:12.5%,start line:80% size:87.5%
are we doing in the
facial analysis space?

05:40.466 --> 05:43.266 align:left position:12.5%,start line:80% size:87.5%
And one of the early gold
standard benchmarks turned

05:43.366 --> 05:48.700 align:left position:0%,start line:80% size:100%
out to be 77 percent male and 83
percent white individuals.

05:48.800 --> 05:50.400 align:left position:12.5%,start line:80% size:87.5%
Then I looked at a
benchmark coming from the

05:50.500 --> 05:52.933 align:left position:12.5%,start line:80% size:87.5%
National Institute for
Standards and Technology.

05:53.033 --> 05:56.133 align:left position:12.5%,start line:80% size:87.5%
It's a government agency
that's tasked with making

05:56.233 --> 05:59.133 align:left position:12.5%,start line:80% size:87.5%
benchmarks for this type
of technology and I looked

05:59.233 --> 06:01.666 align:left position:12.5%,start line:80% size:87.5%
at their benchmark, a
slight improvement,

06:01.766 --> 06:06.033 align:left position:12.5%,start line:80% size:87.5%
75 percent male and 80
percent lighter skin,

06:06.133 --> 06:08.833 align:left position:0%,start line:80% size:100%
so I realized that if we have
these pale male data sets,

06:08.933 --> 06:11.033 align:left position:12.5%,start line:80% size:87.5%
we're not actually
going to have a good sense

06:11.133 --> 06:14.500 align:left position:12.5%,start line:80% size:87.5%
of performance and so
I made a new data set,

06:14.600 --> 06:16.200 align:left position:25%,start line:80% size:75%
one that was more
inclusive called the

06:16.300 --> 06:18.900 align:left position:0%,start line:80% size:100%
Pilot Parliaments Benchmark
and people are like,

06:19.000 --> 06:20.933 align:left position:12.5%,start line:80% size:87.5%
well, how did you make
it more gender balanced,

06:21.033 --> 06:23.500 align:left position:12.5%,start line:80% size:87.5%
how did you get more
skin type variation?

06:23.600 --> 06:26.166 align:left position:12.5%,start line:80% size:87.5%
I went to the UN women's
website and I got a list

06:26.266 --> 06:29.200 align:left position:12.5%,start line:80% size:87.5%
of the top 10 nations by
their representation of

06:29.300 --> 06:30.666 align:left position:12.5%,start line:80% size:87.5%
women in parliament.

06:30.766 --> 06:33.133 align:left position:12.5%,start line:80% size:87.5%
Rwanda led the way there
in the sixties and you

06:33.233 --> 06:35.233 align:left position:25%,start line:80% size:75%
have progressive
Nordic countries in there:

06:35.333 --> 06:37.833 align:left position:25%,start line:80% size:75%
Iceland, Finland,
Sweden, a few more African

06:37.933 --> 06:41.266 align:left position:0%,start line:80% size:100%
countries and so I decided to
choose three African countries.

06:41.366 --> 06:45.033 align:left position:12.5%,start line:80% size:87.5%
Three European countries
to get a bit more of a

06:45.133 --> 06:47.800 align:left position:25%,start line:80% size:75%
gender-balanced
benchmark, but also get

06:47.900 --> 06:51.266 align:left position:0%,start line:80% size:100%
a spread of skin types. Right?
Very dark skinned individuals,

06:51.366 --> 06:52.866 align:left position:25%,start line:80% size:75%
very light
skinned individuals.

06:52.966 --> 06:56.033 align:left position:12.5%,start line:80% size:87.5%
So I made this benchmark
because we currently had

06:56.133 --> 06:57.833 align:left position:25%,start line:80% size:75%
these pale male
benchmarks. Right?

06:57.933 --> 06:59.566 align:left position:25%,start line:80% size:75%
And with this
new benchmark,

06:59.666 --> 07:02.766 align:left position:12.5%,start line:80% size:87.5%
this is where it started
to get interesting.

07:02.866 --> 07:07.400 align:left position:12.5%,start line:80% size:87.5%
I tested systems from IBM
from Microsoft and from

07:07.500 --> 07:09.966 align:left position:12.5%,start line:80% size:87.5%
Face Plus, Plus a leading
billion dollar tech

07:10.066 --> 07:12.633 align:left position:12.5%,start line:80% size:87.5%
company in China that's
actually used by the

07:12.733 --> 07:13.966 align:left position:12.5%,start line:80% size:87.5%
Chinese government.

07:14.066 --> 07:17.666 align:left position:12.5%,start line:80% size:87.5%
So they have access to a
large store of Chinese

07:17.766 --> 07:19.500 align:left position:25%,start line:80% size:75%
faces and I
wanted to see, okay,

07:19.600 --> 07:22.000 align:left position:25%,start line:80% size:75%
how accurate are
these systems?

07:22.100 --> 07:23.466 align:left position:25%,start line:80% size:75%
So if you do it
in aggregate,

07:23.566 --> 07:26.066 align:left position:12.5%,start line:80% size:87.5%
if you just look at
the overall accuracy

07:26.166 --> 07:29.333 align:left position:12.5%,start line:80% size:87.5%
for that whole benchmark,
you had accuracy that went

07:29.433 --> 07:33.933 align:left position:0%,start line:80% size:100%
from 88 percent with IBM up to
94 percent with Microsoft,

07:34.033 --> 07:36.733 align:left position:0%,start line:80% size:100%
which might seem okay, but
once we started to rate

07:36.833 --> 07:40.433 align:left position:12.5%,start line:80% size:87.5%
the benchmark results by
gender and by skin type,

07:40.533 --> 07:42.500 align:left position:12.5%,start line:80% size:87.5%
that's when we saw a
very large disparities.

07:42.600 --> 07:44.633 align:left position:25%,start line:80% size:75%
So for lighter
skinned males,

07:44.733 --> 07:47.433 align:left position:12.5%,start line:80% size:87.5%
error rates were no
more than one percent for

07:47.533 --> 07:51.433 align:left position:12.5%,start line:80% size:87.5%
guessing the gender of a
face in that benchmark.

07:51.533 --> 07:53.333 align:left position:25%,start line:80% size:75%
When you go to
lighter skinned females,

07:53.433 --> 07:54.766 align:left position:25%,start line:80% size:75%
no more than
seven percent.

07:54.866 --> 07:57.700 align:left position:25%,start line:80% size:75%
When you go to
darker skinned males,

07:57.800 --> 07:58.933 align:left position:0%,start line:80% size:100%
you get around 12 percent.

07:59.033 --> 08:00.733 align:left position:12.5%,start line:80% size:87.5%
And when you go to
faces like mine,

08:00.833 --> 08:04.766 align:left position:12.5%,start line:80% size:87.5%
darker skinned females,
you're at around 34,

08:04.866 --> 08:08.133 align:left position:25%,start line:80% size:75%
35 percent error
rates in aggregate.

08:08.233 --> 08:10.566 align:left position:12.5%,start line:80% size:87.5%
If you disaggregate
that and you look at the

08:10.666 --> 08:13.100 align:left position:12.5%,start line:80% size:87.5%
darkest skin, females,
the highly melanated like

08:13.200 --> 08:16.266 align:left position:12.5%,start line:80% size:87.5%
myself, you actually had
air rates as high as

08:16.366 --> 08:20.333 align:left position:0%,start line:80% size:100%
47 percent in commercially
sold products,

08:20.433 --> 08:22.600 align:left position:12.5%,start line:80% size:87.5%
which for me was really
surprising because we were

08:22.700 --> 08:25.966 align:left position:0%,start line:80% size:100%
doing gender in a way that it
was reduced to a binary.

08:26.066 --> 08:30.600 align:left position:0%,start line:80% size:100%
So you have a 50/50 shot of
getting it right by just
guessing.

08:30.700 --> 08:33.100 align:left position:12.5%,start line:80% size:87.5%
And so I sent the results
to the companies to see

08:33.200 --> 08:35.066 align:left position:25%,start line:80% size:75%
what their
response would be.

08:35.166 --> 08:37.166 align:left position:12.5%,start line:80% size:87.5%
And it actually turned
out to be something many

08:37.266 --> 08:42.333 align:left position:12.5%,start line:80% size:87.5%
people were overlooking.
So after the study came out,

08:42.433 --> 08:45.133 align:left position:12.5%,start line:80% size:87.5%
companies have released
new systems showing some

08:45.233 --> 08:46.633 align:left position:12.5%,start line:80% size:87.5%
marked improvement.

08:46.733 --> 08:49.366 align:left position:25%,start line:80% size:75%
But even if these
systems are more accurate,

08:49.466 --> 08:51.400 align:left position:12.5%,start line:80% size:87.5%
how they're used is
just as important.

08:51.500 --> 08:54.200 align:left position:12.5%,start line:80% size:87.5%
So for example, IBM
made maybe a tenfold

08:54.300 --> 08:56.933 align:left position:12.5%,start line:80% size:87.5%
improvement on their
worst performing group.

08:57.033 --> 08:59.400 align:left position:12.5%,start line:80% size:87.5%
But the summer, the
Intercept came out with a

08:59.500 --> 09:02.300 align:left position:12.5%,start line:80% size:87.5%
report showing that they
had secretly equipped

09:02.400 --> 09:05.433 align:left position:12.5%,start line:80% size:87.5%
and NYPD with video
analytics that could search

09:05.533 --> 09:08.333 align:left position:12.5%,start line:80% size:87.5%
for people by their skin
tone that could search for

09:08.433 --> 09:11.366 align:left position:0%,start line:80% size:100%
people by their facial hair and
what they were wearing.

09:11.466 --> 09:14.533 align:left position:12.5%,start line:80% size:87.5%
So essentially tools for
racial profiling, right?

09:14.633 --> 09:16.833 align:left position:12.5%,start line:80% size:87.5%
So it's not just a
question of how accurate

09:16.933 --> 09:19.133 align:left position:12.5%,start line:80% size:87.5%
these systems are, it's
a question about how

09:19.233 --> 09:20.000 align:left position:12.5%,start line:80% size:87.5%
they're being used.

09:20.100 --> 09:21.433 align:left position:25%,start line:80% size:75%
HEFFNER: Well
thank you, Joy,

09:21.533 --> 09:24.166 align:left position:12.5%,start line:80% size:87.5%
because that really is an
illuminating overview and

09:24.266 --> 09:26.133 align:left position:12.5%,start line:80% size:87.5%
really informative for
our viewers who are not

09:26.233 --> 09:27.433 align:left position:25%,start line:80% size:75%
familiar with AI.

09:27.533 --> 09:31.633 align:left position:12.5%,start line:80% size:87.5%
And let's talk about
the policy implications,

09:31.733 --> 09:34.833 align:left position:12.5%,start line:80% size:87.5%
that is what is most
meaningful in the way in

09:34.933 --> 09:38.733 align:left position:12.5%,start line:80% size:87.5%
which those algorithms,
if not audited,

09:38.833 --> 09:41.100 align:left position:12.5%,start line:80% size:87.5%
if not improved from
what you described,

09:41.200 --> 09:43.733 align:left position:25%,start line:80% size:75%
could really
cause harm to people.

09:43.833 --> 09:45.933 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: One of
the reasons I was quite

09:46.033 --> 09:47.666 align:left position:25%,start line:80% size:75%
concerned is I
learned that in the US,

09:47.766 --> 09:50.866 align:left position:12.5%,start line:80% size:87.5%
there are no federal
regulations for facial

09:50.966 --> 09:53.300 align:left position:12.5%,start line:80% size:87.5%
recognition technology.
So you have...

09:53.400 --> 09:54.633 align:left position:12.5%,start line:80% size:87.5%
HEFFNER: To this day?

09:54.733 --> 09:57.933 align:left position:0%,start line:80% size:100%
BUOLAMWINI: To this day, there
are still no federal regulation.

09:58.033 --> 10:00.566 align:left position:25%,start line:80% size:75%
So you have a
space where companies,

10:00.666 --> 10:03.833 align:left position:25%,start line:80% size:75%
right, can sell
systems to government,

10:03.933 --> 10:06.466 align:left position:12.5%,start line:80% size:87.5%
entities and other types
of organizations without

10:06.566 --> 10:08.733 align:left position:12.5%,start line:80% size:87.5%
any kind of oversight.

10:08.833 --> 10:10.633 align:left position:12.5%,start line:80% size:87.5%
So let's talk about the
implications of this.

10:10.733 --> 10:14.166 align:left position:0%,start line:80% size:100%
Here's one example, you have a
company called Hire View.

10:14.266 --> 10:15.933 align:left position:25%,start line:80% size:75%
They have over
600 clients,

10:16.033 --> 10:18.933 align:left position:25%,start line:80% size:75%
including people
like Unilever and Nike.

10:19.033 --> 10:23.400 align:left position:12.5%,start line:80% size:87.5%
They use AI to help with
hiring decisions and in

10:23.500 --> 10:25.766 align:left position:0%,start line:80% size:100%
their marketing materials,
their own marketing

10:25.866 --> 10:29.400 align:left position:12.5%,start line:80% size:87.5%
materials, say they use
verbal and nonverbal cues

10:29.500 --> 10:31.566 align:left position:12.5%,start line:80% size:87.5%
to help you better
understand people's

10:31.666 --> 10:34.466 align:left position:12.5%,start line:80% size:87.5%
problem solving ability
in all sorts of things

10:34.566 --> 10:39.600 align:left position:0%,start line:80% size:100%
from their facial movements and
other cues and in reports

10:39.700 --> 10:41.966 align:left position:12.5%,start line:80% size:87.5%
about the system they
say the way they train

10:42.066 --> 10:46.266 align:left position:12.5%,start line:80% size:87.5%
the system is by looking
at current top performers,

10:46.366 --> 10:49.433 align:left position:12.5%,start line:80% size:87.5%
so given everything we
know about how bias can be

10:49.533 --> 10:52.866 align:left position:12.5%,start line:80% size:87.5%
reflected when you have
homogeneous groups of

10:52.966 --> 10:56.833 align:left position:12.5%,start line:80% size:87.5%
people in the data set, a
worry for me is what this

10:56.933 --> 10:59.433 align:left position:12.5%,start line:80% size:87.5%
system which is being
deployed to hopefully

10:59.533 --> 11:01.933 align:left position:12.5%,start line:80% size:87.5%
increase diversity
and reduce bias,

11:02.033 --> 11:04.700 align:left position:12.5%,start line:80% size:87.5%
actually do the opposite
and be in breach

11:04.800 --> 11:07.466 align:left position:12.5%,start line:80% size:87.5%
in Title Seven of the
Civil Rights Act, right?

11:07.566 --> 11:10.366 align:left position:12.5%,start line:80% size:87.5%
Which says you can't
discriminate by skin type.

11:10.466 --> 11:13.366 align:left position:0%,start line:80% size:100%
We show these algorithms can
have issues with skin type.

11:13.466 --> 11:14.966 align:left position:37.5%,start line:80% size:62.5%
You can't
discriminate by gender.

11:15.066 --> 11:17.600 align:left position:12.5%,start line:80% size:87.5%
We show these algorithms
can have these sorts of

11:17.700 --> 11:20.866 align:left position:12.5%,start line:80% size:87.5%
issues, so here you might
not necessarily know this

11:20.966 --> 11:23.666 align:left position:12.5%,start line:80% size:87.5%
is even going on, which
is another issue and where

11:23.766 --> 11:25.500 align:left position:25%,start line:80% size:75%
policy can
come into place.

11:25.600 --> 11:28.166 align:left position:12.5%,start line:80% size:87.5%
Policies say we need
affirmative consent.

11:28.266 --> 11:29.233 align:left position:25%,start line:80% size:75%
HEFFNER: Right.

11:29.333 --> 11:30.733 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: If my face
is going to be scanned

11:30.833 --> 11:34.133 align:left position:0%,start line:80% size:100%
in the first place, do I know?
Let's look at Facebook.

11:34.233 --> 11:39.400 align:left position:0%,start line:80% size:100%
Facebook has one of the largest
stores of labeled data of faces.

11:39.500 --> 11:40.500 align:left position:25%,start line:80% size:75%
How and why?

11:40.600 --> 11:44.033 align:left position:12.5%,start line:80% size:87.5%
Well we've been uploading
our images and tagging

11:44.133 --> 11:47.666 align:left position:0%,start line:80% size:100%
people right, and this has
enabled Facebook to

11:47.766 --> 11:50.466 align:left position:25%,start line:80% size:75%
get very valuable
information and develop

11:50.566 --> 11:52.866 align:left position:0%,start line:80% size:100%
facial analysis technology.

11:52.966 --> 11:56.700 align:left position:0%,start line:80% size:100%
In 2014, Facebook came out
with the paper called Deep

11:56.800 --> 12:00.600 align:left position:12.5%,start line:80% size:87.5%
Face that showed a marked
improvement in facial

12:00.700 --> 12:02.433 align:left position:12.5%,start line:80% size:87.5%
recognition abilities.

12:02.533 --> 12:04.300 align:left position:12.5%,start line:80% size:87.5%
And where did that marked
improvement come from?

12:04.400 --> 12:07.266 align:left position:25%,start line:80% size:75%
Having access
to more data, our data.

12:07.366 --> 12:11.366 align:left position:12.5%,start line:80% size:87.5%
So now Facebook actually
stores a unique face print

12:11.466 --> 12:13.700 align:left position:25%,start line:80% size:75%
of your face,
your image, right?

12:13.800 --> 12:16.533 align:left position:0%,start line:80% size:100%
While you're on the system
and you might be able to

12:16.633 --> 12:18.733 align:left position:25%,start line:80% size:75%
opt out if you go
to certain settings,

12:18.833 --> 12:21.133 align:left position:12.5%,start line:80% size:87.5%
but that doesn't mean
they're deleting your

12:21.233 --> 12:23.666 align:left position:12.5%,start line:80% size:87.5%
biometric information.

12:23.766 --> 12:26.000 align:left position:0%,start line:80% size:100%
This biometric information
could be used by law

12:26.100 --> 12:29.200 align:left position:12.5%,start line:80% size:87.5%
enforcement, it could be
sold to other companies,

12:29.300 --> 12:32.333 align:left position:12.5%,start line:80% size:87.5%
and so we don't even have
a sense of what's going on

12:32.433 --> 12:33.400 align:left position:12.5%,start line:80% size:87.5%
in the first place.

12:33.500 --> 12:35.666 align:left position:12.5%,start line:80% size:87.5%
Going back to the hiring
example, you show up.

12:35.766 --> 12:37.500 align:left position:25%,start line:80% size:75%
They say, oh, we
have this cool new app,

12:37.600 --> 12:40.100 align:left position:12.5%,start line:80% size:87.5%
just do the interview
and you should be fine.

12:40.200 --> 12:43.000 align:left position:12.5%,start line:80% size:87.5%
Do you even know they're
running AI analytics in

12:43.100 --> 12:46.800 align:left position:12.5%,start line:80% size:87.5%
the first place and if
you have a poor decision,

12:46.900 --> 12:48.533 align:left position:0%,start line:80% size:100%
right, how do you contest?

12:48.633 --> 12:50.833 align:left position:12.5%,start line:80% size:87.5%
So that's another place
that policy can come into

12:50.933 --> 12:53.000 align:left position:25%,start line:80% size:75%
play when it
comes to due process.

12:53.100 --> 12:54.100 align:left position:25%,start line:80% size:75%
HEFFNER: Right.

12:54.200 --> 12:55.300 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: Because right
now if you don't know

12:55.400 --> 12:59.633 align:left position:12.5%,start line:80% size:87.5%
there's AI in play who do
you go to? What do you ask?

12:59.733 --> 13:02.100 align:left position:25%,start line:80% size:75%
HEFFNER: Now that
organizations like the

13:02.200 --> 13:05.633 align:left position:12.5%,start line:80% size:87.5%
ones you mentioned, but
also Facebook and the

13:05.733 --> 13:09.500 align:left position:12.5%,start line:80% size:87.5%
newer companies have the
storage house of data,

13:09.600 --> 13:13.700 align:left position:12.5%,start line:80% size:87.5%
how are you advocating
that it be managed

13:13.800 --> 13:18.000 align:left position:0%,start line:80% size:100%
ethically now that it is in the
property of the companies?

13:18.100 --> 13:19.100 align:left position:25%,start line:80% size:75%
BUOLAMWINI: Sure.

13:19.200 --> 13:22.500 align:left position:12.5%,start line:80% size:87.5%
One of the first things
is affirmative consent.

13:22.600 --> 13:24.633 align:left position:12.5%,start line:80% size:87.5%
Do we know how these
systems are being?

13:24.733 --> 13:27.266 align:left position:12.5%,start line:80% size:87.5%
HEFFNER: But it needs
almost to be retroactive

13:27.366 --> 13:30.533 align:left position:12.5%,start line:80% size:87.5%
to every image that
was ever recorded.

13:30.633 --> 13:31.266 align:left position:0%,start line:80% size:100%
Not. You know what I mean?

13:31.366 --> 13:32.700 align:left position:12.5%,start line:80% size:87.5%
Not just new consent.

13:32.800 --> 13:35.033 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: True, but
there are also systems

13:35.133 --> 13:37.333 align:left position:12.5%,start line:80% size:87.5%
that we don't necessarily
know how they're being

13:37.433 --> 13:38.400 align:left position:12.5%,start line:80% size:87.5%
employed right now.

13:38.500 --> 13:40.733 align:left position:12.5%,start line:80% size:87.5%
So even in New York
there's a bill that's on,

13:40.833 --> 13:43.833 align:left position:12.5%,start line:80% size:87.5%
that has been proposed
to say in a place like

13:43.933 --> 13:46.266 align:left position:25%,start line:80% size:75%
Madison Square
Garden, right,

13:46.366 --> 13:49.600 align:left position:12.5%,start line:80% size:87.5%
where reports have shown
facial analysis technology

13:49.700 --> 13:52.300 align:left position:12.5%,start line:80% size:87.5%
is being used even though
it hadn't necessarily been

13:52.400 --> 13:55.500 align:left position:0%,start line:80% size:100%
disclosed that it's being used
in the first place, right.

13:55.600 --> 13:58.866 align:left position:12.5%,start line:80% size:87.5%
So I do think it's
important that we say even

13:58.966 --> 14:01.633 align:left position:12.5%,start line:80% size:87.5%
though these systems have
been deployed without our

14:01.733 --> 14:04.800 align:left position:12.5%,start line:80% size:87.5%
knowing, starting at a
place where we're aware of

14:04.900 --> 14:07.833 align:left position:25%,start line:80% size:75%
their use is
absolutely important.

14:07.933 --> 14:08.900 align:left position:12.5%,start line:80% size:87.5%
Now there's another step.

14:09.000 --> 14:10.633 align:left position:25%,start line:80% size:75%
You know where
it's being used, right?

14:10.733 --> 14:12.633 align:left position:12.5%,start line:80% size:87.5%
Do you have voice? Do
you have control?

14:12.733 --> 14:13.866 align:left position:12.5%,start line:80% size:87.5%
Do you have consent?

14:13.966 --> 14:16.166 align:left position:12.5%,start line:80% size:87.5%
So for Facebook, what I
would like to see is the

14:16.266 --> 14:19.466 align:left position:12.5%,start line:80% size:87.5%
option to purge your
biometric data, right?

14:19.566 --> 14:22.433 align:left position:12.5%,start line:80% size:87.5%
So you don't necessarily
have to say I'm using the

14:22.533 --> 14:26.466 align:left position:12.5%,start line:80% size:87.5%
service and by my using
service you automatically

14:26.566 --> 14:30.700 align:left position:0%,start line:80% size:100%
get to keep and store biometric
data about me. Right.

14:30.800 --> 14:32.833 align:left position:12.5%,start line:80% size:87.5%
So even though they've
been doing that in the

14:32.933 --> 14:36.300 align:left position:12.5%,start line:80% size:87.5%
past, it doesn't mean we
can't change the practice

14:36.400 --> 14:37.366 align:left position:25%,start line:80% size:75%
moving forward.

14:37.466 --> 14:39.333 align:left position:12.5%,start line:80% size:87.5%
HEFFNER: How about those
other companies were the

14:39.433 --> 14:42.033 align:left position:25%,start line:80% size:75%
ones to which you
sent your research?

14:42.133 --> 14:45.533 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: So I sent my
research to Microsoft,

14:45.633 --> 14:48.400 align:left position:12.5%,start line:80% size:87.5%
I sent it to IBM; I
sent it to Face Plus Plus.

14:48.500 --> 14:52.600 align:left position:12.5%,start line:80% size:87.5%
Microsoft and IBM got
back to me saying this is

14:52.700 --> 14:53.933 align:left position:12.5%,start line:80% size:87.5%
something we care about.

14:54.033 --> 14:56.200 align:left position:12.5%,start line:80% size:87.5%
And again, I mentioned
the Intercept article that

14:56.300 --> 14:57.266 align:left position:12.5%,start line:80% size:87.5%
came out with IBM.

14:57.366 --> 15:01.066 align:left position:12.5%,start line:80% size:87.5%
They are selling these
systems and so in some

15:01.166 --> 15:03.233 align:left position:25%,start line:80% size:75%
cases when I
talk to companies,

15:03.333 --> 15:06.433 align:left position:12.5%,start line:80% size:87.5%
I use the term the
under-sampled majority,

15:06.533 --> 15:10.133 align:left position:12.5%,start line:80% size:87.5%
so if you're missing
women and people of color,

15:10.233 --> 15:14.400 align:left position:0%,start line:80% size:100%
right, this isn't necessarily a
minority concern.

15:14.500 --> 15:15.633 align:left position:12.5%,start line:80% size:87.5%
But again, are you...

15:15.733 --> 15:17.333 align:left position:12.5%,start line:80% size:87.5%
HEFFNER: So how do you
insure that the new

15:17.433 --> 15:20.033 align:left position:12.5%,start line:80% size:87.5%
algorithms they're coming
up with are ethical?

15:20.133 --> 15:24.033 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: So when we're
thinking about ethical

15:24.133 --> 15:26.700 align:left position:12.5%,start line:80% size:87.5%
algorithms, we have to
think about systems

15:26.800 --> 15:27.866 align:left position:25%,start line:80% size:75%
and not products.

15:27.966 --> 15:30.166 align:left position:25%,start line:80% size:75%
So if you just
think about a product,

15:30.266 --> 15:31.700 align:left position:25%,start line:80% size:75%
right, then
you're like, okay,

15:31.800 --> 15:33.900 align:left position:12.5%,start line:80% size:87.5%
how, what kind of bias
does this product have,

15:34.000 --> 15:36.533 align:left position:0%,start line:80% size:100%
etc. and so forth, but the
question we really should

15:36.633 --> 15:39.900 align:left position:12.5%,start line:80% size:87.5%
be asking are how are
these products being

15:40.000 --> 15:42.500 align:left position:37.5%,start line:80% size:62.5%
designed,
developed and deployed?

15:42.600 --> 15:45.400 align:left position:12.5%,start line:80% size:87.5%
What are the mechanisms
for oversight or

15:45.500 --> 15:47.700 align:left position:25%,start line:80% size:75%
accountability in
the first place?

15:47.800 --> 15:50.900 align:left position:12.5%,start line:80% size:87.5%
So to deploy ethical
facial recognition,

15:51.000 --> 15:52.933 align:left position:12.5%,start line:80% size:87.5%
I think coming up with
something called the Safe

15:53.033 --> 15:55.700 align:left position:12.5%,start line:80% size:87.5%
Face Project and what
the Safe Face Project,

15:55.800 --> 15:58.300 align:left position:12.5%,start line:80% size:87.5%
there are four major
things we're asking for.

15:58.400 --> 16:01.633 align:left position:0%,start line:80% size:100%
The first thing is to show
value for human life has

16:01.733 --> 16:04.800 align:left position:12.5%,start line:80% size:87.5%
dignity and rights,
right, and this means not

16:04.900 --> 16:07.933 align:left position:0%,start line:80% size:100%
developing facial analysis
technology for use for

16:08.033 --> 16:10.200 align:left position:0%,start line:80% size:100%
lethal autonomous weapons.

16:10.300 --> 16:13.033 align:left position:12.5%,start line:80% size:87.5%
Let's say there are
categorical areas we do

16:13.133 --> 16:14.733 align:left position:25%,start line:80% size:75%
not want to use
the technology.

16:14.833 --> 16:17.266 align:left position:25%,start line:80% size:75%
This is one thing
companies can step up and

16:17.366 --> 16:21.066 align:left position:12.5%,start line:80% size:87.5%
say, we won't supply
facial analysis technology

16:21.166 --> 16:23.266 align:left position:12.5%,start line:80% size:87.5%
in a way that could
lead to bodily harm, right?

16:23.366 --> 16:25.966 align:left position:25%,start line:80% size:75%
Like that's make
categorical bands.

16:26.066 --> 16:28.200 align:left position:12.5%,start line:80% size:87.5%
Those are some steps
that you can take.

16:28.300 --> 16:31.000 align:left position:12.5%,start line:80% size:87.5%
The other thing is to
address bias continuously,

16:31.100 --> 16:33.333 align:left position:25%,start line:80% size:75%
which means
doing it internally,

16:33.433 --> 16:35.166 align:left position:12.5%,start line:80% size:87.5%
right, where you're
checking throughout the

16:35.266 --> 16:37.533 align:left position:25%,start line:80% size:75%
entire design
process, design,

16:37.633 --> 16:40.000 align:left position:12.5%,start line:80% size:87.5%
development, deployment
of these systems,

16:40.100 --> 16:42.366 align:left position:12.5%,start line:80% size:87.5%
and you're not reacting
the way companies had to

16:42.466 --> 16:44.366 align:left position:25%,start line:80% size:75%
do when gender
shades came out,

16:44.466 --> 16:47.000 align:left position:12.5%,start line:80% size:87.5%
but it's actually part of
your process, right?

16:47.100 --> 16:49.933 align:left position:12.5%,start line:80% size:87.5%
Not an end product, but a
process of continuously

16:50.033 --> 16:51.366 align:left position:12.5%,start line:80% size:87.5%
checking for bias.

16:51.466 --> 16:53.466 align:left position:12.5%,start line:80% size:87.5%
The other thing is
facilitating transparency,

16:53.566 --> 16:56.433 align:left position:12.5%,start line:80% size:87.5%
so submitting your models
to the National Institute

16:56.533 --> 16:58.600 align:left position:25%,start line:80% size:75%
for Standards and
Technologies or other

16:58.700 --> 17:01.500 align:left position:12.5%,start line:80% size:87.5%
standards bodies so we
have a better sense of how

17:01.600 --> 17:05.066 align:left position:12.5%,start line:80% size:87.5%
they're actually working
and also reporting on real

17:05.166 --> 17:07.466 align:left position:12.5%,start line:80% size:87.5%
world deployment because
the benchmarks will only

17:07.566 --> 17:08.966 align:left position:25%,start line:80% size:75%
tell you so much.

17:09.066 --> 17:10.900 align:left position:12.5%,start line:80% size:87.5%
And then the final thing
is actually embedding

17:11.000 --> 17:14.966 align:left position:0%,start line:80% size:100%
these types of practices into
their contracts, right?

17:15.066 --> 17:19.733 align:left position:12.5%,start line:80% size:87.5%
Where you say we provide
a service that another

17:19.833 --> 17:22.633 align:left position:0%,start line:80% size:100%
company can integrate into
their product and this

17:22.733 --> 17:25.366 align:left position:12.5%,start line:80% size:87.5%
means bias can propagate
rapidly, right?

17:25.466 --> 17:29.166 align:left position:12.5%,start line:80% size:87.5%
So a company like Amazon
might have a cloud service

17:29.266 --> 17:31.200 align:left position:12.5%,start line:80% size:87.5%
that gives you facial
recognition and then you

17:31.300 --> 17:34.466 align:left position:0%,start line:80% size:100%
have thousands of companies
that are using that service.

17:34.566 --> 17:37.333 align:left position:12.5%,start line:80% size:87.5%
So if you can say, okay,
we also require ethical

17:37.433 --> 17:39.866 align:left position:25%,start line:80% size:75%
use of these
systems we're providing,

17:39.966 --> 17:41.566 align:left position:25%,start line:80% size:75%
I think that can
go a long way.

17:41.666 --> 17:43.766 align:left position:12.5%,start line:80% size:87.5%
HEFFNER: Joy, we're
talking about an Internet

17:43.866 --> 17:46.400 align:left position:25%,start line:80% size:75%
Bill of Rights in
effect broadly,

17:46.500 --> 17:52.233 align:left position:12.5%,start line:80% size:87.5%
or at least the right of
the user to know how the

17:52.333 --> 17:55.066 align:left position:12.5%,start line:80% size:87.5%
algorithm is functioning
and how it might

17:55.166 --> 17:57.933 align:left position:12.5%,start line:80% size:87.5%
practically affect
them in their life.

17:58.033 --> 18:00.800 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: And not only
to know, to have agency.

18:00.900 --> 18:02.066 align:left position:25%,start line:80% size:75%
HEFFNER: And to
have to agency.

18:02.166 --> 18:04.966 align:left position:25%,start line:80% size:75%
So how do you
collect those imperatives,

18:05.066 --> 18:08.100 align:left position:0%,start line:80% size:100%
they're really imperatives
as you described them in a

18:08.200 --> 18:13.066 align:left position:12.5%,start line:80% size:87.5%
way that can be tangible
beyond the internal

18:13.166 --> 18:14.966 align:left position:25%,start line:80% size:75%
conduct of
these companies,

18:15.066 --> 18:16.600 align:left position:12.5%,start line:80% size:87.5%
which is important, what
we've talked about with

18:16.700 --> 18:20.600 align:left position:25%,start line:80% size:75%
IBM and Facebook,
beyond internal reform,

18:20.700 --> 18:23.366 align:left position:25%,start line:80% size:75%
what is your hope
for external action?

18:23.466 --> 18:25.533 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: GDPR the
general data protection

18:25.633 --> 18:28.233 align:left position:12.5%,start line:80% size:87.5%
regulation that came out
from the European Union,

18:28.333 --> 18:31.300 align:left position:12.5%,start line:80% size:87.5%
it attempts to say
what does it look like to

18:31.400 --> 18:34.100 align:left position:25%,start line:80% size:75%
empower a data
citizen? Right?

18:34.200 --> 18:37.966 align:left position:12.5%,start line:80% size:87.5%
So if you are a citizen
of the European Union,

18:38.066 --> 18:41.033 align:left position:0%,start line:80% size:100%
these are certain rights that
you have with your data.

18:41.133 --> 18:42.733 align:left position:25%,start line:80% size:75%
And that's
been interesting.

18:42.833 --> 18:46.833 align:left position:0%,start line:80% size:100%
You might've seen earlier this
year when it wasn't acted.

18:46.933 --> 18:48.366 align:left position:25%,start line:80% size:75%
Maybe you got a
ton of emails,

18:48.466 --> 18:50.866 align:left position:12.5%,start line:80% size:87.5%
right, telling you our
privacy policy has been

18:50.966 --> 18:52.300 align:left position:12.5%,start line:80% size:87.5%
updated and so forth.

18:52.400 --> 18:55.500 align:left position:12.5%,start line:80% size:87.5%
And I believe people who
are in violation of GDPR

18:55.600 --> 19:00.333 align:left position:0%,start line:80% size:100%
face a 20 million Euro fine or
four percent of global turnover.

19:00.433 --> 19:04.366 align:left position:0%,start line:80% size:100%
So it's not without consequence
to breach these systems.

19:04.466 --> 19:07.366 align:left position:12.5%,start line:80% size:87.5%
What's been interesting
to me is looking at

19:07.466 --> 19:09.800 align:left position:12.5%,start line:80% size:87.5%
conversations that have
been happening in the EU,

19:09.900 --> 19:12.400 align:left position:12.5%,start line:80% size:87.5%
at the UN there's the
Montreal Declaration.

19:12.500 --> 19:15.466 align:left position:12.5%,start line:80% size:87.5%
There are quite a few
declarations out there

19:15.566 --> 19:19.266 align:left position:12.5%,start line:80% size:87.5%
talking about principles
for good governance of AI

19:19.366 --> 19:22.133 align:left position:12.5%,start line:80% size:87.5%
or more ethical AI.

19:22.233 --> 19:25.300 align:left position:12.5%,start line:80% size:87.5%
They tend to come from
European countries, right?

19:25.400 --> 19:29.866 align:left position:12.5%,start line:80% size:87.5%
Or western countries, and
I'm quite concerned about

19:29.966 --> 19:34.800 align:left position:12.5%,start line:80% size:87.5%
what the implications of
having these systems or

19:34.900 --> 19:38.200 align:left position:0%,start line:80% size:100%
like a GDPR that's focused
on European citizens.

19:38.300 --> 19:40.000 align:left position:12.5%,start line:80% size:87.5%
What does it mean for
the rest of the world?

19:40.100 --> 19:41.700 align:left position:25%,start line:80% size:75%
So let me give
you an example.

19:41.800 --> 19:44.966 align:left position:12.5%,start line:80% size:87.5%
Let's say you have data
protections for European

19:45.066 --> 19:48.533 align:left position:0%,start line:80% size:100%
citizens, but companies want to
go gather data, right?

19:48.633 --> 19:50.566 align:left position:25%,start line:80% size:75%
So where's the
next place you go?

19:50.666 --> 19:52.800 align:left position:25%,start line:80% size:75%
You go to the
global south;

19:52.900 --> 19:54.566 align:left position:12.5%,start line:80% size:87.5%
you're starting to
see this with facial

19:54.666 --> 19:56.766 align:left position:25%,start line:80% size:75%
recognition
systems right now,

19:56.866 --> 20:00.833 align:left position:0%,start line:80% size:100%
where you have Chinese companies
going to African nations.

20:00.933 --> 20:03.666 align:left position:12.5%,start line:80% size:87.5%
So there's one instance,
we have a Chinese company

20:03.766 --> 20:06.866 align:left position:12.5%,start line:80% size:87.5%
going to Zimbabwe saying,
we'll give you write this

20:06.966 --> 20:09.500 align:left position:0%,start line:80% size:100%
facial analysis technology
in return for something

20:09.600 --> 20:12.433 align:left position:25%,start line:80% size:75%
very precious,
which is the data, right?

20:12.533 --> 20:14.000 align:left position:0%,start line:80% size:100%
The data of your citizens.

20:14.100 --> 20:16.866 align:left position:12.5%,start line:80% size:87.5%
So for me, what I'm
starting to see is almost

20:16.966 --> 20:18.866 align:left position:12.5%,start line:80% size:87.5%
like a parallel, right?

20:18.966 --> 20:23.000 align:left position:0%,start line:80% size:100%
A bit to the transatlantic
slave trade where you have

20:23.100 --> 20:27.966 align:left position:0%,start line:80% size:100%
bodies but now digital bodies
being sourced and exploited.

20:28.066 --> 20:28.966 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: Right?

20:29.066 --> 20:31.033 align:left position:12.5%,start line:80% size:87.5%
And then being used
in other systems....

20:31.133 --> 20:32.666 align:left position:25%,start line:80% size:75%
HEFFNER: I had
asked you when we met,

20:32.766 --> 20:35.400 align:left position:25%,start line:80% size:75%
if you had seen
Mr. Robot, have you?

20:35.500 --> 20:39.333 align:left position:0%,start line:80% size:100%
I told you -- I'll invite you
here if you go see Mr. Robot!

20:39.433 --> 20:42.333 align:left position:12.5%,start line:80% size:87.5%
(laughs) your Algorithmic
Justice League is the

20:42.433 --> 20:45.700 align:left position:12.5%,start line:80% size:87.5%
antidote to what is a
dystopian future that you

20:45.800 --> 20:49.066 align:left position:25%,start line:80% size:75%
describe when in
effect the Chinese are

20:49.166 --> 20:51.600 align:left position:25%,start line:80% size:75%
harvesting, not
organs, but all the data

20:51.700 --> 20:54.400 align:left position:12.5%,start line:80% size:87.5%
associated with people,
just what you described.

20:54.500 --> 20:56.800 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: I call this
the pending exploitation

20:56.900 --> 20:59.566 align:left position:12.5%,start line:80% size:87.5%
of the data wealth
of the global south,

20:59.666 --> 21:00.933 align:left position:0%,start line:80% size:100%
but it's actually happening.

21:01.033 --> 21:02.866 align:left position:12.5%,start line:80% size:87.5%
HEFFNER: So, so whether
you want to call it a

21:02.966 --> 21:05.566 align:left position:0%,start line:80% size:100%
dystopia or not, I'll take
that liberty and say that

21:05.666 --> 21:06.800 align:left position:12.5%,start line:80% size:87.5%
is a kind of dystopia

21:06.900 --> 21:09.000 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: That is
happening...

21:09.100 --> 21:10.600 align:left position:25%,start line:80% size:75%
HEFFNER: That is
happening in real time.

21:10.700 --> 21:12.900 align:left position:25%,start line:80% size:75%
But from the
American perspective,

21:13.000 --> 21:14.200 align:left position:25%,start line:80% size:75%
we have some
minutes remaining.

21:14.300 --> 21:17.433 align:left position:25%,start line:80% size:75%
So my question
more specifically is,

21:17.533 --> 21:21.266 align:left position:12.5%,start line:80% size:87.5%
who in American politics
is demonstrably concerned

21:21.366 --> 21:23.733 align:left position:25%,start line:80% size:75%
about this issue
and acting on it,

21:23.833 --> 21:25.466 align:left position:25%,start line:80% size:75%
any particular
politicians?

21:25.566 --> 21:27.866 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: So this
summer I actually had the

21:27.966 --> 21:30.166 align:left position:25%,start line:80% size:75%
opportunity to
brief staffers of the

21:30.266 --> 21:33.333 align:left position:0%,start line:80% size:100%
Congressional Black Caucus
and also shared some of my

21:33.433 --> 21:37.433 align:left position:12.5%,start line:80% size:87.5%
research findings with
us, Senator Kamala Harris,

21:37.533 --> 21:40.100 align:left position:12.5%,start line:80% size:87.5%
and she, along with
seven other senators,

21:40.200 --> 21:42.833 align:left position:12.5%,start line:80% size:87.5%
wrote letters to the
FBI, to the Federal Trade

21:42.933 --> 21:47.433 align:left position:0%,start line:80% size:100%
Commission and to the EEOC
specifically asking that

21:47.533 --> 21:50.500 align:left position:12.5%,start line:80% size:87.5%
they look at the risks
posed by facial analysis

21:50.600 --> 21:54.100 align:left position:12.5%,start line:80% size:87.5%
technologies, could these
breach civil liberties and

21:54.200 --> 21:57.900 align:left position:12.5%,start line:80% size:87.5%
civil rights laws?
And they absolutely can.

21:58.000 --> 22:02.300 align:left position:12.5%,start line:80% size:87.5%
So we do have people in
congress who are concerned

22:02.400 --> 22:06.033 align:left position:12.5%,start line:80% size:87.5%
and are pushing more
for regulation for more

22:06.133 --> 22:08.366 align:left position:12.5%,start line:80% size:87.5%
government consideration
about these systems.

22:08.466 --> 22:11.533 align:left position:12.5%,start line:80% size:87.5%
HEFFNER: Are there
any draft pieces of

22:11.633 --> 22:13.566 align:left position:25%,start line:80% size:75%
legislation
that would help?

22:13.666 --> 22:14.666 align:left position:25%,start line:80% size:75%
BUOLAMWINI: Sure.

22:14.766 --> 22:18.466 align:left position:12.5%,start line:80% size:87.5%
So Georgetown Law in 2016
when they released the

22:18.566 --> 22:20.100 align:left position:25%,start line:80% size:75%
"Perpetual
Lineup" report, right,

22:20.200 --> 22:22.966 align:left position:12.5%,start line:80% size:87.5%
the one showing over 100
police departments using

22:23.066 --> 22:24.900 align:left position:37.5%,start line:80% size:62.5%
facial
analysis technology,

22:25.000 --> 22:27.133 align:left position:25%,start line:80% size:75%
unregulated,
unwarranted, all of that.

22:27.233 --> 22:30.566 align:left position:12.5%,start line:80% size:87.5%
They actually proposed
draft legislation for

22:30.666 --> 22:34.400 align:left position:0%,start line:80% size:100%
facial analysis technology
that could fill in this

22:34.500 --> 22:36.933 align:left position:0%,start line:80% size:100%
gap we currently have where
we have no federal laws.

22:37.033 --> 22:39.300 align:left position:0%,start line:80% size:100%
HEFFNER: And what did that,
what did that espouse?

22:39.400 --> 22:41.900 align:left position:0%,start line:80% size:100%
BUOLAMWINI: First that was
to set a high standard for

22:42.000 --> 22:45.633 align:left position:25%,start line:80% size:75%
the use of facial
analysis technology in
the first place. Right.

22:45.733 --> 22:48.200 align:left position:12.5%,start line:80% size:87.5%
Is there imminent danger;
is their immediate threat

22:48.300 --> 22:49.266 align:left position:25%,start line:80% size:75%
to bodily harm?

22:49.366 --> 22:52.666 align:left position:12.5%,start line:80% size:87.5%
Do you have a warrant to
even use this technology

22:52.766 --> 22:53.900 align:left position:12.5%,start line:80% size:87.5%
in the first place?

22:54.000 --> 22:56.633 align:left position:12.5%,start line:80% size:87.5%
Another component is
actually checking the

22:56.733 --> 22:59.566 align:left position:12.5%,start line:80% size:87.5%
accuracy of these systems
in the first place, right?

22:59.666 --> 23:01.400 align:left position:25%,start line:80% size:75%
You say we want
these systems to help,

23:01.500 --> 23:03.900 align:left position:12.5%,start line:80% size:87.5%
but are you sending
parachutes with holes

23:04.000 --> 23:04.966 align:left position:12.5%,start line:80% size:87.5%
in the first place?

23:05.066 --> 23:08.766 align:left position:12.5%,start line:80% size:87.5%
So I think it was a good
draft proposal, right?

23:08.866 --> 23:12.000 align:left position:0%,start line:80% size:100%
Showing some substantive action
that can be taken now,

23:12.100 --> 23:14.400 align:left position:25%,start line:80% size:75%
HEFFNER: How do
you see if at all,

23:14.500 --> 23:18.733 align:left position:12.5%,start line:80% size:87.5%
the facial recognition
concerns intersecting

23:18.833 --> 23:20.333 align:left position:25%,start line:80% size:75%
the banking and
financial sector?

23:20.433 --> 23:22.633 align:left position:12.5%,start line:80% size:87.5%
We've talked about it in
the context of criminal

23:22.733 --> 23:26.766 align:left position:12.5%,start line:80% size:87.5%
justice; employment,
mentioned the EEOC your

23:26.866 --> 23:28.933 align:left position:25%,start line:80% size:75%
correspondence
with Senator Harris.

23:29.033 --> 23:32.600 align:left position:25%,start line:80% size:75%
But I'm going
to the point of,

23:32.700 --> 23:34.633 align:left position:12.5%,start line:80% size:87.5%
of Mr. Robot, which
maybe you'll watch now,

23:34.733 --> 23:38.100 align:left position:12.5%,start line:80% size:87.5%
(laughter) is given the
inequities that exist in

23:38.200 --> 23:43.266 align:left position:12.5%,start line:80% size:87.5%
our society, unlike in
the projected dystopia

23:43.366 --> 23:47.033 align:left position:12.5%,start line:80% size:87.5%
of Mr. Robot where it
just all debt is cleared

23:47.133 --> 23:50.800 align:left position:0%,start line:80% size:100%
and you know, all banks, banks
accounts are in effect,
equalized.

23:50.900 --> 23:54.066 align:left position:12.5%,start line:80% size:87.5%
It's more likely in the
American perspective that

23:54.166 --> 23:57.766 align:left position:12.5%,start line:80% size:87.5%
those without means or
with less means are going

23:57.866 --> 24:03.266 align:left position:12.5%,start line:80% size:87.5%
to suffer, in the case of
a hack or in the case of

24:03.366 --> 24:06.500 align:left position:25%,start line:80% size:75%
an economic Pearl
Harbor or 9/11 scenario.

24:06.600 --> 24:11.600 align:left position:0%,start line:80% size:100%
How, if at all, will AI be
relevant to the future of

24:15.333 --> 24:17.266 align:left position:25%,start line:80% size:75%
the banking and
financial sector?

24:17.366 --> 24:18.366 align:left position:25%,start line:80% size:75%
BUOLAMWINI: Yes.

24:18.466 --> 24:21.966 align:left position:12.5%,start line:80% size:87.5%
So this summer I had the
opportunity to share my

24:22.066 --> 24:25.700 align:left position:12.5%,start line:80% size:87.5%
research at a conference
called Fund Forum

24:25.800 --> 24:28.766 align:left position:12.5%,start line:80% size:87.5%
International where you
have some of the leading

24:28.866 --> 24:31.933 align:left position:12.5%,start line:80% size:87.5%
hedge fund managers,
etc. excited about the

24:32.033 --> 24:36.266 align:left position:12.5%,start line:80% size:87.5%
possibilities of AI for
financial services, right?

24:36.366 --> 24:40.666 align:left position:12.5%,start line:80% size:87.5%
So whether it's seeing
how creditworthy somebody

24:40.766 --> 24:43.133 align:left position:25%,start line:80% size:75%
might be for a
particular opportunity,

24:43.233 --> 24:46.900 align:left position:12.5%,start line:80% size:87.5%
so as you're having AI
influence decisions about

24:47.000 --> 24:48.366 align:left position:25%,start line:80% size:75%
if you have
access to credit,

24:48.466 --> 24:50.033 align:left position:37.5%,start line:80% size:62.5%
access to
loans, et cetera,

24:50.133 --> 24:53.233 align:left position:12.5%,start line:80% size:87.5%
and so forth, any type
of systematic bias

24:53.333 --> 24:57.066 align:left position:0%,start line:80% size:100%
that's embedded within AI can
lead to a digital redlining.

24:57.166 --> 25:00.000 align:left position:12.5%,start line:80% size:87.5%
So because of something
about your identity but

25:00.100 --> 25:02.800 align:left position:25%,start line:80% size:75%
not necessarily
your own actions,

25:02.900 --> 25:07.600 align:left position:12.5%,start line:80% size:87.5%
you are denied access to
something like financial
services.

25:07.700 --> 25:09.366 align:left position:25%,start line:80% size:75%
I think another
way to look at it,

25:09.466 --> 25:11.900 align:left position:25%,start line:80% size:75%
if we're thinking
about economic impact,

25:12.000 --> 25:16.100 align:left position:12.5%,start line:80% size:87.5%
is the use of AI to look
at social media to infer

25:16.200 --> 25:18.433 align:left position:25%,start line:80% size:75%
something about
somebody's personality.

25:18.533 --> 25:21.600 align:left position:12.5%,start line:80% size:87.5%
So a recent example of
this is a report that came

25:21.700 --> 25:24.800 align:left position:12.5%,start line:80% size:87.5%
out in the Washington
Post where startup called

25:24.900 --> 25:31.533 align:left position:12.5%,start line:80% size:87.5%
Predictim allows parents
to vet babysitters or

25:31.633 --> 25:34.666 align:left position:12.5%,start line:80% size:87.5%
potential babysitters, so
what the babysitter has to

25:34.766 --> 25:38.433 align:left position:12.5%,start line:80% size:87.5%
do is submit their social
media account information

25:38.533 --> 25:40.933 align:left position:0%,start line:80% size:100%
and then the company looks
through their social media

25:41.033 --> 25:44.600 align:left position:25%,start line:80% size:75%
to see how
positive is this person,

25:44.700 --> 25:47.100 align:left position:25%,start line:80% size:75%
do they have any
indication of potential

25:47.200 --> 25:49.333 align:left position:25%,start line:80% size:75%
drug use, et
cetera and so forth.

25:49.433 --> 25:53.033 align:left position:12.5%,start line:80% size:87.5%
So you have a system
that's doing some kind of

25:53.133 --> 25:55.833 align:left position:25%,start line:80% size:75%
data analysis on
faulty assumptions,

25:55.933 --> 26:00.200 align:left position:0%,start line:80% size:100%
but then is actually impacting
somebody's true job prospects.

26:00.300 --> 26:02.500 align:left position:25%,start line:80% size:75%
HEFFNER: We're
running out of time.

26:02.600 --> 26:06.866 align:left position:25%,start line:80% size:75%
I wonder what the
implications are for Uber

26:06.966 --> 26:10.700 align:left position:12.5%,start line:80% size:87.5%
and a host of other
companies when it comes to

26:10.800 --> 26:14.800 align:left position:12.5%,start line:80% size:87.5%
the data analytics and
the identification of

26:14.900 --> 26:17.400 align:left position:12.5%,start line:80% size:87.5%
prospective employers,
prospective consumers just

26:17.500 --> 26:19.666 align:left position:25%,start line:80% size:75%
seconds left
Joy but tell us....

26:19.766 --> 26:22.600 align:left position:12.5%,start line:80% size:87.5%
BUOLAMWINI: Well, another
thing to consider is the

26:22.700 --> 26:24.066 align:left position:0%,start line:80% size:100%
advent of self-driving cars.

26:24.166 --> 26:27.333 align:left position:12.5%,start line:80% size:87.5%
So if we're talking about
human sensing AIs and

26:27.433 --> 26:29.166 align:left position:12.5%,start line:80% size:87.5%
we're talking about
pedestrian tracking,

26:29.266 --> 26:32.233 align:left position:12.5%,start line:80% size:87.5%
what happens if you don't
track certain pedestrians

26:32.333 --> 26:35.600 align:left position:0%,start line:80% size:100%
in the first place, in the
self driving car incident,

26:35.700 --> 26:38.433 align:left position:12.5%,start line:80% size:87.5%
and even if you are
tracking pedestrians,

26:38.533 --> 26:41.266 align:left position:12.5%,start line:80% size:87.5%
there's also the issue of
moral dilemmas if you have

26:41.366 --> 26:43.866 align:left position:12.5%,start line:80% size:87.5%
to choose between saving
the driver or saving

26:43.966 --> 26:47.100 align:left position:12.5%,start line:80% size:87.5%
someone who's outside,
who's making that decision

26:47.200 --> 26:50.100 align:left position:25%,start line:80% size:75%
and are these d
determinations based on

26:50.200 --> 26:54.733 align:left position:12.5%,start line:80% size:87.5%
your value as a person,
as assessed by an AI...

26:54.833 --> 26:55.833 align:left position:25%,start line:80% size:75%
HEFFNER: Joy.

26:55.933 --> 26:58.766 align:left position:12.5%,start line:80% size:87.5%
You run an organization
with the coolest name of

26:58.866 --> 27:01.533 align:left position:12.5%,start line:80% size:87.5%
any guest I have hosted
here going on five years

27:01.633 --> 27:03.500 align:left position:12.5%,start line:80% size:87.5%
and you're also doing
really important work.

27:03.600 --> 27:04.533 align:left position:12.5%,start line:80% size:87.5%
Thank you for that.

27:04.633 --> 27:06.066 align:left position:25%,start line:80% size:75%
BUOLAMWINI: Thank
you, appreciate it.

27:06.166 --> 27:07.833 align:left position:12.5%,start line:80% size:87.5%
HEFFNER: And thanks to
you in the audience.

27:07.933 --> 27:10.500 align:left position:12.5%,start line:80% size:87.5%
I hope you join us again
next time for thoughtful

27:10.600 --> 27:12.933 align:left position:25%,start line:80% size:75%
excursion into
the world of ideas.

27:13.033 --> 27:14.933 align:left position:25%,start line:80% size:75%
Until then,
keep an open mind.

27:15.033 --> 27:16.733 align:left position:25%,start line:80% size:75%
Please visit The
Open Mind website at

27:16.833 --> 27:19.066 align:left position:12.5%,start line:80% size:87.5%
Thirteen.org/OpenMind to
view this program online

27:19.166 --> 27:22.733 align:left position:12.5%,start line:80% size:87.5%
or to access over 1,500
other interviews and do

27:22.833 --> 27:27.100 align:left position:12.5%,start line:80% size:87.5%
check us out on Twitter
and Facebook @OpenMindTV

27:27.200 --> 27:28.933 align:left position:25%,start line:80% size:75%
for updates on
future programming.
