Let's play a game : close your eyes and picture a shoe.


All
Science and technology
Travel
Lifestyle
World
Explore
Stories
Funny stories
Health
Entertainment
Culture and Art
Learning
Economy
Sport
Animals
Politics and Society
Natural world
Food and drink
General
Setting
How Human Bias Affects Machine Learning
Let's play a game : close your eyes and picture a shoe.
Okay, did anyone picture this? This? How about this?
We may not even know why, but each of us is biased toward one shoe over the others.
Now imagine that you 're trying to teach a computer to recognize a shoe.
You may end up exposing it to your own bias.
That's how bias happens in machine learning.
But first, what is machine learning?
Well, it 's used in a lot of technology we use today.
Machine learning helps us get from place to place,
gives us suggestions, translates stuff, even understands what you say to it.
How does it work?
With traditional programming, people hand - code the solution to a problem, step by step.
With machine learning, computers learn the solution by finding patterns in data.
So it's easy to think there's no human bias in that.
But just because something is based on data, doesn't automatically make it neutral.
Even with good intentions, it's impossible to separate ourselves from our own human biases.
So, our human biases become part of the technology we create in many different ways.
There's interaction bias, like this recent game where people were asked to draw shoes for the computer.
Most people drew ones like this,
so as more people interacted with the game, the computer didn't even recognize these.
Latent bias ; for example, if you were training a computer on what a physicist looks like,
and you 're using pictures of past physicists,
your algorithm will end up with a latent bias skewing towards men.
And selection bias ; say you 're training a model to recognize faces.
Whether you grab images from the internet or your own photo library,
are you making sure to select photos that represent everyone?
Since some of our most advanced products use machine learning,
we 've been working to prevent that technology from perpetuating negative human bias.
From tackling offensive or clearly misleading information from appearing at the top of your search results page to adding a feedback tool on the search bar,
so people can flag hateful or inappropriate autocomplete suggestions.
It's a complex issue and there's no magic bullet,
but it starts with all of us being aware of it so we can all be part of the conversation.
Because technology should work for everyone.
Let's play a game : close your eyes and picture a shoe.
and
Let's
play
a
your
close
game
eyes
shoe
picture
Okay, did anyone picture this? This? How about this?
Okay
This
How
about
this
did
picture
anyone
We may not even know why, but each of us is biased toward one shoe over the others.
is
the
know
of
We
not
may
but
one
shoe
over
each
why
even
us
others
toward
biased
Now imagine that you 're trying to teach a computer to recognize a shoe.
you
a
that
to
computer
Now
're
shoe
imagine
teach
trying
recognize
You may end up exposing it to your own bias.
your
to
it
You
may
own
end up
exposing
bias
That's how bias happens in machine learning.
in
That's
how
happens
machine learning
bias
But first, what is machine learning?
is
what
But
first
machine learning
Well, it 's used in a lot of technology we use today.
today
a
in
we
of
it
Well
's
use
lot
used
technology
Machine learning helps us get from place to place,
get
to
place
from
us
helps
Machine learning
gives us suggestions, translates stuff, even understands what you say to it.
you
to
it
what
say
stuff
even
us
gives
suggestions
understands
translates
How does it work?
How
it
work
does
With traditional programming, people hand - code the solution to a problem, step by step.
a
the
to
people
hand
With
by
step
traditional
problem
solution
code
programming
With machine learning, computers learn the solution by finding patterns in data.
in
the
learn
With
by
computers
solution
finding
patterns
data
machine learning
So it's easy to think there's no human bias in that.
in
that
to
it's
think
no
easy
there's
So
human
bias
But just because something is based on data, doesn't automatically make it neutral.
is
on
it
But
doesn't
make
something
just
because
based
automatically
data
neutral
Even with good intentions, it's impossible to separate ourselves from our own human biases.
with
to
it's
our
good
from
own
ourselves
human
Even
impossible
separate
intentions
biases
So, our human biases become part of the technology we create in many different ways.
many
in
the
we
our
of
different
So
part
become
ways
technology
create
human
biases
There's interaction bias, like this recent game where people were asked to draw shoes for the computer.
like
for
shoes
the
to
computer
this
game
people
There's
where
were
asked
draw
interaction
recent
bias
Most people drew ones like this,
like
this
people
ones
Most
drew
so as more people interacted with the game, the computer didn't even recognize these.
with
the
more
computer
so
game
people
as
these
didn't
even
recognize
interacted
Latent bias ; for example, if you were training a computer on what a physicist looks like,
like
you
a
on
computer
what
looks
if
were
for example
training
physicist
bias
Latent
and you 're using pictures of past physicists,
and
you
of
're
using
past
pictures
physicists
your algorithm will end up with a latent bias skewing towards men.
with
a
your
will
men
towards
end up
algorithm
bias
latent
skewing
And selection bias ; say you 're training a model to recognize faces.
you
a
to
And
're
say
training
faces
selection
recognize
model
bias
Whether you grab images from the internet or your own photo library,
you
the
your
or
from
library
internet
own
photo
grab
Whether
images
are you making sure to select photos that represent everyone?
are
you
everyone
that
to
making
sure
photos
represent
select
Since some of our most advanced products use machine learning,
our
of
some
use
most
Since
products
advanced
machine learning
we 've been working to prevent that technology from perpetuating negative human bias.
that
to
we
've
working
from
been
negative
technology
human
prevent
bias
perpetuating
From tackling offensive or clearly misleading information from appearing at the top of your search results page to adding a feedback tool on the search bar,
at
a
the
on
your
to
of
or
from
From
clearly
top
page
information
adding
tool
search
feedback
results
offensive
tackling
misleading
appearing
search bar
so people can flag hateful or inappropriate autocomplete suggestions.
can
or
so
people
suggestions
flag
inappropriate
hateful
autocomplete
It's a complex issue and there's no magic bullet,
and
It's
a
no
there's
complex
issue
magic bullet
but it starts with all of us being aware of it so we can all be part of the conversation.
be
with
the
can
we
all
of
it
so
but
starts
part
being
aware
us
conversation
Because technology should work for everyone.
everyone
for
should
work
Because
technology
Todaii English is a website for learning and reading English news integrating other features such as dictionary, practice, mock test, ...
https://todaiinews.com
todai.easylife@gmail.com
(+84) 865 924 966
315 Trường Chinh, Khương Mai, Thanh Xuân, Hà Nội