Podcast: Play in new window | Download
About this episode’s guest:
Yaël is a thought leader, democracy activist and strategist working with governments, tech companies, and investors focused on the intersection of ethics, tech, democracy, and policy. She has spent 20 years working around the globe as a CIA officer, a White House advisor, the Global Head of Elections Integrity Operations for political advertising at Facebook, a diplomat, a corporate social responsibility strategist at ExxonMobil, and the head of a global risk firm. Currently, she is a Visiting Fellow at Cornell Tech’s Digital Life Initiative, where she explores technology’s effects on civil discourse and democracy and teaches a multi-university course on Tech, Media and Democracy.
Yaël has become a key voice and public advocate for transparency and accountability in tech, particularly where real-world-consequences affect democracy and societies around the world. Her recent TED talk addresses these issues and proposes ideas for how government and society should hold the companies accountable. Yaël travels internationally as a keynote speaker at any number of venues seeking informed, inspirational women to help make sense of our world’s most difficult challenges. She can be booked through the Lavin Agency.
Yaël was named to Forbes’ 2017 list of “40 Women to Watch Over 40”. She is also an Adjunct Professor at NYU’s Center for Global Affairs, a member of the Council on Foreign Relations, and she provides context and analysis on national security, elections integrity, political and foreign affairs in the media. She has been published in the New York Times, the Washington Post, Brookings Techstream, TIME, WIRED, Quartz and The Huffington Post, has appeared on CNN, BBC World News, Bloomberg News, CBS News, PBS and C-SPAN, in policy forums, and on a number of podcasts. She earned an M.A. in International Affairs from the Johns Hopkins School of Advanced International Studies (SAIS).
More than anything, she is passionate about using her background and skills to help foster reasoned, civil discourse.
She tweets as @YaelEisenstat.
This episode streamed live on Thursday, October 29, 2020. Here’s an archive of the show on YouTube:
About the show:
The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O’Neill.
Subscribe to The Tech Humanist Show hosted by Kate O’Neill channel on YouTube for updates.
Transcript
02:21
all right hello humans
02:25
welcome to the tech humanist show uh
02:28
come on in and start gathering around
02:31
and joining in i see some numbers
02:32
ticking up glad to see you
02:34
i hope you’re getting settled in i i of
02:37
course
02:37
am your host kate o’neil i’m glad to see
02:39
you uh joining us
02:41
go ahead and comment start commenting
02:42
and wherever you are and let us know
02:44
where you’re joining in from
02:46
uh say hello and also today we’re going
02:49
to be talking
02:50
a lot about you know the intersection of
02:52
democracy ethics and tech
02:53
so as well as sort of social divisions
02:56
and
02:57
and reasoned discourse so start thinking
03:00
of questions
03:00
now about those topics or thoughts about
03:03
them and you’re welcome to
03:04
you know start cueing those up or asking
03:06
them in the comments the more that you
03:08
do that in advance the more we’ll be
03:10
able to
03:11
to screen them and make sure that we’re
03:12
including those in the discussion
03:14
so um yeah but keep saying hello and
03:16
checking in that’ll help us
03:18
get to as many of you as possible this
03:21
as you may know already if you’ve tuned
03:23
in before
03:24
is what i call a multimedia format
03:26
program so that means that as i’m
03:28
speaking it’s being broadcast live
03:30
across a bunch of different channels uh
03:32
youtube facebook twitter and linkedin as
03:34
well as twitch although nobody watches
03:36
on twitch
03:37
um so you know with aoc streaming on
03:40
twitch that may change things we’ll see
03:42
about that
03:43
uh it’ll also live on as an archive on
03:46
all those channels so people will be
03:47
finding it later
03:49
so hello to those of you in the future
03:50
from those of us in the past
03:52
and also each episode gets turned into
03:54
an audio podcast
03:56
a week and a day after it goes live so
03:59
tomorrow there will be a release of last
04:01
thursday’s discussion
04:02
as an audio podcast and then this
04:04
discussion will be available next friday
04:06
as an audio podcast so each week we
04:08
explore different facets of how data and
04:10
technology
04:10
shape the human experience so please go
04:13
ahead and subscribe and follow wherever
04:15
you’re watching or listening to this
04:16
so you won’t miss any new episodes
04:18
because there’s some good stuff coming
04:19
up
04:20
in addition to this good stuff please do
04:22
note that as a live show
04:24
we will do our best to include uh
04:27
comments and questions as well as vet
04:28
them in real time
04:30
we may not get to all of them so uh we
04:32
very much appreciate
04:34
you participating here being here
04:35
chatting with us and being
04:37
being part of the show so now to
04:39
introduce
04:40
our wonderful guest today we are
04:43
chatting with yael
04:44
eisenstadt a thought leader democracy
04:46
activist
04:47
and strategist working with governments
04:50
tech companies and investors
04:51
focused on the intersection of ethics
04:53
tech democracy and policy
04:55
she spent 20 years working around the
04:57
globe as a cia officer
04:59
a white house advisor the global head of
05:01
elections integrity operations for
05:03
political advertising at facebook
05:05
a diplomat a corporate social
05:06
responsibility strategist exxonmobil
05:08
and the head of a global risk firm
05:11
currently she is a visiting fellow at
05:13
cornell tech’s digital life initiative
05:15
where she explores technology’s effects
05:16
on civil discourse and democracy
05:19
and teaches a multi-university course on
05:21
tech media and democracy
05:23
y’all has become a key voice and public
05:25
advocate for transparency and
05:27
accountability in tech
05:28
in particular where real world
05:30
consequences affect democracy and
05:32
societies around the world
05:34
her recent ted talk which you should
05:35
definitely watch addresses these issues
05:37
and proposes ideas for how
05:39
government and society should hold the
05:40
companies accountable y’all travels
05:42
internationally when she can i’m sure
05:44
as a keynote speaker at any number of
05:46
venues seeking informed inspirational
05:48
women to help make sense of our world’s
05:50
most difficult challenges
05:52
yao was named to forbes 2017 list of 40
05:55
women to watch over 40
05:57
she’s also an adjunct professor at nyu
05:59
center for global affairs
06:00
a member of the council on foreign
06:02
relations and she provides context and
06:04
analysis on national security elections
06:06
integrity political and foreign affairs
06:08
in the media
06:08
it’s been published in the new york
06:10
times the washington post brookings tech
06:12
stream
06:12
time wired quartz and the huffington
06:14
post has appeared
06:15
on cnn bbc world news bloomberg news cbs
06:19
news
06:19
you know you get the idea she’s been
06:21
everywhere in policy forums and a number
06:23
of podcasts
06:24
she earned an m.a in international
06:25
affairs from the johns hopkins school of
06:27
advanced information studies
06:29
and i love this the last line of the
06:31
bios that i i’ve cobbled together to to
06:33
present
06:34
this to you the last line is more than
06:36
anything she is passionate about using
06:37
her background and skills
06:39
to help foster reasoned civil discourse
06:42
so how impressive is that please
06:45
get those questions ready for our truly
06:47
outstanding guest
06:48
and with that please welcome y’all
06:51
eisenstadt
06:52
hey you are live on the tech humanist
06:54
show thank you so much for being here
06:56
and joining us
06:58
thank you that was the most
06:59
comprehensive bio intro
07:03
i understand it’s sometimes a little
07:05
overwhelming when you’re hearing
07:06
your own uh life story being rattled off
07:08
for you
07:11
i’m excited for this conversation me too
07:13
and and i think
07:14
you know one thing that’s really funny
07:16
about this is so i gotta say right up
07:18
front
07:18
is that i think it’s hilarious to
07:20
acknowledge this right in the beginning
07:22
that you and i connected when uh you
07:25
were tagging you meant to tag
07:26
kathy o’neil the author of weapons of
07:28
mass destruction on a post on linkedin
07:30
which and you tagged me by accident
07:32
which happens all the time i have to
07:34
tell you and probably happens for her
07:36
the other way too
07:37
and when i i saw your bio and i was like
07:39
hey um you tagged the wrong person but
07:41
we should probably know each other
07:45
yeah so kathy and i had been on i think
07:47
it was an npr
07:49
show together i was posting something
07:51
that she and i had both been interviewed
07:53
for
07:54
and what is funny you know i’m just
07:56
gonna put this out there right now
07:58
hundreds of people you must get this try
08:00
to connect with you on linkedin they
08:01
don’t say why they don’t send a message
08:03
so i don’t respond to people who don’t
08:05
tell me why they’re trying to connect
08:07
so when you wrote to me and you’re like
08:08
hey don’t think you meant me but we
08:10
should know each other at first my
08:12
immediate reaction is should we and then
08:14
i looked you up like oh yeah no it looks
08:16
like we should know each other
08:17
yeah yeah i’m so glad that we did
08:19
connect and here we are and it all pays
08:22
off
08:23
at least is the beginning of paying off
08:25
we may see plenty of other collaboration
08:27
in the future
08:28
i was thinking as i was uh reading and
08:31
researching and you know watching your
08:32
ted talk which was wonderful by the way
08:34
uh i thought you know one of the things
08:37
that i wanted to do is frame this
08:38
discussion around your
08:40
background which and the various uh
08:42
response roles and responsibilities that
08:44
you’ve had
08:45
which i was going to say first of all
08:46
your bio is like you’re the most
08:48
interesting woman in the world
08:50
really fascinating and and to me
08:53
it reads as someone who you know
08:56
maybe when you encounter it’s it’s like
08:59
when you encounter ambiguity that you
09:01
engage with it fully try to understand
09:03
it or something like that is that how
09:05
you
09:05
in reflection would characterize the
09:07
trajectory of your career or is there a
09:09
different thing
09:10
that you think of i love that i might
09:12
have to
09:13
i mean i haven’t thought of it that way
09:16
before but but it’s really accurate you
09:17
know it’s funny for a while i was trying
09:19
to i was struggling with what’s the
09:20
through line
09:21
you know the what do you do question
09:23
when people say what do you do i’m like
09:24
how do i put that into it
09:28
um but the three line i think it’s
09:29
pretty clear at this point
09:31
it’s that i’m one of those people who
09:33
runs head on
09:34
into the fire into the fight i am
09:37
passionate about fighting for this
09:39
democracy
09:41
for for i just i really it’s the people
09:45
who are affected
09:46
that i fight for the most i don’t know
09:50
if that makes sense but
09:51
like even when i was you know i started
09:53
i joined government before september
09:54
11th
09:55
and during some of the darkest counter
09:57
terrorism
09:58
days i all of my work was about trying
10:01
to focus on the people who were affected
10:03
by
10:03
all of these things around the world and
10:04
the same thing led me to facebook so i
10:06
don’t know
10:07
ambiguity definitely i like that yeah
10:10
but
10:10
so just i run head on into a challenge
10:14
and i kind of wish i mean at some point
10:17
that i could just stop running head on
10:19
into every challenge and just kind of
10:21
maybe sell muffins or bread or something
10:23
and
10:25
sleep better at night maybe it’ll be
10:28
like a little
10:29
a little pause and then you can get back
10:31
into it i i don’t know
10:32
well what’s really it’s interesting to
10:33
me you know that your
10:35
your story seems like it begins with
10:37
this cia analyst position is that about
10:39
where is that kind of your professional
10:41
entry
10:41
into uh into work or were you doing
10:45
some some little things here and there
10:47
before then but yeah i mean it was
10:49
it was 1999 i had just finished its size
10:53
which just can make one minor correction
10:55
to your bio
10:57
it’s international affairs not
10:59
information affairs
11:01
all right it’s a tech show so i could
11:03
see how you right away
11:06
um and you know i wanted to work on
11:08
foreign policy global affairs that’s
11:10
that was my passion and yeah state
11:14
department and usaid had hiring freezes
11:16
and i ended up at the agency and
11:19
and yeah that’s where sort of my global
11:21
international affairs career really
11:22
kicked off
11:23
that’s cool and also uh all tech is
11:25
human commented muffins for democracy
11:27
so there’s there’s a possible branching
11:29
out
11:30
that you could use i have can i just
11:32
completely tell a random
11:34
quick little story about muffins for
11:35
hypocrisy i was in a musical because i
11:37
was a musical theater geek
11:39
uh in high school called of the i sing
11:42
which i’m sure
11:43
not one person has ever heard of that is
11:46
it yeah for some reason that was and
11:48
there is a actual song
11:49
in there and it’s all about democracy i
11:51
played a supreme court justice a tap
11:52
dancing supreme court justice tonight
11:55
and there’s a song where they’re trying
11:56
to do a whole campaign actually it’s
11:58
very apropos for today if you think
12:00
about it it’s very 1950s it’s about u.s
12:02
politicking huh i haven’t thought about
12:04
this in a long time but there’s a lot
12:06
where the president’s wife they’re
12:08
trying to portray her as the perfect
12:10
1950s housewife and there’s an entire
12:12
song about how she can make
12:13
corn muffins oh my gosh yeah we’re gonna
12:16
need to search this out
12:18
i’ll try to put it in show notes because
12:20
we can find any link to anything
12:22
well this is so so it’s interesting so
12:23
you get into the cia and
12:25
you end up if i have this right you end
12:28
up stationed in kenya is that right
12:30
so well not exactly i so
12:33
join the agency um well in 99
12:37
it takes a while to get your clearances
12:38
so really started in 2000
12:40
and worked mostly on africa issues of
12:42
course a lot of us
12:43
had i mean after september 11th also
12:45
worked on afghanistan and all sorts of
12:47
crazy things
12:48
and then in 2004 i moved over to the
12:50
state department and so i was stationed
12:52
in kenya from 2004 to 2006
12:54
i was posted at the embassy there as a
12:56
diplomat
12:57
i’m one of those weird people i worked
12:58
for like five different government
13:00
agencies over my
13:01
over my federal career it’s so
13:02
interesting because i know um
13:04
some of the folks that that i know who
13:06
have worked in government they
13:08
their description to the outside world
13:10
doesn’t sound that different and yet
13:12
they’re like oh no
13:12
no this is a very very different role i
13:15
get it i totally do
13:16
it’s amazing but so your work there in
13:19
kenya especially
13:20
but your work sounds like it had to do
13:22
with um dealing with
13:24
communities that where constituencies
13:26
were most at risk of radicalization is
13:28
that right
13:29
yeah one of my multiple roles was to be
13:33
sort of the embassy’s outreach person to
13:36
certain marginalized areas
13:38
and it was i mean kenya just for context
13:41
was a country that had suffered a few
13:43
terrorist attacks our u.s embassy had
13:45
been bombed there in 98
13:46
and then in 2001 there had been another
13:48
attack against an international hotel
13:50
and an israeli airliner
13:52
so it was a country and also bordered
13:55
somalia
13:56
and sudan so it was a country where we
13:58
were
13:59
we really is an ally it was an important
14:02
ally
14:03
and so it was really sort of hearts and
14:04
minds type work of building
14:06
bridges and communities and um and
14:09
mutual understanding in communities that
14:11
are really marginalized that are
14:14
and i mean i can go into as little or
14:16
much detail sorry i realize i’m rambling
14:18
here but oh that’s great i think it’s
14:19
really important
14:20
i feel like it’s really important setup
14:22
because it you talk in a few different
14:24
places or you wrote in a few different
14:25
places about
14:26
the work that you did in that context
14:28
and how it gave you this framework
14:30
of understanding that the the human
14:33
tools that we had for connecting with
14:35
one another and i know you talked about
14:36
listening and empathy and so on
14:38
and you have this really powerful story
14:41
that you wrote in
14:42
in courts i think about uh a man
14:45
you being at a town hall meeting in
14:47
kenya can you do you know story on
14:49
referencing and do you mind me telling
14:50
it
14:51
i love how well you’ve done your
14:52
homework yeah that was an article i
14:54
wrote
14:55
obviously that in the time article were
14:58
both before i went to facebook and they
15:00
were
15:00
and they they were kind of two sides of
15:02
the same story and it was where i was
15:04
really exploring and
15:05
people have heard me talk i heard me
15:07
talk about how
15:08
it really struck me that i had an easier
15:11
time
15:12
engaging with suspected terrorists
15:13
overseas than i feel i have right now
15:16
engaging with americans about political
15:18
issues and that particular story was i
15:20
was heading a town hall and a town
15:22
called la mu
15:23
up in the very northern part of the
15:25
coast near somalia
15:27
and uh all of the councillors from the
15:30
region
15:31
attended and it was really one of these
15:33
i was there with
15:34
one of the us military reps to explain
15:36
why the u.s military was even in the
15:38
region
15:39
and to to really sort of answer
15:41
questions
15:42
listen to the community’s concerns make
15:44
sure that we were
15:45
not doing anything that the community
15:46
didn’t want and um
15:49
a certain one individual showed up and
15:51
he had been
15:52
a suspected uh terrorist
15:56
it’s it’s a i won’t get into the whole
15:57
story but i had been following his trial
16:00
i you know i knew a lot about the man um
16:04
to be frank the charges against him by
16:06
the kenyans were somewhat dubious
16:08
to begin with but he had been suspected
16:09
of harboring
16:11
the man who was responsible for bombing
16:13
our embassies
16:15
so when he showed up i just i realized
16:18
oh
16:18
shoot i don’t know why i didn’t realize
16:20
this person would be here
16:21
and i thought he would start attacking
16:23
me verbally and start trying to argue
16:26
with me
16:26
and um he was the one who actually made
16:30
me feel comfortable and he was the one
16:32
who stood up
16:33
and thanked me for being there and
16:34
talked about how important it was
16:37
that i was willing to show up to listen
16:40
and to answer questions i didn’t give a
16:41
lecture i didn’t show up with any
16:43
promises i didn’t show up with money i
16:44
didn’t show up with
16:46
and he was he was just profound and
16:49
going
16:50
on and on about your ability to listen
16:52
to us and to make us feel heard and
16:54
and it doesn’t change the fact that i
16:55
can hold two ideas in my head at the
16:58
same time
16:59
one idea of i don’t know that i’m
17:01
supposed to
17:02
feel okay about this person but at the
17:05
same time he’s showing me the power
17:08
of civil discourse because the whole
17:09
community took their cues from him
17:11
and how he interacted with me so yeah it
17:13
was a pretty foundational moment
17:15
in in how i thought about the rest of my
17:17
career yeah that
17:18
that’s why when i saw that story to me
17:21
that felt like such an underpinning
17:23
to so much of the work that sounds like
17:25
it comes later
17:26
that um you know it’s it’s complicated
17:29
for sure
17:30
but it does begin to unpack you know
17:33
what is the value of civility and as you
17:35
say when you’ve got people who are
17:37
taking cues in the community
17:39
from these leaders that seems like it’s
17:41
a really important
17:42
dynamic to at least be aware of right
17:45
yeah yeah
17:46
i mean that was probably the most
17:47
powerful part was
17:49
i saw the whole community they saw him
17:51
stand up to ask a question
17:52
they probably saw me stiffen up a little
17:54
bit and then they saw
17:56
how he engaged with me and it the whole
17:59
room changed
18:01
so yeah i know it was a at the time i
18:04
didn’t realize how foundational of the
18:05
moment it was
18:06
but a few years later when things really
18:08
started to break down here in the u.s
18:10
i look back on that moment a lot yeah
18:13
and i think
18:14
so it’s been interesting to me because
18:15
i’ve watched
18:17
these uh schools of thought unfold in
18:20
terms of you know kind of idealist
18:21
philosophies versus
18:22
incrementalism and and the way that
18:25
those sort of are positioned as poles
18:27
and positioned as
18:28
you know you’re supposed to be on one
18:30
side or the other of either you’re going
18:32
all the way toward
18:33
you know radical ideals or
18:36
you’re um you’re branded an
18:37
incrementalist and that’s
18:39
uh you know the idea that you said i
18:41
think in your ted talk compromise has
18:43
become a dirty word and
18:45
that feels like it’s a really important
18:47
um sort of framing
18:49
a construct to acknowledge if we’re
18:51
going to do something about
18:53
you know creating recent civil discourse
18:56
would you agree with that
18:57
yeah it’s funny and you’re the first
18:59
person that i’ve heard
19:00
in a while use the word incrementalist
19:03
because
19:04
internally i’m probably pretty radical
19:06
in my thinking of what i think needs to
19:08
happen
19:08
and what i would love to see in the
19:10
world um
19:12
but i i said before i’m a little bit
19:14
more on the like rbg
19:15
incrementalist uh school of thought of
19:18
if i can fix this one thing
19:20
first that’s a building block to this
19:22
which is a building
19:23
of course doesn’t mean that there isn’t
19:25
part of me that’s like burn the whole
19:26
thing down
19:27
right but um
19:30
i think it is incredibly important to
19:33
understand that
19:35
my opinion isn’t the only opinion in the
19:37
country
19:39
and the incrementalist part of it is
19:41
about trying to figure out how can you
19:43
do the most good
19:44
while also bringing in the most people
19:46
into the into the fold
19:48
yeah i think you’re still right and you
19:50
talked earlier about how
19:51
you are constantly thinking about the
19:53
people impacted
19:54
right so that’s an important piece too
19:56
we have to we don’t want to ignore
19:58
the real world harms that are happening
20:00
at a systemic level
20:02
like the police brutality and racist
20:04
immigration policies and things like
20:05
that
20:06
uh and and yet it feels like in order
20:09
there’s this whole process of trying to
20:11
you know gain momentum toward
20:14
legislation and policy
20:16
and you know kind of the the solutions
20:18
to these things
20:19
we seem to have lost some of the nuance
20:21
in how we talk
20:22
about any of those kinds of issues
20:26
it’s a really hard line to figure out
20:29
right now and i think
20:30
i think it leads directly into these
20:32
social media
20:33
and tech and ethics discussions
20:36
it’s what led me into the social media
20:39
world to begin with so yeah
20:41
so so then how how do you end up going
20:43
from uh
20:44
the cia role to uh working in the white
20:47
house or advising in the white house
20:49
um so at that point in time i was at the
20:53
national counter-terrorism center
20:55
i had come back from kenya and i’d gone
20:57
over
20:58
to i’m gonna use all sorts of government
21:00
acronyms and ctv
21:02
counter-terrorism center um heading up
21:06
some of the africa programs
21:07
for sort of the counter-extremism hearts
21:10
and minds type work there
21:12
and uh right before the change in
21:15
administration in 2000
21:16
and 2008. yeah that makes sense
21:20
my expensive time right now
21:23
um they knew that if there’s a new
21:26
incoming administration it would require
21:28
so without getting into too much detail
21:30
the white house is staffed by multiple
21:32
different kinds of people some are the
21:33
political appointees and some are
21:35
experts that are brought in who are
21:36
still government civil servants
21:38
so they had put out a call to all these
21:41
government agencies
21:42
um for everyone to put in their top two
21:44
people to fill a role advising
21:46
in the vice president’s office and uh
21:50
i went through the process and i was
21:52
picked so
21:53
it was an incredible honor i got to
21:55
serve
21:56
it was funny they were actually looking
21:58
for a counterterrorism advisor um
22:00
which i mean it’s kind of the path i
22:03
fell into but it wasn’t the path
22:05
that motivated me and then uh
22:09
i was interviewing and we started
22:10
talking about what i thought we should
22:12
do in sudan
22:13
and then they were like oh we need an
22:15
africa advisor actually i was like great
22:16
i’ll do
22:17
both which nobody should ever think
22:19
there’s enough hours
22:21
um but yeah so it was amazing i got to
22:25
advise the vice president um
22:30
lead his trip to africa that year
22:32
represent the usa at the world cup
22:34
you’re welcome these are the real tough
22:35
ideas for my country um and it was
22:40
i mean there is no greater honor and
22:43
bigger challenge
22:45
than to be in that building because the
22:47
biggest challenges
22:48
in this country and in the world show up
22:51
on the desks of the people
22:53
in the white house every day so it was
22:55
pretty incredible
22:56
it feels like conceptually it’s not that
22:59
difficult to draw the parallel between
23:01
uh the the work you were doing in these
23:03
communities and understanding the
23:05
radicalization
23:06
and national security and
23:07
counterterrorism and
23:09
you know democracy protection and and
23:11
protecting the most vulnerable and
23:13
things like that but but is there
23:14
as you’re thinking i guess about this
23:16
through line is there something that
23:18
becomes clearer to you when you when you
23:20
kind of
23:20
reconstruct those those moves in in
23:23
retrospect
23:24
well those moves within government all
23:26
were within a certain thread
23:28
like no matter what whatever i was doing
23:30
at overseas i was representing the
23:31
united states right
23:32
so all of that work was still within the
23:36
greater context of u.s national security
23:38
u.s foreign policy um in really
23:42
u.s global leadership and so that’s all
23:45
within that context so then coming to
23:46
the white house
23:48
to be the vice president’s advisor on
23:50
those issues uh
23:52
was absolutely full circle and then it
23:55
sounds like within
23:56
the time frame that you were working
23:57
within the white house
23:59
that perhaps that’s when you began to
24:01
pay attention more to the digital divide
24:03
is that
24:04
is that the right time frame what what
24:06
sort of brought that to your
24:08
attention yeah so just a little bit
24:09
later um but as i was starting to think
24:11
about transitioning out of government
24:13
um my first thing which is
24:16
now i think about kind of funny somebody
24:18
like people were just saying but what do
24:20
you want to do
24:21
and so finally i said you know what i
24:22
want to find one of the biggest baddest
24:24
companies in the world
24:25
with a huge profound impact globally
24:28
who are screwing up local communities
24:31
and help them do better
24:33
and at the time it wasn’t facebook now
24:35
that could define facebook now but at
24:37
the time it was exxonmobil
24:38
um and so somehow i went from making
24:41
that statement to ending up
24:43
moving to texas to head the corporate
24:45
social responsibility
24:47
to not head it but to work on corporate
24:49
social responsibility strategy for exxon
24:51
for two years and then then i moved to
24:54
new york
24:55
and started that’s where it really
24:57
started to hit me like this
24:59
breakdown in civil discourse in the u.s
25:01
this this growing polarization
25:03
what’s happening i never planned
25:06
to talk about my past publicly i never
25:09
planned to publish to write to speak
25:12
um but as i was watching the last
25:14
presidential election heating up and
25:16
seeing americans
25:17
tearing each other apart over things
25:20
that
25:21
we shouldn’t be fighting about um
25:25
i just started completely thinking this
25:26
was all a bigger threat to our democracy
25:28
than
25:29
all of the national security issues i
25:31
spent my career working on
25:32
yeah it’s interesting because it feels
25:34
like i think eli pariser’s filter bubble
25:37
talk was around 2011 or something around
25:40
those times that time
25:42
and when i think about that now it
25:43
always sounds seems a quaint
25:46
you know kind of in retrospect but i i
25:48
also recognize that it’s the beginning
25:50
it’s the
25:51
it’s the the construct that that sort of
25:53
blossoms into the the algorithmic
25:56
filtering
25:57
of content that that creates this
26:00
algorithmic polarization
26:01
we see today um but even at the time
26:04
that you’re recognizing it in like 2012
26:07
or
26:07
or 2016 i guess by then you’re really
26:10
like 2015 that i started exploring it
26:12
yeah yeah
26:13
um was what what did you see
26:16
do you remember an example of something
26:18
that really stood out to you
26:19
as as like oh this is this is what i
26:21
have to speak about this is what i have
26:23
to write about
26:25
i don’t remember what the exact thing
26:26
was um you know when i
26:28
when i moved to texas in 20 from 2013 to
26:30
2015 and it’s funny anyone who knew me
26:32
at the time
26:33
they they would have been less surprised
26:35
if i’d moved to saudi arabia than moving
26:36
to texas they always expect me to move
26:38
overseas they always would have expected
26:40
me
26:40
um i don’t know but
26:44
i think i had such an amazing time
26:46
living in dallas
26:47
and it occurred to me that i
26:51
who would always get really upset at
26:53
americans who would stereotype about
26:55
countries overseas
26:56
particularly in africa in the middle
26:58
east and i had stereotyped about texas
27:00
without having spent time there and so i
27:03
kind of
27:04
called myself out in that moment of like
27:07
why have i why have i done what exactly
27:10
what i’ve accused other people i mean
27:11
yes i
27:12
in texas i had friends who were
27:14
obviously very
27:15
different reality than my liberal
27:18
elitist
27:18
east coast whatever you want to call my
27:21
crowds in new york and dc
27:23
so that started making me look a little
27:25
bit more at
27:27
how we’re engaging with each other in
27:28
the u.s but
27:30
really it was some of the rhetoric at
27:32
the very early days
27:34
of the uh presidential election so like
27:36
in 2015
27:38
and the way americans were starting to
27:40
talk to each other
27:41
like there were some attacks that were
27:43
starting to happen all of the signs that
27:45
i had seen my whole life
27:47
overseas i was starting to see here
27:50
and i again i know it seems like i love
27:54
being in the public spotlight i actually
27:56
don’t my friends had to push me to speak
27:59
up and to write
28:01
in those early days um but i thought i
28:04
could see
28:05
the steps of things that i had seen
28:08
in like war-torn countries overseas
28:10
starting to happen here
28:12
and i also i didn’t immediately pit like
28:16
pit it to being about social media and i
28:18
still don’t there’s lots of factors
28:20
but i certainly recognize there was a
28:22
huge difference in how we were engaging
28:24
here in the us
28:25
and how we had engaged in these
28:27
communities that were
28:28
not as connected to the internet and
28:30
everything and that’s when i started
28:32
kind of digging in into what’s happening
28:35
to the point where just people who
28:37
shouldn’t be fighting with each other
28:38
are and what’s happening to the way
28:40
we’re engaging here
28:42
so at some point this leads to you know
28:44
you’re writing your publishing you’re
28:45
you’re
28:46
you know making these uh putting these
28:48
op-eds into
28:49
into publications which you know you got
28:51
a lot of traction at that time
28:53
from what i saw uh looking into the
28:55
articles you had
28:56
um when does it lead to talking to
28:59
facebook about this role
29:02
so uh yeah i wrote a few pieces
29:06
and i remember the very first time i got
29:08
invited to
29:09
keynote this tech festival in berlin
29:12
and it’s like me at a tech festival like
29:14
that doesn’t even make sense
29:15
um and i understand that i’d also spent
29:19
my whole life not
29:20
ever seeing really cia out loud or
29:22
talking about it and here i’m gonna sit
29:24
on the stage in berlin and just blast
29:26
my past the world but from that keynote
29:29
to like
29:29
just it started i started being on these
29:31
tech podcasts and and the more and more
29:33
it started being about me talking about
29:35
how fundamentally important it is to
29:36
learn how to engage with people who
29:38
aren’t like-minded
29:39
and to get off some part of me off of
29:42
this
29:43
social media hamster wheel and shortly
29:46
after that
29:47
facebook reached out um
29:50
and i started talking to one of the
29:51
recruiters and it was a bit of a journey
29:55
i
29:55
i wasn’t one to easily
29:59
say yes i’m i want to go work at
30:01
facebook um
30:02
but then you might recall the hearing
30:05
the famous senate hearing in 2018 or
30:08
mark zuckerberg the one where he said
30:09
yes senator we
30:11
we sell ads
30:15
a lot of that hearing had been about
30:17
elections a lot of that hearing
30:19
mark zuckerberg had talked about like
30:21
finally recognizing that they need to do
30:22
better on what to do about elections
30:24
cambridge analytica had just become this
30:26
really
30:26
publicly known scandal and they called
30:29
me and one minute after that hearing
30:31
ended
30:32
and they said um we have a different
30:34
role
30:35
we want you to head a new team
30:38
we want you to come had a team on
30:39
elections integrity in our
30:41
advertising side and just kind of was
30:44
silent i was like
30:45
oh uh it sounds like a big job
30:50
so yeah that’s that’s where that ride
30:53
began
30:53
began the day of mark zuckerberg’s
30:55
senate hearing
30:57
you know i think about a lot of people
30:59
that i know
31:00
who have gone to take jobs in
31:03
organizations that they
31:05
they think of as as being off track and
31:08
they’re hoping that by joining
31:09
they’re going to be able to help be part
31:11
of the process of bringing them back on
31:13
track
31:13
and so i wonder if in retrospect if you
31:16
have any advice
31:17
to sort of your former self or to other
31:19
people who find themselves into that
31:21
in that situation as you as you go back
31:23
and look at
31:24
that opportunity and knowing what you
31:26
know now and
31:27
whether you whether that means you
31:29
wouldn’t take the job or whether you
31:31
would but you’ll take a different
31:32
approach when you get in
31:34
what do you think it’s a great question
31:36
um
31:37
funny enough just yesterday i published
31:38
a piece about culture at facebook
31:41
and how the culture there will never be
31:43
the type of culture that could actually
31:44
fix these problems um
31:46
and i bring that up only because the
31:48
cultural question is really important
31:50
so when i agreed to take the job i had
31:52
no illusions
31:53
that i was going to fix at all i mean i
31:56
was not
31:56
i wasn’t right out of college i wasn’t
31:59
going to facebook for the free kombucha
32:01
i didn’t think that one individual was
32:03
going to make a huge dent
32:05
um i was very clear in my interview and
32:08
if anyone’s considering roles like this
32:10
um i would highly advise being
32:14
very very clear on what you want what
32:16
your expectations are and i said
32:18
i said this is how i work this is who i
32:20
am this is not what you’re looking for
32:21
don’t hire me
32:23
um also asked do you have full support
32:27
from the top down to the bottom for my
32:29
role because this is a role that is
32:31
going to be an outside the box thinker
32:33
that’s going to be
32:34
pushing you and challenging your
32:36
assumptions
32:37
and things like that do you have support
32:39
the one thing i would give myself advice
32:41
before
32:42
is four out of the five people who
32:44
interviewed me i knew were so on board
32:46
with it and so
32:47
yes yes yes the final person
32:50
i could tell from the second that
32:52
interview started
32:53
and i should have listened to my gut i
32:56
knew that person
32:57
did not want me to come build this role
33:01
um that would be my only advice uh
33:05
i the role was so important and the
33:08
challenge was so important to me
33:10
that i sort of pushed myself to believe
33:13
okay that’s just one person
33:14
[Music]
33:15
i’ll be okay um
33:19
but i would get the advice i would give
33:21
is listen
33:24
it’s not just about what somebody sells
33:25
you in terms of the recruit the
33:26
recruiters were wonderful they told me
33:28
everything that i needed to hear
33:30
but you i do think from both my exxon
33:32
experience
33:33
and my facebook experience i think
33:35
people downplay
33:37
corporate culture and whether that
33:38
company’s culture is the right place for
33:40
you
33:40
and this is why i wrote yesterday
33:42
everyone expected me to write some piece
33:43
about section 230 because the senate
33:46
hearing on section 230 yesterday
33:48
and instead i wrote a piece on corporate
33:50
culture at facebook
33:52
and why they will never be able to solve
33:54
the challenges that we want them to
33:56
solve
33:57
um i think figuring out if it is the
33:59
right culture fit for you
34:00
and culture i don’t mean in terms of
34:02
wearing hoodies sneakers and playing
34:04
ping pong
34:06
i mean that to me was kind of annoying
34:08
anyway but
34:09
culture of do they value
34:13
what you’re bringing what i bring is not
34:16
protecting profit the bottom line and
34:18
scale
34:19
what i’m bringing is asking really tough
34:21
questions and trying to solve
34:22
actual real hard challenges and pushing
34:25
you in a direction to put
34:27
the public over profit that’s not a
34:29
culture there that’s not who they are
34:32
so i would recommend really digging in
34:35
on
34:36
in addition to the job description
34:39
do you truly believe this is a culture
34:41
that’s going to embrace what you’re
34:43
bringing
34:44
um because that age-old expression
34:46
culture eats strategy for breakfast
34:48
really smacked me in the face when i was
34:50
there yeah
34:51
so does that does that spell that
34:55
it’s only regulations that are going to
34:57
change the
34:58
the structure and the outcome of what’s
35:00
happening with
35:02
facebook and perhaps other entities that
35:04
are that are similar
35:05
well it depends on which problem we’re
35:06
talking about right because there’s many
35:09
um i’m gonna put facebook in a different
35:13
category
35:14
than most other companies for a few
35:16
reasons one because i work there so i
35:17
have
35:18
at least some sort of internal insights
35:21
but two
35:22
also it is the shears power
35:25
size and dominance does put it in a
35:28
category of its own
35:30
um three there’s also no accountability
35:33
at all for facebook right now
35:35
normally you’ve got what four maybe four
35:37
or five structures that
35:38
can incentivize a company to change one
35:40
is the markets
35:42
and the markets have proven they’re not
35:43
going to incentivize facebook to change
35:45
because
35:46
they actually reward them right after
35:48
bad behavior um one is shareholders but
35:50
mark zuckerberg set this up as a dual
35:52
class structure so that shareholders
35:54
have no power um he cannot be replaced
35:58
one is employees and i applaud the fact
36:00
that more and more employees are
36:01
speaking up about the things they’re
36:03
seeing that
36:04
do not does not sit right with them
36:07
unfortunately though
36:08
again i don’t know that it changes the
36:11
incentive structure
36:12
and just remember and i know i said this
36:15
in my talk so i’m
36:16
trying not to re-quote my own ted talk
36:18
but they’re not breaking the law
36:20
for most of things i care about which is
36:24
how they’re contributing to polarization
36:25
to radicalization to hate speech how
36:27
their platform
36:29
is designed and monetized to
36:33
really break down civil discourse what
36:34
that does to democracies i mean there’s
36:36
a million other issues these are the
36:37
ones that i focus on
36:39
they’re not breaking the law and so i i
36:42
do
36:42
i fully believe that the only way we’re
36:46
going to write some of the wrong in
36:47
terms of how they’re affecting
36:48
democracies how they’re affecting
36:49
elections
36:50
will be through legislation it doesn’t
36:52
mean i think everything has to be
36:54
regulated but for those things
36:56
absolutely
36:57
and you made such a good point i know
36:58
you just said you don’t want to quote
37:00
your whole ted talk but i think this is
37:01
a really good point that you brought up
37:03
about you know first of all that one of
37:06
the
37:06
main components of how
37:09
facebook as well as other social media
37:11
platforms
37:12
operate is on user engagement and growth
37:15
right like so
37:16
it’s all about generating these very
37:19
strong reactions
37:21
so it’s been known for a while that the
37:25
angry reaction emoji on facebook you
37:27
know kind of generates the most
37:29
possibly the most traction in the feed
37:31
and so you’ll see you’ll tend to see a
37:33
lot more of the things that you
37:35
engage with with those kind of angry
37:37
reactions
37:39
but as you point out in this in your
37:42
talk
37:43
the fundamental business model
37:47
is about those operators if those are
37:49
the functions that are going to
37:51
you know reward the business model then
37:53
there is no kind of operating incentive
37:56
for for the company to shift away
37:58
algorithmically from
37:59
from those kinds of feedback mechanisms
38:02
so
38:03
have you thought about what what kinds
38:05
of regulations it would take
38:06
to externally impact those algorithmic
38:10
factors or to to help
38:12
reshape those incentives in some way
38:15
yeah i mean because right now again not
38:17
only are they not breaking the law
38:19
but essentially facebook is doing what
38:21
it’s supposed to do right
38:22
they’re they’re focusing on shareholder
38:25
returns
38:27
and so the us government’s job
38:30
is to protect one of its jobs is to
38:33
protect its citizens
38:34
from harm to protect its citizens from
38:37
the externalities caused by
38:40
dangerous practices of companies in
38:44
in addition to the national security and
38:46
global fair side of stuff
38:48
and it’s that simple when the car
38:50
industry was told that they had to put
38:52
seat belts in all of their cars
38:54
you didn’t see the car well you probably
38:56
maybe they did but i doubt you saw the
38:57
car industry say
38:58
oh but there’s so many cars on the road
39:00
it’s going to be so hard
39:01
it’s going to be so hard to find them
39:03
all and put seat belts in them i suspect
39:05
they did
39:06
but you’re right it ultimately it
39:08
doesn’t matter ultimately
39:10
the science and the data showed that i
39:14
don’t know if this is scientific i would
39:15
assume it’s the data
39:16
that showed that wearing a seatbelt
39:18
protects lives
39:20
and there was and they regulated that
39:24
with a company like facebook there’s
39:27
zero accountability
39:28
for any of the real world harms caused
39:30
by their platform
39:31
so they’re not breaking the law they’re
39:32
not incentivized to change
39:34
because right now i don’t want to get
39:36
like too far into the weeds on section
39:38
230
39:39
and all of this but essentially
39:43
they have full immunity from any
39:45
third-party content
39:47
because of section 230 and i’m not
39:50
saying that you should get rid of
39:51
section 230. there are some things that
39:53
section 230 does it’s very important
39:55
they protect
39:56
companies like facebook from being able
39:58
to actually engage in content moderation
40:01
but to say that a company like facebook
40:04
is just this neutral platform where
40:06
people are just posting
40:08
content and therefore facebook shouldn’t
40:10
be responsible for any of it
40:11
completely ignores the fact that
40:15
their tools are not just allowing people
40:18
to post
40:19
speech their tools are deciding what
40:22
they’re amplifying what they’re
40:23
recommending
40:24
how they’re connecting people what
40:26
you’re seeing which direction you’re
40:28
being steered into
40:29
their targeting tools are allowing
40:32
political operatives to use your human
40:34
behavioral data
40:35
to target you with specific messages
40:38
maybe that made sense for like
40:40
adidas versus nike i cannot imagine that
40:43
anyone thinks that that makes sense
40:45
for political rhetoric and and so
40:49
all of these things they’re completely
40:51
immune from because they claim they have
40:53
section 230 immunity
40:55
so i do think i think before we get into
40:57
the weeds of what’s the actual law we
40:59
need to write because this is where
41:01
every conversation breaks down all the
41:03
tech policy people start
41:04
arguing with each other no but this is
41:06
more important or
41:07
free speech is more important or
41:10
antitrust is more important
41:12
why not start from like a higher level
41:14
like what is it
41:15
government’s job should be about what is
41:17
it that is wrong that we need to protect
41:19
or fix
41:20
i think it’s pretty clear to
41:24
many people that we have a broken down
41:27
information ecosystem
41:29
the information ecosystem as it exists
41:31
today
41:32
is completely broken it’s dominated by
41:36
a few unelected unaccountable
41:39
incredibly rich white men
41:43
and i’m just gonna throw that in there
41:46
and
41:47
who in the case of facebook had this
41:50
idea
41:50
you know another thing i don’t talk
41:51
about a lot but i was thinking about
41:53
this in my culture piece the other day
41:54
so you had this guy in a dorm room the
41:56
great story right
41:58
i mean it shouldn’t be such a great
41:59
story he created a company to rate
42:00
women’s looks but
42:02
he decides that the whole entire world
42:05
is going to benefit
42:06
and we’re going to have world peace and
42:08
the world’s going to be great if he can
42:09
connect everyone
42:11
you know what’s interesting in
42:12
government you wouldn’t just be able to
42:14
have an idea like that and then put
42:16
all of the government us government’s
42:17
resources behind it you’d have to have
42:19
commissions
42:20
and intel studies and and you would have
42:23
to really go through every potential
42:25
scenario first to validate whether
42:27
that’s
42:27
actually true he just said i know that
42:30
connecting the world
42:31
is the most important thing that we can
42:33
do and people threw money at him
42:35
and he’s gone and
42:38
so he’s achieved that he has dominated
42:40
our information ecosystem
42:43
and there is not one stoplight not
42:46
one break that says oh but this is
42:49
a problem he has free reign i know i’m
42:52
being really vague right now it’s just
42:53
that i get into the section 230 debate
42:55
so frequently that maybe i’m sorry i’m a
42:58
little tired but
43:00
we do i think most people can agree we
43:03
want a healthier information ecosystem
43:05
and what does that look like well i
43:08
would think that looks like a system
43:09
where we’re not
43:10
rewarding outrage anger and hate more
43:13
than we’re rewarding
43:14
expertise diligence
43:17
wonky facts or even just slowing down
43:22
i think we want a world where everybody
43:24
has the freedom to say
43:26
whatever they want but it doesn’t mean
43:28
that some guy who’s never spent any time
43:30
researching
43:31
anything but screams in the most
43:33
salacious way
43:34
has a bigger platform than somebody who
43:37
spent their entire life studying the
43:38
very same thing like this is
43:40
this is we need to realign this and as
43:42
long as we continue to allow
43:44
business models that prey on our human
43:47
behavioral data
43:48
to exploit our emotions and to target
43:52
our our core biases to keep us engaged
43:56
we’re never going to get there
43:58
and it sounds like one of the things
44:00
that’s that’s challenging is
44:01
you know just the premise that
44:03
connecting people
44:05
is a good is not a difficult one i think
44:08
for many people
44:09
to say like probably right that that
44:11
probably is true
44:12
but once you bring sort of the
44:15
advertising
44:16
the monetization model into it and you
44:18
start talking about how well we’ll
44:19
we’ll leverage the data that gets
44:22
collected
44:23
in the authentic exchanges that people
44:25
have with one another and then monetize
44:27
that
44:28
and then in turn uh political campaigns
44:31
will be part of monetizing that and will
44:33
be part of using the same data that gets
44:36
to be a very complicated equation
44:38
to then break i already can back up a
44:40
step i’m not sure i agree that
44:42
that everybody’d be like yeah connecting
44:44
everyone sounds like a great idea
44:46
i mean don’t get me wrong i do think
44:48
that the things that facebook has
44:49
accomplished early on are incredible
44:51
i use it i stay in touch with my friends
44:53
all over the world i want to be crystal
44:55
clear i’m not one of those people who
44:56
just wants to see the company
44:58
die right that said there’s a
45:01
real naivete to not understand that
45:04
there are also bad actors in the world
45:06
and when so so every time and
45:09
somebody tells me their startup idea and
45:12
i say oh that sounds really cool
45:14
but i could completely see how this
45:16
particular community is going to exploit
45:18
that
45:19
and these criminals are going to do they
45:21
get they get upset
45:22
like how dare you tell me that my
45:24
brilliant idea
45:25
and is going to in any way cause harm
45:29
and then and that’s what’s interesting
45:31
to me it’s this no but my idea is
45:33
amazing
45:34
nobody and i don’t think we talk about
45:36
that enough with facebook like when mark
45:38
zuckerberg
45:39
said nobody could have seen this coming
45:42
this is why this is my last line in my
45:44
ted talk
45:44
because they used that line a lot nobody
45:46
could have seen this coming when they
45:47
were talking about
45:48
russian interference in the 2016
45:50
elections
45:51
nobody could have seen this coming i
45:54
called bs
45:55
yes people could have seen it coming
45:57
anybody who spent their career studying
45:59
soviet propaganda
46:00
and studying the cold war or who worked
46:03
in that world
46:04
could have seen it coming they just had
46:05
no idea how facebook worked because
46:08
these are two different worlds but this
46:11
my idea is wonderful i’m gonna connect
46:13
the world nobody could have ever
46:15
anticipated that a global actor would
46:17
have exploited this for their own
46:19
geopolitical gain
46:20
of course people could have anticipated
46:21
that
46:23
this is i want to ask you too while
46:25
you’re talking about that
46:26
about the idea of um you know faulty
46:29
information
46:29
and the idea of of uh propaganda
46:33
you know the notion of misinformation
46:35
and disinformation and bad actors
46:37
is it is it meaningful is it important
46:39
to distinguish
46:40
and know that they’re the difference
46:42
between misinformation and
46:43
disinformation
46:43
in this conversation because you know
46:46
obviously one implies
46:47
a different kind of intent and and so
46:49
there is that bad actor component
46:51
what do you think we need to be thinking
46:53
about as we try as we disambiguate
46:55
between
46:56
those two when it comes to this
46:57
discussion
46:59
i mean that’s a great question i mean
47:01
there are different things
47:03
and misinformation
47:06
can really be it can be intentional sort
47:09
of but it can also be much more innocent
47:11
it can just be when you share something
47:13
that you didn’t realize
47:14
is not real or when you hear
47:18
something you mischaracterize it you
47:20
write about it and then that spreads
47:22
that’s misinformation or you take a
47:23
grain of something
47:25
but you misinformation is abundant
47:28
there’s no question
47:30
um always has been not at the scale that
47:33
it is now
47:33
because social media gives a platform
47:36
for anybody to scream out to three
47:38
billion people
47:39
disinformation is more intentional
47:41
disinformation
47:43
is an intentional misrepresentation
47:46
of information with an actual
47:49
agenda attached to it and so
47:53
this is why you’ll start i mean you’ll
47:55
start to hear these longer and longer
47:56
sentences
47:57
when people say misinformation and
47:59
disinformation they are different
48:01
um i don’t necessarily blame facebook
48:04
for the fact that there are people who
48:05
are saying
48:06
things that aren’t necessarily true i do
48:08
blame them
48:09
for a allowing operatives to spread
48:11
disinformation on their platform
48:13
for b not listening to critics for years
48:17
well before i started talking about this
48:19
stuff there were critics well before me
48:21
who were i mean you mentioned eli i mean
48:24
the foresight that eli had to already
48:26
start talking about
48:27
whether or not there’s disputes about
48:28
whether filter bubbles are true or not i
48:30
mean he was talking about this years ago
48:33
and it’s just this all critics all
48:36
critics hate tech
48:37
all critics are trying to stifle our
48:39
ability to innovate
48:41
and never listen to the critics and
48:44
this idea that this one guy
48:48
who had this idea has this much
48:51
power and doesn’t want to
48:55
even acknowledge that maybe the idea of
48:58
not
48:58
just connecting everybody and giving
49:00
everybody a platform to the world
49:03
but then as you mentioned selling
49:06
their human behavioral data and to be
49:09
able to target people
49:10
with information and deciding how you’re
49:13
going to keep people engaged
49:15
all of that that’s the stuff i want to
49:17
regulate around
49:18
i don’t want to regulate around whether
49:20
or not you write something on facebook
49:22
tomorrow that offends somebody
49:25
it’s about the tools that they have
49:28
created
49:29
to enable bad actors to do the things
49:32
they’re doing
49:33
yeah you know i think about
49:37
your comments in a couple of places
49:38
about reclaiming our public square
49:40
and i just think from you know orienting
49:43
this conversation
49:44
in this humanist lens what does that
49:47
really look like especially knowing that
49:49
you know
49:50
you don’t want to uh close down the
49:53
social media companies
49:54
you know maybe we see them reform and
49:57
take on some different
49:58
structures and incentives um but what
50:01
does it really look like
50:02
in in a more comprehensive way is it is
50:04
it just social media in these
50:06
contemporary times and beyond or
50:08
are we talking about you know news media
50:10
as part of that ecosystem
50:12
100 that’s and that’s the other thing
50:15
like
50:16
you speak about something and then all
50:18
the haters attack and go oh well you
50:20
didn’t mention fox news or oh you didn’t
50:22
mention this or you
50:23
i’m talking about one thing it doesn’t
50:25
mean that’s the only thing
50:26
we honestly it’s it’s this for-profit
50:32
enterprises that make money off of
50:36
controlling the information that we see
50:38
how we see it and how we consume it
50:40
with no incentive
50:43
to care about fostering healthier debate
50:47
listen we got rid of the fairness
50:49
doctrine we used to have i don’t want
50:51
the us government
50:52
regulating speech my gosh anyone who
50:54
thinks i do doesn’t understand like
50:56
unlike mark zuckerberg i did swear an
50:58
oath to protect the constitution
51:00
of course i don’t want u.s government
51:02
cycling free speech
51:03
that said we have incredibly wealthy
51:07
individuals with
51:08
agendas whether it’s political agendas
51:10
which we see with some of the
51:12
um some of the cable news networks
51:14
whether it’s just profit and power
51:16
motivation like we see with some of the
51:18
social media leaders
51:19
and they are dominating our public
51:22
square
51:23
and i do think it will take people
51:26
standing up and saying this is not what
51:27
i want
51:29
i don’t know that we can get back to a
51:31
place this wasn’t
51:32
a perfect time either but can we get to
51:34
a place where there’s a more
51:36
public utility version of a public
51:37
square can we get to a place
51:40
where there are certain rules
51:43
of the road about media and about
51:46
incentivizing
51:47
better behavior can we get to a place
51:50
where
51:51
there are social media platforms that
51:53
encourage civil debate
51:55
and do not purposely try to keep us
51:58
addicted to their platforms by
51:59
feeding us the most salacious content
52:02
can we
52:02
i mean i don’t have all the answers i’m
52:05
one human being
52:06
this is where people love to attack if i
52:07
don’t have all the answers
52:09
but you do have a lot of answers and i
52:10
think i feel like you you advance a few
52:12
ideas in in some of your work you’ve
52:14
talked about
52:15
retraining algorithms around something
52:16
other than engagement
52:18
you’ve talked about building in guard
52:19
rails to stop content going viral before
52:22
it’s reviewed
52:23
uh and and you know as you said you
52:26
could they could do
52:26
all of this without becoming what they
52:28
call the arbiters i mean at the end of
52:30
the day
52:30
one of the biggest things that are
52:31
lacking is transparency i mean
52:33
really it’s about so much of how we
52:36
engage with
52:37
each other is done is controlled in a
52:39
black box
52:41
at these companies and i do believe that
52:44
transparency is one of the major keys
52:46
here
52:47
whether it’s transparency i believe for
52:49
example
52:50
if the companies had to have some sort
52:53
of oversight i’m not saying exactly what
52:55
that looks like whether it’s
52:56
a government oversight body and it’s not
52:59
the facebook oversight board
53:00
like they might do some interesting work
53:02
but that’s never gonna do enough
53:04
i’m talking about whether it’s
53:07
monthly reports quarter i don’t have the
53:09
answers of exactly how it looks
53:11
but i do believe if there was oversight
53:13
into for example how the recommendation
53:14
engines work
53:16
i want to be able to know and i think we
53:19
should be able to know i don’t need your
53:20
secret sauce you don’t need to send me
53:22
the code of how your algorithms work
53:24
but i think i have the right to know if
53:25
the two guys that went and murdered
53:28
uh police a federal officer in oakland
53:30
pretending to be part of the black lives
53:32
matters movement but they were actually
53:34
boogly boys
53:35
they met in a facebook group don’t you
53:38
think that we should have the right to
53:39
know
53:40
if facebook’s recommendation engines
53:42
recommended them into that group and
53:44
connected the two of them to each other
53:47
like these are the tools that we have no
53:49
vision into we just have to trust
53:50
facebook when they say
53:52
that that’s not how it happened and so
53:55
it’s about transparency i think we
53:57
should
53:58
have transparency into how the
54:00
recommendation engines work
54:02
i want to know for example i think we
54:04
should have the right to know if a
54:05
particularly harmful piece of content
54:07
whether it’s anti let’s let’s go to a
54:10
coveted example
54:11
the movie pandemic i mean things like
54:14
that
54:14
have real world consequences i think
54:18
that some government oversight should be
54:19
able to know
54:20
did that movie just true or did that
54:23
post just truly
54:24
organically reach the millions or do
54:26
your recommendation engines and your
54:28
algorithms boost
54:29
promote amplify send it those
54:32
are the things that we don’t have any
54:34
answers to and that’s where i think the
54:36
responsibility lies i think if we could
54:38
create
54:38
actual transparency and again it doesn’t
54:42
mean that they have to give up their
54:43
secret sauce and it doesn’t mean that i
54:45
private citizen have to be able to know
54:47
every ounce but some oversight
54:50
function within government to know
54:53
were you did your algorithms boost that
54:56
content to a level that it never would
54:58
have reached on its own
55:00
that’s what i’m talking about and we
55:02
have oversight in all sorts of
55:03
industries
55:04
that’s what happens when you have a
55:06
product that is creating an externality
55:08
in the world
55:09
that is where regulation and oversight
55:11
comes in
55:12
and i think many of us would agree that
55:14
facebook is an interesting product but
55:16
it’s certainly creating externalities
55:20
yeah this is all i i wish we had
55:23
hours to go it’s so funny when we’ve
55:24
talked in advance of this discussion
55:27
for like an hour it’s it’s so long and i
55:29
think
55:30
it goes so fast to me because i think
55:32
there’s just so much richness to talk
55:33
about
55:34
you know one thing i wanted to do is
55:36
give you the opportunity to to have an
55:38
uplift out of the discussion we only
55:40
have a few minutes left and i wonder
55:41
i hope it’s an uplift um as you think
55:44
about you know rebuilding american
55:46
society and beyond of course you know
55:48
i’m thinking about american societies
55:49
we’re both americans but i know we have
55:50
international listeners and viewers
55:53
um but i think it’s we know that the us
55:56
is in some ways a sort of litmus test
55:58
for
55:58
uh for a lot of what happens
56:00
internationally do you think
56:02
that are you optimistic i guess
56:05
about our prospects to rebuild as a
56:08
society
56:10
and what do you think needs to happen if
56:12
we’re going to do that
56:14
possibly so it’s funny i’ve had a few
56:16
people introduce me as being optimistic
56:18
which i was that was weird because i’m
56:19
always the one screaming about all the
56:21
fires
56:21
but they said why would you keep
56:23
fighting if you didn’t believe there was
56:24
something worth fighting for
56:27
um i think a few things coming out of
56:32
this pandemic this is not necessarily a
56:35
social media answer
56:36
but coming out of this pandemic i really
56:38
i don’t know if it’s going to happen
56:40
but i hope that we have at least with
56:43
some people
56:44
a shift in a more public service mindset
56:47
a
56:47
realization that
56:51
of the true disparities that are
56:53
happening in this country
56:55
um more i think with what’s happening
56:57
politically right now
56:58
i will tell you one of the things i’m
57:00
optimistic about is an incredible
57:02
increase in civic engagement
57:04
um i mean i’ve always been a public
57:05
servant at heart which is why i didn’t
57:07
do well places like facebook
57:09
um but i think part of what brought us
57:12
to where we are
57:13
is also that we have very much lost
57:17
this civic engagement um part of
57:20
american identity so i do
57:24
hope i am hopeful i hope it’s not just
57:26
everyone’s showing up to vote for this
57:28
election and then everyone goes back to
57:29
not caring about
57:31
what their role is at the end of the day
57:34
we
57:34
all have a role in building the society
57:36
we want
57:37
and for a long time i think that sort of
57:40
started backsliding a bit in the us
57:43
i hope that more and more people stay
57:45
engaged and really fight
57:47
for what they want to see and whether
57:48
that’s their fights about
57:50
wanting a healthier information
57:51
ecosystem whether it’s their fights
57:53
about wanting
57:54
facebook to step up and stop allowing
57:57
conspiracy theorists and hate groups to
57:59
thrive whether it’s racial justice
58:01
whatever it is
58:03
we can’t just sit at home and like
58:04
things on facebook and think that we’ve
58:06
solved the world
58:07
and i do think there’s an incredible i
58:09
work with young technologists and
58:10
startups who are trying to think through
58:12
how to create
58:13
more interesting products that that that
58:16
works for our benefit but when i see
58:19
those kinds of pitches that makes me
58:20
pretty optimistic so i don’t know i mean
58:23
by the time this comes out in the
58:24
podcast it’s gonna be a different world
58:26
yeah
58:26
it’s gonna be elections gonna have
58:28
concluded and like
58:30
from today when we’re doing this now to
58:32
when this will come out in the podcast
58:34
could be an unbelievable amount of
58:36
change in this country
58:38
um i just hope that everybody remembers
58:40
that we all have a
58:41
part to play and this individualism
58:44
versus collectivism has always been a
58:46
huge tension in the us
58:48
i just hope that we nudged just a tiny
58:50
bit more towards the collectivism
58:52
direction i i so appreciate that and
58:55
actually i was going to ask you what
58:56
tools or technologies or things adjacent
58:58
to technologies are you most hopeful
59:00
about but you talked about some of the
59:02
startups and the founders that are doing
59:03
things around
59:04
you know using technologies in other
59:06
ways that are they’re kind of bringing
59:08
people together are there other things
59:09
that come to mind for you of
59:11
a democracy tool or civic tools or
59:13
anything like that that are that are
59:15
encouraging
59:16
yeah i mean just look at just look at
59:18
what’s all the things that have popped
59:19
up just around this election right just
59:21
look at how people have used tools to
59:23
really help get out the vote and not
59:25
just i mean some of it listen
59:27
i’m i still believe in i
59:30
i don’t believe technology is the answer
59:32
to everything
59:33
i just i don’t um but technology that
59:36
helps us
59:37
get the right information that we need
59:38
that helps connect people without
59:40
exploiting
59:41
us there are people trying to solve all
59:43
these challenges i couldn’t name anyone
59:45
that’s the most
59:46
exciting i really like that there’s more
59:48
civic tech
59:49
coming up i really like that there are
59:50
investors i mean i’ll do a little plug i
59:52
mean the reason why i
59:54
i started working with beta lab at
59:57
betaworks is because they are trying to
59:59
invest in companies
60:00
that are focused on privacy on a
60:04
healthier internet
60:05
on different business models people are
60:07
really trying to solve these challenges
60:10
but it’s going to require a whole
60:13
systemic change
60:15
from how how our markets reward
60:17
companies
60:18
to how investors invest in them to how
60:21
people
60:22
engage in their democracy and so i do
60:24
hope that
60:25
this civic civic engagement remains on
60:28
the rise
60:29
excellent i also want to give a reward
60:31
to people who’ve stuck around thus far
60:34
you shared with me one of the things
60:35
that’s keeping you
60:37
busy right now and i’m gonna go ahead
60:38
and pop some pictures up on the screen
60:41
uh can you tell us about what
60:43
we’re seeing on our side here with this
60:45
adorable
60:46
pup this is i call him my little
60:48
terrorist
60:49
uh this is maggie’s little puppy who’s
60:52
trying to you know he’s supposed to
60:53
bring down my anxiety about the election
60:55
but if any of you have ever had a really
60:57
little puppy oh my gosh
60:59
it’s like 24 7. i’m
61:02
very tired but then you see those cute
61:04
little eyes and you kind of can’t help
61:06
it
61:06
oh my gosh the most adorable so that’s
61:09
that was really that was worth it
61:12
everything that
61:13
we just went through to have this
61:14
conversation all the exhaustion i’m sure
61:17
you feel
61:17
digging all these topics up again and
61:19
again at least we got to see
61:21
your that puppy
61:26
hey uh before we close out can you make
61:28
sure that folks know where to find you
61:30
or follow your work uh you know
61:32
accidentally tag you on social media
61:34
like what how should they connect with
61:37
um all of the stuff i’ve written are all
61:39
the main things are on my website which
61:41
is just my name
61:42
yeah eliza.com or i’m on twitter it’s
61:45
always under my true name so that the
61:46
russians can’t troll me or hack me
61:49
pretend to be me so just at your allies
61:51
instead
61:53
and i just want to point out uh so all
61:54
texas human has said great conversation
61:57
absolutely uh carolyn burch a very
61:59
important dialogue to advocate
62:01
keep voicing it and georgia o’neil uh
62:04
said
62:04
earlier that was a great question that’s
62:06
my mom by the way so thanks for tuning
62:08
in mom
62:10
and y’all thank you so much for being
62:12
here with us today and
62:14
i i just can’t thank you enough for
62:15
taking this time and and sparing your
62:17
energy
62:18
with us to to unpack these issues and
62:21
make sure we all we all leave here a
62:22
little smarter and
62:23
a little more sophisticated in the way
62:25
we’re thinking about the interplay
62:26
between
62:27
tech democracy and ethics thank you so
62:29
much thank you for having me
62:32
all right thank you bye