The Tech Humanist Show: Episode 6 – Kaitlin Ugolik Phillips

About this episode’s guest:

Kaitlin Ugolik Phillips is the author of The Future of Feeling: Building Empathy in a Tech-Obsessed World. She is a journalist and editor whose writing on law, finance, health, and technology has appeared in the Establishment, VICE, Quartz, Institutional Investor magazine, Law360, Columbia Journalism Review, Lithub, Scientific American, NY Post, VICE, Salon, and Narratively, among others. Kaitlin writes a blog and newsletter about empathy featuring reportage, essays, and interviews.

She tweets as @kaitlinugolik.

This episode streamed live on Thursday, August 20, 2020. Here’s an archive of the show on YouTube:

Highlights:

2:46 welcome Kaitlin

About the show:

The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O’Neill.

Subscribe to The Tech Humanist Show hosted by Kate O’Neill channel on YouTube for updates.

Full transcript:

00:03
hey everybody
00:06
hello humans
00:10
uh hi clark i see you’re already on uh
00:12
everybody else let me hear from you
00:14
um you’re out there listening and let me
00:16
know where you are
00:18
in the world and where you’re tuned in
00:19
from i know or maybe even like what
00:22
channel you’re tuned in from
00:23
that’s always fun to see i see clark
00:25
here in from
00:26
linkedin it shows right in your little
00:28
icon there so that’s cool
00:32
we have uh we’re going to be rocking and
00:34
rolling in just a few minutes
00:36
if you have been tuned in before you
00:39
already know this is the tech humanist
00:40
show of course
00:41
it is a multimedia format program so
00:43
we’re doing this live
00:45
across youtube linkedin twitter facebook
00:48
and even twitch although nobody watches
00:50
on twitch
00:51
it’s just to like check all the boxes uh
00:54
and then
00:55
that so it’s we’re talking about how
00:57
data and technology shape the human
00:58
experience with a
01:00
guest each week and then it becomes an
01:02
audio
01:03
format so it goes to podcast and it
01:06
becomes available on
01:08
ios and google and stitcher and all that
01:10
good stuff so
01:11
please subscribe or follow along
01:13
wherever you’re watching from wherever
01:14
you’re listening from
01:16
and you won’t miss any new episodes so
01:19
uh
01:19
looks like we are uh live and rolling
01:23
so i’m gonna go ahead and and bring in
01:26
our guest
01:28
we got um i see some some numbers
01:30
starting to to pick up i want to make
01:32
sure i don’t
01:33
run too soon if people are still tuning
01:36
in but we’ll go ahead
01:37
all right marco all the platforms
01:41
yeah sorry twitch twitch you’re just not
01:43
you’re i mean maybe i’m just not hip
01:45
enough for twitter that’s more the issue
01:47
that’s really what’s going on there
01:50
but we got it out there anyway so you
01:52
know if you’re a die die-hard twitch
01:54
user or viewer you’re more than welcome
01:56
to view the show there
01:58
all right let’s get to uh to our guest
02:01
we are talking with caitlin eugelic
02:03
phillips
02:04
the author of the future of feeling
02:07
building empathy in a tech obsessed
02:09
world
02:10
she is a journalist and editor whose
02:12
writing on love finance health and
02:14
technology has appeared
02:15
in the establishment vice courts
02:18
institutional investor magazine
02:20
law 360 columbia journalism review
02:23
lithub
02:24
scientific american new york post vice
02:27
salon and narratively
02:28
among others so that’s just a sampling
02:31
there’s even more just
02:32
insert name of publication here she has
02:35
probably
02:36
published in it caitlyn writes a blog
02:38
and newsletter about empathy featuring
02:40
reportage essays and interviews so start
02:42
getting your questions ready for our
02:44
guest
02:46
and caitlyn you are live
02:49
on the tech humanist show thank you so
02:50
much for joining us thank you for having
02:52
me i’m excited
02:54
yeah me too this seems like it’s been
02:56
kind of a um
02:57
an inevitable conversation between us
02:59
i’ve been looking forward to it very
03:00
much so
03:01
i’ll just jump right in and ask you how
03:02
did you first get on on researching and
03:04
writing about how tech
03:05
is changing the future of human empathy
03:08
well it really came
03:09
from um just the the concern that about
03:12
that question
03:13
you know i um as an old millennial i’m
03:16
32
03:17
so i kind of grew up online um
03:21
first with aol instant messenger and
03:22
live journal and then
03:24
myspace and i really miss myspace um but
03:28
you know on and on
03:29
to where we are now um and a few years
03:32
ago
03:33
2014 2015 i just was kind of having a
03:36
lot of really negative experiences
03:38
um friends and strangers on social media
03:41
and just kind of
03:42
trying to think more broadly about like
03:44
man this is where we’ve gotten just in
03:46
the past 10 years
03:47
how are things going to change as
03:49
technology continues to change i was
03:52
um some tech finance staff at the time
03:55
um at institutional investor and i was
03:57
just realizing you know
03:59
okay all this investment is happening in
04:01
the internet of things and
04:03
in virtual reality and all these other
04:05
new
04:06
emerging technologies and i was just
04:09
worried
04:09
you know what is going to happen to our
04:11
ability to communicate with one another
04:13
and um
04:14
it was really a situation where i tried
04:16
to find a book that focused on
04:18
solutions or you know the concerns about
04:21
that and i
04:22
didn’t find one at the time so i i had
04:24
to write it
04:25
that’s your job you identify a gap now
04:28
you got to write it
04:28
[Laughter]
04:30
well it and it’s it’s such an important
04:32
book it’s like covers such an
04:33
interesting topic and we’ll we’ll dig in
04:35
in a lot of different ways um but i
04:37
actually want to do a little quick
04:38
side note and just ask you know how tech
04:40
affects uh
04:42
human life and human uh
04:45
just experience in general uh this this
04:48
pandemic has certainly changed
04:50
human life and human experience and tech
04:52
has been a part of it so i’m curious
04:53
what your life
04:54
during the pandemic has has looked like
04:56
have you taken up any new
04:57
hobbies you know developing new skills
04:59
sort of picked up any new wide-ranging
05:01
research projects that you’ll eventually
05:02
publish
05:04
well i don’t i haven’t been that
05:06
productive but i
05:07
i will say that i you know the book came
05:10
out in february
05:11
like a month before everyone started
05:13
going into lockdown and i definitely
05:15
did not imagine that some of the kind of
05:17
dystopian seeming
05:19
um situations described in the book
05:22
would
05:22
become reality so quickly um and kind of
05:25
do this big social experiment right away
05:27
on
05:27
you know what does it mean to be plugged
05:29
in constantly all the time and
05:32
have to communicate that way um i have
05:35
been
05:36
trying to actually not spend as much
05:38
time
05:39
with technology because you know all of
05:42
my work day is on
05:43
on the computer um except that i have
05:45
taken up
05:46
crossing on the nintendo switch animal
05:48
crossing everybody’s crazy about animal
05:50
crossing
05:51
it’s so it’s perfect like i’m not a big
05:53
gamer but so it’s perfect for me because
05:55
it’s
05:56
just mindless tasks cute and you know
05:59
that’s you know i have started
06:01
um doing some research for a potential
06:04
new
06:05
project but that’s actually about
06:06
something totally different
06:08
um people keep asking me you know oh you
06:10
should write a follow-up to your book
06:12
about
06:12
and i’m like i don’t think i need to
06:14
write a book i think everybody knows
06:16
what’s happening now
06:17
you know like well and also your book
06:19
just came out
06:21
i know that’s that’s inevitable though i
06:22
mean everybody who writes a book knows
06:24
that
06:24
you know as soon as the book is handed
06:26
over and done
06:28
you’re on to the next idea like you
06:29
already are thinking about it
06:32
i don’t think people realize that how
06:34
how long that
06:35
process takes so maybe in 2023
06:39
we’ll see something new will it be about
06:42
animal crossing do you think
06:44
probably not unfortunately i’ll keep
06:46
that for myself
06:48
well i loved i loved the book um i loved
06:51
a lot that happened in it what you
06:53
covered and i loved the people that you
06:55
introduced us to there was one in
06:57
particular dylan
06:58
maron or marone i don’t know how you
07:00
pronounce his
07:01
last name but um i wasn’t aware of his
07:04
work but
07:05
but what you described about
07:07
conversations with people who hate me
07:08
and unboxing every single word and
07:11
sitting in bathrooms with trans people
07:13
they’re all brilliant concepts i didn’t
07:15
end up watching a few episodes of the
07:18
every single word because they’re only
07:19
like 11 seconds long you know which
07:21
just for the audience in case you
07:22
haven’t run across these if you look
07:24
uh go to youtube and look up dylan
07:27
m-a-r-r-o-n
07:28
every single word and it’s what he’s
07:30
done is
07:31
splice together or just edit down
07:34
feature films
07:35
to just those lines that were spoken by
07:38
actors of color
07:39
and it’s sobering and i mean i’m sure
07:42
that many uh um
07:44
people of color were already aware of
07:46
this phenomenon and white people may
07:47
have been aware of it too but it’s such
07:49
a striking
07:50
striking illustration so i think the one
07:53
i looked at was moonrise kingdom and it
07:55
was 11 seconds long
07:57
so you know it really gets the point
07:59
across but anyway he’s just a brilliant
08:02
uh creator and i loved how you you um
08:05
wove in what he’s doing with uh how
08:09
especially with conversations with
08:10
people who hate me with uh with how
08:12
people
08:12
are interacting online so have you found
08:14
any other cool people since you’ve
08:16
published and that you you are like oh i
08:17
wish i could have included them
08:19
i’m so glad you asked that question
08:21
because yes and i think that happens so
08:23
often but
08:23
then authors don’t really get to say
08:25
that but first dylan is great i got to
08:28
meet him in person
08:29
and his podcast i think it’s still
08:32
ongoing is
08:33
really an exercise in empathy i highly
08:36
recommend listening to it
08:37
um but to answer your question so i
08:39
recently interviewed dr matthew
08:41
at the university of northern colorado
08:43
and he does a lot of research
08:44
on um gaming um in education
08:48
so specifically for social emotional
08:50
learning for younger kids and the use of
08:52
games in schools
08:54
and he and i had a conversation about
08:56
like ed
08:57
and virtual schooling but he just had so
09:00
many
09:01
fascinating things to say about how kids
09:03
learn
09:04
social emotional skills through
09:05
technology and
09:07
we talked a little bit about vr and
09:09
afterwards i was like well
09:10
either you should have written my book
09:12
or i wish i had talked to you
09:15
four years ago yeah that hit on at least
09:17
three or four ethics right
09:18
yeah and then i also um
09:22
the organization color of change i i
09:24
didn’t come across them until
09:25
uh relatively recently but it’s um it’s
09:28
an online civil rights organization
09:30
and they deal with a lot of these kinds
09:32
of things um
09:34
from like the policy arena um of you
09:37
know
09:38
not just empathy but civil rights on the
09:40
internet um and it would have been great
09:42
to include them
09:42
and honestly you
09:46
i came across your work a little bit too
09:47
late as well and you would have been a
09:49
great person to include
09:50
well we’re connected now and we’re doing
09:52
yeah yeah that’s wonderful
09:54
now those are and those are brilliant
09:55
examples too um although
09:57
um at least on my own the audio cut out
09:59
when you gave the name of the first
10:01
researcher can you just say his name his
10:02
or her name again
10:04
yeah it’s dr matthew farber he’s at the
10:06
um
10:07
university of northern colorado great
10:09
yeah i’m sure that there were uh
10:12
listeners who were like oh i didn’t
10:13
catch that name uh you know what what i
10:15
think was so interesting for me
10:17
about your book is it’s
10:20
empathy is this concept that i think
10:22
people talk about a lot
10:24
and then they do it in this very
10:26
abstract broad
10:28
you know un intellectually undisciplined
10:31
way
10:32
but what you’ve done with the book is
10:35
really
10:36
unpack it and and go into different
10:39
aspects of
10:40
it with intellectual rigor and so even
10:42
at one point
10:43
uh you quoted an essayist named leslie
10:46
jameson
10:47
who was writing about medical empathy
10:49
and she wrote what i think may have been
10:50
the most beautiful thing i’ve ever
10:52
read about empathy and it was this
10:53
empathy isn’t just listening it’s asking
10:55
the questions whose answers need to be
10:57
listed
10:57
empathy requires inquiry as much as
10:59
imagination
11:01
empathy requires knowing you know
11:02
nothing and
11:04
empathy means acknowledging a horizon of
11:06
context that extends perpetually beyond
11:09
what you can see and so i to me that was
11:12
just like it seems like she’s really
11:13
isolated
11:14
these elements that make empathy such a
11:16
special concept like
11:18
it’s imaginative it’s active it’s
11:20
curious and it’s dimensional
11:22
so do you find that you
11:25
collect definitions of empathy as part
11:27
of what of what you’re doing here
11:30
do you have a faith that definitely
11:33
wasn’t
11:34
something i expected but that has
11:35
definitely happened um
11:37
and yeah you thank you for reading that
11:40
bit because that’s from empathy exams
11:42
it’s our collection of essays and it
11:44
gives me chills
11:45
um because that was actually the book
11:47
that i picked up thinking i might find
11:49
what i was looking for and it was
11:50
something totally different but
11:52
so so beautifully written and i think
11:54
one of the most important
11:55
parts is when she says the perpetual i
11:58
don’t remember the exact phrasing but
11:59
that
12:00
your inability sort of knowing what you
12:03
don’t know
12:04
and you may not ever know but
12:08
it’s kind of like what’s in that gap
12:10
there that you are still making an
12:12
effort
12:12
to um show emotion and show that you
12:16
care
12:16
and i think that’s a really important
12:19
piece that often gets missed when
12:20
empathy becomes a buzzword especially in
12:23
the tech community and in the business
12:24
community especially in business where
12:26
you have
12:27
all of the you know empathy
12:31
the empathy index and all of these you
12:32
know it’s just used as
12:35
to actually just mean oh you know we
12:37
kind of care about our employees
12:38
or you know we um have a great hr
12:42
department um but in terms of a favorite
12:46
definition i liked um
12:50
thinking and this is actually from i
12:52
think a buddhist
12:53
some buddhist writing i can’t remember
12:55
exactly where it came from but
12:57
um that sympathy is feeling for someone
13:00
and empathy is feeling with someone
13:02
it’s a simple you know and further to
13:05
say
13:05
even just making the effort to feel with
13:08
someone
13:09
acknowledging that you might not
13:11
literally know what it’s like to
13:12
experience what they’re experiencing i
13:14
think that’s kind of like
13:16
an active form of empathy yeah it’s a
13:19
that’s a really
13:20
a great distillation phrase because
13:23
i think that’s what a lot of it comes
13:24
down to is you can work around the
13:26
nuance and the abstraction and i love
13:28
nuance and abstraction i’ll work around
13:30
nuance and abstraction all day
13:32
but when you’re actually trying to
13:34
deliver
13:35
something that’s applicable for people
13:37
that people can actually
13:38
take and understand in in real useful
13:42
ways
13:42
it it requires that level of
13:44
intellectual rigor and
13:45
i find this too a lot of my work centers
13:48
around meaning
13:49
and meaning is a similarly like a
13:51
concept that people
13:53
talk about in very abstract terms and so
13:56
a lot of the work i’ve done over the
13:57
last few decades is to develop
14:00
like concepts and clear clear constructs
14:03
that people can apply
14:04
and how how they can break down what
14:06
we’re really talking about when we talk
14:08
about meaning and specifically
14:09
meaningful experiences so did you
14:11
was that part of in any way what drew
14:12
you to the topic that it lacked that
14:14
intellectual rigor and you mentioned
14:16
going to to the
14:17
one book and being disappointed and and
14:19
kind of feeling like now i need to
14:20
create this
14:22
yeah i think that might have been part
14:23
of it i mean i think what i
14:25
at the beginning thought was missing was
14:27
a future looking
14:28
um piece of intellectualism i guess that
14:34
that specifically focused on that tech
14:36
and human
14:37
interconnection um but as i went on with
14:40
my reporting i really started to think
14:42
more about
14:43
the danger of having a word that can
14:46
kind of be used
14:48
in different ways i know that a lot of
14:51
the things that i wrote about and i
14:52
tried to be
14:53
[Music]
14:55
clear that there were pros and cons and
14:56
that there were criticisms of different
14:58
things especially like when it comes to
15:00
virtual reality experiences that are
15:03
meant to build empathy
15:04
it can be it’s so easy to say okay this
15:08
is an empathy building experience
15:10
put on the headset pretend to be someone
15:12
else just end it
15:13
there and now this person walks away
15:16
with this idea that they know what it’s
15:17
like to be
15:19
to have a disability or to live in a
15:21
different
15:22
um skin color and i think that’s
15:26
that’s kind of why i was looking
15:30
to really lay out you know talking about
15:32
when i’m talking about empathy
15:33
so that it’s clear that i don’t
15:36
subscribe to those
15:37
things yeah that was a really uh
15:40
compelling section of the book when you
15:42
wrote about virtual reality augmented
15:45
reality mixed reality like
15:46
all of the the uh kinds of um
15:50
mixed reality options and what was
15:53
happening
15:54
with different experiments that people
15:56
were trying
15:57
you know there’s this concept that
15:58
virtual reality is the the empathy
16:00
machine as chris milk called it right
16:01
and
16:02
right and then i think you know there’s
16:04
there’s a lot of um
16:05
sort of default acceptance of that and i
16:08
love how well you unpacked it so
16:11
walk us through some of what you found
16:13
when you started to
16:14
you know examine what was happening in
16:16
some of the research that was actually
16:18
out there
16:19
um well one interesting thing is that
16:21
there’s not a ton
16:23
like a lot of what i wrote about is so
16:26
ongoing so it was kind of difficult to
16:28
try and you know
16:30
i couldn’t really prove much um but i
16:33
could talk about my experiences and a
16:35
lot of other people’s and a little bit
16:36
of research
16:38
the most eye-opening experience and
16:40
interview was with dr courtney cogburn
16:42
um and she i believe is at columbia
16:46
and she helped create this experience
16:48
called 1000 cut journey
16:50
and it’s an embodied virtual experience
16:53
so you put on a headset and also
16:55
hold these controllers and you move
16:57
around
16:58
and you kind of go through different
17:00
stages of life
17:01
of this young black man when he’s a kid
17:03
when he’s
17:04
a little bit older when he’s an adult um
17:06
and it’s
17:07
it’s um it’s animated and you’re kind of
17:11
embodied so you can sort of look down
17:13
and you know your
17:14
your arms are the color of his arms and
17:16
you you’re being spoken
17:18
to as if you’re him and um what i wrote
17:21
about
17:22
like i went into that experience excited
17:25
but also skeptical
17:26
for the reasons i just explained and
17:29
when i came out of it i really felt like
17:31
it didn’t i felt like more than anything
17:34
it highlighted to me just how little
17:36
i actually know about what it’s like to
17:39
be
17:40
this particular type of other person so
17:42
it wasn’t that
17:43
oh now i could just put this in my you
17:46
know
17:47
white ally backpack and say that i you
17:49
know had this experience but that
17:51
it it emphasized to me how different my
17:53
experience was
17:54
which i think triggered maybe a more
17:57
useful kind of empathy that was like
17:59
okay let me
18:00
learn a little more um
18:03
or or kind of keep going but i think
18:06
some research that i read also
18:08
emphasized that
18:10
that’s not going to be everyone’s
18:11
experience in part because the
18:14
experience and sort of
18:17
behaviors and personality that you have
18:19
going
18:20
into these experiences makes a huge has
18:23
a huge impact on
18:24
how you feel going out so like any other
18:27
tool and i say this over and over in the
18:29
book like these things could be used
18:31
to help us build empathy but also to
18:33
potentially be manipulative
18:35
or you know cause other problems right i
18:38
mean that’s
18:39
one of the recurring themes in my work
18:41
is that there’s a lot of good
18:43
to be harnessed here and there’s also a
18:46
lot of bad and destructive power and so
18:47
i think we have to be very eyes wide
18:49
open about both of those things about
18:52
recognizing
18:52
what the harms are and what could
18:54
possibly go wrong in order to
18:56
build the best futures and and create
18:58
the best possible outcomes
19:00
with tech and i think kind of in general
19:02
if anything
19:03
right yeah but i thought it was very
19:05
interesting that
19:06
your example of um the one scenario that
19:09
seemed like it did
19:11
uh offer some dimensional
19:14
empathy or or some re-contextualizing of
19:17
empathy was
19:18
kathy hackl who’s a friend of mine and
19:21
and i’m she may be watching the show
19:24
so kathy had had talked about losing
19:27
her feeling of empathy from having been
19:29
subjected to
19:30
horrendous video footage at cnn i think
19:34
it was
19:34
and and then having that sort of
19:37
reactivated by a virtual reality
19:40
experience or a 360
19:42
360 degree sort of video experience is
19:44
that right yeah
19:45
yeah i think she she told me she felt
19:47
kind of desensitized
19:48
totally makes sense because if your job
19:50
is to watch
19:52
sort of screen horrific images to see
19:55
what’s acceptable to put
19:57
on tv um how do you how do you do that
20:00
for more than a few minutes without
20:02
turning some
20:04
you know something off um but yeah she
20:06
said she used the metaphor of like a
20:08
switch flipping when she
20:10
um i guess she she had moved on from
20:12
that position and then
20:13
experienced this 360 video uh i think it
20:16
was one about
20:17
solitary confinement and having
20:21
i don’t know it just kind of was like oh
20:25
okay like that that muscle is still
20:27
there and actually that’s how i like to
20:29
think about empathy is that it’s a kind
20:31
of a muscle that if we
20:32
if we let it atrophy
20:36
which tech can unfortunately help us do
20:39
it becomes more of an issue but if we
20:41
practice it more regularly
20:44
we can you know have better
20:47
communication
20:48
and relationships yeah and i think
20:49
that’s a really
20:51
important concept and an important
20:53
discussion i’d like to
20:55
just set that aside for just a moment
20:56
because i think i definitely want to get
20:58
there and talk about
21:00
how can people develop better empathy
21:02
and use those skills and all of that but
21:04
i wanted to stay just for a second with
21:05
this mixed reality discussion because
21:07
you also touched on a point just a
21:08
moment ago
21:09
about how some of these experiences um
21:12
like the the one cut journey where you
21:15
are inhabiting
21:17
this fictitious black young man
21:20
character
21:21
and i think you you made a really
21:24
interesting point and you had other
21:25
experts who were
21:26
you quoted as making the point that
21:29
this is a really kind of tenuous uh
21:32
process like you’re you’re experimenting
21:34
in ways or or these sex
21:36
these experiments are being done in ways
21:39
that
21:41
you know they that you hope is gonna
21:42
create more empathy
21:44
but there is this risk of number one
21:48
of exposing something someone to an
21:50
experience that they then think they
21:52
understand fully and they can be sort of
21:54
dismissive of
21:55
now and that of course is a huge risk
21:58
but also the other thing that was
21:59
pointed out
22:00
i think i have it in my notes here uh
22:03
kang uh wrote in this in slang that the
22:05
the very concept of empathy creation
22:07
through vr
22:08
is an othering process that right that’s
22:11
that was such a brilliant observation so
22:13
yeah so it felt like you you really uh
22:15
got into some some great critique of of
22:18
that
22:19
that notion of vr as an empathy machine
22:22
and what did you feel like you walked
22:23
away with is sort of the
22:24
the take away the summary and the
22:26
recommendation for people
22:28
yeah it was definitely a journey while
22:31
reporting the book because i
22:33
did i did start with this idea that i’m
22:36
looking for solutions i’m looking for
22:38
positive
22:39
future looking things and then having to
22:42
realize as i came across these different
22:45
other perspectives
22:46
that you know not that i thought
22:48
everything was going to be
22:50
puppies and roses but that you know
22:52
there are real
22:53
important significant um caveats and
22:56
really i just kind of came away with
22:58
like more of an ambivalence about
23:01
you know what what is possible and at
23:04
least currently
23:05
um and
23:09
i guess i the final chapter of the book
23:12
ended up
23:13
ends up being a little bit of a soapbox
23:14
which anyone who reads it we’ll see
23:16
about why i think tech companies need to
23:19
take all these things into more
23:21
consideration but
23:22
um i just think that i still think that
23:25
exposing yourself to as many other
23:27
perspectives as possible is always a
23:29
good thing whoever you are
23:30
um but that piece again that we talked
23:33
about at the beginning that part of
23:34
empathy is knowing what you don’t
23:36
know and i think right now in in the
23:38
world and politics and
23:40
culture and science we’re all trying to
23:42
get comfortable with cognitive
23:44
dissonance and with
23:46
not knowing things and and with having
23:48
conversations about things that we don’t
23:50
have the answers about and i think
23:52
social media in particular really
23:54
encourages you to just
23:56
be concrete and to
23:59
you know say this is what i believe and
24:01
stand by it
24:02
in a debate even if really you know
24:05
you’re more ambivalent or you know and i
24:07
think
24:08
getting more comfortable with not
24:10
knowing um
24:11
is really at the core of of all of this
24:13
and i and i think that
24:14
um that that othering piece is something
24:18
i think about a lot and i worry about a
24:19
lot and i don’t know exactly how
24:22
to remedy and maybe we don’t but i’m
24:25
glad to be
24:26
thinking about that and being able to
24:28
say you know i don’t know
24:30
and and let me try to imagine more what
24:32
that feels like
24:33
yeah and it seems like an important
24:35
caveat or caution to be able to
24:37
offer to anyone who’s trying to create
24:39
virtual experiences with the aim
24:41
of increasing an empathetic response in
24:45
people that that there are those
24:47
considerations
24:48
too that there need to be sort of that
24:50
holistic way of thinking about it that
24:52
this could have the opposite effect of
24:54
what you’re intending
24:55
and it could also create a sense of of
24:59
outsidership in the people that you’re
25:00
trying to build allyship for
25:03
and you know one way to to you know
25:06
address that is just to include
25:08
different kinds of people
25:09
you know i mean that’s kind of part of
25:11
the soapbox i ended up on at the end of
25:13
the book is like
25:14
more women more people of color people
25:15
with disabilities people
25:18
um who you know are
25:21
are descendants of native americans or
25:24
um just people who aren’t just all the
25:28
same kind of silicon valley guy
25:29
that and i say guy for a reason that
25:32
you know not to say that
25:36
every problem we’re currently having
25:38
with you know
25:39
social media for example can be traced
25:41
exactly to identity issues but
25:44
hello if you don’t involve people
25:47
and the at the beginning of the process
25:49
it should not be surprising when
25:52
these things come up years later or
25:54
weeks later
25:55
that oh i didn’t think about that i mean
25:57
that’s not an excuse anymore
25:58
in 2020 yeah no it’s such it’s such a
26:02
brilliant point i actually want to pivot
26:04
there to uh talking about
26:05
the social media interactions in general
26:07
but just this little
26:08
side note from one of our viewers
26:10
christopher dan’s
26:11
told us that it was an excellent topic
26:13
and also was curious about the chemical
26:14
compound on your necklace is there
26:16
relevance oh
26:17
um this is serotonin or at least that’s
26:20
how it was advertised
26:23
um i’m a i’m an anxious gal and
26:26
when i have to do it um an interview i
26:29
like to have my my extra serotonin armor
26:32
oh i love that so much what a what a
26:34
great reveal thank you
26:36
christopher for asking that question uh
26:39
christopher had also commented
26:40
um when we were talking about kathy
26:43
hackl and
26:44
and her uh story no one should have to
26:46
shoulder the ugliness of an entire
26:48
species and
26:49
so true thank you for sharing that so
26:52
uh yeah and uh sorry and christopher
26:55
just follows up and says thought so you
26:56
rock
26:57
[Laughter]
26:59
but let’s go back to so online
27:01
interactions are kind of a funny topic
27:04
because i think it’s
27:05
kind of funny that you and i met
27:07
originally through twitter in fact i
27:09
think almost everyone i’ve had as guests
27:11
on this show i’ve met
27:12
originally through twitter so you know
27:14
there is this possibility of having
27:16
real connection and making a really
27:19
meaningful
27:20
relationships with people but there’s
27:21
certainly knowing that twitter and other
27:23
social platforms are
27:24
hotbeds of what we might call antisocial
27:26
activity as well
27:28
so you really delved into this in in a
27:31
lot of ways it felt like
27:33
you know you’re you’re talking about uh
27:36
all of the well the one topic was um
27:40
dylan’s show conversations with people
27:42
who hate me and how he was able to
27:44
confront people who had been you know
27:46
toxic commenters and trolls with him
27:49
that seemed like it made an impression
27:51
on you in in the way you wrote about it
27:54
i just was baffled by the level of
27:58
empathy
27:58
that dylan was able to have for these
28:01
people and i don’t even know if that’s
28:03
what he would have called it
28:04
you know at the beginning but but
28:06
listening to that show
28:08
you know it’s so easy to stop seeing
28:10
people as
28:11
people i mean there’s even some research
28:13
i think i cite in the book about how
28:16
when when people disagree with us we
28:18
tend to see them as less
28:19
human not just online but in general and
28:23
if i saw someone saying nasty things to
28:26
me swearing at me calling me names
28:28
my gut instinct is not and actually this
28:30
is changing a little bit because i think
28:32
about this so much but my gut instinct
28:34
normally would not be
28:35
you know oh what’s this person going
28:37
through like what’s their life like why
28:39
is this what they feel like they need to
28:40
be doing um it would be
28:42
you know things i’m not gonna say out
28:45
loud right now
28:46
um but his his ability to
28:50
first of all convince people to have
28:53
these
28:54
offline conversations and record them
28:56
but then to just like quietly sit and
28:58
just listen to them talk and
29:00
um you know i think it doesn’t always
29:02
work out exactly how you would hope but
29:04
in a lot of cases these people
29:06
end up admitting oh i forgot you were a
29:09
human
29:10
like i didn’t think you would read it or
29:12
this is how i
29:14
relieve stress because my life is
29:16
difficult and i don’t feel like i have
29:17
other ways of relieving stress and
29:19
you know i try to make the point in the
29:22
book too that
29:23
um empathizing with someone isn’t
29:24
endorsing them and i think that’s
29:26
actually
29:26
might be the name of dylan’s new book
29:28
that he’s writing or has written
29:29
oh no empathy is not endorsement um but
29:33
that you know it takes
29:36
effort and it takes working that muscle
29:39
to say
29:40
i can empathize with the fact that this
29:42
person
29:43
is stressed they’ve experienced a lot of
29:45
hardship this is how they take it out
29:48
on people and without thinking that’s
29:50
okay
29:51
you know like it’s really about boundary
29:54
setting i think
29:55
and about um
29:58
moving away from the way that social
30:01
media makes conversations into games
30:03
so he yeah i just think that his work
30:06
has really kind of like given you a
30:08
great example of what it actually could
30:10
look like
30:11
if we had more empathy in conversations
30:13
about really difficult
30:14
things yeah and especially so you know
30:17
you talked about
30:18
and it’s an obvious place to go in
30:20
politics and disagreement and certainly
30:22
we’re going through
30:24
in the u.s you know this big
30:25
presidential election cycle
30:27
uh feels like we always are i know it
30:30
does it doesn’t ever stop anymore
30:32
from the moment there’s an election it’s
30:34
like the beginning of the next campaign
30:37
and i feel that even if even for viewers
30:39
who aren’t in the us
30:41
you must feel subjected to this too it
30:43
feels like it completely overruns
30:45
international news coverage
30:47
so i think one of the things that’s um
30:51
that’s interesting about that to me is
30:53
that there is this
30:54
duality um that i’ve written before
30:58
about how uh where we live online and
31:02
and you know we’ve we’re increasingly
31:04
our digital selves our aspirational
31:06
selves
31:06
and that because they contain data
31:09
representations of
31:10
the things we love and the and and like
31:13
and admire and the
31:14
the date of relationships to the people
31:16
we love and like and admire and so in
31:18
one sense
31:19
it’s like we’re wearing our hearts on
31:21
our sleeves more but on of course we’re
31:23
simultaneously more vulnerable to
31:25
as bill vincent says in comments this
31:27
may have been discussed but
31:29
unfortunately too many online want their
31:32
anonymity to be
31:33
bad or to to act badly and
31:36
what i think so it’s this simultaneous
31:39
vulnerability to this proliferation of
31:41
anonymous trolls and bots and harassment
31:43
and other bad behavior
31:45
so what’s the approach there do we
31:48
how do how do we best you know use that
31:51
the empathy that we we want to have
31:53
that’s a
31:54
human centric skill and navigate this
31:57
complex dualistic
31:59
world what what’s our best bet there oh
32:02
man
32:02
i wish i knew what the best but i mean
32:04
honestly my mind actually
32:06
first this doesn’t really answer the
32:08
question but my mind first goes to
32:11
having better um
32:14
tech like including
32:17
more things about communicating with
32:19
people via social media in
32:21
social emotional learning for kids so
32:25
and just focusing more on empathy
32:29
on that muscle for young people um
32:32
but that aside i think that
32:36
for me so instagram is my
32:40
like horrible boogeyman like that’s
32:42
where i see this happen the most is that
32:44
duality and i love the way you put that
32:46
because that really is where we are
32:47
is that you both are trying to be your
32:51
aspirational self
32:52
on instagram you know you’re you’re
32:54
posting the highlights or
32:56
you’re posting that you’re having a bad
32:57
day but in a way that you
32:59
think will get the most attention um but
33:02
then at the same time
33:04
you will see you know all these
33:07
anonymous users who make accounts
33:09
troll people um i’m currently fascinated
33:12
and horrified by this trend of
33:14
um mom influencers influencers who
33:18
who are really into the q anon
33:21
stuff and are talking like they’re using
33:24
their platforms which are branded they
33:26
have these branded
33:27
partnerships to talk about all of these
33:29
things and
33:31
um but anyway that that part of
33:34
instagram also exists where
33:35
a lot of anonymous arguments and
33:38
trolling happening within there
33:40
i don’t like i don’t i don’t know that
33:43
they’re
33:43
i think the problem is that we’re all
33:46
being we’re all asked
33:47
and at this point it just seems natural
33:49
to all go into the same place
33:51
and talk about the same things in the
33:53
same way
33:54
but we actually all have very different
33:56
needs and
33:58
i wrote about this a little bit in an
34:00
article about
34:03
the mental health culture on places like
34:06
instagram and pinterest where you have
34:08
these pretty
34:10
images that have nice sayings and
34:13
sort of the commodification of like
34:15
anxiety and depression and
34:17
it’s cool to be not okay but like
34:20
then you’re comparing your not okayness
34:24
to other people’s
34:26
um and that that connects to this
34:29
because i think
34:31
being more in touch with what our
34:33
boundaries actually
34:34
are and what we’re actually comfortable
34:38
and capable of talking about and how
34:40
i think that actually is a place to
34:42
start for empathy for others because
34:44
a lot of times when i’ve found myself in
34:47
these kind of
34:49
quagmire conversations which i don’t do
34:51
so much anymore but definitely have in
34:53
the past
34:54
um i realized like it was about that i
34:56
was anxious about something or i was
34:58
angry
34:59
i’m really being triggered by what this
35:01
person is saying that’s about me
35:04
when when i mean that’s a pretty common
35:06
thing
35:07
in psychology and just in general that
35:10
when someone is trolling you
35:11
or being a bully it’s usually about them
35:14
and i think if we get better at
35:16
sort of empathizing with ourselves or
35:18
just setting better boundaries we’re
35:20
going to be
35:20
less we’re going to
35:24
wade into these situations less i mean
35:26
that’s a big ask i know for
35:29
millennials gen z gen x and anyone
35:32
trying to survive right now on the
35:34
internet
35:34
um but you gave some very helpful
35:38
you know useful guidelines so i want to
35:39
talk about one of the studies that you
35:41
did write about
35:42
that is talking about moderating social
35:45
spaces and comment sections and news and
35:47
so on
35:48
but the guidelines that that the study
35:51
present
35:52
actually seem like they’re a pretty
35:54
interesting uh sort of framework for
35:56
us all to to think about how we approach
35:59
interactions online so i’m sure you know
36:02
the
36:03
the study i’m referencing where there’s
36:05
this the staging
36:07
or sorry the three the three basic
36:09
conversational rhetorical moves
36:12
um so i i have them here in my notes so
36:14
you don’t have to
36:15
reference it it was staging evoking and
36:17
inviting can
36:18
can you do you remember enough to you
36:20
know yeah talk through those
36:22
for us so that was fascinating to me and
36:24
i can’t remember but i think that was
36:26
a google api thing where they were kind
36:29
of keeping track of what was
36:31
happening in these comment threads um
36:33
and
36:34
so right they’re those three different
36:36
basic rhetorical moves in a conversation
36:39
and what what i found when i tested it
36:42
because you could
36:43
put chunks of text into um
36:46
this little bot and it would it would
36:48
kind of show you all the different
36:49
rhetorical moves that were happening
36:51
i had had i don’t even remember what the
36:53
conversation was about but kind of an
36:55
argument
36:55
with a group of people and i put a part
36:58
of our conversation into this spot
37:00
and it we were it showed that we just
37:03
all kept
37:04
staging we were all in the state it just
37:06
never really went past staging it just
37:08
kept being like
37:09
this is my argument well this is my
37:10
argument okay but this was my art you
37:12
know just
37:13
cyclical staging is is you’re just
37:16
saying hey look
37:17
this is this is the conversation i’m
37:19
trying to have right right
37:20
setting up your view setting up your ex
37:23
expectation or your
37:25
point and to see that like
37:29
in a graph you know was like oh okay
37:32
it’s like a little bit shameful that wow
37:35
i you know
37:36
that felt so urgent and necessary for me
37:38
to do that and i’m sure
37:40
that other people felt the same way but
37:41
we were not having a conversation
37:43
that was not productive and an exchange
37:47
right so those later two stages seem
37:49
like they’re
37:50
key to to having that more of an
37:52
exchange right so the evoking
37:53
so pointing out relationships between
37:56
so kind of if you if you um
38:00
and unfortunately people kind of do
38:02
weaponize this on facebook where it’s
38:03
really easy to tag people and twitter
38:05
but calling out someone’s name and like
38:08
specifically addressing them or asking
38:10
them a question
38:11
um that’s not a yes or no question and
38:14
that’s not a leading question
38:16
or identifying common ground um i think
38:19
that’s kind of more evoking and then
38:20
what was the third one third one
38:22
inviting so directly soliciting
38:24
participation by asking a question or
38:26
requesting a comment okay right so yeah
38:28
evoking
38:29
i think is more common ground piece and
38:32
um inviting is direct questions but
38:36
yeah i think in an ideal world if we
38:39
could all learn to have better
38:40
conversations like that but i just think
38:42
the way
38:42
that these platforms built it just is
38:45
not
38:46
it’s not intuitive to talk that way it’s
38:48
so gamified
38:49
so how many notifications can we get how
38:52
many
38:53
you know how long can we make this go on
38:55
or how how can i win
38:56
um this conversation but but going
38:59
forward
39:00
um yeah i try to i try not to stage
39:03
as much yeah that’s a it’s a really
39:06
great reminder so
39:07
so as a quick reminder for the listeners
39:10
it was staging
39:11
evoking and inviting and we all spend
39:13
way too much time
39:14
in that staging saying this is how i
39:16
want to have this conversation this is
39:17
my argument
39:18
but if we move toward you know spending
39:20
more time
39:21
in the evoking the relationship between
39:24
the participants and the inviting sort
39:26
of
39:26
in asking for feedback and asking for
39:28
participation
39:29
that could be more helpful and healthy
39:31
and i think what that highlights too
39:33
is that it’s really hard to have like an
39:37
argument with someone or a debate
39:38
or a disagreement with someone you don’t
39:41
trust
39:42
and who doesn’t trust you and so i think
39:44
that’s the main reason and a lot of
39:45
other people have written about this but
39:47
i think that’s the main reason a lot of
39:48
conversations
39:49
online don’t go well because you have no
39:52
reason to think that person
39:54
you know to give them the benefit of the
39:56
doubt you have you don’t know what you
39:58
have in common
39:59
you don’t know um anything about their
40:02
experience and so it’s really
40:04
better obviously to have these kind of
40:06
conversations with
40:07
people that you do know but we’re in
40:09
this weird
40:10
backwards situation where a lot of us
40:12
are afraid to talk to our family and
40:14
friends about
40:14
difficult things but it’s really easy to
40:17
fight with someone on instagram
40:19
you know about the same thing yeah so we
40:22
have a question here from andy pauline
40:24
and i think you just answered my
40:25
question but i’m going to throw it up on
40:26
the broadcast anyway so he says
40:28
the performative aspect of social media
40:30
definitely leads to some toxic
40:32
situations either performative trolling
40:34
or
40:35
performative fake lifestyle but the flip
40:38
side of the promise of social media or
40:39
the internet
40:40
is that you can find people like you
40:42
with your niche interests etc which is
40:44
also a comfort
40:45
so what’s your view on this and it does
40:47
seem like you know you sort of spoke to
40:48
that but anything
40:49
to add there yeah i want to say that
40:52
like you said earlier
40:53
there are so many opportunities for
40:55
connection and i
40:57
like have made so many real friends and
41:00
learned so many fascinating things and i
41:02
do love the ability to connect
41:04
with people who have similar interests
41:06
but maybe have them in a slightly
41:08
different way
41:08
or you know they’re a very different
41:10
person but we connect on this one thing
41:13
um but one thing i’ve noticed is that
41:17
the longer so groups are ways to kind of
41:21
focus on that part of media
41:23
but there have been several big facebook
41:25
groups that i’ve been
41:26
a part of that the bigger they get and
41:27
the longer they go on
41:29
the more they move away from the thing
41:32
that we all have in common
41:34
and become these just as bad as the rest
41:37
of social media
41:38
so i think yeah when did this knitting
41:41
group become this
41:42
fascist group all of a sudden right
41:45
but that actually also brings up so so
41:47
the influencer thing i was talking about
41:49
there’s a lot of people who create
41:51
communities and then they
41:55
you know can can make them something
41:57
else all of a sudden just
41:58
and it’s just about making money or
42:01
followers
42:02
which then makes you money and but the
42:04
people who are following don’t know that
42:05
they think it’s
42:06
real to a certain extent because we’re
42:08
human and you know we believe what we
42:10
see
42:11
um but i think that groups
42:14
i still have faith in groups i still
42:16
think that we
42:18
um there’s got to be some way that we
42:20
can we can
42:21
have maybe a new platform or change one
42:23
of these platforms to where
42:26
you know there can be smaller groups
42:28
with maybe moderating that is more
42:30
humanist human focused empathy focused
42:34
um that you know makes them usable
42:37
honestly reddit is a better me these
42:40
days than
42:41
any other platform and i think part of
42:42
that is because they’ve really taken a
42:44
hard line
42:46
recently i know it took a lot to get
42:48
there but they’ve taken a hard line
42:49
recently
42:50
on the really bad stuff and just said we
42:53
don’t care
42:53
it’s just not going to be here anymore
42:55
and it’s in in their
42:57
moderation teams on the different um
43:00
subreddits you know really stick to
43:03
their guns and put a lot of effort into
43:05
it and i think that’s really what it
43:06
takes to
43:07
um you know to to
43:10
to focus on that community building part
43:13
and not get bogged down and
43:14
the rest of it yeah i like that
43:16
statement that i still have faith in
43:18
groups that’s a
43:19
it’s a good takeaway i also have a a
43:22
question here from christopher dance
43:23
saying i found success in providing
43:26
a reflection of what is being
43:27
communicated with the person they say
43:29
they want to be
43:30
did you come across anything in your
43:32
research regarding this approach and it
43:33
seems like you did
43:34
actually uh talk about that a little bit
43:37
do you have a
43:38
so is that like is that sort of like the
43:40
idea of
43:42
if someone disagrees with you you kind
43:43
of restate
43:45
what you think they’re saying to make
43:47
sure you’re on the same page
43:49
yeah you can christopher are welcome to
43:50
um clarify
43:52
yourself but it seems to me that it’s
43:54
partly that and partly
43:55
you know kind of trying to see the the
43:58
person
43:58
who’s who’s communicating which i think
44:00
you you wrote well about
44:02
you know one of the things seemed to be
44:04
you know getting offline
44:06
as soon as possible right well i think i
44:09
think that
44:11
it so depends but
44:14
if you’ve had if you’ve had a lot of
44:16
internet conversations
44:18
you can probably tell at this point like
44:20
when
44:21
you’re talking to someone who wants to
44:23
have a conversation and when you’re
44:24
talking to someone who just wants to get
44:25
a rise out of you
44:26
and i think that if you’re able to
44:28
identify when
44:29
there’s a potential productive
44:32
conversation that could be productive
44:33
for you and or the other person um
44:37
trying to take it to dm or you know if
44:41
you feel safe doing that
44:42
um if it’s someone that you know and you
44:45
maintain that relationship
44:46
trying to have the conversation just in
44:49
a less
44:51
just there’s just something about the
44:52
fact that so many people are like
44:54
watching
44:55
quote-unquote that makes like facebook
44:58
twitter instagram feels
44:59
it’s so much more high-stakes and it’s
45:01
so much harder to access empathy
45:04
when you are in like fight-or-flight
45:06
mode so removing that piece
45:08
um unfortunately a lot of times people
45:10
gonna say they’re just gonna call you
45:12
names and say
45:14
you know i’m not or just like i have a
45:15
cousin who i’ve tried that with and he
45:17
just doesn’t talk to me anymore so like
45:19
yeah i mean i think christopher says i
45:21
go to the person’s character
45:23
nobody wants to be associated with bad
45:24
behavior or character flaws and i think
45:26
you know to the point
45:27
where you’re actually interacting with a
45:29
person and not a bot and you know not
45:30
somebody who
45:32
has somehow been lured into the
45:34
anonymity of troll behavior
45:36
you know then then you may have some
45:38
success with that what is your thinking
45:40
there
45:40
yeah but that makes sense to me my only
45:44
concern is that people gets get very
45:47
and and this isn’t just this isn’t a
45:49
critique this is how our brains work
45:51
if you feel like someone is judging your
45:53
your
45:54
morals or your identity that’s what
45:57
heightens that fight-or-flight response
46:00
and
46:00
raises your blood pressure and makes you
46:02
less reasonable and yeah
46:04
so if there’s a way of engaging someone
46:07
on that level without like
46:09
accusing them of things um but again it
46:13
depends on the person
46:14
and that’s where i go back to you have
46:16
to know
46:17
yourself and what you’re comfortable
46:19
with and what your boundaries are and
46:21
remember that
46:22
not everyone has the same boundaries i
46:24
think if more people were able to
46:26
come at social media with that we would
46:28
have less of
46:30
the issue that we have but again it’s a
46:32
tall a tall order
46:34
it’s really hard and so the follow-up
46:35
another question what are your thoughts
46:37
on cancel culture and purity tests and
46:39
it does seem like we
46:40
you know we sort of danced around that a
46:42
little bit or touched on it a little bit
46:44
but it’s in your book you talk about uh
46:46
and it was again
46:47
um uh dylan i think was talking about
46:50
this right the idea i
46:52
have the quote here it can feel weird to
46:54
be empathizing with someone you
46:55
profoundly disagree with
46:57
especially in an age when people say
46:59
you’re just as bad as them if you
47:00
empathize with them
47:02
i thought that was a really profound
47:04
observation on his part
47:05
and we do have that problem where you
47:07
know we’re trying to
47:08
you know on one hand i think that um
47:12
we imagine that if we did the right
47:13
things or said the right things that we
47:14
could reach across and influence people
47:16
who don’t agree with us and yet on the
47:18
other hand
47:19
we do kind of have this sort of purity
47:22
standard and a desire not to be seen
47:24
to be weak or to be giving ground to
47:27
people who disagree
47:28
and so you know i i wonder if we have
47:31
any kind of
47:32
tools coming out of what you know what
47:34
you’ve learned and what you’ve seen in
47:35
the research that
47:36
that give us you know kind of the
47:39
necessary steps to be able to
47:41
communicate with people and engage
47:44
engage at the level we’re comfortable
47:45
with and in a way that’s going to be as
47:47
productive as we can make it and
47:48
and also disengage when we need to
47:52
this is a really great question i think
47:54
that right now
47:55
most of the
47:59
um solutions around this are more
48:02
band-aid
48:02
things like twitter making it easier for
48:05
you to control who replies to you
48:07
or um for a long time there have been
48:09
things where if you’re getting like a
48:10
ton of responses there are bots that can
48:13
just
48:13
make them all go away um but i actually
48:16
think
48:18
i think that there’s a lot going on here
48:19
at the same time i think
48:21
in general i don’t like the phrase
48:24
cancel culture because i think it’s kind
48:26
of been used to
48:28
to lump a bunch of different things
48:30
together and say that
48:31
they suck um which i don’t think is fair
48:34
um i and i think you’re right that again
48:36
we have a duality where we’re
48:40
social media in addition to helping
48:42
people connect with one another has
48:43
given
48:43
a lot of people a platform who would not
48:45
other platform so
48:47
a lot of people of color a lot of women
48:49
younger people
48:50
people with disabilities all kinds of
48:52
people who we normally wouldn’t
48:54
um you know survivors of sexual assault
48:56
just people who can
48:58
take this platform and they’re on the
49:00
same level as like a new york times
49:02
journalist and
49:03
say things and i think that’s really
49:05
important and that’s part of what’s led
49:06
to a lot of the reckoning we’ve had
49:08
where people have gotten
49:09
unquote canceled um but i think as you
49:12
mentioned
49:14
there is also a concern where it becomes
49:17
a performative thing where people say oh
49:19
who do we hate today how can i torture
49:22
them um
49:23
and that is i think the piece that
49:29
again like it’s it’s such a personal
49:31
thing it’s such a like
49:33
tell everyone like can you examine why
49:35
you feel the need to do this but
49:37
barring that i think that some of the
49:40
little tools that the platforms are
49:42
creating
49:42
that are kind of band-aids are helpful
49:45
um i don’t really know
49:48
like i think the only way you stop that
49:49
is by removing
49:51
the reward for doing it right so that’s
49:54
why i think
49:55
you know in some like instagram you can
49:59
there are some there was a test where
50:01
you could um
50:02
[Music]
50:04
delete or um not allow comments i think
50:07
you can
50:07
you can do that you can say no comments
50:09
but another thing is
50:10
these these moderation tools that are
50:13
being created and they’re a little
50:14
controversial because
50:16
you know it’s a as i talk about in the
50:18
book and i’ve seen you talk about too i
50:20
think
50:21
you know ai knows what we teach it and
50:24
so if we teach it
50:25
bias it learns bias but there are some
50:27
ai tools
50:28
um including the one that we were
50:30
talking about that that recognizes the
50:32
different
50:33
um rhetorical tools um or moves
50:37
that are being created to
50:40
monitor conference and kind of nudge you
50:43
and say like are you sure you want to do
50:45
this so i know nextdoor has this
50:47
twitter i think started rolling it out a
50:50
little bit
50:50
but it’s i mean these are private
50:53
companies who
50:55
don’t really have that much at stake
50:57
with this and they’re not going to
50:59
so people you know i saw when it got
51:00
rolled out a little bit on twitter
51:02
people were getting things that were
51:04
like are you sure you want to say this
51:05
and they just screenshot that and tweet
51:07
it and are like
51:09
lol twitter is asking yeah i’m gonna
51:11
never tweet yeah
51:12
i’m gonna say it twice now i’m gonna say
51:14
it 15 times so
51:16
i don’t know yeah you know you’re making
51:19
a really good point uh
51:20
in what you’re presenting that that i
51:23
think
51:23
needs to be spelled out even more which
51:25
is it’s important that we figure
51:27
between mere differences of opinion
51:30
that can actually make us better people
51:33
by understanding each other’s
51:34
perspective and make us more
51:36
well-rounded
51:37
and better able to appreciate different
51:39
life experiences
51:40
and what we each bring to the
51:42
interactions that we have
51:44
and that’s assuming good faith though
51:45
that’s assuming that people are coming
51:47
to interact with each other and meet
51:49
uncommon
51:50
with the intention of understanding one
51:53
another and that of course
51:54
is not it’s an idealist uh situation
51:57
that doesn’t happen
51:58
all that often because we have that army
52:01
of trolls and bots
52:02
and bad actors who are acting
52:04
anonymously and it’s really hard
52:05
sometimes to ambiguate between those two
52:07
or distinguish between them
52:09
so that’s a level of sophistication
52:10
that’s kind of on us
52:13
as as users of these platforms because
52:15
the platforms
52:16
haven’t done a sophisticated enough job
52:19
exactly
52:20
disambiguating those accounts and i
52:22
don’t know if they can i mean if we’ve
52:24
been
52:24
especially as women if we’ve been now
52:26
conditioned then anytime
52:28
you know we put something out on the
52:30
internet it’s likely a man is gonna say
52:33
to us i mean now you’re on alert for
52:35
that right and if you’ve been trolled a
52:37
lot you’re on alert for that and
52:39
how do you we can’t just tell people
52:41
who’ve been
52:42
you know significantly harmed in some
52:44
cases like having their houses swatted
52:46
having to move just for tweeting um
52:49
that oh you should just try to you know
52:52
assume good faith in every
52:53
and so we kind of but i think that’s a
52:56
good point that there there is a piece
52:58
of this that we that they’re they’re
53:00
in there is a personal responsibility
53:04
piece of this because the tech just
53:06
doesn’t do that and i don’t know if we
53:08
want it to yeah i’m not sure we wanted
53:10
to i think that is a level of
53:11
sophistication that
53:12
we ought to equip ourselves with you
53:14
know more and more
53:16
is the ability to recognize good faith
53:18
from bad faith
53:19
you know to the extent that we can and
53:20
and to the extent that it
53:22
that it’s a dichotomy such as that and
53:25
the ability to say
53:28
do i feel capable of reading 10 000
53:31
other people’s thoughts today
53:33
healthily and if not can i convince
53:35
myself that i don’t need
53:37
to do that because i think we’ve all
53:38
just convinced ourselves that
53:40
you have to check twitter 15 times you
53:43
have to scroll on instagram for 15
53:45
minutes and it’s like
53:47
i mean you don’t have to like if you’re
53:50
blowing my mind here
53:52
hey i mean i i’m talking about myself
53:55
like i have a struggle i have the
53:57
time limit thing i very often will say
54:00
ignore
54:01
you know and keep going so yeah but i
54:03
think we
54:04
if we think of it i think just thinking
54:06
about that more
54:07
yeah is a step in the right direction
54:09
sure so i think with our last few
54:12
minutes here it would be helpful to
54:13
circle around to that
54:14
that topic of how do we increase our
54:16
empathy because you did actually have
54:18
some studies and some
54:19
uh examples that you cited that can
54:22
actually help us you know i feel like
54:23
this is a topic that’s evergreen when i
54:26
speak at companies on digital
54:28
transformation and when i talk
54:29
to corporate leaders uh and design teams
54:32
and so on
54:33
so often one of the questions that comes
54:34
back is we understand it’s important to
54:37
empathize with people
54:38
but what if you just aren’t very good at
54:41
it like how do we actually
54:43
develop that muscle as you mentioned
54:45
earlier and and i think
54:47
too as a team how do people build
54:50
empathy as a team
54:52
um so so do you have some specific
54:55
uh guidelines or suggestions that came
54:57
from the research
54:59
um one thing that has actually been
55:01
researched that has nothing to do with
55:03
technology actually is just reading
55:04
fiction
55:05
this has been shown especially in kids
55:07
that you know
55:09
i mean if you think about it it makes
55:10
sense you’re literally putting yourself
55:12
in the shoes of a character
55:13
that’s life is probably really different
55:15
from yours and learning how to process
55:17
the emotions of someone who isn’t you
55:20
and learning to do counterfactual
55:22
thinking which is something i learned
55:24
from the amazing jane mcgonagall who i’m
55:26
sure you know um where you’re kind of
55:28
imagining
55:30
something that’s never happened but is
55:32
possible and like
55:33
how your body and brain might react to
55:36
that
55:37
um that is huge i i just realized it’s
55:39
amazing
55:40
that that’s a that’s a thing that just
55:42
reading literary fiction
55:44
yeah can actually increase your empathy
55:46
and it doesn’t take that much effort and
55:47
it’s fun
55:48
right absolutely but as a team thing i
55:51
think is a really interesting
55:52
point because especially you know
55:56
this summer with there being you know so
55:59
much unrest related to the death of
56:01
george floyd and
56:02
black lives matter and police brutality
56:03
there’s this rush to
56:05
um like remember diversity and inclusion
56:08
initiatives let’s
56:10
let’s do those again and and i think you
56:12
can’t
56:13
there’s a lot of pros and cons to those
56:16
things that i won’t go into and i’m not
56:18
an expert on but
56:19
i think trying to have a
56:22
meeting or a or a workshop
56:26
about building empathy just kind of out
56:28
of the blue i don’t think is gonna
56:30
be effective i think if people don’t
56:32
trust each other like we mentioned
56:34
before people don’t feel safe if they
56:36
don’t feel like they know
56:36
each other um it’s not gonna be
56:39
effective
56:40
and i think a lot of times when
56:42
businesses are talking about empathy
56:44
they’re talking how can we have more
56:47
empathy for our customers
56:49
and like there are frankly just
56:52
things that don’t make business sense
56:54
that make empathy sense
56:55
you know um so i don’t know that’s
56:59
that’s
57:00
really hard but i think the
57:04
just basic things like exposing yourself
57:06
to more ideas and
57:08
and people that are different from you
57:10
um following
57:12
people on social media who are different
57:13
from you not to hate them and not to
57:16
you know make yourself angry but to just
57:18
kind of absorb and not
57:20
respond like just remembering you don’t
57:22
always have to respond
57:23
you can take a breath um a lot of this
57:27
is kind of like cognitive behavioral
57:28
therapy stuff
57:30
um and celeste hedley um she just hasn’t
57:33
has a new book out but she had a book
57:35
out a few years ago
57:36
called we need to talk and i mentioned
57:38
that in the book and
57:39
there’s amazing tips in there for just
57:42
like conversational
57:43
um tools or moves that that really work
57:48
to show empathy um but
57:51
yeah but that the question about you
57:53
know as a group building empathy as a
57:55
group that’s something i’ll have to
57:56
think about more because that’s really
57:57
interesting
57:58
great i i hope you do and i hope you’re
57:59
able to publish some some work on it
58:01
because i think it is a needed area i
58:02
mean obviously
58:03
you alluded earlier to um
58:06
the notion that technology shouldn’t be
58:09
built and tools shouldn’t be built
58:11
for a population that don’t include that
58:13
population so certainly that’s one of
58:15
the answers
58:16
is this is why we need diversity and
58:19
inclusion
58:20
is because we need to be able to address
58:23
holistically and meaningfully like what
58:25
the lived experiences of a diverse group
58:27
of people are
58:29
yeah i mean maybe it is that i’m i’m
58:31
sitting here trying to think about like
58:33
something really
58:34
intellectually rigorous but maybe the
58:36
answer is literally just like oh
58:37
if if if everyone in your development
58:41
suite looks the same like that might be
58:43
part of that might be a place to start
58:45
it’s certainly a good place to start
58:46
you know one of the other little tidbits
58:48
that i thought was so interesting
58:51
um was that you had a study that you
58:53
cited
58:54
that said that we actually empathize
58:57
better to things that we hear
58:59
as opposed to what we read that’s so
59:02
fascinating this is part of why i am a
59:04
huge proponent of podcasts and it’s why
59:06
i talked about dylan marin’s podcast so
59:08
much
59:09
um and there’s a few other podcasts i
59:11
listen to where people
59:13
like if i was reading what this person
59:16
was saying
59:17
i might be like ugh you know
59:20
roll my eyes i i’m not interested or not
59:22
finish it but
59:23
it’s so it’s such a passive way to
59:25
consume things
59:27
but you can there’s no way to
59:30
immediately
59:31
have a knee-jerk response so i found
59:34
myself listening to like hour and a half
59:35
long interviews with people i
59:38
whose beliefs i kind of despise but i
59:42
want to understand and way that doesn’t
59:45
like raise my blood pressure or make me
59:47
feel like i need to respond in the same
59:49
way that social media
59:50
does um and yeah there was research that
59:53
showed that people who
59:55
um people who only could hear each other
59:59
and couldn’t also see or scored higher
60:02
on empathy in conversations with one
60:05
another so
60:06
interesting yeah i mean we definitely
60:08
need more research in
60:09
that area um but yeah that’s
60:12
so i’m a big podcast person i love it i
60:14
love that that answer and
60:16
so one last quick thought before we wrap
60:18
up and get you know
60:19
the information on where people can find
60:20
you in your work uh i noticed you say
60:23
on twitter that you’re pregnant is that
60:25
right yes congratulations that’s amazing
60:28
uh have you given any thought yet to
60:31
what you’re going to be doing as your
60:33
child grows as far as tech exposure and
60:36
and limitation and things like that i
60:39
it is so crazy i’m i’m not due until
60:43
january and it’s amazing the number of
60:44
things i already need to
60:46
apparently have figured out um but i’ve
60:49
thought about this a little bit i am
60:50
reading a bit about
60:52
um some of the new research on and how
60:55
people are really more concerned the
60:58
experts are now saying parents
61:00
should be more concerned with what kids
61:02
are doing
61:03
on screens than how much time they spend
61:05
doing it um
61:06
which i find really interesting and so i
61:09
guess
61:10
my thought is that i’m not gonna
61:14
i’m not a um
61:17
like cut xyz out of your life kind of
61:20
person i find that doesn’t
61:21
work for me and for a lot of people so
61:24
you know i’m not gonna give my
61:26
one-year-old an ipad probably but
61:28
i i think being involved um
61:32
in that exposure is really something i’m
61:36
gonna try to do
61:36
i i loved this thing that i read
61:38
recently um from a researcher whose name
61:41
escapes me at the moment
61:42
who said that we basically need to have
61:44
new dinner table conversations with our
61:46
kids
61:47
where instead of saying you know like or
61:49
in addition to saying what did you do at
61:51
school today
61:52
um or you know what did you learn today
61:55
but
61:55
specifically what did you learn on the
61:57
internet today or like
61:59
what what’s like a funny thing you saw
62:01
on the internet today
62:02
making that kind of the new dinner table
62:04
conversation so you’re involved
62:06
um and then you’re also acknowledging
62:09
that
62:09
it’s not always a waste of time like
62:12
kids are
62:12
looking for support on the internet
62:14
they’re learning things
62:16
they’re being super creative and um
62:19
i yeah so i so i like the direction that
62:21
that’s going i try not to worry about it
62:23
too much until it comes up but it’s good
62:25
they may even someday connect with a
62:27
really brilliant author and have them on
62:28
their live show
62:30
i mean it’s possible if possible
62:33
so caitlyn thank you so much for being
62:35
here with us today how can people find
62:37
you
62:37
and your work what’s the best way to
62:39
connect with you online so i am on
62:41
twitter
62:42
um despite everything i just said i am
62:45
on twitter
62:45
at caitlin hugolick so it’s
62:51
k-a-i-t-l-i-n-u-g-o-l-i-k
62:52
um and i i also have a website um
62:56
you can actually go to
62:57
thefutureoffeeling.com which is the name
62:59
of my book
63:00
or caitlin.com and you can bear um
63:03
i have a newsletter there that you can
63:04
sign up for that has been kind of on
63:06
hiatus during quarantine but hopefully
63:09
uh we’ll get back up and running soon
63:11
sadly not the cake newsletter though
63:13
no i started a cake newsletter at the
63:15
beginning of quarantine and it was
63:16
fun but then it just became too much
63:18
work um
63:20
but yeah if you want to read other
63:22
things i’ve written it’s at my website
63:23
and there’s a link to
63:25
to buy the book there as well if you’re
63:26
interested all right and christopher dan
63:28
says thanks so much caitlyn i appreciate
63:30
your perspective this was enlightening
63:32
so thank you thank you uh thanks caitlin
63:35
i’m gonna
63:36
go ahead and sign off with you and um i
63:39
so much appreciate you being here
63:40
thank you thank you all right everyone
63:44
wasn’t that wonderful
63:45
it was it’s such a fantastic subject and
63:48
i i guess i feel like
63:49
again we talk so much about empathy and
63:51
we so rarely
63:52
have really good tools to talk about it
63:54
with so i hope this was helpful to you i
63:56
hope you learned
63:57
some things that you can share and i
63:59
hope you’ll go and buy caitlyn’s book
64:02
so please do that and please subscribe
64:05
and follow everywhere
64:06
share with friends say the word and
64:08
we’re going to come back next week
64:10
with another great guest i look forward
64:12
to seeing you all then
64:13
thank you for being here

1 Comment

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.