Podcast: Play in new window | Download
About this episode’s guest:
Calli Schroeder is an attorney in privacy/data security, as well as a privacy advocate and self-described “die-hard nerd.” Her brief stint as a wedding singer led her to wonder if she had violated copyright law, an interest which transitioned to tech law broadly and privacy law specifically. While in law school, Schroeder interned for FTC Commissioner Julie Brill, published an article on consent issues and IRBs in the Colorado Technology Law Journal, among other accomplishments distinctions. She developed a focus on consumer protection issues, surveillance, data breaches, and freaking people out at parties.
She tweets as @Iwillleavenow.
This episode streamed live on Thursday, July 30, 2020. Here’s an archive of the show on YouTube:
About the show:
The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O’Neill.
Subscribe to The Tech Humanist Show hosted by Kate O’Neill channel on YouTube for updates.
Full transcript:
00:23
[Music]
00:43
all right
00:56
welcome to the tech humanist show
00:59
a multimedia format program
01:02
exploring how data and technology shape
01:04
the human experience
01:06
i’m your host kate o’neill please
01:09
subscribe or follow wherever you’re
01:10
watching or listening to this so that
01:11
you won’t miss any new episodes because
01:13
we have a lot of great guests coming up
01:15
and today’s guest is no exception today
01:18
we are talking with
01:19
kelly schroeder an attorney in privacy
01:22
and data security
01:23
she’s a privacy advocate and
01:25
self-described die-hard nerd
01:27
i think we many of us can relate to that
01:30
i love this part her brief stint as a
01:32
wedding singer led her to wonder
01:35
if she had violated copyright law an
01:37
interest
01:38
interest which transitions into tech log
01:41
broadly and privacy law specifically
01:43
while in law school straighter interned
01:45
for ftc commissioner julie brill
01:47
published an article on consent issues
01:49
and irbs in the colorado
01:51
technology law journal among other
01:54
accomplishments and distinctions
01:55
she developed a focus on consumer
01:57
protection issues surveillance
01:59
data breaches and freaking people out at
02:01
parties
02:03
among the certifications she holds are
02:04
privacy designations for the u.s
02:06
europe and canada she’s originally from
02:08
whitefish montana
02:10
and cali holds an undergraduate degree
02:12
in peace studies from whitworth
02:13
university
02:15
outside of work she enjoys traveling
02:17
music and extensive debates
02:18
about comic books i’ve been so looking
02:21
forward to this so
02:22
please help me welcome i think maybe
02:25
that might have fixed it
02:26
uh someone please oh here we go uh
02:29
michael rockwell kate o’neil we lost
02:30
your audio we can hear
02:31
cali but not you thank you i am having
02:34
such a
02:35
a tough time with the audio for this
02:37
show i don’t know why
02:38
i definitely need to look into that but
02:41
let me
02:42
let me uh just go through and make sure
02:45
that
02:45
everybody knows that we’re talking to
02:47
cali straighter and she’s a an attorney
02:49
who focuses in data and privacy issues
02:52
all right michael rockwell confirms i’m
02:55
back now thank you michael
02:58
so um back to talking about the
03:02
copyright law around uh the wedding
03:06
singing
03:06
but that you know so so that leads you
03:08
into into um
03:10
tech law broadly so it’s like you’re
03:12
interested in the copyright application
03:14
but how does that then translate into
03:16
becoming interested
03:17
in in tech law and privacy
03:20
uh it was it was an interesting shift i
03:23
started taking copyright and ip classes
03:25
in law school just
03:26
because i thought that was interesting
03:27
given my background
03:29
and part of what got me into tech more
03:31
broadly and into privacy specifically
03:33
was
03:34
i really loved the professor i had for
03:36
copyright law so i just started taking
03:38
more and more of his classes
03:39
um his name is paul ohm he now heads up
03:42
teklon
03:43
policy division over at georgetown shout
03:45
out to paul
03:46
um all right he’s great um but he
03:50
because he’s got a programming
03:51
background he taught some really
03:52
fascinating classes so he had a jan term
03:54
class on uh
03:55
technology law and coding for lawyers uh
03:58
privacy for lawyers uh
04:00
he taught a computer crimes class and oh
04:03
that sounds really
04:03
fascinating oh it was so cool i loved
04:06
that class
04:07
um i read a lot about snowden for a
04:10
stretch
04:11
uh and and through that i broadly got
04:13
more and more interested in privacy more
04:15
and more interested in surveillance tech
04:17
uh i joined and later worked for the
04:20
colorado
04:20
technology law journal and that led to
04:23
me
04:24
getting more and more into these these
04:27
tech issues
04:28
so it it kind of came about by accident
04:30
i’ve never been a particularly techy
04:32
person
04:33
i’m a privacy and tech lawyer who will
04:36
openly admit because i i know it’s a
04:38
huge mistake to pretend you know more
04:40
than you do
04:40
i i don’t know how to code i’m trying to
04:43
teach myself and i’m very slow because
04:45
i’m not good about setting aside
04:47
practice time
04:49
but i i just find the area so
04:52
fascinating
04:53
and and now i’m i’m deep in the weeds of
04:56
it
04:56
yeah increasingly so right like so
04:58
you’re you have certifications
05:00
uh in privacy as a intern and what does
05:04
the cipp stand for
05:05
remind us uh it’s the certified
05:08
information professional
05:10
okay and leave and you have that
05:13
professional
05:14
information privacy professional uh say
05:16
that a few times fast i guess
05:18
so you have that designation applying to
05:21
is it canada
05:22
the us and europe are those those three
05:25
canada u.s europe and now i have the
05:27
cipm which is
05:29
the the privacy management and the the
05:31
fip which is the fellow of information
05:33
privacy i
05:35
full disclosure part of why i have so
05:36
many certifications is because
05:38
i worked for the iapp after law school
05:40
and while i was working there i could
05:42
take the tests for
05:43
for free when they offered them so
05:45
that’s great that’s that’s part of why i
05:47
just took a test
05:48
every single time they offered the
05:50
chance to the smart i i remember i
05:52
used to work at a language school and
05:54
one of the perks we had was that we got
05:55
to take free language tests
05:57
i just kind of i just kept racking up
05:59
all kinds of language
06:00
classes that’s amazing certification so
06:03
uh
06:03
you know but but as you as you’ve gotten
06:06
deeper into that and you have all these
06:07
certifications and you are working in
06:08
the space
06:09
it’s become a more interesting space i
06:11
mean every year it seems like it becomes
06:13
a more interesting space
06:15
so you’ve got you know gdpr hitting the
06:18
scene and what was it 2018
06:20
right yep yeah my my big projects when i
06:23
was at the iapp
06:24
the the gdpr had been passed and was
06:27
going to go into effect shortly and so
06:29
we were doing a lot of
06:30
redrafting the the eu documentation and
06:34
trying to put out guidance and really
06:36
interpret
06:37
how that was going to be applied so i
06:38
was doing all of the prep work at the
06:40
iapp
06:41
and then jumped to a firm afterwards
06:42
where i was doing more of the practical
06:44
client guidance on how to be compliant
06:47
before the effective date hit
06:48
okay okay i was going to ask you to sort
06:50
of walk us through what that kind of
06:52
what that type of work looked like but
06:53
i think so many of us received emails
06:56
notifying us that it was coming that we
06:57
probably
06:58
have a sense of what that looked like
07:01
what was that
07:02
like boots on the ground was that what
07:03
that was there’s a lot of guidance and
07:05
handholding about like who you’re gonna
07:06
have to notify and by when and stuff
07:08
like that or was it
07:09
uh it was it was a lot a lot of working
07:12
directly with clients that way and a lot
07:14
of it was really building out an
07:15
internal privacy program for clients
07:17
that
07:18
for a lot of us based clients it’s not
07:20
that they were purposely negligent when
07:22
it came to data protection or data
07:23
mapping practices
07:24
they just had never had any real
07:29
regulatory motivation to pay attention
07:31
to that and so it always fell by the
07:33
wayside when it came to allocating
07:34
resources so
07:36
we were building a lot of programs from
07:38
the bottom up we were
07:39
doing data mapping for the first time in
07:41
companies you know they were figuring
07:42
out exactly where all the information
07:44
that they
07:45
collected went and what they collected
07:47
and why they used it
07:48
and what legal bases we could use to
07:51
justify that or whether they needed to
07:52
start deleting stuff
07:53
and um it was it was interesting you get
07:57
a really good
07:58
idea of kind of the inner workings of
07:59
companies and
08:01
uh start to develop a lot of strategies
08:03
for
08:04
figuring out what questions to ask and
08:06
when you need to
08:08
push a little deeper on on things
08:11
nobody likes to admit that they may not
08:13
know exactly where everything goes
08:15
within their company but
08:16
particularly for large companies or
08:18
companies that collect a lot of data
08:19
it’s extremely common
08:21
so yeah i would imagine and so and then
08:24
of course we have
08:25
the california privacy act that comes
08:28
online that came online what january
08:29
this year
08:31
is that right so was it again sort of a
08:34
repeat of
08:35
gdpr that was interesting we had a
08:37
really big split with the ccpa
08:40
for gdpr i feel like uh it kind of put
08:43
the fear of god in a lot of u.s
08:44
companies so
08:46
everyone was scrambling for compliance
08:47
we had a huge influx of people asking to
08:49
help
08:50
uh help them with compliance before the
08:52
due date hit
08:53
with ccpa i saw a real divergence in
08:56
approach there were there were a lot of
08:58
companies that were very responsible and
09:00
um came to us wanting us to help them be
09:03
compliant with the ccpa
09:05
right as the enforcement date hit so
09:06
they were ready to go
09:08
um and then not necessarily my clients
09:11
but i
09:11
i did see in quite a lot of instances
09:13
there seemed to be
09:15
uh more companies that were willing to
09:17
gamble when it came to compliance that
09:18
way
09:20
i don’t know if that’s because they they
09:23
thought the attorney general wouldn’t be
09:24
able to enforce
09:25
and mass or whether they thought that
09:28
you know their their privacy practices
09:29
were maybe not fully compliant but
09:31
probably good enough to skate by for a
09:33
while and kind of wanted to get the lay
09:34
of the land before figuring out
09:36
resources but
09:37
uh it was surprising seeing that split
09:39
from the gdpr
09:40
and now with the cpra on the ballot
09:42
it’ll be even more interesting because
09:44
if that passes it
09:46
it will definitely raise the amount of
09:48
enforcement actions
09:50
yeah and so do you feel like d did you
09:52
observe anecdotally
09:53
that it changed the discourse at all
09:56
when
09:56
uh the gdpr um suits started happening
10:00
and
10:00
that the enforcement was happening and
10:03
and google and other companies were
10:05
actually receiving
10:06
fines was that changing the way
10:08
companies were taking it seriously
10:10
from what i’ve seen yes and and in
10:12
particular one of the things i noticed
10:13
is that
10:14
clients even of smaller mid-level
10:16
companies that maybe hadn’t paid much
10:18
attention to privacy or data security
10:19
before
10:20
uh were paying attention to these
10:22
lawsuits they were coming to
10:24
us saying hey i just read about this
10:26
should i be worried do i need to change
10:27
things
10:28
what should i be looking at i mean for
10:30
example we just had the shrems 2 case
10:31
happen
10:32
i had a bunch of clients reach out to me
10:34
proactively before we had issued our
10:36
statement on it
10:37
saying this seems like it changes things
10:40
what do we need to do
10:41
how should we take care of this and
10:43
frankly it was pretty refreshing to see
10:45
that so many of our clients were now
10:47
actively paying attention to this on
10:48
their own as an issue
10:50
yeah that makes sense and so i guess you
10:53
know and one thing i’m really curious
10:54
about
10:55
to talk about with you and and i i want
10:56
to um
10:58
i want to spill that you uh told me via
11:01
email as we were prepping for the show
11:02
you said
11:03
that you have minors in both philosophy
11:05
and theology
11:06
in your undergrad and so you
11:08
occasionally get a little into privacy
11:09
affecting the nature of society and
11:11
humanity
11:11
and you said but i will try to reel it
11:13
in and i’m like no don’t reel that in
11:15
that’s exactly what we want to bring to
11:17
the show so i guess
11:19
you know one question i have for you on
11:20
that in in that more esoteric
11:23
you know kind of uh philosophical bent
11:25
is
11:26
do you think that the way we conceive of
11:28
privacy
11:29
has changed or is changing or is due to
11:32
change
11:33
uh is like where do you think the
11:36
uh you know what externalities do you
11:38
think kind of come into the way that
11:40
that
11:40
humans conceive of privacy and what it
11:43
what is going to
11:44
happen over the next few years over the
11:46
next
11:47
few milestones what do you what do you
11:49
see those as being
11:51
yeah not a small question i understand
11:53
not a small question
11:54
but an important one um to preface
11:56
there’s
11:57
i i’m a little split in my uh
12:00
professional and personal approaches to
12:01
privacy so
12:02
obviously personally i’m i’m very very
12:05
privacy protective and i’m
12:06
very loud about that online uh and
12:09
personally i
12:10
i believe myself to maintain those
12:14
those uh values but i i need to channel
12:17
them in practical ways that clients can
12:19
actually use like there’s
12:20
there’s the idealistic form of privacy
12:23
that i think i have in my personal life
12:25
and then there’s also the practical form
12:27
of well okay but this company is not
12:29
going to just stop collecting data
12:31
right right and so how can i how can i
12:33
guide them in a way that makes them do
12:35
it as responsibly as possibly and as
12:36
ethically and transparently as possible
12:39
while meeting regulatory requirements so
12:41
this i’m gonna go full into my
12:43
personal answer on privacy perspectives
12:46
but
12:47
uh so i do believe that our our um
12:51
approach toward privacy
12:53
has been fundamentally changing i
12:55
believe that for many years it’s been
12:56
changing
12:57
in the direction where people haven’t
13:00
been paying attention to it and have
13:02
lost a good
13:03
amount of uh the kind of privacy that we
13:06
we were used to and part of that comes
13:08
from the development of technology you
13:10
know we have
13:11
um so much more prevalent technology
13:13
that’s really permeated every area of
13:14
life you have
13:15
you know in i there there’s smart
13:17
fridges on the market right it’s really
13:19
hard for me to find
13:20
non-connected kitchen devices when i was
13:23
redoing yeah we just bought an
13:24
air conditioner an in-window air
13:26
conditioner for our apartment and it was
13:28
so hard to pick one that didn’t have you
13:30
know smart features built in
13:32
like i don’t want that i don’t want a
13:33
smart air conditioner yeah i had to get
13:36
a new television
13:37
oh my gosh it was such a nightmare
13:38
trying to find one that wasn’t a smart
13:40
tv it wasn’t connected somehow and
13:42
it’s it’s so hard good luck next time
13:44
you have to buy a toilet
13:45
i mean oh i know it’s ridiculous all
13:47
over the place
13:49
but yeah because of all of those changes
13:50
and because it’s so interwoven with just
13:53
the aspects of our daily life i i feel
13:55
like there’s been this gradual shift in
13:58
that where before
13:59
you you got to be a lot more deliberate
14:01
about what
14:02
aspects of who you are and what pieces
14:04
of information about you you shared with
14:06
different people in different settings
14:09
now it’s just ingrained and so the the
14:11
choice is gone in a lot of ways but
14:13
i’ll also say that i’ve seen a much more
14:15
positive trend in privacy in that
14:17
there’s been a lot more pushback in in
14:20
at least recent years
14:22
so some of it’s regulatory the gdpr is a
14:24
great
14:25
great prompt for considering privacy
14:27
concerns
14:28
um the u.s is very sectoral in its
14:30
privacy approach but there’s been
14:32
individual state laws that are
14:34
incredible like the the illinois
14:36
biometric information privacy act is
14:39
is great and having some really cool
14:40
enforcement ccpa
14:42
is prompting some really great
14:44
conversations around
14:45
what information we protect and how we
14:47
protect it and what level of control we
14:49
have
14:49
and whether that should be left to
14:50
people on a societal level
14:54
the interesting thing for me is seeing
14:55
younger generations and what their
14:57
approach is to privacy into being
14:58
watched
14:59
yeah and you know i’m i’m millennial so
15:03
i grew up in a little bit of the cusp
15:04
you know when i was
15:05
very young we had a computer in my house
15:08
when i was
15:09
probably 10 or 11. but they were so old
15:13
and you didn’t use them all that often
15:15
for a long stretch you used them for you
15:17
know word documents
15:18
but you didn’t necessarily use them for
15:20
internet all that often
15:21
and we really didn’t have smartphones in
15:23
the way that we do now and
15:25
and living through all those changes and
15:27
seeing
15:28
how gradually that happens and how
15:30
suddenly every aspect of your life
15:32
is watched uh is really interesting to
15:35
me
15:36
particularly seeing this net this
15:37
current generation that
15:39
is growing up with all of that just
15:40
already existing already in place you
15:42
know
15:43
um but but one thing that i find so
15:46
interesting is the ways that they find
15:48
to push back against that
15:49
there’s there’s a lot of really creative
15:52
uh loopholes and workarounds and kind of
15:55
hacks of tech
15:56
that younger generations are using so
15:59
even though they’ve
16:00
been put into this world where they’re
16:01
always watched they’re
16:03
actively fighting for privacy and for
16:06
space to kind of figure out who they are
16:07
without eyes on them all the time
16:10
yeah i just think that’s really valuable
16:11
it’s such an interesting dichotomy
16:13
because
16:14
it is a more connected generation uh in
16:16
many ways
16:17
and i think they they are inherently
16:20
that the the younger you skew i know
16:22
that there are some differences
16:23
generationally where
16:24
uh some actually prefer less online time
16:27
uh but but i think in general like i’m
16:29
gen x so the the younger you skew in
16:31
general away from
16:32
my generation and above you know you see
16:35
a lot more
16:36
native sort of familiarity with being
16:38
online but yes you’re right and then
16:40
those attitudes also change like there’s
16:41
there
16:42
isn’t an inherent acceptance of
16:45
the trade-off of that i’m i’m gonna give
16:47
up so much of my self and you know what
16:51
quantifies who i am
16:52
so that you can then sell to me more
16:55
effectively yep
16:58
well it’s interesting i i agree with i
17:00
think i i’m
17:01
probably very aligned with your your
17:03
distinction on the personal and
17:04
professional level
17:06
on the professional level is a really
17:08
interesting one though because i think
17:10
you do have to have this kind of
17:11
pragmatic it’s not it’s not defeatism i
17:14
don’t think but it’s a pragmatic view
17:16
that you know that
17:17
companies are going to collect data and
17:19
they are going to use it
17:21
in aggregate and in segments to be able
17:24
to you know better understand
17:26
targeting offers or better understand
17:30
even just to improve user experiences
17:32
you know inherent to
17:33
to the uh to the pages and sites that
17:36
they serve up
17:37
so so how do you now how do you take
17:39
over from the the personal
17:41
approach and carry it over into the
17:42
professional like where do you
17:44
where do you go with that well one thing
17:47
is that i i’m really grateful that my
17:51
super privacy protective perspective
17:53
from my personal life
17:54
can carry over in a ethical and
17:56
responsible way where i’m still advising
17:58
my clients well when i’m telling
18:00
them to you know practice data
18:02
minimization
18:03
and you know make sure that they’re only
18:05
collecting the information they need and
18:07
that they’re
18:07
putting in really good security
18:09
protocols and that they’re very clear
18:10
and transparent about what they’re doing
18:12
and honest with their users
18:14
um i i like to push really hard for
18:17
people
18:18
for the companies to understand what’s
18:20
going on with their data because a lot
18:21
of companies
18:22
use vendors that are just common you
18:24
know you use google analytics or
18:26
you you use uh different storage vendors
18:30
and
18:31
share information with you know people
18:34
that are
18:35
using behavioral advertising algorithms
18:37
and things like that and so
18:38
being able to advise them and have
18:40
discussions with them where i’m saying
18:42
i really think you should evaluate what
18:44
this
18:45
costs you an effort in disclosing all of
18:48
this and making sure you’re doing it
18:49
correctly versus what you’re getting
18:50
from it i’ve had some really interesting
18:52
discussions with clients about the value
18:54
of behavioral advertising and
18:56
that i don’t think there’s much value in
18:58
it and not to mention
19:00
protecting it against breaches and leaks
19:02
so you know the risk
19:03
yeah the more information you have the
19:05
more risk you have of a breach and the
19:07
more liability you have you know if
19:09
if you have a data breach and all you’ve
19:10
collected about your clients is
19:12
uh name and mailing address for the
19:14
services that’s hugely different than if
19:16
you’ve also collected their
19:18
you know race and gender and income and
19:21
financial account information and uh
19:24
sexual orientation
19:25
and people collect so much nowadays just
19:28
because they can
19:29
because the technology is capable of
19:31
that and until you push back a little
19:33
bit and challenge them and say okay but
19:35
why that why do you need that why do you
19:37
want that and
19:38
really highlight for them that they’re
19:40
raising their own risk
19:42
it’s that that’s something that helps me
19:45
kind of justify to myself the work i do
19:47
because i i can see
19:49
and i know that there are some privacy
19:52
really really purists that would argue
19:54
that
19:54
me working with clients that collect
19:56
data is unethical and
19:58
while i can understand the motivation
20:00
behind that argument i
20:02
i find a lot more value in engaging
20:05
and making the world more privacy
20:07
protective by talking to companies that
20:09
are engaged with it
20:10
and trying to make them do it in a
20:12
responsible way than i do in just
20:14
pulling myself out entirely and not
20:15
engaging yeah and beyond that
20:18
like i can advise my clients and i do
20:20
advise my clients and i want to do
20:21
do right by them and do right by the
20:23
people whose information they’re
20:24
collecting
20:25
but you also can do other things i mean
20:27
i i also
20:29
read a bunch of policy proposals and
20:31
send in
20:32
um formal comments on privacy laws that
20:35
are
20:36
being proposed or uh different
20:38
approaches to cases or you know we
20:40
we discuss this all the time
20:44
with with the different privacy
20:46
community you know the the privacy
20:48
to be very active in debates and
20:50
discussions about
20:51
regulations and cases and proposals and
20:54
and you can still be engaged in all of
20:56
that and push for kind of regulatory and
20:58
social change while also doing the
21:01
practical work of telling a client in
21:03
this circumstance that you’re doing
21:05
right now
21:06
here’s how we make this safer yeah and i
21:08
think you know
21:09
in my own consulting my own work i find
21:11
that that conversation
21:13
about uh the increasing risk when you’re
21:15
talking about more and more data
21:17
is a really useful one to have at the
21:19
sort of c-suite level
21:21
but also it seems like even just at a
21:23
strategy level
21:24
saying that you know you shouldn’t
21:25
collect data if you can’t
21:27
think of an aligned reason why you would
21:30
want to have it right like an
21:31
aligned between what benefits you and
21:33
what benefits the customer or the person
21:34
outside the company
21:36
like if it’s only benefiting you and
21:38
it’s not going to benefit
21:39
them in any way ever then there’s
21:42
probably really good reason
21:44
not to even bother collecting it
21:45
completely agree
21:47
and also the whole mentality we’re
21:48
collected just so you have it just in
21:50
case that’s
21:52
such an irresponsible perspective both
21:54
both
21:55
from a general privacy perspective and
21:57
for them why would you want
21:59
piles more information than you’re even
22:00
using it just increases your risk
22:03
right right we have a question from one
22:05
of our followers here one of the
22:07
audience members david ryan polgar who i
22:09
love
22:10
so should social media platforms offer
22:11
data portability to move valuable data
22:14
across different platforms what do you
22:15
think about that
22:18
i i think more and more as social media
22:20
has
22:21
changed in in what it does in different
22:23
contexts i
22:24
i think absolutely um i do think
22:28
partly let me clarify that partly
22:29
because people use different social
22:32
media as
22:32
as tools both professionally and
22:34
creatively now as well as to connect
22:36
with other people and for networking and
22:38
for
22:39
other services so you know artists and
22:41
photographers constantly share their
22:43
artwork through social media
22:44
and uh writers um practice their craft
22:48
through social media and
22:49
people that are in professional spheres
22:51
like i constantly
22:52
i mean i’m not necessarily recommending
22:54
this because i’m sure they get annoying
22:56
sometimes but i constantly write long
22:57
twitter threads kind of
22:59
analyzing privacy issues or legal issues
23:01
that i see because i want people to
23:02
engage with it more
23:03
so i absolutely see the value in data
23:06
portability and in making sure that you
23:08
can
23:08
preserve those things should you leave a
23:10
platform for whatever reason
23:12
um i don’t know how useful it always is
23:15
i think it’s
23:16
it’s very useful in that you get a copy
23:17
of everything that you’ve put
23:19
there i don’t know how portable it is
23:21
because i don’t know
23:22
like if i if i took all of my
23:24
information from twitter
23:25
tomorrow i don’t know that i could
23:27
actually transfer that to another social
23:29
media
23:30
i don’t think they’re compatible in a
23:32
lot of ways so i think they’re
23:34
they’re absolutely uses to data
23:35
portability i think there’s absolutely
23:37
value to it and i think there’s value to
23:39
forcing the company to go through the
23:40
exercise of saying
23:41
we know where all of your information is
23:43
and we we’ve
23:44
organized this internally sure and
23:48
you know just to make sure that
23:49
internally they’re practicing good
23:52
data structuring and are able to track
23:54
all of that but
23:56
yeah i i don’t know about actually
23:58
transferring it to other platforms
23:59
that’s kind of an interesting question
24:01
right it is an interesting thing i’m
24:02
going to keep thinking about that
24:04
i also wondering what you think about
24:06
the uh
24:07
own your own data sort of movement you
24:10
know where
24:10
do you stand on that concept of you know
24:13
the notion that eventually
24:14
at some point we may be able to move to
24:16
a model where people
24:18
are in uh control of their own data and
24:20
they can be paid for it and that sort of
24:22
thing
24:22
where you are on that so i understand
24:25
the appeal of it
24:26
i i absolutely understand that it’s it
24:29
can be a kind of
24:31
easier way to wrap your mind around how
24:32
data works to think of it as a piece of
24:34
property that you own
24:36
um i absolutely get the appeal of it i
24:38
think that there’s some very thoughtful
24:40
people that support that view i don’t
24:44
but the reasons that i don’t are first
24:47
of all
24:47
i think that puts way too much
24:49
responsibility on the individual
24:51
we don’t even know how much information
24:54
gets collected and by whom half the time
24:55
so having to
24:57
uh track that or parcel out
25:00
portions of your data for different uses
25:03
i think just puts too much of
25:04
both a
25:07
it puts too much of a time
25:09
responsibility on people certainly
25:10
i mean that’s a hugely time consuming
25:12
process but it also
25:14
puts a lot of research responsibility on
25:16
people like i work in privacy so i do a
25:18
lot of research into
25:21
the trade-offs of sharing information
25:23
just for my work
25:24
which is great but other people have
25:26
full-time jobs in other areas and are
25:28
taking care of small children or dealing
25:29
with family members and
25:31
have lives to live they don’t
25:32
necessarily have time to sit down and do
25:34
all of that
25:35
the other issue i have is it sets up
25:39
an inequality when it comes to being
25:41
paid for your information you know who’s
25:43
going to have the most incentive to
25:44
share their information
25:45
even if they wouldn’t normally if
25:48
they’re getting paid it’s it’s going to
25:49
be people with lower incomes
25:51
and so that sets up a system where the
25:53
wealthy can kind of
25:54
afford to keep their information private
25:57
and
25:57
the less wealthy have a lot more
26:00
incentive to
26:00
say i i don’t love this but i need to
26:03
pay rent
26:04
so here pay me for this i’ll i guess
26:07
um so i don’t like that split either and
26:10
then the the other
26:11
concern i have with it is i don’t know
26:13
how much bargaining power you would have
26:15
with big companies even if you’re the
26:17
owner of your information
26:19
big companies are still big companies
26:21
and they have a lot of power
26:22
and depending on what they’re offering
26:25
you may not actually get that much for
26:28
your information you may not get a ton
26:29
of money or value for it
26:31
because you know if you don’t want to
26:32
sell it there’s
26:34
a hundred thousand other people that are
26:36
willing to and they’ll do it at x
26:38
price so i i just think there are a lot
26:41
of complications with that approach that
26:43
don’t necessarily benefit people in the
26:45
way it’s intended to benefit people
26:46
yeah no that’s a wonderful thorough well
26:50
thought out answer and i think
26:51
another aspect of the point the last
26:53
point you were just making
26:54
is that i i’ve seen the numbers
26:56
somewhere that suggests that
26:58
individually your facebook data for
27:00
example would be worth like 20 cents or
27:01
40 cents or something like
27:03
that right compared to you know in the
27:05
aggregate
27:06
what facebook is going to get in in
27:09
terms of
27:09
monetizing it through the advertising
27:11
platform yeah
27:13
and being able to offer the the hyper
27:14
targeting that it does
27:16
so yeah so i mean there’s absolutely
27:18
very little incentive to educate
27:20
yourself and spend the time
27:21
and and become a merchant of your own
27:24
data when it means pennies
27:28
as opposed to frankly just on a personal
27:30
level i don’t like working in sales so i
27:32
don’t want to do it
27:33
my free time absolutely
27:36
well that’s it’s interesting and by the
27:38
way david who asked us the question
27:40
about the uh
27:41
portability says great answer and loving
27:43
the show
27:45
and also another point here from kyle
27:47
johnson is also the
27:48
connections you have on a social network
27:50
aren’t portable to another
27:52
so you know and and i can even say you
27:54
know for me
27:56
the the sort of model that i use mental
27:59
model that i use for who i connect with
28:01
on twitter versus facebook versus
28:03
linkedin
28:04
very very different you know kind of
28:06
constantly
28:07
i’m sure that’s true for most people and
28:08
so certainly it would seem very
28:11
meaningless unless you were moving from
28:13
a twitter-like
28:15
platform to another twitter-like
28:17
platform uh
28:18
you know that you might be able to
28:19
maintain some of the integrity of those
28:21
connections but
28:22
most likely not so that’s a good point
28:25
well what about you know how do you how
28:27
do you think about the emerging concerns
28:30
around privacy i mean certainly we have
28:31
a lot more surveillance technology
28:33
coming
28:34
into uh into being we have um you know
28:36
facial recognition is
28:37
an enormous topic within the space where
28:40
do you
28:41
sort of find yourself most concerned
28:43
about privacy issues in emerging tech
28:46
i mean you’ve you’ve mentioned facial
28:48
recognition i am hugely concerned with
28:50
facial recognition partly because it
28:52
feels
28:53
i i really hope it doesn’t prove to be
28:55
but it feels very much like one of those
28:56
technologies where
28:58
you know once it’s out it’s you can’t
29:00
put it back in
29:02
the this is this doesn’t exist don’t use
29:04
this category
29:06
um i feel i i hope that we’re still
29:09
early enough
29:10
in the use of facial recognition that
29:12
there could be steps to
29:14
restrict it and to put more responsible
29:16
use cases around it i
29:18
frankly just really don’t like it as a
29:19
technology partly because of
29:22
you know there’s been tons of much more
29:24
informed people than than
29:26
me talking about um the the issues when
29:28
it comes to identifying
29:30
uh darker skin tones or uh accuracy with
29:33
non-binary or trans individuals
29:35
um which is a huge issue because you
29:38
know in
29:38
a lot of places including the us there’s
29:41
there’s higher likelihood that those
29:43
communities get targeted under these
29:44
systems anyway and if those all
29:46
those same communities that are being
29:48
targeted also have a really high
29:50
instance of inaccuracy
29:51
in identification it just it really
29:53
facilitates
29:54
the opportunity for abuse of those
29:56
systems um
29:58
so i hate that and even beyond that even
30:00
if facial recognition worked
30:02
perfectly and identified people
30:03
correctly every time i still wouldn’t
30:05
think we should use it because
30:07
it really does change something about
30:09
the nature of being tracked everywhere
30:10
you go it changes something about
30:13
your freedom to move around a public
30:14
space and
30:16
um to restrict your your information and
30:19
your whereabouts and
30:21
that sort of thing and and the the
30:23
counterpoint i hear that all the time is
30:25
well why would that bother you if you’re
30:26
not doing anything bad but the
30:28
i don’t have anything to hide argument
30:29
has never held water i mean
30:31
this is gross in case there are people
30:34
watching or listening
30:35
who don’t know why that argument doesn’t
30:36
hold water would you take a moment to
30:38
say why you think that i mean i
30:40
certainly feel like it doesn’t hold
30:42
water but i’d love to hear you
30:43
uh give if you have a concise
30:45
explanation as to why
30:47
the notion if you have nothing to hide
30:49
that there’s no reason
30:50
for privacy why is that wrong
30:53
oh it’s wrong for so many reasons but so
30:55
the best example i’ve heard is a little
30:57
gross and i’m so sorry but uh
30:59
basically there’s a big difference
31:01
between privacy and secrecy secrecy is
31:04
is more about concealing something you
31:06
just don’t want anyone to know about
31:08
because
31:08
because it’s you know nefarious or uh
31:11
perceived badly or whatever
31:13
privacy just means that you want to
31:15
conceal it because
31:16
you don’t feel like sharing it so for
31:19
example if someone’s
31:20
if i’m hanging out at my friend’s house
31:22
and i leave i get up to go to the
31:24
bathroom there i close the door
31:26
and i close the like people know what
31:28
i’m doing in there
31:29
it’s not secret what’s probably going on
31:32
in the bathroom but i close the door
31:34
because i want privacy that’s
31:36
gross but but beyond that there’s
31:39
privacy issues when it comes to you know
31:40
sometimes you may be
31:42
just wanting to feel out something um
31:44
and not like like
31:46
kids trying to figure out what their
31:47
political views are sometimes you want
31:49
to be able to explore that or their
31:50
sexual orientation you want to be able
31:52
to explore that
31:53
in a safe and secure way without having
31:55
to come out with a hard stance or making
31:56
a decision
31:57
or or having people observing you
32:00
because there’s also been
32:01
multiple studies that just the act of
32:03
being observed and knowing you’re being
32:04
observed changes behavior and changes
32:06
thinking and doesn’t allow people to
32:07
freely
32:09
express themselves or explore ideas and
32:11
and it’s a really stifling
32:13
situation so a good portion of why i
32:17
oppose that is the the stifling aspect
32:20
of it
32:20
but yeah when it comes down to the i’ve
32:22
got nothing to hide argument
32:24
that also changes hugely based on
32:26
context i mean
32:27
communities like refugees are surveilled
32:30
hugely after 9 11 muslim communities
32:32
were surveilled
32:33
on an incredible scale and communities
32:36
of color
32:37
and people with different sexual
32:39
orientations or gender presentations are
32:41
often surveilled much more
32:43
and they may well want more privacy and
32:46
it’s not because they’re doing anything
32:47
wrong it’s because
32:49
there are elements of the society around
32:51
them that will punish them for the not
32:52
wrong things that they’re doing and that
32:54
they’re observed doing
32:55
and so the the social context around
32:58
that is also hugely important
33:00
just because you don’t think what you’re
33:01
doing is wrong doesn’t mean that the
33:03
society around you won’t react badly to
33:05
it and you should be able to
33:07
exercise your rights that way without
33:09
constant surveillance so
33:12
i think that’s a wonderful answer yeah i
33:14
mean
33:15
it seems like you you know you almost
33:17
could just say well it shows a
33:19
tremendous amount of privilege to be
33:20
able to make the statement yeah
33:22
you have nothing to hide but you gave
33:23
really crisp and clear examples
33:25
of that so i think that’s very helpful
33:28
so but when especially when you were
33:29
talking about facial recognition and
33:31
other biometric technology i think
33:33
what what uh the big concern there is
33:35
you know those are things you just can’t
33:36
change
33:37
so it’s another it’s one thing to be
33:39
able to change your email get a new
33:41
face right like like if that gets
33:43
compromised somehow it’s going to cost
33:45
me a lot of surgery to get a new face
33:46
and i like this one
33:47
so i don’t want to and like i can’t
33:50
change my dna i can’t change my
33:51
fingerprints these are
33:53
these are so immutable if it comes to
33:56
like an account being breached i can
33:57
change my password and i can change my
33:59
my username if i need to but these
34:02
things are so much more immutable and
34:04
they’re so much more set and they’re so
34:05
much more
34:06
tied to our actual identities and who we
34:08
consider ourselves to be
34:09
that i think the level of sensitivity
34:12
around them is just so much higher and
34:13
so the level of care should be equally
34:15
as high and one of the big problems with
34:17
facial recognition is that the
34:18
technology
34:19
has developed so quickly that there
34:21
really aren’t any regulations in place
34:24
around
34:24
it so they’ve been able to come up in
34:26
this vacuum of
34:28
of accountability essentially
34:32
and i’m not saying that there’s never
34:34
ever a circumstance where facial
34:35
recognition could be useful or could be
34:37
used for good purposes
34:39
i i don’t know enough to say that yeah
34:41
definitively
34:42
but for sure we’re not in an environment
34:44
where there’s checks on that yet and
34:46
that’s what
34:47
really scares me there’s always at least
34:49
one example that that someone’s willing
34:51
to come up with you know and and i
34:52
yeah when i had the um super viral tweet
34:55
around the
34:56
10-year challenge last year and was
34:57
making the rounds of the news programs i
34:59
was using the example of the
35:00
um the case study of facial recognition
35:03
having been used in
35:05
uh in india to track down missing kids
35:07
but of course we know like there were
35:09
there was a very limited uh experiment
35:12
uh we don’t really know how successful
35:14
it truly was
35:16
you know how successful it was with
35:17
facial recognition and would not have
35:19
been otherwise
35:20
like there’s no kind of control variable
35:22
and and the harms greatly exceed
35:25
the benefits in this and i don’t think i
35:27
i don’t think that’s
35:28
even a controversial statement so yeah
35:32
i it is interesting to see that you know
35:34
some cities have had some luck with
35:36
banning facial recognition uh and as you
35:38
mentioned the illinois
35:39
uh biometric uh regulation so do you
35:43
foresee a lot more sort of city by city
35:45
state by state kind of things coming
35:47
into play that way within the us
35:50
i do i know that there’s there’s a big
35:52
debate raging about whether there’s
35:53
going to be a federal privacy law in the
35:55
next year or two years or whatever
35:57
um i don’t know what’s going to happen
36:01
there frankly i
36:02
don’t know that the federal government’s
36:03
going to get itself together enough to
36:05
pass anything substantial but
36:07
regardless we had a few things on our
36:09
hands right now right
36:10
yeah there’s some other stuff going on
36:12
um regardless we’ve had a sectoral
36:14
system for a long time when it comes to
36:16
privacy and so
36:17
i i’ve been really encouraged by the
36:19
actions i’ve seen statewide
36:21
and the attention i’ve seen statewide to
36:22
these issues i would love for more
36:24
states to adopt something like
36:26
like bipa and uh start enforcing against
36:29
biometric
36:31
privacy violations um i i like what’s
36:34
happening with ccpa in general because
36:36
it moves the conversation forward and it
36:38
moves uh
36:40
thought about what rights people have to
36:42
their information and
36:43
what companies are required to do if
36:45
they want to use that information
36:46
um i think that there will probably be
36:50
more sectoral and statewide regulations
36:53
when it comes to facial recognition you
36:54
know seattle’s got that in place i know
36:56
colorado’s
36:57
been looking into that um but i i wish
37:00
it would move a little quicker
37:03
what makes you optimistic about what you
37:06
see happening around data privacy law is
37:07
there anything
37:08
that that you kind of look at and think
37:10
that’s what needs to happen and i’m glad
37:12
to see it happen
37:13
and maybe you want to encourage it to go
37:15
faster but where are you most optimistic
37:17
around it
37:19
you know one thing that’s made me really
37:21
optimistic is just the level of
37:22
engagement by people that don’t work in
37:25
this space
37:26
um i think that’s surprising to a lot of
37:28
people because this seems
37:30
i feel like privacy can seem like an
37:32
intimidating space because
37:34
it can seem very techy and it can seem
37:36
very
37:37
you know i don’t know how to code or i
37:39
don’t know how to program so i probably
37:41
won’t understand this or
37:42
you know it can be very easy to fall
37:46
into kind of a
37:48
nihilist approach to privacy and say
37:49
well i can’t control everything so why
37:51
bother but i’ve seen a lot of engagement
37:54
with people
37:54
when it comes to privacy rights and when
37:56
it comes to hey what should i be asking
37:58
for and what should i be fighting for
37:59
and what should i be looking for
38:01
and that has been really encouraging
38:03
just on kind of a
38:05
ground level people seem to care more
38:07
about this
38:08
when it comes to regulatory actions you
38:10
know i i do love that there’s been
38:12
multiple proposals for
38:14
a federal privacy law in the u.s i don’t
38:16
necessarily love all the drafts but i
38:18
love that there’s that
38:19
discussion happening and there’s some
38:21
some action there there’s a little
38:22
momentum
38:23
um i think the gdpr was a great step and
38:26
i i
38:27
am happy that it’s had the kind of
38:28
impact it’s had even if it was
38:30
inconvenient for people
38:32
at the jump because it does prompt just
38:35
much more care and much more thought in
38:37
what people are doing with
38:38
personal information um all of those
38:42
i’ve found really encouraging and
38:46
think the privacy space can feel kind of
38:48
like
38:49
a very david and goliath struggle where
38:51
it’s big companies and big government
38:53
surveillance structures that are
38:55
constantly pushing against privacy and
38:57
you know we’re just these
38:59
attorneys or individual programmers or
39:01
individual people that
39:03
are are bothered by the structure or the
39:05
system but
39:07
the fact that we’ve seen these
39:08
regulations passed and the fact that
39:09
there’s been
39:11
outcry against things like mass facial
39:13
recognition use or
39:15
or mass surveillance or companies
39:19
taking as much data as they’re taking i
39:21
i think that’s a good sign
39:23
and that’s really encouraging for me
39:24
personally
39:26
that’s great and i think you know it
39:28
seems like i can relate to the idea that
39:30
people
39:31
uh seem to struggle with finding privacy
39:34
a topic that they can get into because
39:36
it seems at once abstract and arcane
39:39
and like you have to know a lot of
39:41
really specific things
39:42
but you have to care about these very
39:45
philosophical
39:46
ideas at the same time and it’s a it’s a
39:48
tough thing it’s a tough thing to
39:50
reconcile so it’s
39:51
i think it’s a wonderful thing that
39:52
folks like you are
39:54
focused on that and speaking of folks
39:56
like you
39:57
my uh one of my besties tara aaron
40:00
stelluto
40:01
has commented and said not sure if this
40:03
is the right place to ask a question but
40:05
i’d love to hear
40:06
what cali has to say about the big eu
40:08
court of justice opinion last week
40:10
and what she’s advising clients about
40:12
how to go forward
40:13
so it seems like that sort of fits in
40:15
with both the conversations we were
40:17
having about you know
40:18
the professional approach in terms of
40:19
guiding companies and what you recommend
40:21
they do but also thinking about
40:23
regulations and such so
40:24
what do you have to say uh to tara’s
40:26
question
40:28
oh so much has gone on since since
40:31
the return of shrems i referred to it as
40:33
the restraining in a professional talk
40:35
and i feel like that
40:36
probably shouldn’t stick but uh
40:40
uh beyond just a lot of urgent client
40:42
response to different things
40:44
uh we’ve we’ve had some practical
40:46
guidance that we’ve issued
40:48
i personally have been rereading fisa
40:50
and uh
40:51
the the eo one two triple three and
40:54
different
40:55
surveillance acts and all of that uh
40:58
practically speaking you know i
41:01
we’ve been redoing privacy policies that
41:03
list privacy shield is the sole data
41:05
transfer method
41:06
and um we’ve been putting
41:09
sccs in place we’re making sure that
41:11
they are in place and they’re
41:12
enforceable
41:13
uh and then we’ve been doing some
41:14
additional work i i advise clients to
41:18
put some documentation in place where
41:20
they can really track you know this is
41:21
the analysis we’ve done
41:22
of the risks to the data subjects we
41:25
collect information on and this is what
41:27
we
41:28
what we think when it comes to our
41:30
whether we’re included under fisa scope
41:32
and what the likelihood is that
41:34
our information is tracked and these are
41:36
the agreements we put in place with
41:37
vendors and
41:38
uh you know we make sure we encrypt
41:40
information in these ways i
41:42
i think having a paper trail is
41:43
extremely important right now just to
41:45
show
41:46
due diligence and that you’ve been
41:47
paying attention to the issues
41:49
um i i wish
41:52
i’m too lawyerly to say this is for sure
41:54
what you should what you need to do and
41:56
then you’re fine
41:57
uh there’s always and it depends in
41:59
there but
42:01
i do think that showing clear effort is
42:04
going to be appreciated and making sure
42:06
you’re still adhering to the principles
42:07
that were enshrined under privacy shield
42:10
is going to be very important
42:11
um i really hope they put out another
42:15
framework soon or they put out some
42:17
guidance soon because
42:19
right now i feel like all of us are
42:21
doing our best to
42:22
responsibly guide our clients through
42:24
things but it’s
42:26
it’s still a very influx unsettled area
42:29
so uh being as responsible as possible
42:32
making sure we’re doing as much as we
42:34
can is kind of i hear a dog
42:36
she really wants to go outside i think
42:38
there’s a squirrel
42:41
oh well
42:47
and i know we had um i’m going to
42:50
go ahead and cue something up here
42:53
because we had before the show started
42:57
we actually had a question that was
42:59
submitted
43:00
uh from someone on twitter and i’ll see
43:03
if i can get this
43:03
added up on here real quick oh you can
43:05
barely see it let me make this bigger
43:09
like that i and
43:12
uh let’s see if you can see it
43:16
got a high puppy from uh bruce celery
43:19
so the dog is already a hit i don’t know
43:21
about us
43:22
but the dog is a hit i’ll tell lucy
43:25
she’ll be very proud
43:27
layla’s out there fighting with her now
43:29
it’ll be great
43:31
that’s great um so shea swagger
43:35
asked i’d love to hear more about what
43:38
meaningful consent looks like with data
43:39
privacy i wonder if there’s anything
43:41
we can borrow from sex ed about consent
43:44
being affirmative
43:45
revocable or centering care or agency
43:49
what do you have to say about you know
43:50
where we see if there are parallels that
43:53
we can pull from
43:54
you know the conversation about consent
43:56
in terms of our sexual relationships
43:58
over into uh data privacy
44:02
i actually think that’s a really
44:03
interesting way to frame it um i think
44:05
there’s absolutely some really valuable
44:06
things we can take
44:07
from what we consider to be uh correct
44:11
sexual consent to you know consent
44:14
around data i think
44:16
uh the revocable part is important i
44:18
think it’s important that it be clear
44:20
transparency is always clear make sure
44:22
that everyone engaged in the
44:24
conversation knows exactly what they’re
44:25
discussing
44:26
um and the other part that i think is is
44:29
actually
44:30
very important is is consent to various
44:33
steps of the process
44:35
so uh if i’m consenting to let a
44:38
business use my information so i can get
44:40
their services that doesn’t mean i’m
44:42
okay with them sharing it with a bunch
44:43
of other people
44:44
just like if i you know agree to go on a
44:46
date it doesn’t mean that you’re coming
44:47
back to my place later
44:49
so there’s a lot of steps in between and
44:52
there’s a lot of different
44:53
uh uses of information that should be
44:55
considered so full consent i think is
44:57
also really important when it comes to
44:58
data security and privacy and making
45:00
sure that there’s
45:03
the ability to consent to maybe some
45:04
uses of information but not all uses of
45:07
information
45:08
um i think that’s a really important
45:10
aspect that
45:12
can get overlooked sometimes the other
45:14
act yeah the information is used right
45:16
like
45:17
exactly it was an example that i had
45:19
written and i shared with shane with you
45:21
i think on twitter that i had written a
45:23
piece a few years ago
45:24
that was talking about um foursquare
45:27
knowing that i was at a bar
45:29
that i never searched for on foursquare
45:31
or on google or in any way shape or form
45:33
i didn’t even know the name of the bar
45:34
because i was going with a friend we
45:36
were going after an event and we walked
45:37
over to this bar he knew it
45:39
and then in the morning i got an email
45:40
from foursquare saying what did you
45:42
think
45:42
of name of bar it was just like what i
45:46
don’t i
45:46
until this moment i didn’t even know
45:47
what the bar was called and it just
45:49
seemed so
45:50
yeah it seemed so creepy and invasive to
45:53
me and what it what it made me think
45:54
about was that
45:55
um you know rape culture is so much
45:58
about
45:58
the the um where accountability and
46:02
power
46:03
resides and the fact that you know these
46:05
these
46:06
companies like foursquare and others
46:08
that are tracking your location even if
46:10
you’ve
46:10
said i i don’t want you tracking it
46:12
unless i’ve given you
46:14
the incentive or you know the the ask
46:16
right then and there like unless i’ve
46:18
said i want you tracking this i want to
46:19
know
46:20
you know where i am or where i’m going
46:22
or whatever and they’re tracking it in
46:23
the background
46:24
you know at least the common courtesy
46:27
not to tell me that you’ve
46:29
you’ve tracked it and so i just think
46:30
it’s a it’s a
46:32
whole disconnect around the way that
46:35
i think the value that the companies
46:38
that are collecting this data think that
46:39
they’re offering
46:40
versus what people on the other side of
46:42
the interaction believe the value to be
46:44
and then the assertion of power and
46:47
dominance in ways that are
46:48
really weird and wrong and and so yeah i
46:51
think it’s a really important
46:52
uh point that shea has brought up here i
46:55
agree and i think there’s actually some
46:56
broader parallels you can draw
46:58
from that to to privacy in general like
47:00
just because i consent for
47:02
one company to use certain information
47:04
about me and in one way doesn’t mean
47:06
that i can send to all companies that do
47:08
that service
47:09
to do the same thing right you know it’s
47:11
specific it’s to this one company that
47:13
i’m
47:13
purposely interacting with it doesn’t
47:15
mean that i’m okay with it in all other
47:17
contexts
47:18
and it also doesn’t necessarily mean
47:20
that i’m always okay with that company
47:23
using my information over and over you
47:25
know it can like
47:26
if i go on a date with one person that
47:29
doesn’t guarantee that i will go on a
47:30
date with them forever anytime they feel
47:32
like it means that
47:33
you have to keep checking yeah um
47:37
and so i think i think the
47:38
responsibility to follow up and to be
47:40
clear
47:40
on exactly what is being permitted is is
47:44
a carryover
47:45
absolutely so um a follow-up question
47:48
from
47:49
jenna jordan in the comments is is data
47:51
ownership a necessary precondition to
47:53
true
47:54
fully empowered consent though
47:57
that’s a really interesting question i
48:00
don’t know if it’s
48:01
ownership the thing that bothers me well
48:04
the thing i already gave a list of
48:05
things
48:06
one of the things that bothers me a
48:07
little bit about thinking of
48:09
data as a physical tangible good that we
48:12
own
48:13
is that that just doesn’t track with the
48:14
nature of it
48:16
it’s it’s kind of an
48:19
a renewable resource that way it doesn’t
48:22
it’s it’s not like this one thing that i
48:23
give this one time it’s this constantly
48:25
regenerating resource there’s always
48:27
more information about me and if i share
48:29
it with one person that doesn’t preclude
48:31
me from sharing it with other people
48:33
and there’s different limitations on it
48:35
than there would be on like a physical
48:36
object and in addition the ownership
48:38
factor i think
48:42
puts different parameters and
48:43
responsibilities on the parties involved
48:45
that i don’t necessarily think
48:47
perfectly correlate i absolutely get why
48:49
we constantly use it as
48:51
you know data as property because that’s
48:54
that’s just a much easier way to
48:55
understand it
48:56
the the difficult thing about
48:58
information
49:00
and part of frankly why privacy law is
49:02
constantly fluctuating and is such a
49:05
kind of squishy area of law in some ways
49:08
is that it
49:08
it just doesn’t perfectly map onto
49:11
property law it
49:12
it intersects constantly but it doesn’t
49:15
perfectly map
49:16
and it doesn’t perfectly map onto issues
49:18
of you know
49:20
contract law and liability and all of
49:22
that it touches all these different
49:24
areas but it doesn’t perfectly form into
49:26
anything
49:28
um
49:33
we’ve been talking for too many years
49:35
about data as the new oil and
49:36
you know there are these kind of
49:38
concepts of data lakes and and that sort
49:40
of thing but
49:40
there’s not those metaphors only go so
49:43
far and they only make so much sense
49:46
and i think you know data as property
49:48
that you can own and you can barter and
49:49
you can
49:50
you know commodify i think as we can
49:53
pretty clearly see when you start to
49:54
model it out only really works when
49:56
you’re talking about
49:57
in the aggregate as it’s monetized by a
50:00
corporation like facebook as opposed to
50:02
by an individual
50:04
in their own interest i agree i do think
50:07
to the core question
50:08
thinking of it as like if someone’s
50:10
getting consent from me for my
50:12
information that implies i own the
50:13
information i think that’s accurate it’s
50:15
just it’s not ownership in the same way
50:17
as you would own a physical good
50:19
like it is mine it comes from me it’s
50:22
about me
50:23
but it’s not the same as saying like my
50:26
house or my dogs or
50:27
or whatever um so i i don’t think that
50:31
assertion’s wrong at all i think that’s
50:33
correct like for consent to work you
50:35
have to go
50:36
from the base assumption that it’s
50:38
because it’s
50:39
of me and connected to me i’m the one
50:42
that gets to say whether you use it or
50:44
not
50:45
but i again it just it doesn’t perfectly
50:47
correlate when it comes to things like
50:49
sale or or ownership in the traditional
50:52
sense that way
50:53
yeah yeah it makes sense no i think it’s
50:56
an important thing that
50:57
you know this this is why we don’t have
50:59
very good metaphors for this because i
51:00
think there aren’t enough discussions
51:02
that are happening across
51:03
you know disciplines and considerations
51:06
so i think it’s
51:07
um it’s really healthy to to keep
51:09
talking about it so that we can find out
51:11
what the nuances are and and dive into
51:13
them so
51:14
i know we’re getting uh kind of close to
51:16
our hour here so i want to make sure
51:18
that we
51:19
um we get to the chance to to what i
51:21
think is a
51:22
really important question is that in my
51:25
work
51:25
around this tech humanist concept i’m
51:27
always trying to think about
51:29
how we built the best futures for the
51:31
most people and
51:32
i i wonder what you think we can do
51:35
around data privacy law in culture
51:38
to to stand a better chance of bringing
51:41
about the best futures
51:45
i think
51:48
i think education and empowerment are
51:50
both really important so
51:51
keeping the discussion about privacy as
51:54
a prominent thing
51:55
and and continually bringing it up and
51:58
kind of keeping it at the forefront of
51:59
people’s minds you know the
52:01
the other tricky thing about privacy is
52:02
that it’s something that’s never really
52:04
done
52:05
i mean you don’t you don’t finally
52:07
figure it out and like
52:08
check a box or send in a form and then
52:11
you’re done with privacy and you don’t
52:12
have to deal with it anymore it’s an
52:13
ongoing thing it’s kind of like
52:16
uh this is also a sad metaphor because
52:19
of
52:19
quarantine but it’s kind of like fitness
52:21
you know i
52:23
if i if i want to stay on top of it i
52:25
have to keep doing it i don’t get to
52:27
just hit a set point where i’m like
52:28
awesome
52:29
did done never have to go to the gym
52:31
again
52:32
like you have to maintain and and
52:35
that’s particularly true of privacy
52:37
because it involves tech that’s
52:38
constantly evolving and constantly
52:40
changing
52:40
and and regulations that are constantly
52:43
evolving and changing
52:44
um i think to make and i’m going to
52:46
interject and just say i think it’s a
52:48
challenge to us individually too because
52:50
we’re constantly being bombarded with
52:52
opportunities to participate
52:53
in things and share things and uh
52:56
divulge things about ourselves and make
52:58
different trade-offs and and
53:00
compromises that we have to evaluate on
53:03
the fly
53:03
and most people don’t have the big
53:06
picture to be able to make that decision
53:08
in as sophisticated a way as they need
53:10
to
53:11
yeah i think that’s absolutely true i
53:13
think for a bigger picture
53:14
you know trying to make the future
53:16
better for people when it comes to
53:17
privacy
53:19
you have to keep advocating really hard
53:21
um
53:23
you know i i i don’t know her last name
53:26
because it’s twitter but eva made a joke
53:27
the other day that
53:29
uh she doesn’t like being referred to as
53:31
a tireless privacy defender because
53:32
she’s like i’m tired i’m very tired i
53:34
would like a nap
53:36
um and it’s true you know you it does
53:39
get really tiring because it can feel a
53:40
little bit like you’re kind of pounding
53:42
your head against a wall
53:43
it just keeps on creeping closer to you
53:45
and
53:46
there’s there’s only so much you can do
53:47
but it is valuable and i think it
53:50
it’s important to keep in mind that
53:51
advocacy is is what
53:53
keeps protect what protections we have
53:55
in place
53:56
and it’s what pushes for regulations and
53:58
changes in privacy and it’s what
54:01
helps protect people and their
54:03
information and i think keeping the
54:05
perspective of this isn’t just for me
54:07
because i’m uncomfortable with alexa
54:10
listening to me or whatever it’s
54:12
it’s for people that are in more
54:14
vulnerable population groups it’s for
54:16
kids that are developing a sense of self
54:18
and need to be able to do that safely
54:20
and it’s for
54:21
um you know lgbtq and non-binary and
54:25
trans youth who who would face actual
54:27
danger
54:28
if privacy keeps getting reduced and um
54:31
people in in refugee communities and in
54:34
you know
54:34
uh activists that that get tracked it’s
54:37
it’s
54:38
thinking of privacy as something that
54:40
you fight for not just for yourself i
54:41
think really keeps things moving forward
54:43
and keeps the importance of it
54:45
to the forefront of your mind i think
54:47
it’s easy to get hung up in the like
54:48
well
54:49
what do i do i just do paperwork all day
54:51
every day i write
54:53
privacy policies nobody ever reads and i
54:56
i do a bunch of contract renegotiation
54:59
and reread laws that no one cares about
55:01
but but it does matter and it does make
55:03
a difference and i think keeping that in
55:05
the forefront of your mind really
55:07
improves things for the future and and
55:09
provides motivation for us to keep going
55:13
so that’s that’s really my biggest
55:15
takeaway
55:16
that’s a beautiful uh takeaway i think
55:18
privacy as advocacy
55:20
and thinking of it you know in the
55:21
biggest most interconnected sense
55:23
is is a beautiful way to to close this
55:25
out so cali schrader thank you so much
55:28
for joining us for the tech humanist
55:30
show
55:30
thanks to everyone who watched live and
55:32
offered comments uh
55:34
and questions and uh thanks to all the
55:36
listeners
55:37
out there uh please go ahead and
55:39
subscribe comment like wherever you see
55:41
this
55:42
and um please tell your friends about it
55:44
thanks again calli
55:45
and we’ll see you next time no problem
55:47
thank you so much for having me