The Tech Humanist Show: Episode 13 – Ana Milicevic

About this episode’s guest:

Ana Milicevic is an entrepreneur, media executive, and digital technology innovator. She is the co-founder and principal of Sparrow Advisers, a strategic consultancy helping marketers and C-suite executives navigate the data-driven adtech and martech waters. A pioneer of digital data management in advertising, Ana was responsible for the development of the Demdex platform (now Adobe Audience Manager) from its early days through its successful acquisition and integration into the Adobe Digital Marketing suite. Prior to starting Sparrow she established Signal’s Global Strategic Consulting group and helped Fortune 500 customers adopt advanced and predictive analytics across their marketing, ad ops, and digital content business units at SAS. Her consulting portfolio includes working for the United Nations, executing initiatives in 50+ countries, and advising companies on go-to-market strategies all around the globe. Ana is frequently quoted by media powerhouses like The Wall Street Journal and Business Insider (who in 2018 named her as one of 23 industry leaders working on fixing advertising) as well as industry trades like AdWeek, AdAge, Digiday, Marketing Magazine, AdExchanger, and Exchangewire. She is a sought-after speaker on topics of adtech, martech, innovation, customer experience, data management and new frontiers of technology. 

She tweets as @aexm.

This episode streamed live on Thursday, October 8, 2020. Here’s an archive of the show on YouTube:

About the show:

The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O’Neill.

Subscribe to The Tech Humanist Show hosted by Kate O’Neill channel on YouTube for updates.

Transcript

02:20
okay
02:22
all right hello humans
02:25
welcome to the tech humanist show
02:28
uh come on in start gathering around the
02:31
digital campfire
02:33
i am of course your host kate o’neil so
02:36
let me hear from you
02:37
from those of you who are already online
02:38
i see some numbers starting to tick up
02:40
here
02:41
so uh hello say hello tell me where
02:43
you’re joining in from
02:45
and i’ll go ahead and start clueing you
02:48
in that
02:48
so today we’re going to be talking a lot
02:50
about
02:52
ad technology data and privacy and other
02:54
related topics so start thinking of your
02:56
questions
02:57
now about those topics and in fact feel
02:59
free to start asking those
03:01
in the comments even as you keep saying
03:03
hello and checking in which you should
03:04
be doing
03:05
and that will help us answer as many
03:07
questions and discuss as many of those
03:08
comments as possible
03:10
so this as you may know is a multimedia
03:13
format program which means that
03:15
as i’m speaking it’s being broadcast
03:17
live but it’ll also live on as an
03:19
archive on multiple channels so people
03:21
can always find it later
03:22
so hello to those of you in the future
03:24
from those of us in the past
03:26
uh and also each episode gets turned
03:28
into a podcast for the following week
03:30
uh so you can always find it that way as
03:31
well each week we explore different
03:33
facets
03:34
of how data and technology shape the
03:36
human experience
03:37
i hope you’ll subscribe or follow
03:39
wherever you’re catching this so that
03:41
you won’t miss any new episodes because
03:42
they’re all wonderful
03:44
as is today today’s episode is going to
03:46
be fantastic
03:48
so please do note that as a live show we
03:51
will do our best to vet those comments
03:53
and questions in real time but we might
03:54
not get to all of them
03:56
uh very much appreciate you being here
03:58
though and chatting with us
03:59
and just generally participating in the
04:01
show and saying hi
04:03
so with that i am looking for those
04:06
comments bring them in folks
04:08
but meanwhile i’m going to go ahead and
04:10
introduce our guest
04:12
so today we have the great privilege of
04:15
talking with anna milicivic
04:16
who is an entrepreneur media executive
04:19
and digital technology innovator
04:21
she is the co-founder and principal of
04:23
sparrow advisors
04:25
a strategic consultancy helping
04:26
marketers and c-suite executives
04:28
navigate the data-driven ad tech in
04:30
martech waters
04:32
a pioneer of digital data management and
04:34
advertising ono is responsible for the
04:36
development of the demdex platform
04:38
which is now adobe audience manager from
04:40
its early days through successful
04:42
acquisition and integration
04:43
into the adobe digital marketing suite
04:46
prior to starting sparrow
04:48
she established signals global strategic
04:50
consulting group and helped fortune 500
04:52
customers adopt advanced and predictive
04:55
analytics across their marketing
04:57
ad ops and digital content business
04:59
units at sas
05:00
her consulting portfolio includes
05:02
working for the united nations
05:04
executing initiatives in 50 plus
05:06
countries and
05:07
advising companies on go to market
05:08
strategies all around the globe
05:11
so ana is frequently quoted by media
05:13
power houses like the wall street
05:14
journal
05:15
and business insider who in 2018 named
05:18
her as one of 23
05:19
industry leaders working on fixing
05:21
advertising thank goodness
05:23
as well as industry trades like ad week
05:25
at age digiday
05:26
marketing magazine ad exchanger and
05:28
exchange wire she’s a sought-after
05:30
speaker on topics of
05:31
ad tech martech innovation customer
05:33
experience data management and new
05:35
frontiers of technology
05:37
so audience keep getting those questions
05:39
ready for our outstanding guest and with
05:41
that
05:42
please welcome my friend anna milicivic
05:45
anna you are live
05:46
on the tech humanist show thank you so
05:48
much for joining us
05:50
hello hello thank you for having me it’s
05:52
really a pleasure to be a guest here
05:54
yeah it’s a pleasure to have you and we
05:56
um we i think this is kind of like
05:58
this is one of those things i have a few
06:00
friends that immediately popped into
06:01
mine when i started doing this show is
06:03
like
06:03
they’ve got to be guests on the show at
06:05
some point but i’m sort of like
06:07
keeping you in reserve all the wonderful
06:10
friends that i know are just brilliant
06:11
i’m like
06:12
i gotta i gotta use you sparingly
06:14
because i know i’m gonna have so much
06:15
fun with those conversations that i
06:17
don’t want the other ones to pale by
06:18
comparison
06:20
but here we are
06:23
well hey so you know we’ve got a lot of
06:26
stuff that to
06:27
to dig into because i actually went
06:29
through and to prepare for
06:31
some notes for today i had gone through
06:34
your
06:34
um your newsletter that you write weekly
06:38
right uh-oh
06:41
i think we’ve lost anna just a second
06:43
here so we’re gonna we’re gonna
06:44
reconnect
06:46
and hopefully we stay connected here uh
06:49
with the live show but it’s nice that
06:52
honest left us with a frame where she’s
06:54
smiling
06:56
we’ll get her back on in just a second
06:57
though see if we can’t reconnect here
07:04
sorry bear with us folks keep those
07:06
comments and questions coming though
07:08
tell me where you’re tuning in from i
07:10
want to see um
07:11
where folks are isn’t it fun that
07:15
honest a freeze frame is of a smiling
07:17
face i love it
07:20
we can’t seem to get back here
07:23
let’s try this one more time oh
07:26
darn internet connection
07:39
there we go yay
07:42
that was like really sweating that one
07:45
out because
07:46
but anna i don’t know if you can hear me
07:48
say that the frame
07:49
the uh frame you froze on was you
07:53
smiling
07:53
so it was really nice it was like i
07:56
didn’t even know you were gone for a
07:57
second because i was just like oh she’s
07:58
just smiling
08:00
oh no i
08:03
i hope i’m back to moving and that
08:05
whatever that internet god hiccup
08:08
was is gone but who knows
08:12
well so okay mark barnhart says
08:15
glitch has happened part of the fun of
08:17
live so
08:19
true so true uh it is um it is part of
08:22
the fun of this is that
08:23
you know you never know what’s going to
08:25
happen and you got to keep
08:28
keep things going home we’ve lost on
08:30
again
08:32
uh let’s get this reconnected
08:36
oh something horrible is happening with
08:38
our internet connection
08:40
today all right but thanks for
08:43
commenting mark
08:44
we need i know there’s a bunch more
08:46
people out there so we need to get uh
08:48
get folks chatting to keep us going
08:50
while we get anna reconnected here
08:59
wait i saw a movement
09:03
wow okay i don’t know how to describe
09:05
what’s happening
09:06
right now again i’m gonna have to
09:09
actually take a photo with my phone
09:11
and send it to over twitter because it
09:13
is just ridiculous
09:15
and if you all can hear me i did
09:18
uh start i did start today by saying
09:21
that
09:22
i’ve experienced every sort of weird
09:24
technical glitch today so obviously
09:26
there’s some kind of crazy cloud
09:29
yeah i think we
09:32
oh you were breaking up there again oh
09:34
gosh it’s so crazy are you
09:36
uh sorry for the administrative stuff
09:38
folks
09:39
are you hardwired in by the way or or is
09:41
there a possibility of getting
09:42
uh ethernet ethernet no unfortunately
09:46
that’s the thing that that i
09:47
i really can’t do i don’t have an
09:49
ethernet
09:50
line um so i’m i’m hoping this improves
09:54
itself pretty shortly but if you can see
09:56
my my
09:57
my my uh what this looks like from my
10:01
perspective
10:02
right now my screen has literally just
10:03
gone all pixelated and dark
10:06
[Laughter]
10:07
well you look fine and you sound fine
10:10
everything’s great as far as
10:11
we’re concerned right now so let’s see
10:13
if we can just plow right ahead
10:15
sorry for your pixelated view of the
10:17
world but um you know that’s kind of a
10:19
fitting
10:20
way to go about this conversation i
10:23
agree
10:23
[Laughter]
10:25
well hey so you know i want to dig right
10:27
in i was say i was saying i think when
10:29
the connection dropped first that
10:31
and to prepare for this conversation
10:32
i’ve actually gone back and looked at
10:34
your
10:35
uh newsletters that you’ve written so
10:36
you have this weekly is it weekly
10:39
uh yes it’s it’s mostly weekly so far
10:41
since we
10:42
started it it’s been
10:46
oh no we’re frozen again all right it’ll
10:49
come back
10:49
i’m sure it will i’ll keep i’ll uh
10:52
just keep saying what what anna’s saying
10:55
here which is that
10:56
it’s um it’s a weekly connection
10:59
or sorry weekly newsletter
11:02
called sparrow one everybody should
11:04
subscribe to it because
11:06
if if you don’t get a chance to hear on
11:08
his brilliance uh
11:09
without this many interruptions then you
11:11
deserve to treat yourself to uh the
11:14
uninterrupted genius
11:16
of anaheim georgia o’neil which is my
11:20
mom hi mom
11:20
says technology is not always helpful
11:23
you know it’s kind of funny that that’s
11:24
um
11:25
in a way that’s a little bit the
11:27
underlying theme of the show
11:30
we we want technology to be helpful we
11:32
want it to be
11:33
um helpful to humans and to help us
11:36
connect with each other and uh have
11:39
meaningful experiences but what it’s
11:43
really i think what the the most helpful
11:45
thing that’s happening with technology
11:49
is that it’s forcing us to be very
11:51
clever
11:53
because like right now we just lost anna
11:55
completely
11:56
let’s see if we can bring her back
12:05
mark says i was a guest on a show he’s
12:08
currently unavailable
12:09
please leave a message after mark says i
12:12
was
12:13
on a guess on a show some years ago with
12:14
the tech gremlins that day rendered the
12:16
entire interview unusable
12:17
it’s awkward both for the host and for
12:19
me mark sorry you had that experience it
12:22
is
12:22
it’s rough when these things are uh
12:24
really not working this is this is
12:26
episode
12:26
13 of the show so this is
12:29
kind of a bad omen probably
12:34
all right i’m gonna try anna one more
12:36
time here and see if we can’t get her
12:38
reconnected episode 13.
12:41
what do you think guys uh comment and
12:43
tell me if you’re superstitious about
12:45
things like
12:46
the number 13.
12:52
who’s this who’s uh superstitious out
12:54
there
12:56
uh well let me tell you a little bit
12:57
about anna’s work while i’m sure she’s
12:59
trying desperately
13:01
and you know sweating it out trying to
13:02
get back on the show let me tell you a
13:04
little bit about
13:05
what um what she’s doing with this
13:07
sparrow one
13:08
newsletter that i was starting to tell
13:10
you about she’s been covering
13:13
a lot of topics related to um how
13:17
data informs ad models advertising
13:21
and one quote from one of her
13:23
newsletters was
13:24
we’ve previous previously explored how
13:27
today’s big tax should really be thought
13:29
of
13:29
as big advertising since today’s
13:31
platforms owe
13:32
significant portions of their revenue to
13:34
advertising
13:35
so it’s a really interesting observation
13:37
like the the idea that what we think of
13:39
as big tech
13:40
fun you know functionally is really
13:43
all about advertising and it should be
13:46
approached that way
13:46
hey here come here she comes let’s see
13:48
if we can get this
13:50
this show rocking and rolling one more
13:52
time there she is
13:54
well that was fun
13:59
hi everybody hey
14:03
okay i was just telling everybody while
14:05
you were away i know that this is
14:07
episode 13 so i’m wondering if we
14:11
found the gremlins are hitting us with
14:13
superstition now
14:16
indeed and did all right that explains
14:19
everything i’m not worried about yeah
14:21
yeah my mom says yay
14:26
hi i want to go ahead and move into
14:30
some of the discussions so we can
14:31
actually get some of your genius
14:32
captured here
14:33
you know one question i had for you
14:35
almost like right away is there’s this
14:37
this old chestnut uh by the way i was
14:39
talking about why you were
14:40
gone about the whole big text should be
14:42
thought of as big advertising
14:44
observation that you made in one of your
14:46
newsletters and it feels to me like one
14:48
of the other
14:50
really commonly cited cliches
14:53
is that if you’re not paying for the
14:56
product you are the product and that
14:57
feels like it’s been around as an
14:59
observation for at least a decade
15:01
maybe even longer um but that it’s not a
15:04
very contemporary observation
15:06
about the relationship between you know
15:08
humans and data and technology
15:11
um feels like things have gotten more
15:13
complicated in the decade since that
15:15
became popular so i wondered if you
15:17
see that that way as well and if you
15:19
have a new mental framework an updated
15:22
mental framework that we can replace
15:23
that with
15:26
yeah i i think that you know for
15:30
the better part of uh certainly the last
15:33
decade but really the last two decades
15:35
since we’ve had
15:35
digital advertising and the internet
15:38
that’s been becoming an
15:39
ever increasing part of our lives uh
15:42
we’ve kind of answered every question on
15:44
well how am i going to monetize
15:46
you know fill in the blank whatever
15:47
service product whatever
15:49
this is that we’re talking about the
15:51
answer that came to mind always was well
15:53
we’ll just you know run some advertising
15:55
against it or we’ll monetize through ads
15:58
and so we’ve we’ve been kind of
15:59
postponing this
16:01
decision on uh what should
16:04
monetization look like in a mature
16:07
internet as opposed to an emerging
16:08
internet and
16:09
you know the that that’s kind of what
16:12
what we’re hitting now
16:14
uh is is that wall of continuously bad
16:17
decisions and postponing the decision of
16:19
what monetization should look like
16:21
i think for the consumer this is
16:23
certainly coalescing now
16:25
as a chat challenge where you know
16:28
we’re being asked to directly support
16:31
some content creators whose content
16:33
we’re used to consuming for free
16:35
so this is every journalism conversation
16:37
that we have today ties back to this
16:40
and uh you know it’s a really silly
16:42
conversation to have
16:43
when we used to consume journalism
16:45
through print journalism
16:47
nobody thought twice about paying for a
16:50
single issue of a newspaper
16:52
but right now if you ask somebody you
16:54
know hey
16:55
i want four dollars for you to be able
16:57
to read all of this amazing reporting
16:59
today
17:00
that seems like uh you know folks will
17:04
look at that and go like oh that’s
17:08
too how so there’s a
17:12
mismatch between where we are with
17:14
monetization which is still to your
17:16
point very
17:16
advertising oriented the perception of
17:19
value on the the user end of this
17:21
advertising supported ecosystem and then
17:23
we’re just starting to nip
17:25
at the bud of what some future
17:27
monetization models may look like
17:29
but we haven’t really settled on one or
17:32
several
17:32
that will look like they’re taking off
17:34
so if we’re that exciting in between
17:36
where
17:37
and any thanks paul
17:42
but at the same time nothing seems to
17:44
work anymore at least the barclays has
17:45
really
17:46
uh fluctuating space so i don’t know
17:49
that there is a good
17:50
mental model to transition to other than
17:53
as consumers we really need to be more
17:55
aware
17:56
of the trade-offs we’re making data
18:01
and this is where awareness at all
18:04
yeah you know it so one of the phrases i
18:07
use frequently in my work is
18:09
analytics are people and it seems like
18:12
you know
18:12
what i’m trying to get across there is
18:14
that the so much of the data-driven
18:16
data-driven landscape of decision-making
18:19
is based on
18:20
um the data trail that’s generated from
18:22
human behavior and
18:24
human preferences and uh you know all of
18:26
the human communications and
18:27
relationships
18:28
as they transact in online and
18:30
increasingly in offline spaces too
18:33
and you know one one of the things that
18:34
i wonder about when you’re talking about
18:37
this
18:37
the shift or the the growing awareness
18:40
that you know
18:41
what has been happening with the the
18:44
internet of
18:45
the last 20 years is an increasing
18:47
reliance on ad-based models
18:49
and that is happening in parallel with
18:52
the the diminishing consumer expectation
18:56
that they have to pay
18:58
for uh subscriptions or anything like
19:00
that and
19:01
the uh the lack of viability it seems in
19:04
many other monetization models so
19:07
what are some of the ways that uh that
19:10
we can think about
19:11
some of the emerging opportunities there
19:13
like what what are some of the
19:15
monetization models that we might
19:17
experiment with of course subscriptions
19:19
are one that comes to mind and you know
19:21
freemium sorts of models and things like
19:23
that what are you seeing that seems
19:25
innovative and exciting that that
19:27
anyone’s out there playing with or
19:28
trying to get
19:29
some some consumer awareness of
19:34
i think one of the areas that i’m
19:36
personally really excited about
19:38
is uh premium vod so we have the the
19:41
pandemic to thank for
19:43
a pretty big shift in how we consume
19:46
entertainment and really predominantly
19:49
premium entertainment so we
19:51
largely don’t have live concerts anymore
19:54
we’re definitely not going to movie
19:55
theaters but still there’s a slate of
19:58
really you know content that needs to be
20:01
still monetized that way
20:02
and so i you know i look at releases
20:05
like universal’s trolls or now the
20:07
ever postponing slate of really exciting
20:10
movies that i’m sure a lot of us would
20:11
pay
20:12
a lot of money to be able to see this
20:14
here like the james bond film
20:16
uh christopher nolan’s tenant another
20:18
you know really high
20:20
high ticket item that has a lot of
20:22
built-in fans already
20:24
but at the same time the the studios
20:26
that are running them
20:27
just don’t have the agility to be able
20:29
to shift into this more on-demand
20:31
direct-to-consumer entertainment
20:33
universe
20:34
so i i’m fascinated by the impact that
20:37
the past
20:38
six seven eight months are going to have
20:41
long term
20:42
on sports entertainment esports and
20:45
these
20:45
emerging ways of how we spend our time
20:48
because
20:49
as with most things the trends that
20:52
would have taken
20:53
five to ten years to play out are now
20:55
actually being
20:56
condensed in 18 months to three years
21:00
and it’s just like
21:02
you know being able to to witness almost
21:04
a big bang of sorts uh
21:06
in changes and so i’m super excited
21:08
about that yeah and one of the other
21:10
examples seems like it’s uh and you
21:12
wrote about it is hamilton being
21:14
released on
21:15
disney plus and so that whole enthusiasm
21:18
that that generated around
21:20
a relatively new streaming platform
21:23
uh and you know kind of garnering the
21:26
opportunity for for people to explore
21:28
the
21:28
the catalog that they had which i didn’t
21:31
know
21:32
until we had access to disney plus that
21:34
they had national geographic
21:36
and they had marvel and so all these
21:37
other things had somehow escaped my
21:39
notice so
21:39
there’s a really interesting uh lesson
21:42
there i think about um
21:44
you know kind of the ex exposing the
21:47
opportunity to consumers and making sure
21:49
that
21:50
you know there’s choice and that there’s
21:52
a
21:53
breadth of of um opportunity
21:56
uh what do you see happening there that
21:58
that’s in parallel i guess i want to add
22:00
into that
22:00
that question too this is all happening
22:03
as you say at a time through the
22:04
pandemic where we’re also seeing
22:07
this tremendous growth in e-commerce and
22:09
you know
22:10
uh the downtick in uh the closures of
22:13
brick and mortar
22:14
and things like that so there’s just so
22:17
much upheaval
22:18
in so many industries as a result of the
22:20
pandemic
22:22
um i guess i don’t know exactly how i’m
22:25
formulating that question and
22:27
uh actually i’m not sure that you i
22:30
think you have frozen again
22:32
oh no
22:33
[Laughter]
22:36
are there you’re there you’re moving oh
22:38
no i
22:39
just once you started yes no i just as
22:41
you started saying and i guess i want to
22:42
dovetail the next question so i didn’t
22:44
hear
22:44
any of that part um that’s okay i think
22:46
i i butchered the question
22:48
so we’re just back to hamilton let’s
22:51
just go back
22:54
i i think we
22:58
yeah like what we see a lot of
23:01
traditional
23:06
companies whether they’re traditional
23:08
brands or traditional entertainers
23:12
sorry i i’m so sorry this is awful i
23:14
think it’s my internet that’s uh tanking
23:16
but yeah so you’re saying about uh
23:18
hamilton i think with
23:21
sorry everybody um i hope this is still
23:24
fun
23:25
um yeah so you were saying about
23:28
uh hamilton and what i liked about that
23:31
um
23:33
that approach was that disney uniquely
23:36
understood
23:37
that uh to to be
23:40
relevant in this direct-to-consumer
23:42
world you also have to approach
23:44
uh awareness and consumer acquisition
23:47
differently and you know in a world of
23:50
content abundance
23:52
where there is so much good
23:55
out there it’s usually not about a
23:57
single piece of content that will drive
23:59
somebody to subscribe
24:00
but you can use it the way disney has
24:03
used hamilton
24:04
to really to your point let folks
24:06
experience and kind of
24:07
talk themselves into uh continuing to to
24:11
pay for the service and and
24:12
really exposing the the value of the
24:15
service
24:16
and this is the the number one uh
24:21
mistake we see a lot of traditional
24:24
companies make
24:25
and not really understand how to pitch
24:27
to a digital first mobile first consumer
24:30
or a direct subscriber
24:32
they’re just not wired to do it that way
24:34
and it oftentimes the
24:36
you know technology stacks that they
24:38
have in place just
24:39
aren’t the the types of tools that can
24:42
facilitate this type of direct
24:44
interaction as well
24:45
so they’re they’re stuck in this very
24:48
strange limbo where
24:50
they are committed to continuing to
24:52
acquire customers
24:54
in traditional ways but that’s just not
24:56
how you would go about acquiring
24:58
a direct customer that you you now want
25:00
to go and acquire
25:02
yeah and your point about the the
25:04
difference between
25:06
you know acquisition and retention you
25:08
know these are our concepts that are
25:10
are completely familiar to any marketer
25:12
but i think the more
25:13
that you think about um how that plays
25:17
out in an increasingly
25:18
humanistic way like making sure that
25:21
we’re thinking
25:22
uh holistically about human experience
25:24
as opposed to manipulatively
25:26
like because i think as we’ll get
25:28
further into the discussion about
25:30
the conversation that would have
25:32
happened 20 years ago
25:34
when i was at netflix about retention
25:37
and using
25:37
customer data to you know offer sweeter
25:40
deals and keep people around
25:42
is a potentially very different
25:44
conversation than what could happen
25:46
now 20 years later with um
25:49
not necessarily with netflix although
25:51
with netflix with any company and
25:53
the level of of granularity of data
25:56
that’s available the level of
25:58
um you know connected insights that can
26:01
come from from different data sources
26:03
um so it becomes a very different kind
26:05
of conversation and it’s it’s one that
26:07
leads i think into um into the larger
26:10
conversation
26:11
of of privacy and and you’ve written
26:14
about
26:14
privacy quite a few times you’ve said
26:18
we’re having the wrong conversation
26:19
about privacy what do you mean by that
26:26
yes it means something
26:31
nice to have like i i hear privacy
26:34
and it’s you know fluffy and great but
26:37
ultimately not something that is a
26:39
must-have and that’s my
26:40
ongoing frustration with privacy related
26:45
conversations because
26:47
i like to think of it in terms of data
26:49
usage rights rather than privacy because
26:52
i
26:52
as a consumer and the creator of this
26:56
data stream that everybody but me the
26:59
data-driven ecosystem is actually
27:01
able to monetize i want to have more
27:06
control over used with my
27:09
what is being done with my data and i
27:11
certainly want more transparency
27:13
and when we couch it in the language of
27:15
privacy
27:16
that usually tends to be interpreted in
27:19
a very binary way
27:20
it’s like you know oh leave me alone i
27:22
don’t want to be tracked
27:24
and that’s not where most consumers are
27:26
most consumers understand
27:29
that companies want to market to them
27:31
but then you know why
27:32
isn’t someone like target giving you
27:36
twenty dollars to go and try out a
27:38
service or something like that they
27:40
certainly have the budgets for that
27:42
maybe there’s a shortcut way that we can
27:45
bring consumers closer to companies that
27:48
want to talk to them and we don’t have
27:50
to do this big
27:51
rigmarole and
27:54
track and you know infer and whatnot and
27:58
that’s really what’s missing now
28:00
we have the technology that would make
28:01
that happen but the business processes
28:03
and the thinking hasn’t really caught up
28:05
and i i cringe at privacy
28:09
restrictions being imposed in in this
28:13
ecosystem without actually having the
28:15
conversation around
28:17
value and who is the creator of these
28:20
ultimately very valuable data sets and
28:22
who is actually seeing any value
28:24
extracted from them
28:26
because right now consumers are seeing
28:27
very very little value from the very
28:29
exhaustive data sets that they’re
28:31
generating
28:32
yeah that seems like it requires there
28:34
be an awful lot more education
28:37
and sophistication uh in the consumer
28:39
space
28:40
but also even just among practitioners
28:43
thinking about
28:44
the the broad applications that some of
28:47
these data sets can have
28:48
so you had a great example in one of
28:51
your newsletters about
28:52
if you just look at your your google
28:55
interests
28:56
settings and he went through and did
28:59
kind of an inventory of like
29:01
some of yours were like microsoft
29:03
powerpoint like yeah okay but
29:05
yeah fair play right like yeah i think i
29:09
guess that’s an interest sure yeah yeah
29:11
sure you had a portugal national
29:13
football team
29:15
yes i i’ve no idea i i
29:18
literally could not figure out how that
29:20
made its way there
29:21
but knitting you said made sense you do
29:23
you do yes okay
29:25
i i did yes so that that’s that’s the
29:27
one that was like oh okay
29:28
this is kind of maybe me but but i could
29:31
not
29:31
sherlock holmes my way back from the
29:34
portuguese
29:35
national football team you know i keep
29:38
racking my brain around
29:40
like how what what what was i searching
29:42
for
29:43
that could have triggered that kind of
29:44
conjecture and
29:46
obviously that’s a really ludicrously
29:48
silly example
29:49
but these kinds of assumptions are being
29:52
made
29:53
with uh much more relevant data sets so
29:56
let’s say i
29:56
search i hear about a you know chronic
30:00
condition
30:01
somehow or maybe through you know
30:03
somebody has it in my vicinity and i
30:05
google it and i can very easily
30:08
be classified unbeknownst to me
30:11
in
30:14
some sort of segment people sufficient
30:17
are likely to suffer from this disease
30:18
and then what happens
30:19
when a health insurer purchases an
30:21
alternative data set
30:23
and uses that to you know determine
30:26
price of my coverage or a life insurer
30:28
uses it to deny me
30:29
life insurance even though this could be
30:32
a completely inaccurate data set
30:34
the power of it of propagation
30:38
can completely ignore the fact that it
30:40
it’s not its source
30:42
isn’t true and so things like that
30:45
literally keep me up at night and and
30:47
this again going back to the privacy
30:49
conversation this is at the crux of it
30:51
it’s not oh privacy it’s if you’re using
30:56
if you’re going to be using these kinds
30:57
of data sets then you have to give
30:59
people
31:00
the direct ability to control what’s
31:03
being
31:04
used for all sorts of very
31:08
inferences about their lives
31:11
so what kind of control does that look
31:13
like or how you know what kinds of
31:14
interfaces
31:15
or or control sets could we expect or
31:19
could we imagine
31:20
that consumers might have
31:25
so i think i’m going to invite the ire
31:27
of the internet now if i say well this
31:29
is
31:29
one possible application of blockchain
31:32
here but so please you know don’t
31:34
everybody laugh
31:35
at once it is actually but i i do i do
31:39
see uh
31:40
almost like a marketplace where you know
31:43
different companies are saying hey we
31:45
want to
31:45
target you or we want this particular
31:48
data set about you
31:49
and that you as a consumer can say yes
31:51
yes no i you know i never
31:53
want to hear from you people again ever
31:55
don’t ever like mail me or email me or
31:57
anything like i don’t exist for for you
32:00
and obviously it’s it’s easy to imagine
32:02
something like that
32:04
and it’s i think it’s easy to
32:09
understand
32:12
the potential utility of it it’s very
32:15
very
32:16
built we want to see in market but it’s
32:20
near impossible to to build it now with
32:22
the data infrastructure that we have
32:25
yeah that that was amazing you said that
32:28
and then your your video had uh frozen
32:30
up so it was like you were your mouth
32:32
was perfectly still but we could still
32:33
hear your hours
32:34
you’re frozen but we’re we’re uh are you
32:36
fro am i frozen for you
32:38
oh my gosh those of you who are hanging
32:41
on and i i do see some folks
32:42
uh hanging on and i really appreciate it
32:45
so sorry that we’ve got all of these
32:47
issues going on
32:48
uh we’re gonna try to make it as as
32:50
useful and
32:51
entertaining a conversation as i can for
32:54
those of you who are hanging in
32:56
uh i want to know texting you
32:59
yeah yeah maybe just text me your
33:01
answers and i’ll just put them i’ll put
33:03
them up
33:03
um you know i’ve been noticing that you
33:06
have been following the whole
33:07
the tick tock uh song and dance is what
33:09
i would call it
33:11
um so first of all what do you make of
33:14
the
33:14
data and privacy issues that that are
33:17
related to
33:18
chinese ownership of an app used so
33:20
widely by american and international
33:21
audiences
33:22
and and what does that even say about
33:25
you know kind of the expectations that
33:28
uh of who’s going to own
33:29
platforms and that that people are going
33:31
to be using
33:34
i think that whole story is particularly
33:38
interesting because it’s really the
33:39
first time
33:40
that we are seeing the
33:43
internet that originated in silicon
33:45
valley mostly
33:46
and the internet that originates in
33:48
china uh
33:50
interacting with one another i i think
33:52
you know for me
33:54
it goes back to we
33:57
need to have scrutiny over what any app
34:00
does
34:00
with our data um and so it can’t just be
34:04
oh you tick-tock you are somehow
34:07
foreign-owned and you know you are
34:09
inherently evil
34:10
and i think you know i remember the
34:13
conversations that you initiated around
34:15
all of those
34:16
uh apps that were comparing like
34:19
faces and like aging faces and stuff
34:22
like that many of them
34:23
of which were not housed in the u.s
34:26
and so you know asking where does what
34:29
happens to this data set
34:30
especially if it’s a potentially
34:32
biometric data set is
34:34
really really important but as we saw
34:37
with
34:37
tiktok it can’t just be in the interest
34:40
of freaking everybody out
34:43
and you know mentioning things like
34:44
national security without
34:46
really the context around it because
34:50
even if you’re making a valid point
34:52
you’re just
34:53
it just sounds stupid to try to make
34:56
that argument without actually expanding
34:58
it to include
35:00
all data operations so that was my my
35:03
most of my reactions around that whole
35:06
spectacular mass were around oh this is
35:08
a great opportunity to talk about deity
35:10
rights usage let’s talk about that
35:12
and instead it just kind of you know
35:15
spiraled into this
35:17
very very strange scenario and much ado
35:20
about nothing
35:20
in the end so yeah you know kind of
35:23
feels like it’s a very
35:24
contemporary uh story that you know we
35:28
always see things become reduced to
35:30
their least sophisticated talking points
35:33
and then it just gets hammered in these
35:36
kind of ideological arguments and
35:38
vacuums but you know i i wonder
35:41
what are you thinking now about you know
35:44
the the
35:45
american potential uh investment or a
35:49
holding of the app like
35:50
i i i know you had included a chart
35:53
that um showed this complicated
35:56
ownership structure
35:58
that some journalist had had included in
36:01
his tweet
36:02
and that it lives on actually on the
36:03
bite dance site i believe it very
36:06
explicitly and transparently explains
36:08
the hierarchy of these
36:11
holding companies and the limitation of
36:14
the the u.s component of of the the tick
36:17
tock
36:19
discussion so so are we even talking
36:21
about anything
36:23
of significance anyway no
36:26
not at all and that’s that’s the thing
36:28
it’s a missed opportunity
36:29
and you know a lot of hullabaloo over
36:31
essentially nothing
36:33
you know we don’t get words uh
36:37
like national security involved in in
36:39
regular
36:40
discussions around like who owns cloud
36:42
infrastructure here or there
36:44
and so it’s once again we’re we’re
36:47
supposedly having a conversation but
36:48
we’re not having the conversation that
36:50
we should be having
36:51
and so you know pick any topic from
36:55
digital world and i think that
36:56
generalization will apply
36:58
and i i really mostly see it as a missed
37:00
opportunity to clarify and educate
37:03
consumers who
37:04
are uh you know not thinking twice about
37:06
installing something on their phones
37:09
uh about what can potentially happen
37:12
from that point on and you know what the
37:14
the the data exhaust that they’re
37:16
generating just by using their their
37:18
device
37:19
actually looks like i think that’s
37:20
something that’s incredibly important
37:23
from uh just a technological literacy
37:25
perspective
37:26
yeah i agree i think that’s the the
37:28
three-pronged
37:29
discussion that i inevitably have when
37:32
this kind of subject comes up is that
37:34
you know
37:34
there’s a there’s a corporate
37:35
responsibility piece there’s a
37:37
government responsibility piece and
37:39
there’s an individual responsibility
37:40
piece and
37:41
that piece i think is the easiest to
37:43
overlook but
37:44
each of each of us all people need to be
37:48
becoming more savvy and sophisticated
37:50
all the time
37:51
about the data footprint that we leave
37:54
and the way that we engage with
37:56
with different systems so so we’ve got
37:59
that
37:59
like we know that that needs to happen
38:02
what do you think needs to happen
38:03
on the government responsibility side
38:05
let’s segue a little bit into
38:07
regulations what do you see as a miss
38:10
opportunity or
38:10
or opportunity in the area of of
38:13
regulations around
38:14
around personal data well
38:18
first of all i love that three-pronged
38:19
approach because i would
38:21
venture to say that uh really the only
38:24
companies that
38:25
seem to be uh operating where they
38:28
should be right now are
38:29
that is the corporate prong and that’s
38:31
inherently because
38:33
whoever owns the most data in a big data
38:36
environment
38:37
has a really really big moat over others
38:41
so this is where you know the big tech
38:43
conversation really becomes
38:45
uh particularly impactful and you can
38:48
see
38:48
the the advantage that somebody like a
38:51
google
38:52
has uh with you know every
38:55
really understanding where everybody is
38:58
and what they’re doing
38:59
at any time of day essentially uh i
39:02
i think when i look at regulation again
39:05
especially here in the us
39:06
i see a lot of reactiveness and a lot of
39:10
you know skating way behind the puck
39:14
to borrow from wayne gretzky
39:17
and you know solving problems from
39:20
several years ago and the the rate of
39:24
advantage of big data agglomeration and
39:28
aggressive acquisitions and uh you know
39:31
that that big tech is adopting
39:33
as a growth strategy it just doesn’t
39:36
leave room for you to
39:37
even be you know six months behind let
39:40
alone
39:40
you know three to five years and so i i
39:44
unless we have this type of
39:48
data protect consumer data protection
39:51
environment
39:52
i look at things like gdpr as an
39:55
inspiration
39:56
and a way to adopt something that can be
39:59
significantly improved upon
40:00
in actual implementation but like a good
40:03
first step
40:04
unless we have that layer that’s really
40:06
consumer friendly we’re
40:08
always always always going to be way
40:10
behind
40:11
big tech on this so it’s no longer a a
40:14
tri-pronged
40:16
approach you have you know one
40:19
prong that’s very strong and then two
40:21
that are kind of
40:23
not not really doing very well and that
40:26
you know that does not
40:27
look very balanced and i hope
40:30
that that is going to change as we
40:34
get more uh regulators and legislators
40:38
who are uh more
40:41
attuned to the tech landscape in general
40:44
and are more tech savvy
40:46
i i don’t think we can afford more
40:48
regulators who are you know endlessly
40:51
fascinated
40:52
by how search works in the year 2020
40:55
uh that that’s just not not acceptable
40:58
and so we we need to to level up
41:00
on that front when i say we i largely
41:02
mean the us
41:04
but this will stand for for most other
41:06
uh countries although
41:08
amongst the oecd group we are doing
41:10
quite badly
41:11
um when it comes to this this type of
41:14
maturity and
41:14
in regulation and really consumer
41:17
protection
41:18
yeah and the the opportunity seems like
41:22
there are so many other kinds of
41:26
entities that are proposing their own or
41:27
coming up with their own types of
41:29
regulations like
41:30
um the amsterdam algorithmic
41:34
index or whatever they’re calling it the
41:35
the register of
41:37
algorithms in amsterdam uh or is it
41:40
amsterdam is in the in
41:41
the netherlands as a whole i can’t
41:43
recall i don’t know if you remember
41:45
um but that seems like you know there’s
41:48
this opportunity to have algorithmic
41:50
transparency and make
41:52
seemingly black box algorithms more
41:54
approachable
41:55
uh to everyone and create accountability
41:58
like uh
42:00
what opportunities do you think exist
42:02
around around that space
42:04
right now but i think we we talk about
42:07
algorithms so much but we talk about
42:10
them
42:10
uh as if there’s some like magical
42:13
entity
42:14
and like mystical power that does
42:16
something
42:17
and whatnot and it’s not it’s
42:20
we human programmers put in inputs and
42:23
put in constraints and reflect
42:25
their perspective in the world and then
42:27
machine language
42:28
uh interprets that and and channels that
42:31
back to us
42:32
and i just want everybody who has said
42:35
that something along the lines of well
42:37
the algorithm does this is just to
42:39
replace that
42:40
with you know the people who program
42:42
this algorithm
42:43
have done so and so because i think it’s
42:46
it really removes responsibility
42:50
from us as authors of these
42:54
technology solutions and you know
42:56
technology is
42:58
is still very homogeneous
43:01
the people who get to build it
43:04
overwhelmingly across the world
43:07
look a certain way and and come from a
43:10
certain background
43:11
and and think very similarly to one
43:13
another
43:14
and unless we can figure out how to
43:16
democratize this
43:18
so that everyone is truly involved we’re
43:20
always going to have
43:22
an inherent bias reflected in the
43:24
algorithms even if
43:26
the the uh authors of software have the
43:30
most
43:30
noble intentions ever it’s just
43:34
by design that’s how this works uh so
43:37
i’m i’m particularly excited about
43:40
things like the no code movement
43:42
and really removing the level of
43:45
knowledge
43:46
one needs to have to even play
43:49
in the arena of developing and and and
43:52
having a stake in technology
43:55
and i think that that’s going to be the
43:56
next uh
43:58
where i get excited about the
44:00
interaction and intersection of humanity
44:03
and technology
44:04
it’s around things like that like i want
44:07
to have my
44:08
my you know my four-year-old niece
44:10
should be able to
44:11
program a bunch of stuff by drawing or
44:15
you know using the tools that she
44:16
normally has i don’t want to live in a
44:19
world where she needs to
44:20
learn eight specific languages and you
44:23
know go to school for
44:24
20 years to be able to like interact
44:26
with a machine like that
44:28
you know i i can say that because that
44:29
was my path i i have a computer science
44:32
and math degree i’ve done
44:33
all of that and i don’t want that i
44:36
don’t think we
44:37
we have to live in that kind of world we
44:39
can unlock a much much better more
44:42
equitable world for everyone
44:43
and that’s that’s what i’d really like
44:46
to see as the the next phase
44:48
don’t know here was gonna say i don’t
44:50
know if that actually answered your
44:52
question oh well
44:53
i think it’s a great answer uh whether
44:56
it answered
44:57
that specific question or not i want to
44:58
know what i so that feels like a very
45:01
optimistic view
45:02
of you know what what can happen i want
45:04
to know what else
45:06
you are optimistic about when it comes
45:08
to how technology
45:09
can can shape human experiences or what
45:12
we can do with
45:13
with data and emerging technology to
45:15
improve the meaningful experiences
45:17
of humans in the future
45:21
yeah i want to say this has been a very
45:24
trying year for
45:25
optimism especially technological
45:28
optimism
45:30
because you know we we we readily see
45:34
that there are solutions available that
45:36
we’re just not
45:37
using or not using well um i’m
45:40
greatly energized by uh
45:43
not having this silicon valley
45:47
silicon valley be the only place in the
45:50
world where
45:50
we innovate something in technology
45:53
anymore i think that
45:54
there are so many interesting
45:56
initiatives and and the cost
45:58
of participation has kind of
46:01
lowered itself significantly that you
46:04
know now
46:05
kids in the most remote areas of the
46:07
world can start building cool stuff on
46:10
their mobile phones can start building
46:12
businesses just by using chat apps and
46:14
similar and that’s just been
46:16
really really wonderful to see i i think
46:20
we here in in the west and oecd
46:23
countries can learn a lot
46:24
from these new approaches and and really
46:27
new
46:28
creativity bubbling up so i
46:31
i tend to think a lot about how can we
46:33
elevate
46:35
really cool creative projects that
46:37
aren’t originating here
46:39
how can we find vehicles to fund them
46:41
and to support them commercially and you
46:44
know i like to think that in
46:46
10 years time just looking at how much
46:48
things have changed
46:50
never mind the world but in like the new
46:52
york tech ecosystem
46:54
in a decade and then extrapolating that
46:57
to the entire world like that’s
46:59
genuinely exciting to me
47:01
and i think the more we’re um
47:04
innovating now the closer we are to like
47:07
bleeding edge between
47:08
different spaces so if you’re if you’re
47:11
good with
47:12
data analysis you can do that
47:15
in uh you know you can apply that to
47:18
biotech and you can apply it to
47:20
climate change and you can apply it to
47:22
logistics and you can apply it across
47:24
many different dimensions
47:25
so i think we’re at the cusp of going
47:28
from that
47:30
very specialized look
47:33
at our careers and and our individual
47:36
focus on a very like factory driven
47:39
kind of way like oh i work on this
47:42
particular part
47:43
and i’m very good at this but i don’t
47:45
know anything about any other parts of
47:47
the card that i’m putting together
47:49
to a much more renaissance approach
47:53
to knowledge where you’ll have you know
47:55
the kind of the building blocks and the
47:57
critical
47:58
uh knowledge that that you’ll need to
48:01
command whether that’s
48:02
high math literacy analytical skills
48:05
ability to process
48:07
a lot of data and extract value from it
48:10
similar
48:10
that can be applied across many many
48:12
different things and every time
48:14
in the history of humanity when we’ve
48:16
had that kind of blending event
48:18
something magical has come out on the
48:20
other end
48:21
so that’s my that’s where i catch my
48:25
optimism now is that i think we’re in
48:27
in that kind of acceleration phase and
48:30
that could be really really wonderful in
48:32
the next
48:32
decade that’s a great way to think of it
48:36
i do have a question from the audience
48:37
uh shawna carp
48:39
asks a question does the math background
48:42
of a lot of algorithms
48:43
create problems around abstracting into
48:46
no code
48:49
it shouldn’t we shouldn’t be
48:53
creating barriers we should be finding
48:56
ways to explain
48:58
complex math to people who don’t speak
49:00
complex math
49:02
and so you mentioned knitting this has
49:04
been one of the big revelations for me
49:07
in in the last couple of years and one
49:09
of the reasons why i picked this up as a
49:11
hobby
49:12
so you know i might my grandmother was a
49:15
great crafter
49:16
and she many many years ago tried to
49:18
teach me how to knit and i was like yeah
49:20
this is not interesting whatever
49:22
um until uh literally last year
49:25
when i looked at a knitting pattern
49:28
and i just had this revelation that oh
49:30
my god this
49:31
is code because it is code it’s exactly
49:34
that it’s it’s an abstraction
49:37
you need to visualize and then you need
49:39
to make and translate with what you’re
49:41
seeing
49:41
on a piece of paper or screen or
49:43
something like that and i thought
49:45
you know some of the first programmers
49:48
the computers were women for an obvious
49:52
reason because we had the skill set
49:54
based on generations of crafting and
49:56
telling stories in this
49:58
level of abstraction so somewhere
50:00
between that first
50:02
step and the world we live in today
50:06
somebody inserted this oh no it’s all
50:08
math
50:09
thing in between and didn’t look at
50:13
disciplines like crafting and knitting
50:16
and didn’t have the skills to look at
50:18
that and go
50:19
wait a minute this is exactly what i’m
50:21
looking for
50:22
so we had a completely different set of
50:25
mental models
50:26
shifted and put on top of this world it
50:29
doesn’t have to be that way
50:30
so i think as mathematicians now there
50:33
is
50:34
you know mathematical research and and
50:36
furthering that but i think we
50:38
have a lot of opportunity to really
50:42
demystify
50:43
uh and think about math as you know lego
50:46
building blocks
50:47
and and create that universe so that
50:50
more and more people
50:52
don’t have to use that pathetic excuse
50:55
if oh i’m not good at math
50:56
i don’t know how to do this because of
50:58
the you know the way they were
51:00
thought didn’t resonate in in their
51:02
brains and it’s not just math i think
51:03
it’s every
51:05
every uh discipline especially in in
51:08
science or in stem fields
51:11
can be interpreted in a much much more
51:14
user-friendly way and that that’s we as
51:17
as professionals and leaders should be
51:19
focusing on that yeah and uh shayna
51:22
sorry shayna that i didn’t pronounce
51:23
your name correctly it’s right there in
51:25
your
51:25
in your username you spelled it out it’s
51:27
shayna
51:29
i said it wrong sorry i said i wish
51:31
someone had said this to me in high
51:33
school
51:34
and i think that is the sort of tragedy
51:36
of how we’ve
51:37
taken technology and computing and
51:40
programming and made it
51:41
such a uh a remote discipline made it
51:44
feel so
51:45
hard to get to that you have to go
51:47
through so many hoops of math and
51:50
code and logic and all these different
51:52
things which are all very valuable
51:54
to know but they aren’t necessarily
51:57
what we need in in trying to
52:00
make the democratized movement right
52:03
like we we need people to be able to
52:04
think about
52:05
technology fundamentals in ways that
52:08
they can actually
52:09
uh bring holistic and integrative
52:12
approaches to
52:13
and that’s not going to happen if people
52:14
are thinking of it as some far-off
52:16
distant thing that they can never reach
52:17
because they aren’t good at
52:19
10 things that aren’t necessarily aren’t
52:21
necessary
52:22
to it so yeah exactly and look i you
52:25
know i’m a terrible guitar player
52:27
seriously
52:27
awful but i still play because there’s
52:30
enjoyment in playing and there’s
52:32
enjoyment in
52:33
knowing enough that you can get
52:36
something back from it
52:37
and and i wish we had a very easy way to
52:40
do that across other disciplines
52:41
especially intellectual disciplines
52:43
especially stem disciplines uh
52:46
and and i think that’s a that’s a worthy
52:47
goal for the next generation of
52:49
technology and also why you know i love
52:51
stuff like no code and
52:53
and those kinds of movements because it
52:56
removes
52:56
that impulse to be a gatekeeper
53:00
and to kind of go oh but i went to you
53:03
know
53:03
high school and college and graduate
53:05
school and i’ve learned all this stuff
53:07
and now you must suffer too
53:09
like no that’s the wrong impulse here
53:14
yeah uh i also want to um
53:17
i want to use this this sort of wrapping
53:20
up moment of your you’ve already got
53:21
yourself into this optimistic headspace
53:24
i want to ask you
53:25
you know as i think about how we think
53:27
about the um
53:29
the opportunity to build better futures
53:31
with tech
53:32
and how we can stand a better chance
53:35
in culture uh in maybe in
53:38
the corporate world in the advertising
53:40
world or wherever whatever scope you
53:43
think is most appropriate for this
53:45
question like
53:46
what do you think we could do to stand a
53:48
better chance of
53:49
creating or bringing about the best
53:51
futures with tech
53:53
rather than the worst futures and you’ve
53:55
already spoken to no code and you know
53:57
maybe some
53:57
reforms to education around technology
54:00
but what else what what kind of
54:02
seems like it’s going to make the
54:03
biggest difference in terms of affecting
54:05
people’s lives
54:09
honestly i i think there has to be some
54:12
type of
54:14
uh whether economic incentive or or
54:17
uh similar
54:20
to you know incentivize
54:23
the development of good technologies
54:26
because
54:27
nefarious technologies are very easy to
54:30
monetize
54:31
and by by nefarious you know i do mean
54:34
like literally you know
54:36
dr evil and james bond villains kind of
54:38
thing
54:39
but when you think about how much of the
54:42
development of modern day technology is
54:44
tied back
54:45
to uh you know military investment and
54:49
similar like is
54:50
is there a way that we can funnel those
54:54
same investments and money
54:56
but have a different goal
54:59
in mind for society and i’m greatly
55:02
encouraged by
55:04
uh the approach that new zealand has
55:07
taken here
55:07
where they’ve you know identified not
55:10
just the pursuit
55:11
of uh increasing their their gdp
55:15
as a goal but really measuring and
55:17
quantifying and
55:19
holding themselves accountable for
55:21
improving
55:22
society and citizens satisfaction with
55:25
the society that they live in
55:27
that’s a really really powerful concept
55:29
that i think extends beyond
55:31
just technology i think technology can’t
55:34
operate in a vacuum it always has to
55:36
have
55:38
some connection to to the people it’s
55:41
serving and the the use cases that it’s
55:43
beating
55:44
and you know it’s it’s just
55:47
it’s the tragedy of our our day is that
55:50
it’s
55:51
easier and possibly commercially more
55:53
advantageous
55:54
to just you know develop crap
55:59
yeah i don’t have strong opinions about
56:01
this
56:04
but so there’s some economic incentives
56:07
for
56:08
alignment right it sounds like what
56:10
you’re saying is um
56:12
the best opportunity and i i see
56:14
something like this in my work
56:16
too that we need to find ways to align
56:18
business incentives with
56:20
human outcomes and the more we can do
56:23
that the better chance we’re going to
56:25
have
56:25
at scaling you know great technology as
56:29
well as great other
56:30
business-driven experiences so that
56:33
that’s a
56:34
a great thought to to close with hey um
56:37
thank you so much for persisting with me
56:40
through all of the
56:41
internet connective and thank you to our
56:43
audience for persisting with us
56:45
through all of the connection issues uh
56:47
while we’ve got you still here
56:48
how can people find and follow you and
56:50
your work online
56:53
uh so i’m the principal and co-founder
56:55
of sparrow advisors we’re a management
56:57
consultancy that helps
56:59
brands technology companies investors
57:02
and really everybody
57:03
in the general digital space work better
57:06
with one another and
57:08
and create commercial advantages out of
57:10
some of these newer technologies
57:12
so uh sparrowadvisors.com is uh
57:15
our home on the internet uh we have a
57:18
the weekly
57:19
strategy newsletter that i i you know
57:21
i’m slightly biased but i think it’s a
57:23
fascinating read because
57:24
it’s really good it’s one of those
57:27
things that if if we weren’t developing
57:29
it i wish somebody else was doing it so
57:31
that i could read
57:32
it so that’s you can also find a link to
57:36
subscribe to that on our website and
57:38
then i tweet a lot
57:40
and you can find me as aexm on twitter
57:43
that’s my my playground for testing a
57:45
lot of these larger topics and ideas is
57:48
there and
57:48
uh yeah it’s you know fun so it is
57:52
that’s me all right
57:54
thank you very much uh thanks to
57:56
everyone out there and
57:57
have a wonderful rest of the day thank
57:59
you anna for being here
58:01
thank you

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.