Episode 26: The Science of Depolarization (Interview with Dr. Lisa Schirch)
(Music) Society builders pave the way to a
better world, to a better day.
A united approach to building a new society.
Join the conversation for social transformation, Society Builders
Society Builders with your host, Duane Varan.
(Duane) Welcome to another exciting episode of
Society Builders, and thanks for joining
the conversation for social transformation.
Today we continue our sequence of episodes
exploring the science of depolarization, how we
can help bring antagonistic groups closer together.
Now, most of our previous episodes on this
theme have explored solutions, and in some ways
we've been talking about solutions before fully exploring
the underlying problems and their causes.
So today we're going to go back a step and
dive in further on why we're seeing this explosion in
polarization globally, what's behind it, what's helping cause it.
And today we're going to explore just
one of these major drivers, an accelerant
that's greatly fanning the flames of polarization.
And here, of course, I'm talking
about social media and how social
media contribute to this polarization disease.
And my guest today is one of the
world's leading authorities on this question, Dr.
Lisa Schirch, who's the Senior Professor of the
Practice of Peace Studies at the University of
Notre Dame and who is also a Senior
Research Fellow at the Toda Peace Institute.
She's the author of 11 books and
countless academic articles exploring this theme, including
her most recent book, Social Media Impacts
on Conflict and Democracy: the Tech-tonic Shift.
So we're incredibly fortunate to have
the benefit of her insights today.
So Lisa, welcome to Society Builders. (Lisa) Thanks so much.
Great to be here.
(Duane) It's such a thrill to have you.
Your research has really been focusing on such a
crucial question in this whole polarization arena, and that
is about the role that social media play as
this accelerant that's feeding this whole polarization machine.
That's what we really want to talk about today.
It's kind of like there was this romantic period where
when we were talking about technology, we talked about it
in terms of how it was going to give voice
to people, how it was going to unite people, how
it was going to improve democracy.
We're not talking about it quite like that anymore.
Now it's like technology has suddenly become
unbridled. And now we're fearing the consequence.
I mean, it's a very different way that we view social
media today than perhaps we did a few years back.
So maybe we can start here and talk about
how it is that social media contribute to polarization.
(Lisa) Sure, great question.
A lot of people will point to the
fact that there was polarization before social media.
So it's not the origin of
why people disagree with each other.
And there are other factors also.
So I want to point out that political polarization through
radio, through legacy know tv shows like Fox News and
MSNBC, for example, we've had this creation of partisan media,
and that is also part of polarization.
But I think the contribution of social media
to this phenomena of increasing dehumanization and hatred
of each other, really, that it's played a
big role, because the way that social media
is designed rewards bad behavior.
So it actually amplifies the worst aspects of human
behavior in terms of, if you can imagine us
driving down the highway and there's an accident, and
everyone slows down to look at the accident the
same way on social media, when there's an argument,
that's what draws people's attention.
Social media is a lot like an
amphitheater or more like a coliseum, where
there's gladiators fighting in the center.
So most people on social media are just watching.
I think it's often just 1% of people on social
media that are behaving really badly, arguing with each other
in a very dehumanizing way, but it's contagious.
So while many people might not be
arguing online, it's affecting people and what
they think of the world online.
And I would say it's the design of
social media platforms as gladiator arenas where people
come to fight, and the whole design of
how it helps people watch that fighting.
We can design social media in a different way.
It doesn't need to be amplifying those fighting
people by putting them on the stage in
the middle of all of us.
And I think that it's what we
call algorithms on social media that drive
our attention to see who's fighting today.
So the first thing when you open your computer
in the morning, or you turn it on and
you look at your Twitter or your Facebook, the
algorithm is going to show you who is fighting.
And so that's how it's a gladiator arena.
(Duane) So, Lisa, one of the themes we want
to explore today is how intentional the social
media push to polarization really know.
I think people might assume that polarized content is
naturally floating to the top, so to speak.
People are clicking on content, and that's
naturally resulting in polarized content being viewed.
But actually, that's not the whole story.
It's not happening naturally or by popular selection.
There are actually very deliberate forces that
are biased to amplifying the polarized content,
giving it a disproportionate voice.
And there are a variety of reasons
for this, which we want to explore.
So, first, let's talk about commercial reasons
why this kind of polarized amplification occurs.
What commercial interests drive this amplification?
(Lisa) ((Great question.
So why companies want to highlight why
their algorithms highlight who's fighting today.
Because, yeah, the profit is related
to how long people stay online.
So they make more money the longer each
of us stays on their platform, because they
show us more ads, the longer we're there.
So they get money from advertisers for the minutes that a
user is on the platform looking at that specific ad.
So they get financially rewarded for keeping us there.
What keeps us there are the fights and the arguments.
But we also have to realize that the
longer we're on the platform, the more information
these companies are gathering from us.
So they're extracting personal data from each person on
any of the platforms about who they're friends with,
what they like, where they are, and all of
that data is then sold to advertisers to be
able to target ads more clearly and precisely.
So there's this twofold profit model where they want to
have us stay there to watch ads, and they want
us to stay there and watch other people's content so
that they can gain more information about what we like
and what ads to feed us in the future.
So that makes sense.
So there's this big money sign over
us staying on the platforms longer.
And emotional content, what we say is
sort of false and deceptive and hateful
content keeps people there longer.
This is the neuroscience part of it.
It's often referred to as the attention economy.
The idea that each of us
has a limited attention every day.
And all those tech companies, Netflix, eBay, Amazon,
Facebook, they're all fighting for a minute of
our day because they make more money the
longer we're on their sites.
Not all of the sites are making money
in the same attention sort of way.
It's mostly the social media companies, but
they are all fighting for our attention.
So when you have a social media company that
is letting users make their own content, and users
learn that making outrageous content gets them more attention,
because the algorithm is driving people toward extremist ideas,
angry, hateful ideas, what we know is that politicians
in Europe actually figured this out, that if they
just posted a regular campaign ad, they would not
get very much engagement.
But if they used inflammatory language, sort of like
very emotionally engaging language, then the algorithm on Facebook
would show that to lots of people.
And so politicians are like, you're making
us more polarized, because you've incentivized us
to be outrageous in our political ads.
And so this is how this algorithm.
We call it algorithmic extremism, algorithms that
reward extremist content, it's turning up the
heat in all of our conversations.
(Duane) So we see how this commercial
imperative contributes to the problem.
But it's actually a lot more sinister and
malicious than this, because there are also other
forces driving this amplification of polarized content.
And perhaps the most malicious is the
cyber warfare that is taking place here.
And this is all the more sinister because
there are often foreign governments acting to destabilize
our countries that are behind this.
Lisa, can you comment on how these
kind of malicious forces drive polarization?
(Lisa) Foreign governments, and especially Russia, learned how Facebook
works a long time ago, and it was
before the 2016 election in the United States.
And we know this because a bipartisan republican democratic
Senate Foreign Relations committee did a big research project
on how Russia interfered in the US election in
2016, and they interfered by gaming this system.
In fact, foreign governments at that point
could actually profit from creating fake ads
about Hillary Clinton going to prison, creating
a meme with Hillary Clinton behind bars.
And this would then spread like wildfire
because extremist content gets more attention.
And there is an affordance, a design feature
on Facebook that lets you share things with
other people that wouldn't have to be there.
And one of our asks in the 2024
elections is, stop the share button during election
seasons, because this is how false and deceptive
information travels so quickly and so fast.
But we know that Russia was doing
that to a bunch of different candidates,
so they were playing different sides.
They were also pretending to be
black activists in the United States.
And in those black activist chats that
Russia was posting information, know, why would
anybody want to vote for Hillary Clinton?
And we should all just stay home?
So they were pretending to be different Americans.
And the goal is really to divide Americans,
to undermine democratic processes, to make us doubt
political leaders, to doubt our media.
It's sort of amplifying disinformation, but also leading
to a collapse of what is true.
The collapse of truth and the collapse
of trust, social trust, public trust.
We can't trust who is on our side.
All we sort of see online is chaotic
information that's very, very polarized, and it really
just sorts people into us versus them.
(Duane) There's a certain irony here, because a lot
of this feels like it parallels what was
happening in Iran in the 1953 coup.
And this is well documented.
I mean, it's not a conspiracy theory I'm sharing.
This is now more than 50 years later.
This is all out in the open.
But what would happen is the CIA would pay
these protesters to go march in the streets with
these anti Mossad death messages, and then they would
pay the same protesters to walk the other direction
with these anti Shah messages.
So they would just walk back and forth,
just feeding the passions on both sides.
And now, ironically, you have state actors, including
Iran, including Russia, and many others, who are
now kind of like bringing that same kind
of tactic now into the cybersphere. (Lisa) Absolutely.
And they're called chaos actors.
It's not really clear and easy to see foreign
operatives on social media because they're causing chaos.
They're causing the collapse of information and how we
think and make sense of the world today.
And so I think we need to understand how
chaos contributes to polarization, because before there was toxic
polarization all over the world, we disagreed politically with
many of our neighbors, but we didn't think that
they were evil or hateful.
And if we had a question about why do you believe this?
Or that, we could actually talk to them.
And so it's sort of this chaotic and hateful
environment now that we see on social media that
is just fueling sort of people disengaging from politics
in many ways and just deciding this is too
chaotic, too angry, too hateful.
And it encourages passivity, actually.
It encourages just opting out of voting and engagement.
(Duane) I'd like to go back to what you were talking
about, the algorithms, and how these algorithms really build on
some underlying passions and build on that, again, not in
a disinterested or not in an objective way, very clearly
with this bias tooth accelerating it.
But still, one of the things which is really
interesting about this, and you talk about this in
your book, the limbic neurophysiological human, the reptile human,
what Baha'i's would call this conflict between our lower
nature and our higher nature, our animal self and
our transcending of our animal self, our more divine
self, if you will, this battle that exists.
And what seems to be playing out is that
those mechanisms of our lower nature, our passions, are
being fed and fanned in this way.
And that is what gets this fire going so rapidly.
And it just seems like that opportunity that
has a more rational, more measured, more thoughtful,
more insightful, that kind of more elevated conversation
gets lost in that traffic somehow.
So that seems to be this underlying
tension that is this engine, this catalyst
that drives these algorithms so radically.
(Lisa) So really, neuroscience is behind
a lot of conflict behavior.
When human beings can sit down calmly and
breathing regularly, we can solve problems together.
We can disagree about issues, but we can maintain
a sense of human dignity, relationship with others.
And we have our prefrontal cortex.
I'm pointing to my forehead right now.
We have this amazing brain, as humans, that
can link up with other people's brains and
figure out creative solutions to problems.
And really, that's what conflict resolution, conflict transformation
building, that's what that's all about, is trying
to create a setting, a condition where people
can be their best selves and work with
other people to find creative solutions to problems.
So, as a mediator, for example, I'm used
to sitting in a room with people who
are very angry with each other, disagreeing something.
And it's my job as a mediator to lead them through
a process of moving from the reptile part of their brain
in the back bottom of the brain stem, and trying to
sort of create enough safety in the room that they can
come up to their forehead, where their thinking brain is, where
they can actually solve problems together.
And so even before all this, social media, I
think neuroscience really underlies a lot of the process
of conflict transformation, moving from just an emotional response
to be able to think and be mindful of
our ability to solve problems together.
So I think when you think about neuroscience
and social media, there's a few things happening.
It's showing us the emotional content
which keeps us at that brainstem.
My colleagues at the Center for Humane
Technology call it the race to the bottom
of the brain stem on social media.
So it's this race to show us the most outrageous
emotional content to keep us engaged at an emotional level.
But it's also just sort of this addictive comparison.
Like, there's so many dynamics on social media
that keep us on these platforms, and it's
been shown to really have a negative effect
on most people, especially children.
All this social comparison that happens with, here's
what I'm doing, here's the fun thing, here's
how beautiful I look, and all these selfies.
It's such a weird culture, and it's bringing out
really bad parts of humanity where we don't feel
like we're all one and we're connected, actually.
It makes us feel alone,
lonely, inferior, insecure, depressed, anxious.
The levels of all of these emotional issues and
children using social media is skyrocketing, and it correlates
directly with the start of social media.
So it's polarization that's happening online because of
some of the content, but also just this
dynamic of what it's doing to our brains.
(Duane) Now let's go to a related problem, and I say related
because there are a lot of parts to this other problem
as well that also contribute to the polarization issue.
And that is what is happening with journalism.
Oh, my God, journalism.
I mean, we're all seeing it like it's
happening right in front of our eyes.
The idea of objective journalism is dead
as a doorknob. I work with most of
the major news organizations in my professional capacity.
Nobody aspires to objective news anymore.
It's like objective news is seen as a ratings killer.
It's a horrible thing to see.
But let's talk about this.
I often talk about how the idea of objective news
was itself the byproduct of the invention of the telegraph.
Because before the telegraph,
news was inherently partisan.
People subscribed to the newspaper
that reflected their perspective.
And so this kind of perspective to a
story was what differentiated a news outlet.
But the only thing that was
more powerful than perspective was speed.
And so if you were in the Civil War, if you
were at a battle scene and you could report on that,
and if you could get that story back, like near instantly,
that would trump any kind of partisan perspective.
And so the wire services were born because
it was much more cost efficient to have
one version of the story distributed to everybody.
And the telegraph was massively expensive.
And so this created this idea that you
wanted to have scale to the story, really
created this idea of objective news.
And really, it's been in the past 20 years
that we've seen the underlying technology that, if you
will, gave rise to objective news get challenged largely
by digital technologies which were, of course, much faster,
very different, but which are dramatically changing the dynamic
of what journalism is.
I don't even understand how anybody who goes to
a journalism program, you're still trained to be this
objective journalist, but somehow you graduate and you go
into a professional news organization, or even worse, into
the social media sphere, and something happens and you
become a very different kind of journalist overnight.
So this is a major crisis because this is the
diet that we're being fed to understand the world around
us, and it's just not clear what that is anymore.
(Lisa) Well, I think that's a really interesting set
of observations about the history of media. Sure.
Polarization, again, is a long term human phenomena.
I think there are some similarities.
I talked about TV shows being more polarized now,
too, because as technology has advanced, it's also much
cheaper to create your own radio station, your own
podcast, your own TV station, the newspaper.
And so as the costs of technology
decrease, more people create these things.
And then actually all media becomes
more polarized as people start listening
to things that reconfirm their beliefs.
So there's part of that that's happening.
But I think there's some real differences between traditional
media and social media, and that is with traditional
media, you still have editors such as yourself.
As an editor of this podcast, you get to
decide what part of what I'm saying you keep.
But an editor at the New York Times or at NBC
News is deciding what are the stories, what are the priorities
to say, and how are we going to tell these stories
where on social media there are no editors, right?
So it's everybody talking to everybody.
There's no journalist or editor or editorial board
who's just making these decisions, kind of trying
to follow some semblance of professional journalistic ethics
or thinking about the public interest.
So I think what we've seen is sort
of the weaponization of the democratization of media.
When social media first started, we thought of it
as the democratization where everybody can now publish.
It can be citizen journalism, and we can
have human rights defenders all over the planet
posting their stories and sharing and starting social
movements like we saw with the Arab Spring.
But yes, that still is happening.
And actually, social media has really democratized
access to tell your own story.
At the same time, we have millions
of cyber armies around the world posting
false, deceptive, divisive, polarizing information with the
goal of dividing and undermining, splitting humanity.
And so it's actually the same thing that's happened with
weapon systems, because weapons are now also democratized, where it's
really easy to get a machine gun now.
And democratization of access to powerful tools like weapons
or media has an upside and a real downside.
I'm not sure there's an upside to weapons, but
anyway, with media, and there are consequences to this.
(Duane) In your book, you have a number of examples, a
number of studies that you cite that are really fascinating.
I'd like to share some of these
with the audience because they're so amazing.
There was a study that you talk about that
the Wall Street Journal did, where they found that
a YouTube algorithm more often is more likely to
recommend misleading content for reputable news.
I mean, that's scary, right?
Other things being equal, you would think
that an algorithm would pick the more
reputable, the more reliable news.
But here's the study saying, no, it's the opposite.
You have these rumor cascades, and there's this
MIT study that you talk about where false
news will spread faster than true news.
Again, what a shocker.
How scary is that?
You have this whole problem with people relying
increasingly on social media for their news.
I mean, half of the world's
population subscribe to Meta or Facebook.
They're much more likely.
The people who get their news from
social media are far more likely to
have an inaccurate picture of the world.
They're much more likely to
buy into conspiracy theories.
You have the problem with the whole echo chamber
phenomena, where because of the algorithm, looking at what
you're clicking on, it's giving you more of what
you're clicking on, and that accelerates.
And so it's easier to believe what
you already believe is what everybody believes,
because that's the reality that you see.
And it becomes harder and harder for you to
even understand that there can be an alternative view,
because you never see the alternative view.
All you see is the people who have the view
that you already have reinforced more and more and more.
So it just seems so consequential in terms
of how people understand the world around them.
(Lisa) Absolutely.
I think the other part of this journalism piece
that I wanted to mention is also how the
profit model of social media, which is making money
on showing people ads, has undermined the profit model
of public interest news sources.
That the more professional journalistic programs on TV or
on newspapers, they also rely on ads, some of
them, and their subscriptions have vastly declined.
So many local newspapers now have gone
out of business because of social media.
So we actually have fewer professional journalists doing the
work while we have more people getting fake news
and reading what China, Russia or Iran are planting
in the US social media sphere.
Hiding as Americans, basically.
(Duane) And it's not only that, it's also that
it changes the way that journalists practice journalism
because they have to respond to the pressure.
So just in my own interaction with journalists, just
to give an example, I remember a decade ago
when I would be called by a journalist for
an interview, what I said wasn't good enough.
Like, if I said, this company is my
client, they didn't go on what I said.
They had to verify that.
So they would call the company and they would know.
Duane says that you're his client.
Are you really his client?
Like, there was this need for verifying the accuracy
of the story, and there was a lot of
effort that was put there. But that takes time.
And when you're competing with the speed of social media,
you just don't have the ability to do that anymore.
As a professional journalist, by the time you do
it, the story will have come and gone almost.
So what you find is that
professional journalists now are cutting corners.
Suddenly.
They don't do the verifying.
Now they get the story, they run with it, and
you get all these incredible, ridiculous instances of major errors
that journalists do because they're not the same journalists that
they were a decade ago because of these influences that
have come into their game, so to speak.
(Lisa) I think you're absolutely right.
And there's sort of an infantilization, actually,
of all journalism, because as social media,
highly emotional journalism is keeping people so
emotionally aroused all the time.
And then they expect also their
regular news to be emotionally engaging.
In the same way, what we learn
about cognitive development is that children, young
people, things that are emotional, are overwhelming
for them more often than for adults.
The whole process of becoming an adult is learning
how to acknowledge your emotions, manage your emotions, and
to be able to reflect on your emotional state
and control it in some states, or manage it.
Not that emotions are bad.
Emotions are good.
They indicate to us when we feel strongly.
But as an adult, you're expected to be able to reflect.
When I'm feeling angry or outraged, I'm going to
moderate my behavior and how I interact with others.
But what media now does to us is
say that emotional engagement is profiting me.
So I'm going to keep you emotionally
engaged as much as I possibly can.
And real journalists and news organizations are
finding that sort of this infantilization of
keeping people outraged is profiting them too.
And so I guess this is the question then, for humanity.
We cannot solve problems when
we're all completely emotionally engaged.
How do we move people to the frontal
cortex, to the front of their brain, where
they can solve problems, where they can think
rationally and make sense of complex information?
This is degrading the IQ of humanity, and it's making
it harder for us all to solve other problems.
Migration, climate crisis, pollution, water shortages,
all the many things that are
facing societies all over the world. Yeah.
(Duane) This idea of shifting from, as you say, the
lizard self to the rational self, it seems like
that's the dilemma of our age, right?
I mean, when you think about the ultimate remedy
to these problems, it's not a particular social policy.
It's about this much larger problem that we have
of feeding this base self and trying to get
that to transcend to the higher version of ourselves.
I mean, that just seems like
that's what it's really all about.
At the end of the day, what
I'd love to do is just to
help illustrate how incredibly consequential this becomes.
You have a number of case studies.
I love the case studies looking at different nations.
And one of the ones that really stood out
for me was the story of what happened in
Myanmar, the country formerly known as Burma.
I'm sure everybody is really acquainted with
this whole plight of the, you know, Ruhingya Muslims
and their forced migration into Bangladesh.
I mean, it's such a very sad story, but I
think what people don't know and the story that they
don't understand is the role that social media played in
the events that led ultimately to this great human catastrophe.
So maybe you could tell us that story so
we can better understand how that all happened. (Lisa) Sure.
So, in around 2013, 2012, the Myanmar military
began using Facebook to motivate and mobilize the
public to be outraged at the muslim population.
And they did this by posting fake
and inflammatory stories on Facebook, accusing Muslims
of killing Buddhists in the country.
It's a primarily Buddhist country, and
it's a form of Buddhism.
Just like there's forms of Christianity and
Islam, there's some types of religious expression
that become very extreme and violent.
And this group of Buddhists is very violent. Yeah.
So they posted fake videos and photos of what
they said was Muslims killing Budhists and basically told
people to go out and kill Muslims.
And that's exactly what happened.
There was a genocide against the muslim population
in Myanmar, and now there's been a lawsuit
against Facebook by the allies of the Rohingya
Muslims to know Facebook was the communication tool
that allowed this genocide to happen.
And the civil society in Myanmar, they sent representatives
to Facebook's headquarters out in San Francisco in Palo
Alto, and said, please stop, turn off your algorithm,
dismantle or deplatform these accounts that the military is
using to spread this false information.
And Facebook did not respond.
And it was actually years before Facebook kind of
recognized that this was not just happening in Myanmar,
this was happening all over the world.
And so the authors that write the chapters in
this book that, you know, they're detailing how Facebook
is fundamentally changing their society and often polarizing people
along existing divisions in different countries.
Ethnic groups already are different.
They have a different history.
They sometimes don't like each other.
They have different political views.
But then what Facebook does is just throw fuel on
that and light a match and just let it go.
So from Kenya to Zimbabwe to Colombia and
Venezuela, this is happening all over the planet,
where violence and hate speech starts online and
then quickly slips off into the real world.
Into real world violence. (Duane) Yeah.
The beauty of the chapter as well.
It's so well written is that it paints the picture.
Not so much of Facebook being like an
evil party trying to make this happen.
That's not the story.
The story really is more that know.
I think the term was an absentee landlord.
I mean, it's not happening accidentally.
Most people's understanding of the Internet in
Myanmar at the time was Facebook.
Like, there was no other Internet.
I mean, that's what they understood as the Internet.
They had gone through this telecommunication liberalization
program, which made mobile phones very accessible.
Facebook was on the phone for a period of time.
Facebook had this program where access
to its platform was free.
And so that was what people
could experience as the Internet.
It was in their local language, so it
wasn't foreign content that they couldn't read.
So all of this was the reason why
Facebook was in the position it did.
But the part that makes Facebook so culpable
in the story was the negligent kind of,
like, approach they took to their platform.
Know, you could count literally on one hand,
it was less than five people throughout this
entire period that had any knowledge of Myanmar,
most of whom didn't even speak Burmese.
Who are the people who are responsible for all the
decisions about what Facebook and Myanmar is going to be.
And so there was really no capacity to moderate.
And once you had bad actors who understood
how to capitalize on that system, I mean,
they could just basically use it for anything.
It's not the first time that we've
seen these studies that look at the
relationship between media and genocide.
Rwanda with radio, of course, is a classic example,
but there are dimensions to the social media which
are so dramatically different to the more traditional media
instruments that we've seen historically in the past.
(Lisa) Yeah, absolutely.
That's why the title of my book is
Social Media Impacts on Conflict and Democracy.
The subtitle is the Tech-tonic shift.
It's a play on the word tectonic, spelled
T E C H, because technology is fundamentally shifting
the way societies hold together or fall apart.
And I think that technology has
the potential to hold us together.
And a lot of my work now
is on peace tech and democracy tech.
But technology as it stands today, in 2024,
it is having a negative impact on societies.
It is polarizing and amplifying the
problems rather than solving problems.
(Duane) Oh, great point. Very true. So true.
So, Lisa, let's talk now about what
we can do about this problem.
So I think you've done a great job.
You've made the case social media play a role, however it
is that we want to understand what that role is.
It's consequential, for sure.
And in your book, you talk a lot
about social policy, those kinds of things that
are high level solutions to the problem.
But I want to bring you down to the ground level.
I want us to talk about what we can do.
One of the thoughts that I had as I was
reflecting on your book was we have become very conscious
as societies about our diets and what we eat.
It wasn't always the case.
But I mean, now you go to McDonald's or
you go anywhere, you look at what the calories
are, you go to the store, you buy something.
I mean, we have become very conscious
as a society of our diet.
We may still eat bad things, but we're
very aware that that's what we're doing.
We eat a potato chip, we feel a little bit guilty.
We've just become very aware of diet.
And in the same way, I think what we click
on, what we do, that is our diet as well.
I mean, it's our intellectual
diet, our communication diet.
But what is it that we can do in our
own lives, in the things that we ultimately control in
the limited circles of people that we interact with?
What can we do at that level to
help address this particular problem with social media?
(Lisa) Thanks so much for asking that question, because many
of the solutions are not in our hands, but
we do have power in this situation.
I would start off by just saying, being aware
of our emotional state and recognizing when content that
we see online is emotionally engaging, but maybe pushes
us a little bit too far, too emotionally obsessed,
or sort of, what is our level one to
five of emotion when we're reading something?
And is it allowing us to think through the issue?
Or is it sort of a five keeping us
sort of hyperemotionally engaged and wanting to respond out
of that emotion rather than out of an ability
to think complex thoughts or think about the ambiguity
or the complexity of the situation.
So that's the first thing.
And I would say the main lesson here is if
you're feeling really emotional when you read something online, do
not share it and really don't even comment on it.
Come back later if you want to, but
just really note that you're being manipulated online.
The design of these
platforms is emotionally manipulated.
And the more we understand that as individuals, the
more we can get a hold of this manipulation
and hunch through it into some sort of sanity.
Again, I love that because definitely a big
part of it is creating a moment of
separation between reading it and reacting to it.
And if you can create that moment, it just gives you
the opportunity to reflect instead of just react, which is so
much a part of how those algorithms really work. Yeah.
The second choice is to really look for
spaces online where people are actually engaging and
learning from each other, where there's some kind
of norms within the group that promote learning.
And so you go online to exchange views,
to actually have a real exploration of how
different experiences that people have had in their
life have led them to different conclusions.
I think, for me, I'm getting less and less
interested in the dramatic arenas where people are fighting,
and I'm more interested in really learning how people
of the other political party in my group here,
how they think about things.
What led them to think about this?
I'm truly curious, and I want to
be in places where everyone's curious and
they're willing to learn from each other.
I think making that choice as an individual is a huge
change, that not every space on the web is equally bad.
There are places where people talk kindly
and respectfully to each other, and they
engage with issues in good ways.
I'm part of the group Braver Angels.
For example, Braver Angels is a
U.S. group that works on political polarization.
You know, they host meetings where Republicans
and Democrats can talk to each other.
They have moderators that help the
conversation stay civil and constructive.
And people are asked to come and learn.
And I think that instead of battling
people online, we should be thinking of
it as a learning opportunity to understand
how people in our communities think differently.
(Duane) Wow, that's great, Lisa.
So pick the forums that you go
into, because not all are equal.
Some are going to be more respectful,
and some are going to know less
respectful, and let's find those environments where
the conversation is more dignified, more elevated.
And then we've talked so far about kind of
like how we read the social media universe.
But let's talk about how you communicate, how you speak,
how you talk in the social media world, and the
need for some kind of new etiquette for how it
is that people should communicate in this space.
What do you think we should be doing in
terms of building this new kind of, like, etiquette
for expression in the social media sphere?
(Lisa) Yes, absolutely.
So when I am mediating between two people who disagree,
the very first step is asking each person to share
the experiences that they've had that led them to the
current conflict, so they each tell their own story.
And I think that we have sort of always skipped
on social media to, this is what I believe.
Now, rather than helping, backing up and explaining.
So, like telling stories, personal experiences is a
great way to start talking about an issue,
because it allows people to humanize you to
see that we do change our minds.
And some of us really had really important
experiences that have led us to shift beliefs.
And this kind of humility too, of saying
that I don't have all the answers.
I used to have a different position.
This starts creating a situation where there's a
little vulnerability, but there's also just a sense
of putting humanity in the center of the
conversation of people telling their own stories.
I think that if that's a norm on a platform,
a norm is like the middle of the road.
A guideline or a rule on social
media is the edge of the road.
And what we don't have right now are, what are the
norms, the middle of the road, what you should aim for.
And really, social media platforms should be reminding
us not only of the rules, the edge
of the road, but the middle.
What do we aim for?
What do we aspire to here?
Like, to learn from each other, to share
experiences, to understand different points of view.
If we had pop ups that were telling us and
reminding us of these things, I think that it would
be better so we can make our own pop ups.
Just write it down, a postit note, and put it
on your desk to remind yourself to tell a story,
share your experiences, and ask other people about what are
the experiences that led you to that belief?
Can you share a little bit more about your story?
I use that line all the time
on social media where nobody has maybe
offered their story, but I'm inviting them.
So I'm asking questions.
I'm showing curiosity and why someone else
believes something differently than I do.
And that asking questions can also really
take down the tone of a hostile
and disagreement, (Duane) and to just be respectful,
dignified, elevated human personal interaction.
Here I'm visualizing life at the neighborhood level, and how
is it that we can reengage in interpersonal dialogue at
that level, you bump into people at the school, people
in your work environment, in your work circles.
Just how it is that we can not become
overly dependent upon social media for our human interaction,
but make sure that we have a real diet
of human personal interaction as well.
(Lisa) Well, I think that that's really tied to
this emotional, addictive design of social media platforms.
They are not a replacement for real world relationships,
and there's all kinds of really, actually chemical things
that happen between people when they are actually sitting
in the same room or talking in the park
to each other that don't happen on social media.
So we have to be aware it's not a replacement.
It's not the same thing.
And I think I pay a lot more attention to when
I meet one of my neighbors or somebody in the park.
What are the social graces that we all learn?
We say hello to each other, how are you doing?
And my writing more recently has been
like, how do we bring that online?
So when somebody comments on something that I've
posted online and it's a little bit hostile,
I will say to them, oh, good morning.
Nice to see you here.
Almost like I would do if I saw them at the park.
I want to start bringing on these social graces to
social media to say, we have to greet each other.
We don't just dive into arguing on something.
It's really weird, actually, when you think about it.
For all of our lives, we've learned that you
say hello to somebody if you meet them.
You don't just walk up to a
stranger and start arguing with them.
And that's across cultures, too.
It's not just an american thing.
It's many, many people in many places
have had things that are normal ways
that human beings interact with each other.
And so paying more attention to that.. (Duane) I think,
like you're saying, when there is this chemical interaction,
when you're talking to a person and you say
something, you're getting a layer of feedback that you
don't get when you're interacting, like, in the social
media sphere, the same way.
So you say something and you look and you see that
what you said has hurt the other person, and you're human,
and that's not what you really want to do.
And so that has a disarming effect on you in
terms of getting you to cool down a little bit
and think about how to say it maybe a little
bit differently so that they won't be as heard.
But your message is constantly adapting on the
basis of how another person is ultimately reacting.
But the minute you go into the social media
universe, you don't have that same dimension anymore.
You're not seeing a person's eyes, and you're
not seeing their facial expressions, and you don't
understand how they're reacting when you're saying something.
And so it becomes easier for you to
just become insensitive to all that stuff.
And that helps that process of that dialogue kind
of like gravitating and becoming a little bit more
dehumanized a little bit more, maybe degrading all of
that as it feeds that passion that's driving whatever
it is that motivated your comments?
(Lisa) It's why I really think social media platforms need to
enable us all to give our intentions what we post.
It's not currently something that's offered anywhere, but in
my sort of design code for how we should
design social media to improve social cohesion.
This is one of the points, actually, is that
when I'm feeling sad or when I'm feeling lonely
or upset, I should be able to indicate that
as I'm posting that I'm truly curious about this
issue, or I'm coming here with an authentic question.
I'm not angry at you.
And that we should be allowed to sort of state
our intentions of the conversation and communicate what we would
normally communicate on our faces with a smile or with
raised eyebrows, or we have all kinds of ways in
person to signal, I'm not here to attack you.
I really care about you.
Yeah, and we're missing all that on social
media, so we should find other ways to
send that same signal of our intention.
(Duane) So we've been a little bit critical today because
we've been looking at understanding the problem, really.
But where do you think this all goes?
What do you see as the future for social media?
(Lisa) Great question.
I think that there's actually an exciting future,
but we all need to be working together.
So if we just let tech companies go off
on their own, it will continue to just be
for profit and centered on amplifying our divisions.
We need to be involved, and we really
need to be engaged with the tech companies
to express, we want something different.
We want a different product that
serves humanity and serves our societies.
And we need to let our governments
know they need to be involved, too.
There need to be tax credits for companies
that produce social cohesion, help societies hold together,
and there needs to be taxation of companies
that are divisive so that we actually have
then a whole incentive structure, creating new tech
tools that help society make decisions together.
And I've already seen some of these platforms, Remesh
and Polis, I'm writing about them a lot.
They help people make decisions together.
They incentivize where there is common ground.
They help people see each other's point of
view and really listen to each other.
At scale, we could be building the kinds of
design affordances that are in Polis and Remesh into
all of our social media to ensure that every
conversation brings out the best of humanity, so that
we're learning from each other.
We don't have to have perfect harmony, but we have
to be able to appreciate the humanity of the other.
So that's what I'm working for, designing a social
media that will enable us to humanize each other,
to continue enjoying all the good things about social
media, but then adding to it sort of a
benefit to society, a benefit to holding us together.
(Duane) Wow, that is such an exciting future.
Lisa, thank you so much for sharing that.
And thank you for all the insights that you shared
in helping educate us on how it is that the
social media universe is kind of like contributing to this
polarization issue that we're all suffering from.
Thanks again, Lisa.
Thanks for joining us today.
(Lisa) Thanks so much for your wonderful questions.
(Duane) Wow, so many interesting insights there.
First and foremost, I think Lisa did
an amazing job in demonstrating how social
media greatly accelerate the polarization issue.
And she's helped us understand, I think, how this
is not a natural or an accidental byproduct, but
it's something that directly results from both commercial and
malicious efforts to feed and capitalize on our lower
nature, on our animal or our reptile brain.
I loved her explanations of the
neuroscience behind all of this.
And she leaves us reflecting, I think, on what it
is we can do individually and in the circles in
which we move to become more conscious and aware of
our own social media diets and of our responses, highlighting
really what the Universal House of Justice calls for as
the need for an etiquette of expression, something we'll explore
further in a future episode.
Now, we continue our journey into depolarization
in our next episode, where I'll interview
yet another global authority in this discourse.
This time, I'll be interviewing Nealin Parker,
who's the Executive Director of the US
Office for Search for Common Ground.
That's one of the world's
largest nongovernmental peacebuilding organizations with
offices in some 30 countries.
Now, Nealin is also part of initiative
engaged in tackling polarization, an effort that
has already identified, wait for it, over
6700 organizations in the United States alone.
I mean, wow.
So Neelen will help us get a better sense of
the kinds of like-minded organizations we might want to
explore collaborating with as we engage in depolarization.
So don't miss my interview with Nealin Parker.
That's next time on Society Builders.
(Music) Society builders pave the way to a
better world, to a better day, a
united approach to building a new society.
There's a crisis facing
humanity, people suffer from a lack of unity.
It's time for a better path to a new society.
Join the conversation for social transformation. Society Builders.
Join the conversation for social transformation. Society Builders
So engage with your local communities and
explore all the exciting possibilities.
We can elevate the atmosphere in which we move.
The paradigm is shifting.
It's so very uplifting.
It's a new beat, a new song, a brand new groove.
Join the conversation for social transformation. Society Builders. Join the conversation for
social transformation. Society Builders. The Baha'i Faith has a
lot to say helping people discover a better way
with discourse and social action framed by unity. Now
the time has come to lift our game and
apply the teachings of the Greatest Name and rise
to meet the glory of our destiny. Join the
conversation for social transformation. Society Builders. Join the conversation for social
transformation. Society Builders.