Episode 14 | How to Change a Misinformed Mind November 28, 2021

Listen

GUEST: Dr. Nika Kabiri, professor, author, and decision science specialist

It’s easy to discount conspiracy theories as harmless or simply bizarre. But, in recent years, they’ve bred the kind of misinformation that has given rise to dangerous calls to action and fatal violence – as well as deep rifts between relatives, friends, and colleagues. Is there a way out of this? In this episode, Dr. Nika Kabiri talks about why people believe these theories and how we might counter this misinformation, as well as what strategies we can use to heal the divisions it causes in our closest relationships.

GUEST BIO:

Dr. Nika Kabiri is a faculty member at the University of Washington, where she teaches decision science. This line of scientific study looks at how humans make decisions – from gathering information to processing that information, and, ultimately, determining what they are going to believe. In the case of conspiracy theories, it offers an understanding of what leads some people to believe something that isn’t true. She also is the founder and owner of Kabiri Consulting. She uses decision science principles to help institutions and companies better understand their consumers and clients. She also is a contributor to major publications and the author of the bestselling book Money Off the Table: Decision Science and the Secret to Smarter Investing.

LINKS:

Kabiri Consulting

Your Next Decision  – Access to Dr. Nika Kabiri’s newsletter and online design science resources and tips for personal and professional growth.

“How to Change a Misinformed Mind” is available for free here.

Screenshot of Nika Kabiri for Speak Good Podcast

Full Transcript

BRAD PHILLIPS, HOST, THE SPEAK GOOD PODCAST:

Paul McCartney died on November 9th, 1966. Since then, we haven’t been seeing the real Paul, but rather a man named William Campbell. Campbell won a look-alike contest shortly after McCartney’s death, and he quietly replaced the deceased musician so that the Beatles could continue on as the world’s most successful band — and avoid upsetting their young fan base. Campbell has played the part of the mop-topped, cute Beatle convincingly ever since. Despite the band’s official silence back then, if you knew where to look, you would have been able to find clues everywhere. On the photograph found on the back cover of the Sergeant Pepper’s album John, George, and Ringo are facing the camera. Paul is facing away. If you play the track Revolution 9 backwards, you’ll hear the words, “Turn me on, dead man,” a reference to Paul. And a couple of years later, John Lennon referenced the Beatles hit song I Am a Walrus with a cryptic line in the song Glass Onion that said, “The walrus was Paul.”

When asked back then about the rumors McCartney insisted that he was still alive. But, let’s be real. Isn’t that exactly what the fake Paul McCartney would say? All right, let’s put all of that nonsense to rest. Paul is not dead. William Campbell has not been playing the same role for the past 50 years. Those alleged clues all had benign, even banal, explanations that didn’t hint at something more sinister. But despite the seemingly obvious conclusion that Paul has been with us this entire time, the conspiracy lingered. In fact, in 2009 Time magazine named the “Paul is Dead” rumor one of the world’s 10 most enduring conspiracy theories. A few years ago, the real Paul addressed those rumors during an unaired outtake from a Carpool Karaoke segment on CBS’s The Late Late Show with James Corden.

(SOUNDBITE OF PAUL MCCARTNEY WITH JAMES CORDEN)

JAMES CORDEN: “Now, you must be aware of this. For a long time, there was a huge rumor, at one point, a conspiracy theory if you like, that Paul McCartney was dead, and you are not Paul McCartney. You’re an imposter pretending to be Paul McCartney. How did this come about?”

PAUL MCCARTNEY: “You found me out.”

CORDEN: (Laughter)

MCCARTNEY: “For years, I’ve been able to fool people. Now, in the car, you finally …”

CORDEN: “But seriously, there’s some people on the Internet who are very passionate about this. And they say there’s been clues, like, this is Sgt. Pepper you play it backwards and you hear the words ‘Paul is dead?’”

MCCARTNEY: “Yeah, I mean the truth is, at that time, everyone was stimulated with way too many stimulants. So, it was a pretty crazy time – ‘this is true, and this is true’ about the Beatles. There were like  millions of legends. And so, we just kind of let them go, you know, there wasn’t really any way you could stop it. So, we just, you know.”

CORDEN: “People said because you weren’t wearing shoes on the Abbey Road cover.”

MCCARTNEY: “Yeah. We did the Abbey Road cover, we decided we would make the cover so we’re going to have a photograph on the level crossing. So, it was a very hot day and I arrived in sandals, so I thought, ‘well, I’ll just kick them off, because it’s too hot,’ okay. So, I ended up barefoot. Well, this is a sign that, I don’t know … he’s dead or something.”

So, like most conspiracies, the real explanation is a lot less interesting. It was a hot day. He took off his sandals. He was more comfortable that way.

While I – and almost surely, you – can hear the “Paul is Dead” conspiracy as little more than a bit of harmless fun, not everybody else does. A few years ago, after landing at London’s Heathrow Airport, I took a taxi to my hotel. The driver – a man who appeared to be somewhere in his 40s – regaled me with the Paul is Dead conspiracy for most of the ride. But he was serious about it, and was even angry that they – the they in this story was never made quite clear – had lied about it for all this time.

So, a kooky, random taxi driver believed that conspiracy. It’s still, relatively harmless stuff, though, right?

But then he brought up the 2012 Sandy Hook Elementary School shooting that took the lives of 26 people, 20 of whom were six- and seven-year-olds. That, too, he insisted, was a hoax. I told him it actually happened, that I live only an hour away from Newtown, Connecticut, and know people in the neighboring town. He responded by asking if I knew any of the victims, which I didn’t, and he took that to be further evidence that because the kids were quote “crisis actors,” it’s unsurprising that I didn’t know any of them personally. Given that he was the driver and had my life in his hands, I decided not to press my case further. Suddenly, this wasn’t just a kooky, random taxi driver, but a man with very twisted and even dangerous ideas.

And as we’ve seen in recent years, that type of conspiracy thinking too often leads to real violence. As examples:

In 2016, a man with an AR-15 shot at a restaurant in Washington, D.C. He had been spurred on by an internet theory that claimed the pizza joint was home to a Satanic child sex abuse ring involving Hillary Clinton. Fortunately, no one was injured.

Last year, in Canada, a man who shared Q-ANON content on social media rammed his truck through the gates of the Prime Minister’s Ottawa residence. Fortunately, Mr. Trudeau was not home at the time.

In January, the insurrection at the U.S. Capitol was underpinned by the false belief that massive electoral fraud had denied President Trump a second term. Multiple people, including police officers, died during and shortly after that insurrection.

Oftentimes, those conspiracies don’t just live in headlines. They live in the minds and the hearts of people we know and people we love. Family members. Longtime friends. Cherished colleagues. And, as this clip from a January 2021 segment on NPR’s All Things Considered shows, conspiracy-thinking can add tension to – and even destroy – those once-close relationships.

(SOUNDBITE OF NPR CLIP)

KELLY: You know, my father’s calling me a stupid liberal [expletive] and telling me I can’t be trusted.

UNIDENTIFIED PERSON: She hits us with the – I used to think you all were smart.

KELLY: I’m embarrassed about where my father’s mental state is, but I also am devastated because I feel like I’ve lost him.

ANNIE: And when I am arguing with my mother about it, it feels like I’ve lost my parent.

In this episode, we’ll talk about how to deal with people in our lives who believe sincerely and deeply in conspiracies and in misinformation. My guest is Dr. Nika Kabiri, a faculty member at the University of Washington, where she teaches Decision Science. She’s also the founder and owner of Kabiri Consulting, where she employs Decision Science principles to help businesses start, grow, and thrive. And she’s the author of a new guide – which you can download for free on her website – called How to Change a Misinformed Mind.

(MUSIC PLAYS)

Nika, thank you very much for joining me. I found your guide, How to Change a Misinformed Mind really helpful. And I guess my first question is, are there types of people who are more likely to be susceptible to being misinformed or believe in conspiracy theories or is it an equal opportunity offender?

NIKA KABIRI:

That’s a great question. To some extent, we’re all susceptible simply because we all have the same kind of brain. Like we’re human. We have the human brain, and the human brain is highly efficient. It’s efficient to conserve energy. It takes up a lot of energy and efficiency comes with some drawbacks, which is that we tend to jump to conclusions, or we make connections between things that may not necessarily be accurate. And that leads us all down this kind of path towards this pitfall of believing things that aren’t true. Now, conspiracy theories are a completely different class of things that aren’t true. There’s some research that recently – I don’t remember what university it came out of – but the findings suggest that people who believe in conspiracy theories are less likely to be altruistic. They’re less likely to be inquisitive. They’re more likely to be impulsive. They’re also more likely to kind of break the rules to kind of get ahead or to get what they feel they deserve. So, there is some indication that there’s a type. I don’t know if this has been verified with further research yet, but there is some indication of that. Also, there is definitely a sense that people who believe in conspiracy theories have a conspiratorial worldview that undergirds their belief system.

PHILLIPS:

It’s really interesting to hear you say that they tend to be less inquisitive. Because, then, one of the remedies might be, and we’re going to get into this, how to help people become more inquisitive, even if that’s not their natural place that they go to. To try to take some of the impulsivity away from their decision-making. One of the things I think that’s very discouraging in your guide, not because you wrote it, but because it just happens to be the thing that’s effective. You can’t go into conversations with somebody, with a misinformed mind with the expectation that you’re going to be able to change it in a single conversation. You have to play the long game. And, that requires a lot of, I think more than anything, a lot of discipline and commitment, because this is something that could take months or even years. So, could you talk about why it takes so long and why it’s not a short-term effort?

KABIRI:

Yes, people hold firmly to their beliefs to varying degrees, but they invest in them. And so, when you challenge their belief, you’re really kind of generating in them some sort of a threat response. Like you’re challenging their worldview and you’re also challenging their perspective on what they should do to have a good life. And that threat is very real and that’s why people react the way they do when you try to tell them the truth. They argue. They need a sense of safety. They need to feel like it’s okay to believe what they believe, in order for you to sort of be there to look for those cracks, to peel open with doubt, which is another issue. But, that safety doesn’t come from arguing or presenting facts that show them that you think you disagree or that they’re wrong. And also, people believe who they trust. Like I believe Dr. Fauci because I trust him, not because I’ve seen the raw data he’s seen. We all believe certain people we trust. In an order to be trustworthy, we have to have a relationship. We have to develop that relationship over time and be there. That’s pretty much it, trust and safety.

PHILLIPS:

Right? And you brought up a moment ago that you can’t just kind of stubbornly say, but these are the facts and look up a document online somewhere and then be able to show them here’s what the CDC says about vaccine safety. Because just giving people fact, fact, fact, fact, isn’t going to be the thing that changes their mind. You say in the guide that you change minds by expressing your point of view in a specific way and at a specific time. So, if you’re not there to debate facts, what are you there to do?

KABIRI:

I believe, and this goes back to being inquisitive, you’re there to look for openings. Their minds aren’t open. You’re there to either look for little hopes, glimmers of openings, and just kind of pry them open a little bit. That’s really the goal. It’s not to change the mind overall, right away, it’s to create potential. Because a lot of times all that you need to do is create that potential. Circumstances in their lives can change. And once that potential is there, if that doubt is there, those circumstances could generate a changed mind on their own. You’re just, you’re trying to create an environment in their heads. So, yeah, that’s the strategy. Look for those potential openings and maybe create them by being inquisitive yourself, by forcing them to be inquisitive. And since they’re not naturally inquisitive …

PHILLIPS:

Can you give me an example? Because I think in your guide you give an example, maybe it’s about vaccines or something else, but what’s an example of maybe how you go about trying to create that slight opening that might suddenly be welcomed to new information.

KABIRI:

Yeah. There’s a general curiosity that you kind of have to adapt and we believe what we believe too. So, we don’t feel like we need to be curious. But, if you are okay with suspending your own disbelief and being authentically curious about where they came up with the information that they have or why they understand things to be the way they did. So, for instance, there are people out there who believe that Biden is dead, and he’s been replaced by a doppelganger. So, my immediate question would be, and I’ve put myself in this perspective, like, wow, okay, what if this was true? What if this was true? What would have to be the case for this, for him to pull this off?

PHILLIPS:

(LAUGHS)

KABIRI:

Why isn’t Mitch McConnell on this? Like why aren’t the Republicans in the Senate? Like, and so I would ask like, oh, wow Biden’s dead, that’s a doppelganger?  Well, when is Mitch McConnell going to do something about this? Why hasn’t Marjorie Taylor Green spoken up about this? Something should happen. Do you know, do you know why nothing’s been done about this? And it kind of can create this dissonance in them like, wow, okay, if this is true, then why isn’t this happening? They should be, you know, coinciding, that sort of thing is what creates that dynamic. That opens that door.

PHILLIPS:

Yes, that’s really smart to being inquisitive yourself. Although in that specific example, I fear that between the time we’re recording this and the time we released this episode, Marjorie Taylor Green might in fact come out and say, Biden is dead, we should be doing more about this, because that seems to be the direction that she likes to go.

KABIRI:

This is why it’s so hard to change minds that are misinformed because those who are spreading this information will go there. They will go there to those dark, dirty places. They’ll lie. They’ll do whatever it takes. And the reason why it’s so hard for us to change their minds is because we won’t go there.

PHILLIPS:

Right.

KABIRI:

I wish I could make up these fantastical lies to say, well, do you know that the Democrats own like most shares of companies that manufacture Ivermectin and that they’re just kind of pretending that they want you to get vaccinated, but really they don’t. If you could just make up your stories like that and Marjorie Taylor Green would make up stories and these people would make up stories and it’s hard to combat that. That’s why it’s really, really hard.

PHILLIPS:

Obviously, people have to own their decisions. If they feel like the decisions are being pushed on to them, they’re going to resist them. I know for myself, when somebody says you better listen to me with that, that finger wag, I’m immediately looking for reasons why I shouldn’t. So how do you go about in a way that doesn’t feel manipulative, and that perhaps isn’t manipulative, go about helping somebody draw that own conclusion for themselves? In other words, what role can I play to allow somebody to feel that they’re owning their perhaps changed mind?

KABIRI:

Right. And this is where the long game happens because every time that you interact with this person, you are creating a little bit more doubt. You know, you’re creating a little bit of doubt, and I think it’s also important to close that doubt. I kind of think about doubt opening up this kind of like what? What? And, this questioning. And then to kind of fill that gap with information that you can then provide. And over time, over the long game, with little baby steps and I call them small wins, you are creating a situation. You’re not trying to change the mind. I’m not sure if this is answering your question directly.

PHILLIPS:

Yeah, it is. You’re reminding me, there’s an example I recently heard that was kind of mind-blowing for me, and I think it’s along the lines of what you’re talking about. That if somebody let’s say is opposed to, and this is not misinformation or conspiracies, but I would imagine that the same logic applies, that if somebody is opposed to, let’s say reparations for slavery, that what you might say is, Did you know that there have already been reparations, but the reparations, for example, when France gave Haiti its independence or allowed Haiti to become independent, it was with the strings attached that the formerly enslaved people in Haiti had to pay reparations to their former enslavers. So there have been reparations in history recently but they’ve gone the opposite direction. Now that’s one of those things that I would think is along the lines of what you’re speaking about it of cracking open a window, like, whoa I didn’t know that.

KABIRI:

Yes, it is. And that’s a great example because that creates doubt. I would add to that if you could somehow convey that story, that fact came from a source that they trust, that’s what matters too. Because, again, people will believe who they trust. If you tell them, I learned this in college, they’ll say, oh, well, you know, college … they just feed you lies. I had a conversation not long ago where somebody told me just that. Oh, college, it’s all lies. You have to kind of have that behind you to.

PHILLIPS:

Yes. So, let’s get into that because obviously that single conversation that you might have with a friend or family member that you really care about can only go so far if at nine o’clock, they turn on Tucker Carlson or OAN or Newsmax and they hear misinformation from those channels. One of the things I thought about in preparing for our conversation was that Tucker Carlson was recently involved in a lawsuit and his lawyers I’m going to quote here, Fox’s lawyers said, “the general tenor of the show should then inform a viewer that Carlson is not stating actual facts about the topics he discusses and is instead engaging in exaggeration.” So that comes directly from that lawsuit. So, if … that family member believes Tucker, would it make sense to try to undermine that host and ask a question like, hey, isn’t it weird that his lawyers argued in court that he’s entertainment only? What’s that all about?

KABIRI:

That is creating doubt, right? That creates doubt. I think the challenge is setting your own expectations around that though. When you say something like that to someone who absolutely loves Tucker Carlson, you’re not going to change their minds with that one question. Again, just as long as you set those expectations that it’s a long game, that you can mention that one day and it’s not going to change minds, but it might make them think a little bit. You just kind of want to shake them up a little bit. Then, you’re kind of creating a small win. That’s a small win to me. Again, if the source of that is trustworthy as well, then that’s even more important. Yeah, people love … they’re fans, you know?

PHILLIPS:

Exactly. And, and so that also gets to, I think, you referred to it earlier that the person has to trust you too, in order for you to be able to have any impact. But I also think it goes beyond that. You have to sit there in that conversation with a complete and total lack of judgment, because the moment I think they feel judged negatively by you, they’re just going to shut down. Could you just talk a little bit about that? Because I’m sure that you, like I do, have this internal visceral reaction when somebody’s spouting nonsense, how do you train your brain, I guess, to sit there patiently in that moment and not project negative judgment?

KABIRI:

My brain hack is to basically look at them in the most human way possible. They are somebody’s kid. They had a childhood. They’ve had pains in their lives. They are somebody’s spouse. They are somebody’s parent. They’re worth, and this sounds so kind of savior-ish, but they’re kind of worth saving. You have to believe that. And if you’re in a conversation like this, with somebody, you know, you care about a friend, then you probably already feel that way. You feel like they’re this person who really deserves that love. It kind of sounds touchy-feely, but that’s where I come from. This is my friend, this is my parent. This is someone who is a person, and their ideas are not the sum total of who they are. I think that’s also really important to remember that their perspectives on the big lie or whatever, COVID, it’s not the sum total of who they are as a person. If you can remember that, it makes it a lot easier. But, I’ll tell you, I find it hard. It’s so hard because I am a human, too. We’re all humans. And we feel a threat response when somebody tells us we’re wrong.

PHILLIPS:

So, you’re sitting there telling yourself my goal here is not to be right. My goal is to see you as a full human. My goal is to hopefully open a window a little bit so that you’re able to take in better information.

KABIRI:

Yes. And to be uncomfortable with your own uncertainty, to feel okay feeling not okay. It’s a skill that’s so valuable.

PHILLIPS:

But it also suggests that this is not something you’re going to do with casual acquaintances. The amount of emotional energy that you would need to try to remain patient and see people, I would imagine it’s exhausting. So, are you talking primarily about the family members either you really love or the ones that you just have to see every holiday anyway? Who are you talking about?

KABIRI:

The guide was designed for people who love someone who believe in misinformation. The impetus for it was  friends, clients, people coming to me saying, I don’t know what to do. I have this family member who won’t get vaccinated because they believe whatever. I need them to vaccinate their kids. What do I do? So, it’s very much designed for that. It’s not that it couldn’t help in other conversations, but you’re right. It takes a lot of time and energy. And also, you have to really develop trust first. And that takes a lot longer with somebody you don’t know. Like they’re just not going to know.

PHILLIPS:

And the thing about something like COVID is, so the brain science tells us that you have to play the long game in order to be effective. Of course, when it comes to a vaccine that might be lifesaving, you don’t have months or years to be able to put this stuff into practice. So, when something is an imminent danger, is there any accelerant to this process?

KABIRI:

I think the accelerant, unfortunately, and I have thought about this, and I think somebody with a bigger brain might be able to come up with a better answer is just time. Like events will happen. And this is, I mean, life sucks. This is why being human can be so sucky; it can be so painful sometimes. Because we come around to understanding what we need to do after it’s too late. Climate change is another example of that. I feel like oftentimes there’s very little we can do, except just keep picking away and just chipping away.

PHILLIPS:

But sometimes that’s the answer.

KABIRI:

It’s the answer.

PHILLIPS:

Because it seems like a I’m going to mangle a Billy Joel lyric, which was something along the lines of mistakes are the only things you can truly call your own. The idea being, no matter how much history there is behind us, we still have to trip on that same wire in order to learn the lesson. It sounds like what you’re describing here. I’d like to walk through a specific example with you, so I can really get concrete and understand what the conversation sounds like, what the process looks like. So, if we could maybe use vaccine hesitancy as an example. Let’s say that I’m somebody,  I do, I have an eight-year-old and six-year-old and at the time we’re recording this, the vaccines are not available to that age group, yet. But, they’re coming shortly. So, let’s say that I’m someone who is very skeptical of the vaccine. I did not get one and my intent when it becomes available for my kids is also not to get one. And, I cite as my reason for that my fear of whatever thing I’ve heard spread around that it’s not that bad that there’s 5g in the vaccine, whatever the thing is, that’s gotten into my brain. How do you, what’s your first step in having that conversation with me?

KABIRI:

My first step honestly, is just to take a big breath. Setting your own expectations is so important. Because, the first thing I want to say is, are you freaking kidding me? That’ the first thing, impulsively that’s what I want to say. It’s the human part of me. So, I have to take a breath and just, okay, that’s interesting. That’s cool. All right. Whatever, you know, that’s interesting. And, from there, just set yourself up. Okay, this is going to take many conversations. I’m going to have to spend a lot of time with this person. And, I’m going to have to commit to developing this relationship with them. But, that’s the first thing, you’re creating a safe space. But then I would ask, really ask, oh, tell me more about this. You say that there’s stuff in the vaccine. What’s wrong with the vaccine. Honestly, the problem with misinformation, the weakness behind it is it’s just, it isn’t consistent. It doesn’t make sense. And you just need to follow somebody through the conversation to where they see that on their own. It’s kind of Socratic.

PHILLIPS:

So, you ask me those questions. You say to me, okay, where am I getting that information from? What have I heard about the vaccines being dangerous or 5g or tracking devices or whatever it is? And then where do you go next?

KABIRI:

And when I ask you that, it’s not to set you up. I’m not trying to get you. It depends on what you say is what’s next.

PHILLIPS:

Yeah.

KABIRI:

So, it’s just following that line of questioning. It’s just being almost like wanting to believe it yourself, to get to the point where you can show them that it’s hard to believe.

PHILLIPS:

And that they can own it. So, the first step, it sounds like really is diagnostic. I want to let you talk so I can understand what’s in your head, because I can’t really try to create that crack in the window until I understand what your thoughts and arguments are.

KABIRI:

This is a classic sales technique. You just have to listen. You just listen and get information, get information, get information. Why do they think that way? What is their fear? What are they really trying to avoid? And I think another thing that we don’t really pay attention to is who is in their lives. Like I have a good friend. She was one of the people who motivated me to write this guide who has a friend who doesn’t want to get her kids vaccinated. And her husband is very much an anti-vaxxer, just not into that. So, for her to change her mind, she would have to actually live with this person and be married to this person who is that different from her. I think when you realize who’s in their social worlds, that that is something you have to pay attention to. What are they saying? What are they going home to? Reading the room that way.

PHILLIPS:

Yes. You just made me think of too is so, no, you can’t open the window there by slamming the husband, because that can only go so far. But I do wonder if there’s another story of, oh yeah, you know, I’ve heard these kinds of cases before where the two married partners have a disagreement about this. One gets it and vaccinates the kids, the other disagrees. And you know even though there’s been some tension around that they end up being okay with the fact that they’ve made different choices. So, at least, maybe there’s a way of cracking the window by invoking a third-party surrogate who is familiar to them.

KABIRI:

Totally, totally. I think, for instance, one thing that a lot of people and this gets back to the original question, but one thing that a lot of people believe is that they don’t know if the vaccine is safe, right? They want to wait and see what happens. You can ask them about that and try to get a sense of that. But in a response to that, a way of creating doubt is to say, oh, you know, you’re right that you don’t really know until you wait and see, but so how long? Because I’ve had the vaccine now since April and nothing’s happened to me. So how long should I wait until I should worry? Like, is it a year? Is it two years? And, oh, you know what, the Spanish flu, there were flu vaccines developed then, and believe it or not nothing’s happened to anyone who took them right away. I thought that was fascinating. Just leave it at that. Let them draw that conclusion. That’s really, really important. That’s what conspiracy entrepreneurs do when they try to exploit other people and get them to believe in conspiracy theories. They pose a question and say, isn’t this interesting, like 5G happened at the same time as COVID, isn’t that interesting. They don’t make the connection. They allow the audience to make that connection and own that idea.

PHILLIPS:

Yes.

KABIRI:

And the endowment effect kicks in. You own something. It holds much more value for you. You want to do the same.

PHILLIPS:

You’re also making me think, that diagnostic part, not judging the other person part, the reasons that somebody might not get vaccinated are so varied. One person might be very ideological here listening to a lot of right-wing radio and their viewpoint on this as hardened. They’re not going to change their mind. There might be somebody else, maybe an African American person who has heard their entire lives, the Tuskegee experiment, and also experienced, through just life experience, disparities in medicine. I think in many cases rightfully skeptical. But that is a person who might be more persuadable with good information, especially if it’s  delivered by somebody they deemed credible than the person who is the ideological hardliner.

KABIRI:

Absolutely. And the longer that you hold onto this conspiratorial worldview, like if you’re born into a family that conditions, socializes you into believing that’s the way the world works, it’s black and white, the Illuminati. and all of that. It’s very, very hard. But if somebody has come to this more recently or come to this for different reasons, it could be easier to change their mind, for sure.

PHILLIPS:

I want to be careful of this conversation that I don’t hold myself up as somebody who is never susceptible to misinformation, because as I mentioned that a recent episode, I was sucked in like five years ago. I told the story about Pepe the Frog, this meme that was very active in social media that appeared to be antisemites and racists and misogynists who were supporters of Donald Trump. Maybe some of those tweets really were from those people, but it turns out that there is at least some suggestion that a lot of those were actually misinformation bots, possibly Russian. And so, I took it at face value and as much attention as I try to pay to these things I was susceptible too. And you wrote something in one of your most recent newsletters that I loved. And if I could just ask you to talk about percentages and rather than believing in something or not believing in something as a binary, why you talk about using percentages instead.

KABIRI:

Because the world is not black and white. The world does not work in black and white. There are a million variables that shape any sort of outcome in our lives. You can plan, you can feel like you have control. A lot of it is an illusion. And because the world works that way, because there’s so much variability in so many different things, influencing outcomes, you can’t see the world in terms of if X happens, Y will always happen. That is black-and-white thinking. But, unfortunately, the hyper-efficient brain does that. We jump to those conclusions, like the Tuskegee experiments and creating this doubt about the vaccine today. That’s a great example. Well, that happened then. Yeah, that happened. That was real. That was effed up. That was messed up. That was not right. But what is the likelihood that that is happening now? That’s the real question. I think a lot of people draw that definite conclusion and see the world in black and white. But I like to ask what are the chances? I always like to ask that question, what are the chances that Pfizer would manufacture a vaccine that the FDA would approve that wouldn’t be okay. Maybe there’s a 10 percent chance. Maybe there’s a 20 percent chance, like at what chance at what proportion is it worth taking that chance. That is just a more realistic way of thinking about things. And that prevents you from making choices or making decisions on bad information.

PHILLIPS:

I’m thinking, I forget who said this, and I’m probably gonna mangle this expression as well, but it’s something along the lines of, aside of a first-rate intelligence is the ability to keep two conflicting points of view in mind at the same time and retain the ability to function. And so, if I’m understanding what you’re saying, rather than me saying, oh, come on, the vaccine is safe. It might be smarter for me to say, you know, the vaccine, there’s probably a 96 percent chance that this is safe. I can’t say in absolute terms that there won’t be long-term consequences 40 years from now. I don’t know that we know that or can know that now, but to be able to sit with that uncertainty and say, okay, even if that is true, what is the smartest choice for me to make today? And so, did I understand your percentages?

KABIRI:

Yes, yes. And you can just ballpark it. Like you don’t even have to have an exact number in mind, like are the chances of falling ill to COVID greater without the vaccine, then, then falling ill to the vaccine? Like, what are the chances relative to one another? Like getting on an airplane, you know, oh my gosh, I could get in a crash. What are the chances that you’re going to get in a crash? Like, do you really need to be afraid? It’s the same sort of thinking with everything.

PHILLIPS:

Well, in your guide, you offer a total of 10 different ideas for how you can change a misinformed mind. We’ve touched on many of them. We haven’t touched on all of them. Before we wrap this up, I just want to know, is there anything in your guide that you think is really important that we didn’t get to?

KABIRI:

Yes. We are living in a polarized society right now and polarization can be very dangerous. We’ve seen it historically across the globe and the more that we allow that to happen, the less safe we are as a society. Like we’re just kind of creating a circumstance for disorder. So, if you do know someone, care about someone who believes in misinformation, I really do believe you should not give up, not just for the sake of the person you care about or for yourself, but because if all of us independently make these choices to really work at it, we can combat this polarization that, I mean, that is really breaking us apart. And I think it’s really important to see the bigger picture, to see the bigger purpose, in not giving up.

PHILLIPS:

Where can people download this guide?

KABIRI:

My website is called yournextdecision.com and you can just go to the guides there and download it there.

PHILLIPS:

And I will just make a pitch here that, as I said to you before we started recording Nika, your newsletter is sincerely one of the best ones that I get.

KABIRI:

Thank you.

PHILLIPS:

I think I can speak for most listeners when I say we all receive a lot of newsletters. Many of them, the first thing you probably do is hit the delete key when you see it coming in your inbox. Nika’s is one of those that not only do you not want to delete, but you want to read really quickly. You know that you’re going to be kind of happier or smarter or something better after reading it. So go to her website and download How to Change a Misinformed Mind, but while you’re there, make sure that you’re also signing up for her newsletter, which is just terrific. Nika, thanks so much for joining me.

KABIRI:

Thank you for having me.

Back to All Episodes

Public Speaking Skills Training

Since 2004, we have helped speakers prepare for the world’s biggest stages, including TED, the World Economic Forum, and a presidential announcement speech. We’re committed to your long-term growth, and we’ll be with you every step of the way.

Learn More