Skip to content
Home » Expertise » Podcast » Season 3 » S03 E04 – Misinformation

S03 E04 – Misinformation

This episode we chat about misinformation and disinformation. We explore some useful frameworks about information and critical thinking too.

Support us!

Books

Find all of our guests’ reading recommendations at our The Tao of WAO book club.

Links

Transcript

Important note: this is a lightly-edited AI transcription of the conversation. If you require verbatim quotations, please double-check against the audio!

Doug Belshaw: [00:00:23] Welcome to the Tao of WAO, a podcast about the intersection of technology, society and internet culture with a dash of philosophy and art for good measure. I’m Doug Belshaw.

Laura Hilliger: [00:00:33] And I’m Laura Hilliger. This podcast season is currently unfunded. You can support this podcast and other. We are open projects and products at open collective.com/weareopen.

Doug Belshaw: [00:00:44] Solo Last week when we were talking to Kerry Lemoie, I said thank goodness for vaccines and that we would be talking about misinformation and disinformation in the next podcast. And this is the next podcast episode. So here we are.

Laura Hilliger: [00:00:57] Yes, good that we have a little bit of a plan. I think it’s a timely, um, it’s a timely topic because in the past couple of years I really feel like misinformation and disinformation has been on the rise. And we’ve most recently seen the Spotify situation, which is kind of a combination of like Covid misinformation and reactions to misinformation and disinformation. Um, have you read anything about what’s going on with Spotify at the moment?

Doug Belshaw: [00:01:28] Oh yeah, for sure. Yeah. I’m very keen to get into that. And I just mentioned in terms of, you know, skin in the game and being interested in this, I see this as kind of a. A continuation. A continuing interest and outgrowth, whatever you want to call it, from, you know, my work on digital literacies, the Web literacy work at Mozilla, etcetera. And I want to get back to that at some point, but I’m also currently involved in doing some user research for the Zappa Project. And Zappa Project is a project of bonfire networks, and Bonfire Networks is a kind of a fork of Moodle net, which I’ve talked on this podcast about before. So the Zappa project just really quickly involves me talking to really cool organisations and people about why Federated Social Networks might be good for their organisation and how to counter misinformation and disinformation on those networks.

Laura Hilliger: [00:02:24] And you just started this project, right?

Doug Belshaw: [00:02:26] Yeah, literally last week. So I’ve spoken to a witness in Mexico and also Nebo who are based in Spain, but deal with like Spanish speaking all over the world, especially in kind of South America and Latin America.

Laura Hilliger: [00:02:41] And with the user research that you’re doing on that project, are you is it sort of specific to figuring out how decentralised networks can deal with misinformation or how does misinformation come in?

Doug Belshaw: [00:02:53] Yeah. So the I mean, there’s advantages and disadvantages from a misinformation and disinformation point of view. Maybe we should define our terms in a moment as well. But with centralised social networks like Twitter and Instagram, that kind of thing, the responsibility is on the users to report stuff and on the like stuff like the Twitter stuff to to block things, I guess, and to remove things. And in the UK at the moment there’s a there’s a thing going through Parliament to make social networks take it down before even users flag it, which is interesting. But from a social network kind of federated point of view, you can imagine there being and we can maybe talk about Gab and how Gab was on Federated Social Networks and stuff. There’s advantages and disadvantages. You know, there potentially echo chambers, but also ways in which you can block things and all different kinds of stuff which maybe we can get into.

Laura Hilliger: [00:03:49] So where should we start? I think we should define terms, as you just said.

Doug Belshaw: [00:03:53] Yeah. So I was talking to my son about this this morning because I think sometimes misinformation and disinformation are used interchangeably. I don’t know how you choose to use them.

Laura Hilliger: [00:04:03] Um, yeah, no, I, I actually, when we were getting set up for the podcast and we were talking a little bit about what our kind of flow should be, and you said we should, we should make it clear what the difference is. And I kind of realised that I’ve, I’ve absorbed the difference between these two things, but I’ve never really specifically thought about the difference. So why don’t we start there? Yeah. When we were setting up, you kind of said that the difference between misinformation and disinformation is that one of them might be accidental.

Doug Belshaw: [00:04:38] Yeah, sure. So if you said, Oh, they’re just misinformed. Yeah. If they’re misinformed, it’s not like they’ve deliberately gone out of their way to be misinformed. Yeah, it’s an accidental thing. Um, whereas if, like, I don’t know, Russia or a state actor is running a disinformation campaign that’s entirely on purpose. So I would say that there’s, you know, there’s room for people to disagree around this, but usually misinformation would be accidental. Like you’re passing on stuff that you didn’t know was wrong and disinformation would be, you know, that what you’re passing on is wrong and you’re trying to, like, sow fear, uncertainty, doubt or whatever.

Laura Hilliger: [00:05:20] Where does. So I think this is really interesting because there’s a lot of different kinds of information. There’s a lot of different kinds of content. And if we’re if we’re talking about these two, I think it’s interesting to talk about where it butts up against other kinds of content. So in the show notes, we’re going to share an article that you found which is called Fake News. It’s complicated. Do you want to talk a little bit about the framework that’s in that article?

Doug Belshaw: [00:05:48] Yeah, sure. Just before we get into that, really quickly on the Spotify thing, um, and I’ve read something about this and maybe I’ll dig the article out, but I think it’s really disingenuous for Spotify to try and apply the same moderating approach as Facebook, because Facebook is almost entirely user generated content and Spotify is entirely not user generated content. So Facebook has a moderation problem because they’re allowing people to post anything they want and then they have to deal with the consequence of that. Spotify isn’t doing that. Spotify’s paid Joe Rogan millions of dollars to literally get his content and his audience to have exclusive content on their platform. So to try and use the same approach feels weird.

Laura Hilliger: [00:06:37] And it’s I mean, it’s very hand-wavy. You know, Spotify said that they are going to put a content advisory on any podcast episode that discusses Covid 19, whether the podcast has interviews with internationally recognised public health experts or disseminates potential misinformation. And the way that I read that and I’ll link to the source in the show notes. The way that I understand what they’ve announced is that it doesn’t matter what somebody says about Covid 19, it’s going to have the same content advisory as any other as any other thing that says something about Covid 19. So that means that Joe Rogan, if he is spreading misinformation or disinformation, he’s going to have the same kind of content warning as, I don’t know, a world leading virologist who is actually talking about the science of Covid 19.

Doug Belshaw: [00:07:30] And I think that’s going forward, isn’t it? They’ve actually taken down historical episodes like 100 episodes or something.

Laura Hilliger: [00:07:37] Episodes of what?

Doug Belshaw: [00:07:39] They’ve taken down about 100 episodes of Joe Rogan’s podcast. Oh, really? Yeah, I didn’t read that. And I know this because on Hacker News, which I visit every day, there’s a website which just tracks all of the publicly available Joe Rogan podcasts versus those available on Spotify, because before he went exclusive on Spotify, it was available like any other podcast on the open web kind of thing. So they are, yeah, they’re in future they’re saying, Oh, look, we’ll just put a content warning on. But historically, from what I understand, they are acting as some kind of moderate or censor or whatever you want to call it.

Laura Hilliger: [00:08:15] Yeah, I listened to this morning, I was listening to a little bit of NPR, as I sometimes do, and it was NPR and Tech, and they were talking they were talking about how they were talking about the situation with Spotify and Joe Rogan, and they were talking about how podcasts actually don’t fall under any sort of regulatory guidelines like radio does or television does. And I thought that was really interesting because podcasts kind of they belong with the Wild West content of the Internet. And the Internet has, you know, is historically hard to moderate content on because, you know, because it’s an open platform. I think that’s a very interesting angle as well.

Doug Belshaw: [00:08:58] That is a really interesting point, actually. You’re right. It’s the only kind of frontier left, as it were, because most videos kind of go on YouTube and YouTube has some moderation policies for good or for bad. Most centralised social networks will take things down. Yeah, so there’s an attempt to put a walled garden around. Podcasts at the moment in our podcast is available through Spotify, but not exclusively. Um, and that’s for good and bad I guess, in terms of user experience, but also moderation. Yeah, yeah.

Doug Belshaw: [00:09:31] Mm.

Laura Hilliger: [00:09:34] I feel like this is like a heavy topic for both of us. Like we have so many different directions that we can go in when we’re starting to.

Doug Belshaw: [00:09:43] I think I’m a bit wary because I know that I could talk about this all day.

Laura Hilliger: [00:09:49] Yeah well, and also because it also, it ties into so many of the other things that we’re interested in. You know, we’re both experts in open source and, you know, our open culture advocates. And, you know, when you start talking about these content on the Internet and there’s a crossover around censorship and, you know, safety, but also openness and free culture and all of these things. And it’s a very a big mesh of complexity. And I think that I personally am a bit wary of putting my foot somewhere on a podcast that I don’t actually want my foot to be, if that makes sense.

Doug Belshaw: [00:10:29] Okay. Well, what might be interesting is to just kind of probe and see if there’s areas of difference for us, like if there’s because we we do believe similar things on lots of different areas. So let’s just see if there’s things that we disagree with. But first of all, you mentioned earlier this first draft news article. The author is Claire Wardle, and she has got this really useful misinformation matrix. And there’s there’s some links in this article. There’s one to Global Voices who’ve got a slightly different version, but there’s seven different types of mis and disinformation. Maybe we can just read one each. I’ll start with satire or parody. So it’s like a spectrum almost the way it’s being presented. So over on the left, satire or parody, no intention to cause harm, but has potential to fool. That’s number one.

Laura Hilliger: [00:11:24] Do you want me to go down or to the right?

Doug Belshaw: [00:11:26] However you’re feeling.

Laura Hilliger: [00:11:27] All right. Well, I’m going to go to false connection, which is when headlines, visual visuals or captions don’t support the content.

Doug Belshaw: [00:11:35] Yeah, and some of these are quite close together and there’s a bit of nuance. Number three, misleading content is when it’s trying to frame an issue or an individual in a particular light or a particular way.

Laura Hilliger: [00:11:49] Yep. And then number four, false context is when genuine content is shared with false contextual information. This one is very interesting from a vaccination anti-vax perspective.

Doug Belshaw: [00:12:03] Number five Imposter content when genuine sources are impersonated.

Laura Hilliger: [00:12:08] Number six manipulated content. When genuine information or imagery is manipulated to be deceptive.

Doug Belshaw: [00:12:16] And then the last one, number seven, right over on the right hand side of the spectrum is fabricated content. So new content, 100% false and designed to deceive and do harm. So the idea is that all different types of mis and disinformation fall on that spectrum somewhere. So on the left hand side, satire or parody, someone’s just doing something for a laugh. But actually someone takes it to be real and so it becomes misinformed all the way through to the side where you’ve got people creating things, you know, false things from whole cloth and like trying to make sure that people are are fearful or uncertain or doubt or believe things which are wrong.

Laura Hilliger: [00:12:59] Yeah, I’m going to encourage our listeners to go and have a have a look at this resource because it is really informational. I was going to say, but I met interesting. It’s I think it’s really interesting the way that this resource sort of picks apart, you know, kind of the difference between what propaganda looks like versus partisanship, for example. And there’s a whole a matrix here that kind of shows what each of these content types, you know, how they play into words and phrases that we’re quite familiar with. And it also has it also it talks a lot about how what happens when we just passively accept information, when when somebody sends us something and we don’t double check and we don’t look and we get these ideas in our head that something is true. And and, you know, mostly I think this happens most of the time because you trust the person that sent it to you and you expect that people around you have done some fact checking.

Doug Belshaw: [00:14:01] Yeah. Or it’s in line with your pre-existing beliefs. And so it slips through the net. It has that cognitive ease that Brian Mathers talks about. Yeah. Um, now, just, just to kind of bring because that’s quite abstract, just to kind of give a homely example the people listening might kind of be familiar with. So I’m not using Twitter anymore, but I’m familiar with like how it works, having been on for 15 years and more recently in the last couple of years, especially since Trump, and there’s been this thing where if you try and share or retweet or whatever a link without having read it first or it doesn’t think you’ve read it, it’ll pop up with a little notification saying, Would you like to read this first before you share it? Whereas my understanding, having not used Facebook for what, 12 years now, is that that doesn’t happen on Facebook. So you get really weird things. And you know, the guy I was talking to this morning was explaining this and I was talking about some of this in the talk I gave in New York, too, and a bit years ago. What happens is you get truthy looking websites. You know, ones which are like, you know, sound a bit like, I don’t know The Daily Telegraph, but are one letter out or they’ve, they’ve bought the URL of a now defunct newspaper or something and. A link is shared. And you know, when you share a link in a unfurls like on Slack or on Twitter or whatever, it’s designed to unfurl in a way which is really Clickbaity not even Clickbaity share Baity don’t even know what the word is for that, but the image is appealing in some way, appealing to your political views, and so is the headline like it might be attacking your political opponents or whatever. So people just reshare these things without actually clicking through. And the guy I was talking to was saying, sometimes when you click through these things, there’s hardly any content there. The whole point is to kind of so this particular meme really.

Laura Hilliger: [00:16:11] Yeah, that’s. I mean, this this starts to get into training the algorithm and why online privacy and data security is so important because I mean, I’m of I’m of the habit that any time that I order something from the Internet, I’m really quite careful, even if I’ve been to sites before. I take a look at the URL, you know, I take a look at where I am like I want to be sure because I’ve had the experience where I’ve stumbled into the very, very well done, but not quite the right place. Your URL. Yeah. You know, and almost given data that I shouldn’t give, it shouldn’t give up. And, and I think that I think that for people. For people who are not on the Internet every day and who are not wrapping their heads around sort of, you know, digital literacies, critical literacies, critical thinking, the kind of educational philosophy stuff that you and I have have quite, quite a bit of experience in. I think it today’s world must be so hard to navigate today’s Internet, Today’s, you know, I mean, it’s so easy to pop up onto a site that isn’t what you think it is. And then to interact in a way that could be potentially dangerous.

Doug Belshaw: [00:17:26] Yeah. I mean, this is going to be a not perfect metaphor, but for those of us who have grown up with the Web, you’ve seen how it’s built. And you know what what how it works inside or have a kind of a good idea because you’ve seen it and you’ve grown up with it for people, you know, like my kids or other people who are younger, they haven’t seen it being built, so they don’t know what’s necessarily going on inside unless you kind of scratch the surface and and whatever, everything from, you know, building your own computer through to having to plug in cables before Wi-Fi existed and was everywhere. And just like the kind of sometimes the slips you see in language between people talking about Wi-Fi versus, you know, your data connection on your mobile phone, like just not really understanding what’s going on, really. Um, the other thing I wanted to mention on this misinformation matrix, and let’s just point out that this is now almost five years old and seems to have been created specifically at the time of maximum panic about Trump. Which is fair enough. The misinformation matrix. And there’s no way we can read this out in an audio form which is going to make you be able to understand it. Dear listener. So you’ll have to go and have a look at it. But those seven kind of elements of misinformation or those seven types of misinformation, they’re. They’re kind of plotted against lots of different P’s. And this is Eliot Higgins. He listed four P’s, and then the author, Claire Wardle, has added a few more. So four additional ones. So there’s there’s eight different things mapped onto sardarapat false connection, misleading content, etcetera. Should we just go through those quickly what those different P’s are the reasons why people might want to spread misinformation or disinformation?

Laura Hilliger: [00:19:20] Yeah, let’s do that.

Doug Belshaw: [00:19:21] Okay.

Laura Hilliger: [00:19:22] The first one is not really a reason why, right?

Doug Belshaw: [00:19:25] Oh, well, I guess. Yeah. So if it’s misinformation, it can be accidental. I guess so. The first one is poor journalism. Just not being good at checking your sources and stuff.

Laura Hilliger: [00:19:35] Yeah, the second one is parody. So creating misinformation specifically to parody real information.

Doug Belshaw: [00:19:45] Yeah. The third one is to provoke or to punk someone.

Laura Hilliger: [00:19:49] The fourth one is passion.

Doug Belshaw: [00:19:52] So yeah, just being really into something and therefore being so swept along that you kind of miss other people’s context. The fifth one is partisanship. So just being so biased, I guess.

Laura Hilliger: [00:20:04] Yeah. I can’t believe the sixth one is listed as number six, but it’s profit, which is quite clear that profit is also a motivator.

Doug Belshaw: [00:20:15] Seven is political influence. So you’re not necessarily doing because you’re just so biased. You’re doing it because you know the effect it’s going to have and then the impact it will have on your status, I guess.

Laura Hilliger: [00:20:25] Yeah. And the last one is propaganda, which is political influence times, actual societal manipulation, perhaps.

Doug Belshaw: [00:20:37] So I find this really interesting and I want to connect this back to something which I did in 2012, which was give that Ted talk, um, which was about my thesis about digital literacies. And I’ve been kind of beating myself up a bit recently thinking that I should have seen this coming, not that I personally could have stopped it, but like, I should have seen this coming because I was like talking about memes and their effect and all this kind of stuff. And I should have seen that they were going to be weaponized and used for political gain and profit and all that kind of stuff. And there’s a really good example in this first draft article, which I would highly recommend anyone goes and has a look at.

Laura Hilliger: [00:21:19] Let me let me react to that because if I recall correctly, one of your eight essential elements was civic and and okay, perhaps you didn’t go quite down into the depths of what what’s happening in today’s world around misinformation or disinformation. But you certainly you certainly wrote about how civic, civic participation and civic duty are a part of, you know, part of these essential elements and how cognitive bias might play in critical thinking. So I feel like your work at that time was a bit meta, and this is like a specific application of of the meta.

Doug Belshaw: [00:22:02] Yeah, you’re probably right. And actually I wrote a post for ML Central that I’ve quoted a couple of times in this podcast called Curate or Be Curated, which was talking about algorithmic. I suppose that was specifically around the change that Twitter was making to its algorithm to make it instead of So you see the most recent posts from people you’re following. You were getting algorithmic, algorithmically served up content which improved engagement and was good for shareholders, but not necessarily good for us because it became a rage machine from 2014 onwards. Um, but the reason I wanted to use that as a bit of a Segway is because just how easily things can be weaponized. And I want to read just one short paragraph from this article, and it links to a BuzzFeed article. It says, As this BuzzFeed article highlights a group of us Trump supporting teenagers have connected online to influence the French election in April. This was April 2017. They have shared folders of shareable meme shells, so even those who can’t speak French can drop visuals into hashtag streams. It’s now incredibly easy for loosely connected groups to mobilise using free tools to coordinate private messaging, end of quote.

Laura Hilliger: [00:23:20] I think. Yeah, the world. No, I wanted to connect back to something that we were talking about a few minutes ago, which is about how people who grew up with the Internet and can see inside the Internet and how they might be interacting with content slightly differently. And and I think that, you know, when we think about, quote unquote, digital natives, which is not a real thing. Um, a really interesting difference between how we interact with the Internet. Those of us who grew up and kind of looked inside the hood of it and, and the newer generations is that they’re, they’re interacting pretty predominantly through a handful of platforms. So Twitter, Instagram, lots of Instagram. And the way that they’re interacting with content is really through very specific social media platforms that mean specific types of content become popular or viral. And so that this quote that you just. So there’s something here about how that kind of content that you can just easily make a meme whether or not you speak the language and use these sort of free tools, how that is then displayed and where it’s displayed and who owns the platform that it’s being displayed on.

Doug Belshaw: [00:24:42] So listening to this, you might be thinking, All right, well great. Doug and Laura, like you’re you’re diagnosing what the problem is like. Everyone knows what the problem is. What can you do about this? Well, I think the point that we’re kind of implicitly making, or certainly what I’m making, is that the more control you have over your information environment, the less likely you are to be subject to misinformation and disinformation. The more that you outsource your news reading to an algorithm which is on a social network, the less likely you are to be informed about what’s going on in the world. Um.

Laura Hilliger: [00:25:19] I would certainly agree with that. So I think you can make that point for for both of us. And I think that it’s not just through technology that it’s important that you sort of get outside of your own bubble, right? Like in the real world, we also need to talk to people that we, you know, that don’t see eye to eye with us or that are strangers in some way, because this is this is part of what it means to, you know, to create a information environment that doesn’t actually shell you into one certain way of thinking. And I think that’s partially, you know, from my perspective, that’s kind of what it means to be educated, is that you you know, that you don’t know anything. And that talking to a different kinds of people is going to open your world and send you down new, new paths.

Doug Belshaw: [00:26:07] I mean, the great thing about the Web for, you know, slightly introverted teenagers in the 90s and early 2000 was that you might be into something quite niche and the chances of anyone in your home town or village or whatever being into that were quite small and the internet all of a sudden connected you to people who were interested in the same things. And this was amazing. Um, but the flip side of that is that there’s always been, especially in the UK, there’s always been that weird guy in the corner of the pub who had crazy conspiracy theories, but the same is now true of him or her or whoever. They’re now connected to crazy people as well, and it gets amplified. So I was reading this morning about New Zealanders anti-vax and New Zealanders basically being inspired by what’s happening in Canada, i.e. truckers going on some kind of convoy drive, Anti-vaxxer Liberation, Freedom March Drive, whatever you want to call it. And literally in New Zealand they’re planning to camp outside the Parliament until they relax Covid lockdown restrictions. Now, that’s not the kind of thing that would have happened. Even ten, 15 years ago because to get the critical mass would have been extremely difficult.

Laura Hilliger: [00:27:31] So I think that, um, I think that it’s important to say. How do I want to frame this? I think that it’s important to say out loud that not everyone who falls for misinformation is crazy or, you know, like I think the misinformation movement and the disinformation movement. And I think that it’s become coordinated to a level that a lot of people are behaving in ways and beginning to believe things that are really because of that misinformation and disinformation. Like it’s I mean, I really think that there’s a there’s something else going on around how people are being targeted and what they’re being targeted for. Yeah, I think that you’re right. 10 to 15 years ago, it wouldn’t have it wouldn’t have occurred to somebody to organise a similar protest to something that they are seeing across across the world, especially with the intent of stopping the government from being able to govern. But I do think that it’s important to call out that a lot of this information and misinformation and disinformation is so real to people. It’s so nuanced, it’s so almost the truth that when you start looking at almost the truth and you believe that little bit of a sleight of hand, it’s easier to believe something that’s a little bit further away and a little bit further away until you’re caught into this this, you know, this very real world where you believe some of the most insane conspiracy theories I’ve ever heard.

Doug Belshaw: [00:29:18] When I was doing my philosophy degree, there was this philosopher was a W.V. Quine, and it was quoted a lot in my kind of especially my stuff around like American pragmatism. So this kind of theory is that we have these we have a web of beliefs and some things are are closer to the centre of the web. And if the centre of the web is us, however you want to think about that as an individual, some beliefs are like core to who you are and some are like more peripheral. So, for example, if I think that the capital of a particular country is I’ve never been to, I’ve only really ever heard of once this tiny country somewhere in the world, I think the capital is this place. But actually it’s somewhat it’s a couple of different country. That’s a peripheral thing to me. It’s not it’s not like a core thing to who I am. But if I found out, like, I don’t know that my my father wasn’t really my father or something like that or that the religion I’d been following all my life was false.

Doug Belshaw: [00:30:25] Like, that’s core to who I am. Yeah, that’s a that’s a core belief. And what I think is really interesting with all of this is that I feel like some of the conspiracy theory stuff starts off quite peripheral and therefore falsifiable. But when something becomes like a core belief, your whole belief system is organised around these things being true and therefore you’re always willing to make exceptions. It’s a bit like, you know, with Galileo, Galileo’s theories or Copernican kind of theories of the sun being the centre of the solar system versus the earth. When I was doing the philosophy of science again, what used to happen was people would add what’s called epicycles. So cycles within cycles to explain the orbit of the of the planets around the earth as they thought, whereas actually what they were doing was doing very complicated calculations because their entire mental model was wrong. Now, I don’t know if that makes any sense, but the thing I’m trying to get my head around is why some people are willing to defend stuff which surely they know is wrong.

Laura Hilliger: [00:31:37] Well, let me know when you get your head around that and then you can explain it to me because. Well, I mean, yeah, I think I think with that theory sort of indicates is that. It’s you know, you don’t change your core beliefs overnight, right? You don’t learn one piece of information. Very rarely do you learn one small piece of information that completely changes like who you are. But I think that, you know, every once in a while that happens. I think that’s called trauma. But I think that, you know, you can have your core beliefs picked, picked away little by little, day after day until suddenly, you know, suddenly you actually realise that your core belief is completely different. But maybe you wouldn’t realise it because, you know, I mean, we’re talking about conspiracy theory here.

Doug Belshaw: [00:32:31] So yeah, but. What usually happens is that the way that that’s usually conceptualised is that someone is like brainwashed by their parent as a kid or whatever, and then they slowly come to realise that it was all a lie. Whereas what we’re talking about with conspiracy theories is almost the opposite of that. Like being fine as a child, maybe, I don’t know, becoming an adult and then getting sucked down such a rabbit hole that your entire belief system changes and you could become. Yeah, completely. You’re understanding the world through a very narrow theory.

Laura Hilliger: [00:33:07] I don’t think it’s so different, though. I think that if you, you know the first example that from childhood you’re being brainwashed, it’s, you know, that is a day after day chipping away, chipping away until you are essentially brainwashed. And I think it’s the same with conspiracy theories. I really don’t think that it happens overnight. I think it’s it starts with a share from your, you know, your great Aunt Sally who shared a thing and, you know, her love her and trust her. And so you don’t fact check and you just believe that it’s real. And there’s one chip in the in the day by day and over time. And when your information when your bubble is intact and when your information stays the kind of information landscape that it is, I think little by little, you can, you know, fall into into the beginning of the rabbit hole and just keep going. And then at some point, you know, not be able to find your way out and or not know that there is a way out and have friends and family telling you that you’ve lost your way. So I don’t I don’t think it’s I don’t think it’s all that different. I think it’s just with some of the you know, with some of the conspiracies that are floating around in today’s world and some of the stuff that we’re aware of. For us, it seems incredible that somebody would fall that far down. But if you kind of start reading about some of the people who have found their way back out, the way that they describe it is really it’s really that day by day, chipping away until suddenly you’re just in so deep.

Doug Belshaw: [00:34:40] Yeah, I suppose it’s an addiction and like it’s like a it’s an addiction to like, an alternative reality game, like an ARG. It’s almost like that. Um, when you were talking there reminded me of and I don’t know how true this is, but sometimes people say that, you know, you need to be careful about who you associate with because you end up being like the, the average of the five people closest to you. So, for example, there have been studies showing that if you, you know, if your friends put on weight, you’re more likely to put on weight. Um, that kind of thing, you know, like if your friends are very left leaning, you’re like to be more left leaning, that kind of thing. Um, and it reminded me of another kind of pragmatist philosopher, William James. This is my kind of way of organising the world, I guess, pragmatism. But he talked in the varieties of religious experience, which was kind of a, I guess, an agnostic take on people’s religious beliefs. He said that our belief is belief in other people’s belief. So I if someone has a strong belief in something and expresses that, we start off by having a belief in their belief because we we want to have that depth of feeling, that kind of thing. So yeah, it’s interesting.

Laura Hilliger: [00:35:55] I think that’s a that might actually be a really good segway into talking a little bit about what you can actually do about misinformation on the Internet. And the reason that I think that it’s a good a good Segway is because. If with the example that you gave, if someone is really passionate about a particular belief and then you want to believe that because passion attracts passion. Um, I think there’s something really interesting here that revolves around, um, how do I put it? It revolves around how people are able to respond to other people in the world. And so let me just give a little bit of context here. Um, we’re, we’re doing some work with Greenpeace and working on a project that’s about implementing a web strategy that we that we helped write a while ago. And we’re just doing a little bit of research and collecting some best practices for speaking with our audience and some stuff about how Greenpeace as an international organisation actually deals with user generated content and moderation and these kinds of things. And so I was kind of digging around, wrapping my head around some of the challenges that that go along with being an international organisation and, you know, being on the Internet. And I came across a term called CVE Countering violent extremism, and I’ll put the link in the show notes. I came across a paper that was specifically about being able to do this on social media and why it’s challenging, and it led me to sort of start thinking about what are the differences in responsibility for dealing with misinformation and disinformation and what are what is from a personal perspective, what what should I expect of myself in dealing with these with these rather with a somewhat heavy topic and one that can can be in some way dangerous?

Doug Belshaw: [00:38:12] Yeah, it’s it’s difficult because it’s trying to think. I mean I’ve only skimmed that, that link that you’ve sent, but it’s talking about where’s the line. You know like between. Very, very right wing, extreme alt right commentators politically and like jihadists and stuff and different strategies and things. Um, but I mean, we, we get people. Commenting on our stuff occasionally and sending emails and stuff. And I think. Back in the day. Back in the day. Pre-pandemic, I probably would have entered into some kind of extended dialogue, whereas now I would not only block them, but block them at a fundamental level where I don’t even see their stuff anymore because I’m just not I don’t see it as my. Well, do I see It’s my job. I see it as my job in some regards with the people within my bubble. So, um, like as a parent, as an educator, as a, as a son, that kind of thing, but not in terms of a random person on the internet. They can believe whatever they want about climate change and stuff. And I’m only interested in the impact it has in the aggregate, really, unless someone’s lives is in danger.

Laura Hilliger: [00:39:32] Yeah, exactly. So I mean, I think this is a really interesting thing to think around. Like there’s a societal responsibility around misinformation and disinformation, around the platforms that allow disinformation to spread. From a regulatory perspective, you know, the the I mean, it shouldn’t be that it’s always an individual’s responsibility. I think that when it comes to public health, then there should be some regulatory response to the to the disinformation that’s going around when it comes to people’s safety. Right. From an individual responsibility, responsibility perspective. I definitely think that when I was younger, I was more. Interested or more? I don’t know. I think maybe I felt like I had more power. And so so I put it upon myself to change hearts and minds around some of the kinds of misinformation. And I definitely felt more of an individual responsibility in communities that I feel responsible for. I still have that individual responsibility. So I’m the moderator of a variety of communities, and I do feel a responsibility to counter, um, well, countering violent extremism, which I think I’m using the term as a shorthand because in my communities I wouldn’t term it violent extremism extremism. But but it is some of the things that happen in online communities is is dangerous for people in some way. And I guess there’s there’s a scale there.

Doug Belshaw: [00:41:18] For sure yeah. Um.

Laura Hilliger: [00:41:20] And, I mean I, I think I’m it’s more countering arseholeism. You know, from my I’m not it’s not really extremist.

Doug Belshaw: [00:41:31] Let’s not spend too long on this. But people can be extremely wrong and dangerous intellectually whilst being absolutely charming on a personal level. Mm. You know, if you ever met politicians, then you’ll know what I mean. Um, but so we have to counter the. There’s a physical threat. There’s the abuse of language, but also there’s the, the kind of you’re, you’re spreading misinformation and disinformation here. Now, as soon as you take a stand, then someone’s going to say, you’re impacting my freedom of speech or freedom of expression. But freedom of speech and freedom of expression does not extend to me having to listen to that or this community of people to which I have responsibility having to listen to it. You can go and rant about it as much as you want in the glass cage over there. I don’t have to listen to it. Now, I’ve experienced this like with a parent and one of my son’s football matches. Like every time I started talking to him, he would bring the conversation around to anti-vax stuff or, you know, immigrants or something. And it got to the stage where I didn’t say I didn’t want to talk to him. I said, I don’t really want to talk about this stuff where I’m just here watching football. And he said, Well, I don’t want to talk to you then and walked off. Um, which is interesting. And he’s kind of avoided me since then, which is.

Laura Hilliger: [00:43:00] That’s interesting because there’s a similar occurrence in my circle where someone I know, they’re, um, they’re what’s the, what’s the word? Their, their line manager, I guess I would say, you know, was constantly talking about anti-vax stuff and Covid theories and stuff. And my friend was a bit didn’t really know what to how to deal with this because it was a line manager. And so they didn’t want to, like, piss them off, you know, because their job was on the line or they felt like their job was on the line. Whether or not it actually was is debateable. But after having to deal with this kind this kind of conversation from their line manager over and over, they finally they were, you know, asking they were asking me and we were talking about it. And they were like, do I quit? Or should I say something because I can’t listen to this anymore? And I said, Well, if you get fired for saying something, then that’s a same result as quitting and be actually illegal. So, you know, and in the end, they they decided to say something. They said something about it. And the line manager just stopped, just stopped talking about it, went back to only work. Just, you know, just let it go completely. And I thought that was actually a a rarity. Like, I was really impressed. Yeah, they just.

Doug Belshaw: [00:44:28] I think I think too often and that is a perfect example of someone being in a position of power and using that power even if they didn’t realise it, to get to be talking or setting the boundaries for the conversation, information, environment, whatever you want to call it. And I think in general we don’t challenge that enough and say, I don’t want to talk about that or that doesn’t interest me or whatever. And I’ve I’ve got I’m a bit like you, Laura from experience. I’m not really bothered if I piss people off, to be quite honest. I’d rather say something and just, you know, be right up front with people. So I’ve really annoyed family members, I’ve annoyed that guy, whatever, but I’m not going to spend my short time on earth having to, like, pour energy into refuting bullshit.

Laura Hilliger: [00:45:16] Mm Yeah, I definitely I definitely think that we don’t see enough of that kind of an example where people just say what they need, stand up for themselves. But I also think that I think that it is difficult from a resilience perspective, from a personal worth perspective, because if you’ve done that, as I have stood up for myself and said something and then been, you know, brutally rejected, slammed down or otherwise marginalised in the response, then the next time that you want to stand up for yourself, it’s a little bit more scary. And over time, I can certainly understand why why people kind of put up with crap that they don’t want to put up with.

Doug Belshaw: [00:46:07] And yeah, no, no, I totally get that. And I’m speaking as a, you know, I’m playing life on the easiest difficulty level. I am a, you know, a white, straight, middle aged man living in the West. I totally get that. Um, in which case I should be doing it even more. I should be like saying that is bollocks. We’re not having that. Thank you very much. The difficulty, of course, is if you’re in a room and you feel you’re willing to say that, but other people aren’t, what gives you the right to speak on other people’s behalf? You can only speak on your own behalf and it becomes fraught with difficulty.

Laura Hilliger: [00:46:39] Yeah, I’ve definitely gotten gotten myself into trouble with that one for sure.

Doug Belshaw: [00:46:44] Right. We should probably wrap this thing up, but I think we would be remiss if we didn’t talk about your favourite conspiracy theory.

Laura Hilliger: [00:46:52] Oh.

Laura Hilliger: [00:46:53] I don’t have a favourite conspiracy theory, but I am going to include a link to Birds aren’t real.com, which is a group of young activists who are trying to convince the world that birds aren’t real. No, it’s actually what it is, is they’re using conspiracy theory to deal with disinformation. And I think it’s really funny and cool. Basically, they they started this accidentally. One of the founders was at a women’s march or something, and there was just like a lot of just stupid bullshit on some people’s posters. And he got really frustrated. And so he decided to just flip his poster over and write Birds Aren’t Real. And people started to come up to him and ask him what he meant and he just kind of bullshitted, Oh yeah, you know, there’s no birds in the United States anymore. They’re all surveillance drones, blah, blah, blah. And people believed him because that was the atmosphere of the event. And so he started this this organisation that is designed to help family members of people who are affected by misinformation or who have fallen down into conspiracy or rabbit holes, find their way out, which I think is a really cool way to I didn’t mean for this to become a birds aren’t Real promo. It’s just I think that it’s a cool project and unfortunately not it’s not a not for profit as far as I can see. So I don’t know what the real motivations are, but I think it’s cool that people are trying to help other people get their family members out of some of the conspiracy theory stuff.

Doug Belshaw: [00:48:33] Yeah, it’s and it’s funny. And like, I don’t want to be that middle aged guy. Kind of like going, Oh, well, you know, this could have serious consequences or whatever. But if we go back to that misinformation matrix and the kind of, um, what was it, the parody or satire? This is firmly in parody and satire. Yeah, but it’s the kind of thing which could easily be. You know, thought of as actually being true. Yeah. Yeah. I mean, I go back to my my son and he’s now 15. And about five years ago he was in the back of the car and said, Dad, did you know that there’s seven places on Earth where there’s no gravity? What I was like what? Yeah, it said on YouTube. Yeah. And I was like, Right, okay, I’m going to have to sit you down and explain that every video on YouTube isn’t real and some people are malevolent. And I use that as a shorthand for like, yeah, you haven’t always got the critical thinking skills to be able to operate in the adult world. Son I use that as a kind of shorthand for that these days. But it’s interesting, isn’t it, when you see stuff like that, things which were obviously meant as a parody being taken as true by children or less informed people.

Laura Hilliger: [00:49:51] Yeah. I don’t feel like we should end on that note, though.

Doug Belshaw: [00:49:54] No, we shouldn’t. We shouldn’t. What else should people look at if they are interested in helping themselves or other people around this? I feel like we haven’t talked about one thing in particular that we wanted to talk about, which was the kind of triangle between. Personal responsibility. Societal responsibility and kind of educational responsibility. Duty of care. Parental responsibility or something.

Laura Hilliger: [00:50:22] Well, we talked we talked a little bit about it, but in a sort of roundabout way, I mean, I think that we could end by giving people the advice that, you know, that they should pay attention to harm and harm reduction and be open eyed, clear eyed and conscious about their own resilience, as well as other people’s resilience when dealing with misinformation. It’s also kind of a roundabout way to say, Just try not to be an arsehole.

Doug Belshaw: [00:50:56] Interestingly, I had to mute myself there because right past my office, which goes up to the road, goes up to the school, and what sounded like two fire engines went flying up with their sirens blaring. Now, I’m assuming that they’re doing a fire alarm. But when I was at school, the kids didn’t want to be in lessons. And I can remember over a two week period us being on fire drill more than we were in lessons it felt like. Now, the reason I mention that is because it’s a bit like the boy who cried wolf, isn’t it? Like you can think, well, you know, misinformation, satire, a bit of disinformation, whatever. It wasn’t matter. But it it pollutes the the environment. It it reduces trust. It erodes kind of the normal functioning of society. So I don’t know if that’s a bit of a tenuous link. It probably is, but it literally just happened, so I thought I’d mention it.

Laura Hilliger: [00:51:45] Yeah. No, I don’t think it’s a tenuous link at all, especially because you just used the Boy Who Cried Wolf, which is a very old aphorism that is about exactly this topic. You know, about what happens when somebody does spread misinformation. And that’s a very simple example. Like nobody believes the boy anymore and he gets mauled by a wolf, which is dark and twisty. But, you know, maybe we can just maybe we can just end by reminding people that this is complex stuff. But wrapping your brain around it is probably a good thing to do at this point in time because there is a lot of misinformation and there is a lot of disinformation going around in the world today. So here’s a reminder to double check your sources.

Doug Belshaw: [00:52:34] And yeah, just to my final advice for what it’s worth is if you’re feeling the burden of trying to correct people who are wrong on the Internet, maybe just think, do I need to do that? Do I have the kind of relationship with this person? Do I have any relationship with this person where I feel like facts might actually make any difference to their emotional response to certain topics? And if not, press the block button? Yeah, because it’s probably the best thing to do.

Laura Hilliger: [00:53:07] I feel like that was very good advice. So thanks for listening. We will talk to you later. I think we’ve said in previous episodes we’d love to have some listener feedback. So if you got all the way through this episode and you want to tell us what your thoughts on misinformation or anything else are, do send us a note. Get in touch. We’re all over the internet. Bye for now!

Doug Belshaw: [00:53:31] Cheers!

"