Rebroadcast of Mis- and Dis- Information with Nasma Ahmed
Season: 4 Episode: 1 [Rebroadcast of S01 E08]
Nasma Ahmed is the Director of the Digital Justice Lab and she helps Elizabeth unpack what exactly mis- and dis- information are, why we need to question content we see online, and how a lack of trust in larger political systems plays in. Additional Resources:
Marwick and Lewis with Data & Society also have a helpful report: Media Manipulation and Disinformation Online.
Episode Transcript: Rebroadcast of Mis- and dis- information with Nasma Ahmed
Read the transcript below or download a copy in the language of your choice:
Elizabeth: [00:00:05] Welcome to Wonks and War Rooms, where political communication theory meets on-the-ground strategy. I'm your host Elizabeth Dubois. I'm an associate professor at the University of Ottawa, and my pronouns are she/her. Today, I'm recording from the traditional and unceded territory of the Algonquin people.
[00:00:19] Alright, today we're kicking off our season on mis- and disinformation, and we're actually going to start things off with a throwback to season one, where I spoke with Nasma about how we actually define and identify mis- and dis-info. The conversation was in the context of a recent election in 2019, and we talked a lot about how mis- and disinformation flows or doesn't flow the kinds of different actors that might be involved in mis- and disinformation campaigns. And while a lot of that was actually contextual to the time back in 2019, a lot of it does actually still hold true today in the rest of this season will be talking about and complicating this idea of mis- and dis-info by thinking of things like political polarization, post-truth politics and a variety of other concepts and theories. But, I figured it'd be great to get us all on the same page with a rebroadcast of that initial mis- and dis-info episode. So, I hope you enjoy it and check back next week for a brand new episode on information disorder with Claire Wardle of First Draft. Alright. Take it away, Nasma.
Nasma: [00:01:27] My name is Nasma Ahmed, I am based in Toronto. I run something called the Digital Justice Lab, which is focused on building towards a more just and equitable digital future. Very broad work, but really just trying to find ways that we can build capacity together, reimagine the possibilities around the technology that we use and recognize that our fights around technology are actually interconnected through the fight for economic justice, racial justice, et cetera. So that's the work that I do.
Elizabeth: [00:02:00] Awesome. Thank you so much for being here today. I want to talk about disinformation, misinformation, the dreaded fake news term, which I, as an academic, find incredibly frustrating. I'm going to guess that you have heard of these terms before. Yeah?
Nasma: [00:02:16] Yes, I have.
Elizabeth: [00:02:17] Awesome. Okay, so I'm going to do the real quick rundown of the definitions we use in academia, and then we'll see if that tracks with your understanding. So disinformation is essentially information that is fabricated or deliberately manipulated. It might be audio, it might be visual, it might be text based, and it's intentionally created to kind of make people believe something that is not true.
[00:02:43] This idea of disinformation has sometimes been called fake news, but fake news became a bit of a catchall term when this became an issue that the general public was concerned with. And so fake news is typically not used very often in academia because it's not very precise.
[00:03:03] Misinformation, in contrast to disinformation, is unintentional mistakes. And so that might be, you know, inaccuracies in a photo caption, dates or statistics that have been misprinted. Often we think about misinformation in the context of a journalist who has accidentally reported something that was untrue, and then they have to kind of report a retraction later or a modification later.
[00:03:27] We also have the notion of mal-information, which I think is relevant where that is information that is intended to harm. So some disinformation, there's kind of an intent to harm, but not all disinformation, [whereas] mal-information, no matter what it looks like, the whole goal of it is to harm somebody else or hurt somebody else, deliberately change the context or date of genuine content in order to confuse people, for example.
[00:03:55] There is an example from the last election where there were some people tweeting, "Oh, remember to go vote on Election Day?" And then they said the wrong date because they were intentionally trying to get people not to go cast their ballots. So do these different definitions track with your understanding of those terms?
Nasma: [00:04:13] Yes, it does. And I think it's also - there's also a really interesting in between when we think of like, for example, disinformation. And then someone like also sharing something that they didn't intend to mislead. But like, was this information that they also shared? Like, you know, but it's not that they view in any way, shape or form that it was like false information. So I think it's also a weird in-between as well.
Elizabeth: [00:04:39] Yeah, for sure. There's, I mean, there's a few different ways we can see in-betweens and gray areas. One is that one that you've just described, you know, there's the person or entity that posts the thing in the first place and they might have intentionality behind it. They might be intending to spread disinformation, intentionally saying information that is incorrect. And then, we using social media largely but in other formats too, can share that information. And we might have just been duped, right? We might have not realized that it was in fact incorrect. We thought that it was correct and and therefore shared it, or we didn't really think at all about whether or not it was correct or not. So we don't have our own intentionality behind it, but, nevertheless, information that is false spreads through our systems.
Nasma: [00:05:30] Yeah, exactly. And I think that's been extremely hard, especially during COVID-19.
Elizabeth: [00:05:34] Yeah, completely. It's really challenging because, you know, the ways that we have gone about as a society trying to deal with disinformation, particularly in the context of a global health pandemic, where we need people to have accurate information. And often it changes and it changes quickly. In those contexts, a lot of the focus has been put on "Let's curb the spread of disinformation rather than getting rid of the information in the first place." And so a lot of the focus has been on, "OK, well, let's just make sure that the people who are saying masks don't do anything or COVID 19 isn't real. Let's make sure that they don't get as much airtime. They don't get as much push. They're not going to be on the top of all of our Facebook timelines. Or in the trending topics on Twitter." Like the goal is to keep them out of the sort of public eye, make their messaging less salient, less important. But then to be able to do that, you have to have figured out what that content is.
Nasma: [00:06:40] And essentially like what we're trying to do in many cases, or I guess a lot of practices are "Let's drown it out" essentially, right? Like how do we be able to share the quote/unquote - I say quote/unquote, because it's like the right information, for example, from public health, right? And so, you know, a lot of us do that even in our own personal circles by just constantly sharing the information that we deem as accurate. So whether it be from public health or et cetera. And just constantly sharing it as a means of being able to kind of curb the other stuff that people might see.
Elizabeth: [00:07:16] Yeah, totally. The kind of idea behind it is this pretty common argument of "The only antidote to disinformation is high quality information," right? And that's problematic for a few reasons. It's problematic because we know that once disinformation starts to spread, it goes like wildfire often. And so we end up in a situation where disinformation just can reach such a wide audience really quickly. And more authoritative information - or information we trust, doesn't spread as quickly or as far. And then we also have a problem because disinformation often gets shared because it strikes some sort of emotional chord or it strikes us as probably being true. It feels convenient to fit into our existing world views. But, the other kind of information we have might not be quite so convenient or comfortable to incorporate into our lives, and so we're less likely to pay attention to it and want to bring it on board from the get go. Even if we have all of our friends doing their best to say "You need to wear a mask, we now know that masks are helpful. Please put a mask on," you know?
Nasma: [00:08:37] Yeah. And I also think that it allows us - it's a humbling opportunity, right? Because I think, we often want to think that other people, right, if other people, as you said, like other people who are experiencing this, not us. And I think that when it comes to the misinformation and disinformation side of things, especially around COVID and like the pressing issues right that are occurring in our world right now. It is important for us to all acknowledge that there's a reason why we can get caught up in this right like it is. It is not necessarily like a comment like, you know, people always think it's like common sense. How would you not know this? And I think that that's where we have to remove that kind of judgment that occurs as well. Because I think that people are trying to understand the world that they exist in. And a lot of the things that are happening often seem unfathomable and being so far fetched. So you're trying to find a semblance of balance with whatever that looks like for you, right? Or semblance of understanding how the world works. And so I think that that's been a very humbling experience across the board of like, hey, I had also been caught up multiple times and like things that I thought were real and were absolutely not real. And that could be maybe because I just wanted to think that it was real, which is also totally true. I think that's part of how we're able to deal with the issues at hand is acknowledging that all of us can be trapped in that very quickly, right? Whether we're just too tired to look it up, whether we just don't want to read the article, whether all the things. Not everyone is going to be fact checking all the time, like who has the time to do that all the time?
Elizabeth: [00:10:09] Yeah, absolutely. There's a limited amount of time that we each have. You also mentioned at one point, you know, like wanting something to be true. So then believing it and you know, that plays into this idea of confirmation bias that has been studied in psychology for a very long time. We know that people pay more attention to the things that confirm their existing beliefs, and they kind of push away things that don't confirm their existing beliefs. We also know that people, they seek out information to support their hunches and their inklings, and they don't seek out information that doesn't support those things. We know that there's this idea of cognitive dissonance where when we are faced with information that doesn't fit with our beliefs, we kind of ignore that what we think doesn't match with reality because that's really uncomfortable and mentally difficult to manage, right? Like it takes a lot of energy and time for us to manage a mismatch between what we believe and what we're seeing in the world. And so often it's just easier for us to go about our day ignoring that mismatch.
Nasma: [00:11:20] Yeah. And I also think that this is also where the interesting piece comes around trusted sources. We are in a place right now, where I actually think that people have the right to not trust certain entities just given the political climate, whatever the case might be, where I understand where the cynicism or the lack of trust exists, because how have these entities actually been deemed trustworthy to us is another conversation. But you know, I think that that trust component also makes it very hard for us to believe that we are getting the two sides of the story, for example, or even understanding that sometimes there isn't two sides to the story. Sometimes there's also right and wrong, and that's once again based off of our own ideas of what is right or wrong, which makes it very difficult.
[00:12:08] And so I think that the reckoning that we're dealing with right now is not only a deliberate campaign across the board that is meant to heighten emotions and then to question absolutely everything, which is, I think there's a piece of it that is important, right? Like we should question, that is not that - we're not saying to not question where to get the information, we always should. But then also the limitations that now exist with what we're deeming as trustworthy sources - that we may not view always as trustworthy, right? Like, we may not feel the same way about CBC that we feel about, like all the other entities and media entities.
[00:12:44] And so I think that's what's making it very hard, even now, to be able to discern what information you're receiving. Because I think even our ideas of trust with the entities that we used to deem trustworthy is shifting, right? And that could be through obviously the disinformation side of things, which is a deliberate but also the due to the like, you know, the lack of accountability measures or transparency, like, you know, all those other pieces, which is actually, I think, a very valid critique that can be posed against what we deemed trustworthy institutions.
Elizabeth: [00:13:19] Yeah, absolutely. I think this idea of trust is a really crucial one when we're trying to understand what misinformation is, what disinformation is and what, whatever the contrary to that would be. And, you know, trustworthy is one way that we phrase it. Another argument has been made, it shouldn't actually be about trust, it should be about authoritative sources or not. And so you might have noticed that Facebook has started talking about using authoritative sources to label content which may have been flagged as misinformation or disinformation on their platform. And the idea there is, you know, we're not telling you who to trust or not. We're telling you that there is some authority that exists that has contrary information to the thing that just showed up on your screen. What do you think about that in terms of this idea of trust versus authority?
Nasma: [00:14:13] Yeah, I was actually looking. I wanted to do a deeper dive on that because I think that what is the difference between like if it's authority or verified? I mean, it's like it's very connected to verified information, right? The idea of like there is a source that allows us to be verified, which then gets very messy. You know, Facebook is trying to figure out, OK, like what is deemed authoritative. I don't know. Does that mean it has to be peer reviewed? Does that mean that you have to have a very specific political source? What type of source is it? Right? And so I think that this is where obviously the information battles continue to exist, right? And I think that one actual fear that I know Twitter's been testing out, which is more to help with the spread, which I actually think is really, really good because I've caught myself in that sometimes where if you've actually read the article, like, that's also another thing that I feel like once again, it puts a lot of the onus on the user, which I think is important, right? We have a responsibility to ensure that we're not sharing information that we have not consumed in some way, shape or form.
[00:15:18] And I think that's been a really interesting way of like, OK, like, have you read the thing? And if you read it, like, will you still share it? Like, I'm really going to see what the stats would be as they move forward with those features of like, does that mean even with Facebook, right? Does that mean that people will share it live?
[00:15:33] Yeah. You know, I'm like, just really curious to see if that actually psychologically impact anything, you know, because like, for example, when you're taught, when you're having like a feeling right and you're like feeling really like, Oh, whoa, like this is like something is being triggered in me. You know, therapists often say, like, give yourself like a couple of deep breaths, give yourself a couple of deep breaths before you make an action or count to 10. Or there's all the different tactics. I'm curious to see how whether it's flagging it or asking, Oh, have you read those kinds of moments actually provide that same kind of trigger that, we already know in psychology, that stopping for a second before you move and make an action?
Elizabeth: [00:16:18] Yeah, I think that that analogy between, you know, the take some deep breaths. So that you can sort of calm yourself, recenter yourself before you say the thing or react in whatever way you're going to react and these cues, we sometimes talk about that as creating friction in the system, like slowing things down, creating a bit of a "Oh, you should go check. You should just like, Hey, have you done the thing that everyone tells you you should do? Did you read that article?" That's got potential for sure. But we also know from academic studies that people are really bad at remembering what the source of information was on social media like, they don't pay attention to who actually posted a thing or what outlet the thing came from when it's in their Facebook feed. And so I worry that people might just skim over those labels and not pay attention to them at all.
Nasma: [00:17:15] Yes, and I think that's part of -once again, like a lot of onus on the user, given that these platforms are meant for mass consumption and consistent fast consumption. And I understand where your concerns come from because I think that I even probably will ignore it, right? And I know, and so and I think that's where I try my best to constantly remind myself of like, even if no matter how much we know, we still get trapped in these tools. And the reason why is because it was designed that way,
Elizabeth: [00:17:49] And these platforms serve a purpose in our lives. They curate information. So we do end up having at least less information to deal with. But as you were saying, you know, there's a constant stream of it. There's always more that we could consume. We could gather. And so we have to make these decisions about how to spend our time and the platforms are designed in ways that really structure how we interact with this information. So these platforms have some power over the extent to which misinformation or disinformation shows up in our feeds and the extent to which they repeatedly show up.
[00:18:26] We also know from psych research that when we see the same thing, multiple times come from multiple different people or sources, we are more likely to believe it. And so as these platforms are structuring the information that they are feeding back to us so that we stay on their platforms longer because we think their information is good or entertaining or interesting or or whatever. The more they can provide that service to us, the more money they can make off of us through advertising. I have a question for you, though, about kind of zooming back out to the idea of misinformation and disinformation, even beyond necessarily our social media uses. Do you think that misinformation and disinformation impact people equally like? Are there divides within society of who is most or at least impacted by it?
Nasma: [00:19:16] If you feel worried, you know, misinformation and disinformation campaigns moving, essentially disinformation campaigns are moving to like, stop you from using math, for example, right? And you were from a poor community, because we've seen the COVID right, the people who are most impacted from it [are] folks who are black brown people, who are from long term health, long term care facilities. And so I think that I'm saying it's obvious in the context of COVID, but also other factors is that it does negatively impact that's meant to do that right. It's meant to be disinformation. It's meant to further marginalize right or further radicalize or like it's meant to create massive fractions right across the board. And the fascinating thing is like when I was like, I'm not going to lie, I kind of avoided the misinformation disinformation space for a while. I was just like, I like. I just kind of felt like it was already normal in my life. Like, I already remember being like, oh, like the WhatsApp like videos I would get or like, just like I was like, Oh yeah, yeah.
[00:20:20] Like, I can know that people use propaganda like, you know, we're kind of taught that very young. And so I was like, Oh, it can't be that big of a deal. But what I think is important is that we know that disinformation campaigns have existed obviously prior to the platforms, right? And I think that that is something I remind myself often because it is a tool, right? Media is a tool. Information is a tool, right? I think people are still seeing it as still kind of new to them, right? As if this hasn't happened before. And I think for me that that grounding of like, Oh, these are tactics, right? These are tactics meant to dissuade us. These are tactics meant to anger us. These are tactics meant to further separate us or make it harder for us to see each other. It makes it easier for me to even compartmentalize, like what's happening now, if that makes any sense, like the understanding that says these are like old tactics that have only increased. And how do we then fight back against us?
Elizabeth: [00:21:19] Yeah, that makes a lot of sense, and I really like that. You've brought up the point that you know these are. Tactics, I think that at first when we were starting to deal with the misinformation and disinformation on social media, the conversations were really about, "Oh my gosh, people have so much access to information that's not true." And the conversation was around, well, fact vs. opinion. And are we losing the fact based nature of the core of our arguments? And how does the legal system function if there are no facts right? And then we switched gears a little bit and are now starting to recognize this is not a new problem and in fact, trying to manipulate the view of what we commonly agree on as fact, whether or not that fits with your idea of, you know, "this is always true" or "this is the true thing in the moment," the sort of manipulation of that base understanding is a common tactic in politics for decades and decades and decades. We've seen it all over the place.
[00:22:29] You also mentioned propaganda. And I think it goes back to those conversations too, of creating communication products and pieces of information very specifically to try and change people's opinions on a thing, to create cleavages within society, to create moments of disagreement and discord that kind of put people off their center of gravity and make them recalibrate. And that kind of propaganda has been used for good and for bad. If we need to categorize between good and bad, and it continually leads people to confront the very reality that your view of the world is impacted immensely by the content you see in media, and that gets shared to you through social media and that your friends talk to you about.
Nasma: [00:23:22] Exactly. And I think that the difficulty I think we're experiencing now, and in all honesty, on my end to right now, even from the academic texts, in the articles that have come out about this. But for me, what I've struggled with is even in my understanding of digital literacy and media literacy is that I, I think it is actually important. And we also say this is all platform digital literacy. It's important to question your sources, right, even if it's a trusted source. It is important to question.
[00:23:50] And I think that as we're unraveling, more people are getting access to information they might have never had before. You're hearing sides of the story that have never been shared because of popular media, because of who's heard and who is not. I understand the questioning that's occurring as well, and I feel the same way when we're seeing a certain article or like in the type of titles that are tied to it and you're like, Wait, what's going on here? You know what I mean? And so I don't think that we should ever stop the questioning per se. I think that it is always important for us to navigate and understand where things are coming from and also an acknowledgment that what we have deemed as trustworthy institutions were trustworthy institutions. It was deemed trustworthy institutions by a certain population, which is a hard one.
[00:24:46] It's also hard to acknowledge that because it's the powers that be. It's like white upper middle class that would say "this is fact". And so I think that that is also the difficulty where I think people are in this space where they want the question, which I think is important, right? Like, "Hey, like is this, you know, like what are the other side of the story here?" And then also acknowledging that there are institutions that are hopefully meant to have neutrality or whatever the case might be to be able to share their information that they need to share. And so I think we're just dealing with a lot of factors, like a lot of different moving wheels at the same time. And I think that's where it's become so overwhelming and that overwhelming feeling of, wow, I can't even trust. Like, for example, not everyone has trust in the government. So that's another piece. But like, I can't trust the government, but I also can't trust this.
[00:25:39] And I also think, you know, I mean, like, I think people are quite overwhelmed across the board, even if you are like a quote/unquote educated or, you know, like asking all the right questions. Even I feel like, I remember there's like so many times where I listen to the daily New York Times, and every single time I listen to it, I'm like, "You know, I probably should read more about this topic", but I'm like, you know, like "I'll just understand the topic based off of what they've shared in the twenty five minutes" but like there's always every time there's episode, I'm like, I probably should read more about this and actually have a better understanding of this issue. But I'm like, "I do not have the time or also the energy." And I think that that is the part that sometimes overwhelms me, right? Because I'm like, I'm probably not getting the full story. I'm not saying that they're giving me the wrong story, but I'm like, I'm probably not understanding like a bunch of other pieces that are moving at the same time. But I also, - Do I need to know? That's also sometimes I think about that. I'm like,- Do I need to know? Maybe I'll just won't ever repeat this information. That's honestly where I've been, where I'm like, I'll consume something, and sometimes I just don't repeat it. I'm just like, I'll just let it be like, you know, it's like something that I've consumed that might impact me in some way, shape or form moving forward. But I haven't even done the research to understand the issue at hand. So I'm just going to let it be in my brain.
Elizabeth: [00:26:56] So like, you don't want to be that node that shares it more widely. It'll, you know, stick there in your brain. But like, you just aren't going to be the thing that continues the expansion of that.
Nasma: [00:27:06] Yeah, because I also just don't know, right? There's so many things we just don't know about the world. There's so many things that I consume that I'm like, “Hey, I have not read this like tax policy from, like 1962, but I have no idea.” Like, I haven't actually read the like primary source document, so I have no clue I was reading the assessment of someone else's , you know, of their primary source documents. And so I think that, you know, that's the difficulty that we're all in. And so I think that that's the struggle. No matter how much, we will not know everything, obviously. And the great thing to say and my favorite thing to say is, I don't know. I really don't. Yeah. And that's okay.
Elizabeth: [00:27:50] Exactly. That is OK. We have to be more OK with saying, I don't know, because a lot of these structures that put the onus on the public to think for themselves, be informed, do their own fact checking notice when there's a new label on some piece of content like that is a lot of work to put on individuals who all have their own jobs and lives to live right.
[00:28:13] Like, we use the idea of a free press in democracy as a really important tool to help citizens become aware enough of their political system that they can make informed decisions on voting day, but also so that they can go ahead and live their lives fully and not be the political experts. And you know that that's just one example of why we rely on sort of information intermediaries to help us discern what is useful information and what isn't.
[00:28:44] You know, beyond voting, there's there's lots of other examples, but I think it is pretty crucial and something that we lose track of a little bit in these conversations about how to deal with disinformation and misinformation because we want to focus on something that's actionable and media literacy feels actionable. But it really does put a lot of expectation on people who have not been trained to be that critical of information, necessarily. They've got lots of other things in their lives, and they might not have the tools. And we might be also putting people who are already marginalized in a system at more risk because we're asking them to go above and beyond because they're already not being catered to in the information environment.
Nasma: [00:29:33] But that's actually the really important part, right? They might not be catered to. And I think that that's part of the difficulty right now is when we're talking about, -hey, like what are these trusted institutions like trusted by whom right? And what are we deeming as trustworthy or verified or fact, is that we know that the way that information has been shared in the past is like the more powerful right or the one who has access to the academy or the one that has access to XYZ. And that is what's being fact.
[00:30:00] And so, you know, I think that it's important to note that the questioning that is occurring as well of whether it be like democracy, like a bigger questions at hand and the societal issues that we're experiencing today is that I don't think that we have to put the issues of questioning right now of these systems on just the mass disinformation that's occurring. I think there's a reason also why we're not questioning that's not tied to the like, obviously direct campaigning around disinformation, but the questioning of like, why is this deemed fact right? And who gets to deem something as back when we know that the systems that have kind of built these things, I'm not used to talking very broad generalizations here. But if we're talking about health care, right, we know that if you are a white person experiencing the universal health care system, it is a different experience compared to if you are a black person, right? And so it's like there are all these things that we do know and we've been able to understand and we've been able to build facts on.
[00:30:59] And I think that it's important that this questioning right now is not just because of disinformation. Because also, people now know more right and are actually getting valid information are getting. I would say what we call quantitative data right about qualitative data, about the experiences that they've been already talking about, right? And you know, it has still not been listened to.
[00:31:22] And so I think that there is a piece of the questioning that, you know, I think is actually more about the power part, right? And that is something that we also have to reckon with. And I think we all are all across the board, whether it be our relationship to media, our relationships with government entities or relationships the academy, is that it's not just this disinformation side of things, right? It's actually, hey, like, are you even serving us across the board and we don't think you are right? And so I think that's also the very fascinating part that's also being tied into the disinformation misinformation piece of like everyone struggling to be like, "Hey, why aren't you believing us?" And everyone's like, but, you know, "like, you haven't served this before? So why do you think we're going to believe you now?" You know, so it's a very interesting space to be in as well.
Elizabeth: [00:32:07] Yeah, absolutely, absolutely. I could. I could talk for ages about this, but we are coming up on time. And so I wanted to finish off with one last question for you, and it's a quick little quiz. Can you tell me the difference between misinformation and disinformation and some version of, I guess, trustworthy information? We've used the word trust a bunch.
Nasma: [00:32:33] Yeah. So we've seen a big difference between misinformation and disinformation. As disinformation is bent like the intent to like- it's deliberate. It's like it's deliberately shared, whether it be to make you feel a certain way, validate something in your brain. Like all of that stuff, it's like deliberate. And misinformation is shared without the actual intent to mislead or intent to do you a certain way. I think the intent piece is the big part of the difference between disinformation, misinformation.
[00:33:07] aAnd then if you're thinking about like, for example, you know, I always I always struggle with especially being a trustworthy or verified like I don't I feel like this is like the like this is why their communication scholars to really think about this because even I could go into rabbit hole. But you know, confwhen we're thinking about trustworthy or verified information, that's kind of like the facts that are being shared once again, what do we do? In fact, this becomes a bit of a question, but it's accurate information that in most cases you have sources, right? Whether it be primary sources, oral sources like historic sources, peer reviewed sources. And I think obviously we can go down the rabbit hole about that. But it is like information that's being shared that has a direct source is kind of how I view trustworthy information or verified information.
Elizabeth: [00:33:59] That's awesome. Thank you. That's great. Those are really great definitions. And you're right, the idea of trustworthy or verified or authoritative, it's difficult. And it's something that in academia, like a bunch of communication scholars are all trying to figure out what's a good definition of it. And that's evolving over time. And we don't have one clear one that everyone agrees on at this point.
[00:34:28] All right, thanks for tuning in. That was our episode on mis- and disinformation. If you want to learn more about any of the concepts or theories we talked about today, go ahead and check the show notes or head over to polcommtech.ca. This special season on mis- and disinformation is brought to you, in part by a grant from the Social Sciences and Humanities Research Council of Canada and the Digital Citizen Initiative.