Digital literacy and the mastery of evaluating information online

A fantastic image of a Dungeons & Dragons-style mimic sitting next to some papers and a screen

The mimic is one of Dungeons & Dragons’ most iconic monsters. They are protean creatures that can take on the appearance of inanimate objects such as chests. If someone unwary attempts to open the chest, it springs to life and attacks. Savvy adventurers learn that not everything is as it seems, and they need to pay careful attention to their surroundings if they hope to avoid an ambush. Alas, the Internet can feel like a dungeon filled to the brim with mimics. Anyone can be taken in by misinformation or disinformation, which is why it’s imperative that we carefully evaluate the information we find online. This is particularly true if you’re an authoritative content creator, as drawing on false information will inevitably erode your credibility.

Table of Contents

The mimic is one of Dungeons & Dragons’ most iconic monsters. They are protean creatures that can take on the appearance of inanimate objects such as chests. If someone unwary attempts to open the chest, it springs to life and attacks. Savvy adventurers learn that not everything is as it seems, and they need to pay careful attention to their surroundings if they hope to avoid an ambush. Alas, the Internet can feel like a dungeon filled to the brim with mimics. Anyone can be taken in by misinformation or disinformation, which is why it’s imperative that we carefully evaluate the information we find online. This is particularly true if you’re an authoritative content creator, as drawing on false information will inevitably erode your credibility. 

What is digital literacy?

Before delving into the topic at hand, let’s recall some key facts about digital literacy. Digital literacy has three dimensions: Cognitive, Technical, and Social-Emotional. It can also be broken down into skills such as Finding, Understanding, Evaluating, Creating, and Communicating. Today, we’ll be focusing on the third of these skills: Evaluating. For more information on digital literacy, check out this post or this one).

The digital age: a double-edged sword

The Internet offers a vast array of information on almost any topic under the sun, from animated videos on the origins of Nichiren Buddhism to dictionaries of dead languages. This is a blessing and a curse. On the one hand, it has undoubtedly democratized knowledge. Take the Georgian Papers Programme (GPP), for example. The goal is to digitize 425,000 pages of material from Britain’s Royal Archives and the Royal Library relating to the Georgian period (1714-1837). Traditionally, this material was only accessible to a rarefied group of professional scholars, but the GPP is making it available to anyone with an Internet connection. In a similar vein, many books that were only only found in the collections of major research libraries can now be downloaded by anyone thanks to Google Books. 

Of course, this does present challenges as well as opportunities. While something like the GPP is obviously authoritative, the Web makes it easy for false information to be given a patina of authority. Wikipedia is a treasure trove of information, but anyone can contribute to it, meaning it’s susceptible to many different types of manipulation. For example, in 2011, American vice-presidential candidate Sarah Palin mistakenly claimed that Paul Revere rode through Boston ringing bells to inform the British that the colonists were willing to fight. In reality, Revere’s famous ride was a clandestine affair meant to warn the colonists of the movements of the British army. Palin’s supporters then repeatedly edited the relevant Wikipedia page in order to make it reflect her comments. Wikipedia has mechanisms to deal with this kind of issue, but the sheer volume of information on the site makes it impossible to police every edit. High-profile pages such as those about celebrities or politicians already have lots of eyes on them, making it easier to root out false information (especially if the information in question is an obvious attempt at trolling), but pages that cover esoteric subjects don’t attract the same kind of vigilance. But the average reader probably doesn’t think about this, and so they assume that everything they see on Wikipedia has received the same level of scrutiny. 

Bear in mind that there’s a distinction between misinformation and disinformation. With the former, the person making the statement genuinely believes it to be true. Perhaps they’ve simply misremembered something they read, or they could have learned the material from an outdated source. With disinformation, on the other hand, there is a deliberate intent to deceive. The person responsible knows they’re relaying false information. They might be motivated by a simple desire to troll, but they could also have far more malign intentions. Both misinformation and disinformation are a growing problem, as they make it more difficult for people to know what’s true and what’s false. 

The emergence of generative AI has added another variable to the equation, as it offers both new opportunities and new challenges. If you’ve ever played around with something like Imagine.art or Craiyon, you know that these tools can be used to create realistic-looking images of impossible things. Before Donald Trump was arrested for real, there was a plethora of AI-generated images that purported to show him being taken into custody. At first glance, this might seem like nothing more than wish fulfillment or catharsis, but it has a more sinister side. If someone discovers that a genuine-looking image is actually a fake, it can make it harder for them to accept images that are real since the seed of doubt has been sown. 

How to evaluate digital content

Ultimately, you are your own best defense against falling for inaccurate information. It’s vital that you approach content with a skeptical eye, even if it appears to align with your own views. Consider the following:

  • Is it original?
  • Where did it come from?
  • How old is it?
  • What’s the motivation behind it?

Luckily, there are a number of tools you can use to help you answer these questions. Let’s say you’re trying to decide if something is original. Grammarly or Originality.AI can help you tell if something has been plagiarized (the latter also checks to see if content is the work of generative AI), while a reverse image search on TinEye can show if an image has been repurposed from another source. The Wayback Machine can help you determine how old something is, and an answer engine such as WolframAlpha lets you find answers using a database of authoritative sources. 

To see what this means in practice, let’s think about our hypothetical journalist who writes about video games. Imagine someone purporting to work for a major studio provides them with leaked information about the studio’s plans for the next few years, including detailed release schedules as well as screenshots of works in progress. Among other things, it reveals that one of the studio’s biggest IPs will be getting a new release within the next two years. While this is potentially a great scoop for our journalist, they’re sensible enough to do their due diligence before filing a story. After checking to make sure the content hasn’t appeared elsewhere and hasn’t been created by an AI, they run a reverse-image search on the screenshots to rule out the possibility that they’re forgeries crafted from existing images. The journalist also uses databases of vetted content to see what other industry observers have been saying about the studio. 

For more information on assessing online material, check out “5 steps to verify social media source accuracy for publishers,” “How to fact-check online information,” and “How to spot AI-generated text and imagery.”  

The social/emotional aspect of evaluation

It’s also a good idea to bear in mind how our emotions can affect our view of the material we encounter online. It’s easy to surround yourself with information that aligns with your beliefs. It starts when we choose to consume certain media, or we decide to associate with like-minded individuals. But this tells the algorithms that we like a particular type of content, and so we start seeing more and more of it in our feeds. The more we watch, the more we see. It becomes a vicious circle. Research has also shown that social media can have an enormous influence on our emotional state as well. Ultimately, the best solution is to look at everything through a critical lens, even if it feels legitimate to you. Even Garfield knows you are not immune to propaganda. While that meme is often used in discussions about the role of advertising in social media, its relevance extends well beyond those conversations. Many of us probably associate the word ‘propaganda’ with Nazis or the Soviet Union, but propaganda is everywhere, whether it’s a company trying to get you to buy their product or a politician’s spokesperson looking to rile up their base. It can be tempting for content creators to adopt similar tactics, but ethical creators know to tread carefully lest they unleash forces beyond their control.

Conclusion

In a world rife with false information and algorithms that spoon-feed us a curated cocktail of content that appeals to our preferences and prejudices, it’s all too easy to either throw up our hands in despair or adopt a nihilistic cynicism. Luckily, there is a remedy, and it’s a heaping helping of critical thinking. Don’t take things at face value. Ask questions about the creator and their motivations, and don’t be afraid to do some amateur sleuthing to verify the information. Don’t assume that something is legitimate just because it comports with your worldview (there are charlatans on both sides of the aisle). It can be frustrating to have to do all this work, but it will be better for you (and for society) in the long run. 

Illustration of colorful books on a shelf against a dark background.