When I was researching and writing my new book, “The Gist of Reading,” I wanted to explore long-held assumptions about reading and how we process what we read.
Some of these assumptions have changed through time. For example, as novels became popular in the 18th century, many warned that they were dangerous and had the potential to cultivate ignorance and immorality in readers, especially female ones.
Today, many would consider that view antiquated. People probably think that reading a narrative – fiction or otherwise – might be able to influence a reader’s opinions or personal beliefs. But their prior knowledge of real-world facts should be safe.
For example, readers might read a story in which a character mentions in passing that Hillary Clinton, rather than Donald Trump, won the 2016 election. This shouldn’t influence readers’ ability to quickly respond that Trump was the real winner, right?
And yet I came across a substantial amount of psychology work that has demonstrated how reading stories – both nonfiction and fiction – has a powerful ability to distort readers’ prior knowledge.
In psychologist Richard Gerrig’s 1989 study “Suspense in the Absence of Uncertainty,” Gerrig developed short, nonfictional narratives about well-known events, such as the election of George Washington as president of the United States, that he gave to participants.
Some participants read a version of the narrative that foregrounded facts that made it doubtful Washington would become the president; others read a narrative that made his presidency seem likely.
Readers who read the doubtful version took longer to verify that he had indeed become president (or to recognize that a sentence denying that he had become president was not true).
Even though they knew Washington eventually became president, simply reading a very short narrative had enough power to make readers significantly less sure of what they already knew.
While Gerrig’s experiment presented readers with nonfictional stories about real events, another study demonstrated that reading a short fictional story containing falsehoods presented as facts can make readers more likely to treat them as facts, even if readers have previously shown that they know the truth.
In the study, participants took an online survey that quizzed them on their world knowledge – for example, identifying the world’s largest ocean (the Pacific) – and then had them rate how confident they were in their answer.
Two weeks later, the same participants read two fictional stories and were warned that these stories might contain some false information. The stories actually contained inaccurate versions of the very facts that the readers had been tested on two weeks earlier. For example, in one story, a character (incorrectly) mentioned, in passing, that the Indian Ocean was the world’s largest.
After reading the stories, the participants took the same world knowledge test they had taken two weeks earlier. The inaccurate information turned out to have a serious effect: Readers did worse on the world knowledge test after reading the stories than they had done two weeks before. In particular, questions they had gotten right two weeks earlier they now got wrong – even for the questions that they had answered most confidently on the earlier test.
And remember: All of this happened despite the fact that readers had been explicitly told that the stories would contain inaccurate information.
Given our struggle to discern misinformation from fiction, psychologists have been interested in exploring how to combat it. It seems especially vital to develop strategies that make people smarter about what they are gleaning from what they read, and to encourage ways to become more skeptical.
In a 2016 article, psychologist David N. Rapp outlines how to defeat, or at least reduce, the misinformation effect.
Rapp describes four key strategies that have proven especially effective.
First, when readers actively tag information as accurate or inaccurate while they read, inaccuracies lose much of their effect. It’s not enough to know that something you read is incorrect: Unless you actively tag it as wrong while reading it, you may suffer the misinformation effect.
Second, the further removed fiction is from everyday reality, the less vulnerable readers are to believe false facts that may be embedded in it. Rapp and his colleagues found that misinformation in fantasy stories had much less effect on readers’ knowledge than misinformation in more realistic stories. Rapp argues that this could mean readers are able to compartmentalize their response to fiction. Fantasy stories like “The Hobbit” probably have less of an ability to alter real-world knowledge than, say, a piece of historical fiction, like Philippa Gregory’s “The Other Boleyn Girl,” which is grounded in historical events but nonetheless riddled with historical inaccuracies.
Third, Rapp found that some inaccuracies are so flagrant that readers do notice them. They may be persuaded that St. Petersburg, rather than Moscow, is the capital of Russia. But it’s much harder to persuade them that Russia’s capital is Brasilia. Brasilia is just too different from anything that readers associate with Russia to make it a convincing capital.
Finally – and perhaps most importantly in today’s climate of “fake news” – readers may be sensitive to the authority of a source. False facts from a generally credible source seem to have more effect than false facts from a disreputable one. The challenge, of course, is that what counts as a credible source to one reader may count as the opposite to another reader.
I find all these psychological experiments telling precisely because they generally avoid having participants read about hot-button issues that may make them feel defensive or partisan.
The traditional suspicion of fiction arose from its ability to excite and engage. Yet the materials in these experiments are comparatively dry – and the fictional information was nonetheless able to cast a spell on the reader.
In other words, even without emotional appeals, by warping the most neutral of facts, readers can easily be persuaded to question or even reverse what they already know.
Such work underscores more than ever that suspicion of reading is not entirely ungrounded. Today, not only is the internet filled with dubious information but there are also deliberate attempts to spread misinformation via social media channels. In this era of “fake news,” scrutinizing the sources of our knowledge has become more critical than ever.
Andrew Elfenbein, is Professor of English, University of Minnesota. This article was originally published on The Conversation.
Crafted with brevity
to make certain you see what others don't
Subscribe. We are growing.
Andrew Elfenbein researches 18th- and 19th-century British literature, the history of authorship, queer theory, linguistics, and cognitive approaches to literacy. His current work focuses on historical linguistics as a theoretical and practical challenge to governing paradigms in literary critical study. In connection with cognitive scientists, he has also begun extensive empirical work on reading in order to provide criticism with better and more sensitive models for the reading process.