The speed at which technology is advancing is far greater than our psychology, our education systems, or our policy-making. And this will create problems for our societies if we don't work harder to solve the challenges we already have and those that lie ahead.
By Carlos Carrasco-Farré
How can we avoid being deceived by fake news? Our social networks are bombarded with information, and it is almost impossible to distinguish what is real information and what is not. The problem is not so much because of its false content, but because malicious actors try to use psychology against us.
This also explains why misinformation spreads 6 times faster on social media than reliable information. As if that weren't enough, misinformation is created in greater quantities (the volume problem), in more typologies (the diversity problem), and faster (the velocity problem) than our ability to counter its content.
Despite this, there are some clues that can tell us whether the author of an article is trying to deceive us by taking advantage of our psychology. By analyzing theories of human cognition and behavioral sciences, we can calculate quantitative measures and apply language processing techniques that can help us distinguish fake news.
For example, the limited capacity model of processing motivated and mediated messages proposes that different structural and functional features of a text require different cognitive efforts to be processed by our brains. This means that not all texts are equally complex.
For example, a book for young children has a simpler structure than a scientific article. The problem is that on social media, people tend to minimize the effort we put into processing information. This explains why simpler content is also more viral.
However, our psychology is sensitive and influenced by emotions. Studies show that content with a high emotional level is more viral, which explains why social networks are considered sources of "large-scale emotional contagion."
When we are exposed to information with high emotional content, our psychology becomes less rational, and this reduces our ability to distinguish what is true from what is false. Knowing this, creators of fake content try to exploit emotions against us.
In a recent study, we tried to uncover the strategies followed by different types of people involved in spreading misinformation, what we call “misinformation fingerprints.”
For this reason, we used more than 92.000 pieces of fake news content divided into 7 categories: clickbait fake news, conspiracy theories, hoaxes, hate speech, fake science news, credible sources, and rumors. We then compared these with real news in terms of provoking emotions and cognitive effort.
"Fingerprints" of misinformation
Using natural language processing, we calculated the extent to which different categories of misinformation differ from real news in terms of emotional appeal (emotional analysis and appeal to moral values) and the cognitive effort required to process the content (grammatical complexity and lexical diversity). The results show that there are significant differences between fake and real content. Almost all types of misinformation have a simpler grammatical structure than credible news. This makes them easier to process, as they require less cognitive effort from the user. They are also all 15 percent less lexically diverse, further reducing the cognitive effort required to process them.
If we look at the evocation of emotions, the results show that misinformation is much more emotional than real news. In fact, fake news is 10 times more negative in terms of the feeling it provokes. In addition, and this is important, they appeal to the reader's morals 37 percent more. So they try to psychologically influence the user through ideas that convey an attack on the individual social identity (gender, religion, nationality, etc.) of the reader.
Be suspicious of content that has poor grammatical and lexical structure
Misinformation creators know that when you're browsing social media, you're not going to put in the effort you would in other situations. After all, you're on them to be informed or entertained, not to work. So they try to avoid the hassle of processing the content. This makes it more appealing to your brain, and you're more likely to believe the report and share it with other members of the network.
Don't trust news that tries to generate certain emotions or refer to your moral values.
All the research in behavioral sciences shows that people are not as rational as we think. Our emotions strongly influence our mental processes and our decision-making. This is why creators of fake content try to exploit them so that we are not able to distinguish rationally whether the content we are reading is true or not.
They appeal to our emotions by trying to generate anger, fear, or sadness to cloud our rationality. They also use strategies that make us feel that our social identity – our nationality, our gender, our opinions – is at risk, creating the sense of enemies outside our group who threaten our very existence.
And this feeling of being attacked activates mechanisms in our psychology that make us behave less rationally and reduce our ability to distinguish what is true from what is false. So, the results of the study show that it is important to be vigilant when surfing the internet, especially on social networks, and to be able to detect when someone is trying to take advantage of our psychology against us.
There is individual work to do, but also work as a society. First of all, social media companies need to be aware that their platforms create access to information never seen before in human history – which is incredibly useful. But also that their platforms are being used by malicious actors, who exploit the psychology of users for economic or political purposes.
On the other hand, the results are an appeal to public decision-makers to be more concerned about the alarmingly low levels of media literacy among the population. The speed with
The technology that is advancing is much bigger than our psychology, our education systems, or our policy-making. And it will create problems for our societies if we don't work harder to solve the challenges we already have and those that lie ahead.
Note: Carlos Carrasco–Farré, researcher at the Department of Operations, Innovation and Data Science, ESADE Business School. / World Economic Forum - Bota.al