(0:10) When I was a little girl, I used to struggle with the burning question. What will I do when I grow up? A garbage girl? A cotton candy saleswoman? A princess? The President? Maybe a presenter!
(0:30) My dad used to host one of the town fetes every year. There were singers, jugglers and other performers at the festival. But the greatest artist of all of them was the emcee for me. I found it charming how he could get people’s attention, speak fluently and also react to sudden situations, such as the band being late or someone losing their keys. Every year I looked forward to this one particular presenter, and every year I watched him from afar and silently admired him.
(1:14) I even set up Facebook at a young age to add this presenter to my friends. I wanted to observe his inspirational magic year-round. In 2015, this colossal role model and idol of mine shared the news that each of us would have to house Syrian refugees in our homes compulsorily. I found this man to be a huge inspiration, intelligence, insight. I don’t even know where I got the courage at the time, but I wrote him a comment saying, “Doesn’t that seem a bit stupid to you?
(1:58) This comment of mine got more likes than the original post overwhelmingly. The author even deleted his post after a few hours. I considered it such a nice little victory. At least for those few minutes before he unfriended me and then merrily went on sharing posts like this. How many people could believe these messages when they were shared by someone so successful? This is the problem associated with authority. For example, has it ever happened to you that you bought a particular product because your favourite celebrity recommended it? Or have you similarly adopted an opinion?
(2:48) For example, my grandmother contributed to an organization fighting disease in Africa. As I found out, the organization was quite solid; anyway, my grandmother admitted that she decided to contribute because the poster featured a gentleman in a white coat. And that’s saying something. So maybe when I grow up, I’ll be an expert on health topics. The white coat is enough for that. And perhaps not even that anymore.
(3:25) You can be considered an expert on covid not because you have the white coat, but because you can sing a song well or because you’re on a favourite TV show. We associate people like that with some authority, they’ve accomplished something in their lives, so it’s hard to believe they could be wrong.
(3:50) The presenter mentioned above, who unfriended me, was definitely such an authority figure for me. But at the same time, I was also very interested in current events and debunked conspiracy theories, so I didn’t fall for his take on the refugees.
(4:06) A little while later, Zvol si info (Choose your information) was created. A student society that was primarily concerned with educating people about Fake News. They even – after less than a year of operation – wrote a successful book on the subject. As soon as it was possible, I joined them. I no longer wanted to be a princess or a presenter but a fearless fighter against fake news.I have read the book, attended several lectures, and felt that I understood the topic and was an expert on it.
(4:45) But it wasn’t enough, and I wanted a better and more accurate world, so my classmates and I started another project. We tried to tackle fake news playfully, which took us to America through a course at school and a competition. We didn’t know all that much about the subject. Still, less than three months after the founding, I was suddenly sitting in Washington D.C. explaining to the tie-dyed staff at the State Department how extremely dangerous Fake News is in the national security realm. A few days later, I was on DVTV (A successful Czech television programme), confidently presenting my views on didactic considerations in teaching media literacy. In neither case did I feel out of place. And now that media education has been my daily bread for several years; here I stand feeling very out of place.
(5:45) I feel the Dunning-Kruger effect, a kind of scientific version of the Socratic “I know that I know nothing”. The Daning-Kruger effect basically says that the more we know about a certain thing, the lower our self-esteem in that area. When we learn something new, we tend to overestimate our knowledge and skills, feeling more competent or more proficient than others. If we stay with this bit of knowledge, we can continue to live with high self-esteem in how well we understand the topic. But if we continue to learn about the phenomenon, we find that it is more complex, and we begin to doubt our knowledge. And the more we know, the less confident we become. When we deepen our understanding sufficiently, this tendency breaks down, and we slowly start to regain our confidence. Can we then be surprised at people who spread, say, nonsense about covid because they know so little about it that their self-confidence is at an all-time high? And that, on the other hand, someone who is erudite in the field prefers to keep their distance and be more restrained with some clear statements?
(7:10) But the Dunning-Kruger effect is just one of many cognitive distortions, biases, shortcuts, or if you like, errors, for which our brains are responsible. These distortions do not help us much in the fight against misinformation. On the contrary, although anyone can do the most rudimentary fact-checking, verification may not happen at all because of the so-called confirmation or confirmatory bias. Confirmation bias is another brain phenomenon – we tend to seek out information that supports our opinion. We easily believe this information and are not interested in further verification. And we avoid evidence that speaks against our view. Cognitive dissonance also plays a role here. When we encounter a statement that contradicts our beliefs, the same centres in the brain are activated as when we experience physical pain. Changing one’s mind is, therefore, a painful process. And there is a whole spectrum of other similar phenomena. But just knowing them is not enough.
(8:26) Last summer; the measures against the spread of the coronavirus ceased to exist for a while, but the virus continued to fly through the air. Slowly, a second wave of the pandemic was born. And I attended an outdoor birthday party. Some boy there reached for my glass and asked to drink from it. I cringed with the cup and said, “If you want, I’ll mix it for you too.” But he didn’t give in. He took the glass from me and drank. I said, “Keep the glass; I’ll make it again.” It didn’t take long for him to see what the problem was. “Oh, you surely believe in covid, don’t you?” Then he started to present one conspiracy theory after another. Government conspiracies, Bill Gates, chipping. I only tried to have a rational discussion for a second. Then I figured I didn’t have to be a fake news warrior at all costs, and the princess thing didn’t sound so bad, did it?
(9:36) To verify information, be aware of my brain’s flaws and possibly other influences, and look critically at my claims. None of that is nearly as hard as trying to convince someone else.
(9:53) A STEM survey earlier this year found that 40% of Czechs using the internet believe misinformation about covid. 40%! These are not stupid people. These are not people who can’t verify the information. Behind the monitor, there is often a person driven by fear: fear for their health, fear for their business, fear for their children. Distrust of the system motivates him to seek out information that supports his opinion in the first place. The belief that the truth is hidden and kept from him. And this person may not just be a stranger’s profile in the comments on Facebook. Or a random guy at a party you’ll probably never meet again. It could be our mother or grandfather, a childhood friend. Would you damn someone close to you for having a different opinion? Or for believing misinformation, for whatever reason?
(11:14) No, we shouldn’t damn these people. But neither should we put ourselves in the shoes of someone more intelligent than our opponent. Because we can all be victims of our cognitive errors, we can all act on emotions like fear or resentment. We can all succumb to well-targeted manipulation.
(11:33) Let’s act helpfully. After all, we may be guilty of partially disrupting the person’s world in such a debate. I didn’t follow this advice a few years ago. At Christmas, in discussion with my grandmother, I challenged her worldview so sharply that she didn’t speak to me until Easter. Therefore, despite differing views, one should approach the other with care, humanity, and above all, a good deal of empathy and understanding. Let’s not be haughty and instead find common ground, something that unites us. Something we can agree on. We can show the other person that we are not here to lecture them like a little child, To show that we want a dialogue.
(12:28) In finding common ground, it’s good to agree on where our dispute arises. Are we each entering the debate with different information? Do we trust different sources? Or is our disagreement more about the values we hold? If we run into the problem of the other believing misleading or false information, let’s show them that it’s okay to be wrong. He is certainly not alone, that there are many influences, and that it is perfectly understandable for us to understand why he believed the thing.
(13:10) But let’s not stop at the fact that the thing is misinformation, half-truths or nonsense. If we have already created a hole in the thought concept, we must fill it with something. So let us offer an alternative: what else can explain the ambiguity if not what one has believed so far? But let’s try not to overwhelm the other person with our arguments. Instead, let us ask him well-measured questions.
(13:44) It won’t happen overnight. Changing one’s perspective is painful and can take a long time. Sometimes it may not happen at all. Sometimes it’s better to give up when you see that the debate is going nowhere and instead turns into an argument. As Jan Werich (Czech writer) said, “If a man argues with an idiot for more than half a minute, two idiots are already arguing”. If our debate becomes heated and personal attacks begin to appear, for example, we had better abandon it. But let’s try not to worry about it next time. Let’s reflect on where we went wrong in the debate and try to do better next time.
(14:30) There is no detailed guide on how to do this most efficiently. It’s perhaps a Nobel Peace Prize question. The important thing is to realize that we cannot turn around a person’s entire perception of the world with one debate. But we can plant a tiny seed that, with luck, will germinate into a bit of doubt. It is not easy to change the world for the better every day. But training can make us a master.
(15:08) So what will I do when I grow up? I still don’t know. But whatever it is, I want to do it the same way I try to debunk misinformation – with respect for myself and others, understanding and patience.
The translated transcript of my TEDx talk which I gave in May 2021 in Brno, Czech Republic.
Veronika Batelková – a master student of Information and Library Studies, a graduate of bachelor’s studies in journalism and political science. I run an NGO concerning about media literacy education. I love an international environment.
Click and go to the artical