Thanks to technology and the internet, today’s children and young people have unprecedented access to information. While much online content is valuable and informative, there is also a great deal of compromised, biased, and untruthful information. Juliane von Reppert-Bismarck is the director of Lie Detectors, a journalist-led organisation seeking to help teenagers and pre-teens use the internet to enhance their learning, while avoiding the dangers of becoming ensnared in conspiracy theories or manipulation. She argues that media literacy should be an urgent priority for educators and policymakers alike.
Beatrice White: What kind of information landscapes are school-children navigating today? How does this impact the way they learn?
Juliane von Reppert-Bismarck: The pandemic caused a lot of upheaval, not just because classes were suddenly taking place via video conferencing, but also because children were exposed to a lot of rumours and false information – as we all were – and so were unsure of what to believe. We also see this with the war in Ukraine, and previously we saw a lot of curiosity among children about the environment. Children have questions and concerns and want to understand what is happening. These worries are compounded because often they are not helped by their teachers to find their way in this information universe.
Disinformation affects children in a very different way to adults because they use different platforms. These days, children seek out information about the world from sources such as TikTok, Snapchat, and increasingly Twitch and Discord. These aren’t generally seen by adults as information sources: they’re gaming platforms and live video platforms, and they are largely unmoderated, either because they are encrypted – with exchanges happening in small private chatrooms – or because the content is visual. Images and video are among the types of content that have always been the most difficult to moderate. So there is a generational dimension, in that children inhabit a different online world to their teachers. This can make it very difficult for teachers to approach the subject of disinformation.
So the issue goes beyond whether particular pieces of information are factually correct or incorrect; it’s a question of the environment shaping our whole world view. Is this why Lie Detectors works specifically with young people?
Our organisation works with professional journalists to strengthen democracy using the tools of journalism. We currently work with more than 200 professional journalists from all kinds of media: broadcast, print, and online. The most visible part of our work is what we do in classrooms, speaking to children aged 10 to 15.
Those at the younger end of this age range in particular are incredibly open and enthusiastic about looking things up online; they are keen detectives who want to have the freedom to do research online, which they tell us doesn’t happen often at school. They are also very enthusiastic about meeting real live journalists. By the time they’re 14 or 15, they can sometimes be more difficult to work with because they’re shyer and more self-conscious. But secondary school is a very important time because this is when they are forming lasting friendships and social groups, and also their world views. It is when they start thinking deeply and making decisions about where they stand on particular issues.
In our view, it is really important to intervene early, and there are initiatives that start with children as young as four years old. On average, 10- to 11-year-olds regularly use three platforms, and the older ones are on up to five platforms. So they are much more adept at using these technologies than their teachers, but they often don’t understand the full extent of their engagement and can’t see that they may be trapped in an information silo. Also, very often they don’t use reliable sources of information. If you ask a schoolchild where they get information from, very often the answer will be “Instagram” or “WhatsApp”. You then have to tell them that these are photo or messaging apps and explain the difference between how they operate and how journalism works. It’s about understanding the difference between the content on these platforms – which is entirely subjective and sometimes manipulative – and that produced by journalists, which might not always get it right but is certainly more reliable.
The way we talk about this issue is very important. Yes, there are dangers we need to be wary of, but it’s important to emphasise that so many precious treasures and so much valuable information can be found online. We need to seek out what is good. It’s a bit of a yin-and-yang approach. Nonetheless, all the journalists of the world are not going to be able to solve this problem, so what we are also doing, increasingly, is training the teachers.
Critical media literacy needs to be recognised as a core literacy alongside reading, writing, and counting.
What are the main shortcomings in how the current education system deals with this issue? Is it simply a question of catching up with the technology or is there a need to instil a more critical approach?
Teachers’ confidence in their own ability to do something about media literacy is not at the level it should be. Almost 100 per cent of teachers say that this subject is relevant to their class, but only 30 per cent have actually addressed it in the classroom. That is a really significant gap. So there needs to be training for teachers regardless of subject area, and we need to provide them with incentives. It should become part of all teachers’ approaches to their subjects, whether they teach biology, politics, art, or maths.
In addition, some schools in more deprived areas have very poor connectivity and access to facilities and equipment to get online, so practical access is an important barrier. There is also a lack of suitable materials. It can’t always be a conversation about refugee rights or religious tolerance. You have to catch the kids where they are, on the platforms they use, and actually talk about things that they care about. It’s not about immediately tackling the hardest issues; it’s about teaching kids to flex their critical-thinking muscles so they can use them when they need to. It doesn’t matter if they train them on crazy stories circulating online, like the one about the man who allegedly married his pet cobra – a discussion that tends to generate great enthusiasm in most classrooms!
Because they are unfamiliar with the information universe that children inhabit, teachers actually need to learn to ask questions. This process cannot be only frontal and didactic. It’s a universe that is developing incredibly rapidly, so we need to meet children where they are and be able to engage them regardless of their backgrounds.
We also need to make sure these discussions are conducted in a responsible way that keeps teachers safe. Bringing up controversial subjects can be a risk, as we saw with the tragic case of the teacher killed in France in relation to a discussion about cartoons depicting the Prophet Muhammad. An extreme example but, especially when you start looking at conspiracy theories, you have to keep teachers safe and make sure they know how to approach these questions in a light-hearted way.
What structural changes would you like to see to bring media literacy into schools?
If you’re going to be piling things onto the curriculum, you really have to make sure that teachers are being incentivised in the right way. One of the big grading systems for schools is the PISA rankings. They are currently thinking about adding critical media literacy as one of the indicators used to gauge a school’s performance. What we say is that critical media literacy needs to be recognised as a core literacy alongside reading, writing, and counting. Because without it, you cannot make sense of the world. It doesn’t matter how well you can read words if they don’t make sense to you.
We also really need to be able to measure the effects of media literacy programmes so we can guarantee that politicians are going to put their political and financial capital behind them.
In recent years social media platforms have stepped up their efforts to stem disinformation, for example through content warnings and fact-checking. How do you view these efforts?
We think the “information disorder” problem needs to be tackled on the demand side, because the reason that it exists is the demand at a human level. If we can curb this demand, then it might not be as interesting to put out the manipulative information we see at the moment. However, there’s also the supply side. And by supply side, I don’t mean foreign actors such as Russian or Chinese trolls; I actually mean the drivers of disinformation. Platforms have to become accountable for their algorithms and the way in which they prioritise polarising content.
It’s about teaching kids to flex their critical-thinking muscles so they can use them when they need to.
There is good reason to be sceptical of the large internet platforms and the solutions they offer. We have seen this for many years: Facebook hires 1000 new fact-checkers, and it still doesn’t solve the problem. There has been a focus on content moderation, fact-checking, and using artificial intelligence to move or remove harmful content. The problem is that when you start deleting content, this often leads to accusations of censorship – real or perceived – which is harmful, both for democracy and for the credibility of politicians. You only have to look at the German Facebook Act to see what kind of backlash can arise. We cannot delete our way out of this.
At the EU level, the Digital Services and Digital Markets Acts will give the EU very powerful antitrust tools, so it’s important to get that right. Correct application, privacy, the limit to behavioural data collection: these are essential questions over the long term for curbing disinformation. But there have been intensive lobbying efforts, and the process has been very lengthy. Commitment and resilience will be needed on the side of policy-makers to see it through to the end.
Increasing numbers of people distrust established media. Should we be seeking to restore trust in certain sources? It has also been argued that the impulse to “deconstruct” information and narratives can lead to cynicism and a loss of respect for facts and objectivity. Is this as a risk?
Yes of course. We’ve heard teenagers say, “I don’t believe anything except for what my friends tell me.” We’ve got deep fake fatigue, which makes people just give up. And that’s the worst, when people don’t believe anything anymore. So yes, there is a lot of disinformation out there, but the good news is that there are practical tools that we can give children and young people – and also adults – so they do not feel helpless. All you need is a basic journalist approach. It’s not magic; we simply ask questions. “Who wrote this?” “What is the source?” “What do we know about the person writing this?” “Why might they be writing this?” And there are also questions to ask yourself: “Why am I reading this?” “Why do I want to believe this?” “Is this confirming something I want to believe because it makes the world more understandable to me?”
This is something children have no problems understanding. When you get to the ultimate outcomes of manipulation, propaganda, or lying, they can understand it, because they know all about cyberbullying. They have a very innate understanding of the online world that a lot of adults don’t have, because they all follow YouTubers, memes, and popstars. Even with just an image or a catchy headline or a video, you can make a lot of money and influence people’s opinions, positively or negatively. When you move that gaze from their online world into the world of information, you can do a lot.
What are the risks for democracy if we don’t take this issue seriously?
The threats to democracy are incredibly clear. Just look at the questions millions of people were asking during the pandemic: about whether masks really work, about the risks of vaccines, about whether remedies like gargling garlic water would cure Covid-19. That shows that disinformation can really mess with a person’s ability to make informed decisions. If you are not able to tell true from false online, that can stop you making an informed decision. And informed decision-making is the basis of the democratic process. If that is undermined, then the entire democracy is undermined.
What is the best thing that could happen? From our point of view, media literacy needs to be seen as a right for children, and must become engrained in the thought processes of both teachers and students. And teachers need to lose their fear. We’ve seen it happen; we’ve seen the debates that can happen in classrooms. We’ve also seen children correcting teachers in the classroom, or heard about them questioning their parents at home, as a result of this work. These tools can be incredibly empowering.
Young people today face great uncertainty about the future and are exposed to alarming and confusing information about issues such as war and the climate crisis. It can often be tempting to look away from reality. Can media literacy empower young people in the face of such challenges?
In the past, you would go to a newsstand or a newsagent and there would be a choice between either broadsheets or tabloids. You’d buy a broadsheet for reliable information on serious topics, or sometimes you might choose a tabloid for its sensational stories about UFOs and alien landings because you wanted some light entertainment. After all, not everything has to be a matter of life or death. What matters for children today is being able to tell the difference, to make a conscious choice. We have to train our eyes and those of the next generation to see and understand that difference and to know when it really matters to use that knowledge. We are ultimately telling children and teachers to slow down, to consume information more deliberately. Share more sparingly and stop and think before you do. And also to abandon or resist the temptation to see everything in black and white; to realise that there is so much to be found in the grey middle. It might take longer to explore this messy middle where all the information overlaps – it’s difficult, it’s nuanced – but it’s a really promising and interesting space where you can learn so many important things.
 Germany’s Network Enforcement Act, or “NetzDG” law, aims to combat hate speech and disinformation online by forcing platforms to rapidly remove illegal content or face heavy fines. (Source: Centre for European Policy Studies).