In mid-December 2020, the European Commission laid out a package of new legislation – comprised of the Digital Services Act (DSA) and the Digital Markets Act (DMA), with the aim of regulating tech companies, increasing accountability and user safety online, clarifying rules around illegal activity, and harmonising policies across member states. EU officials have said the two acts will prohibit unfair conditions imposed by online platforms and are at the core of the Commission’s ambition “to make this Europe’s Digital Decade.”

For the Greens/EFA members of the European Parliament, however, the proposals fall short of what MEPs had been calling for: clear legal framework for online platforms, to formulate guidelines for dealing with tips on illegal content and to put an end to arbitrariness in blocking and deleting content by online platforms. Dutch Green MEP Kim Van Sparrentak, member of the European Parliament’s Internal Market and Consumer Protection committee, will be among those scrutinising new legislation in negotiations between the European Parliament, Council, and Commission. We spoke to her about the challenges of regulating and reorganising the digital space in a context of rife disinformation, threats to democratic institutions, and a global pandemic.

Green European Journal: The recent banning of Trump and other right-wing figures from many social media platforms, in the wake of the insurrection at the Capitol, was welcomed by many progressives – why is it problematic in your view?

Kim Van Sparrentak: One of the main things that we are saying about platforms is that they have become too big. They have too much power. Platforms decide what you and I see online, and in this case they just decided that they could ban the President of the United States. Of course, you could say that on the one hand, he didn’t comply with their terms and conditions, so it was okay to ban him; however, we can’t see Facebook and Twitter as just platforms anymore. They have become an important forum where public debate happens. That’s why I think it’s very problematic that platforms can decide what is and is not allowed online. We have to take it back into the hands of democratic institutions and, especially when it’s about illegal content, into the hands of judges.

If it was not a decision for the platforms themselves – due to their status as private companies – but rather one for lawmakers, what happens when it’s the lawmakers themselves who are communicating in a dangerous way, as in the case of Trump?

We have to make sure that the democratic institutions set the boundaries and then judges can decide whether it is okay or not. It shouldn’t be in the hands of private companies. They act based on what is most profitable for them and how it makes them look; they listen to their shareholders.

How would you describe the current state of the digital or information landscape? Why is the current legal framework insufficient?

You can separate these into two different issues. First of all, these companies are just too big – they’re monopolies, and they act as gateways to the internet. For many people, the internet is what is on Facebook and Twitter and Google, and there’s not much more. So we have to make sure that these companies become smaller and we have a more pluriform internet in general. Also to create opportunities for smaller competitors to grow, but never to the same size.

For many people, the internet is what is on Facebook and Twitter and Google, and there’s not much more.

Unfortunately, we see that European competition law and competition law in general isn’t able to tackle this issue because of the way these companies operate. In the same week that the European Commission proposed new legislation to tackle big tech, they also approved Google taking over FitBit, which made them even bigger and able to also enter the health market.

So there is no control over these companies and no boundaries have been set as to what they can actually do online. That’s why they are now the ones deciding what you can and can’t see on the internet. But this needs to be regulated by democratic institutions. For example, why are the terms and conditions for very big platforms not checked against human rights legislation or fundamental rights legislation, at the very least, to see whether or not they are actively racist or discriminatory. This is one of the things that we would like to see changed.

What is your assessment then of the new legislation proposed by the European Commission?

We have two proposals. The first is the Digital Services Act. I think it’s a good first step – it’s very important for the European Commission to show ambition; that they want to do something about the total freedom of these platforms to decide what happens on the internet. There are some useful proposals to improve transparency – making sure there’s much more transparency about the algorithms, for example, but also about advertisements. Also, when it comes to the responsibility of platforms such as AirBnB, a good proposal was made that if governments or a local authority would like to tackle illegal holiday rentals, for example, there would be an obligation for platforms to comply in giving information.

So there are some positive aspects, but it definitely does not go far enough. They’re not tackling the root causes of the big issues that we see on the internet on these big platforms. Besides more transparency about advertisements, we would like to see a total ban on personalised advertisements in the EU. Platforms want to keep us for as long as possible on their platform and then also try to follow us when we go to other websites in order to get as much data about us as possible and then make a lot of profit by selling us advertisements. If we make sure that this is not their main source of income or their business model anymore, there is also less of a need to show us more and more extreme content to ensure that we stay on their platforms – something that can also have real effects on people offline, as we’ve seen with events unfolding in recent weeks and months.

Next to that, we also see that there’s a call in the Digital Services Act for self-assessment by platforms to see whether they comply with fundamental rights and we would like to see more scrutiny from the government’s side.

The Digital Markets Act is designed to try to close the gap on competition policy and also curtail the monopoly these companies have and what they are allowed to. Some of the proposals are really positive, for example, making sure Amazon cannot use the data from third party sellers from their website anymore, and can’t force people that are selling to Amazon to stick to certain prices. But there’s still so much more that could be done. For example, the Google-FitBit deal, under the current proposal, would probably still be possible. The problem with these big platforms is that they try to expand into every possible field they can. For example, Amazon have now taken over Wholefoods to make sure they also get into the food department.

The Digital Markets Act also contains provisions to improve interoperability throughout the whole digital market. This would mean you could actually choose, as a consumer and user of the internet, which platform you use. In practice this means that you have the choice to use Whatsapp and still be able to communicate to Signal or to Facebook Messenger, you can decide for yourself, like you do with email clients. This is another good way to break the monopolies.

Is there a risk that the new regulation will create some additional controls but still allow the big tech companies to continue to develop their business model and largely control the internet space? It’s estimated that Google and Facebook control 80 per cent of new online advertising revenue. Is the new legislations likely to change that?

As the proposals stands, we see a number of very good steps, but no radical changes or solutions to problems we currently face. We need more ambition for both proposals to tackle root causes. The Greens will fight to tackle the root problems of current toxic business models under the DSA, such as more transparency and control of algorithms and a ban on microtargeted advertising. But, to break the power of Big Tech in the DMA, the key is curbing Big Tech’s unlimited data power. We can do this by ensuring Alphabet cannot use the data they collect in one service, such as Google Search, for their other services, such as Google Ads or Youtube. Gatekeepers should also not be able to use their enormous data power and nearly unlimited financial resources to conquer new markets, most notably our public services, such as healthcare or education.

The Greens will fight to tackle the root problems of current toxic business models under the DSA, such as more transparency and control of algorithms and a ban on microtargeted advertising.

You’ve called for breaking the power of big tech. Does the legislation provide the EU with any real means to forcibly break up monopolies that don’t comply with the rules?

In the proposal for the Digital Markets Act, it is mentioned that – as a final resort – it would be possible to break up companies when they are too big. The way that it’s written right now, I don’t think that it will ever happen. I think it is very clear that the way platforms like Google and Youtube, or Facebook and Whatsapp, are connected and merged makes them too big and too powerful. Of course, it’s always a nice idea that the data wouldn’t be shared, but we all know that it does happen, so being able to at least break up these kind of companies from each other and make sure that they can’t be a data monopoly on their own is very important. There should be better provisions for that in upcoming legislation.

How influential and active are the tech lobbies in Brussels, in your experience as an MEP?

Very active and influential. They have the resources to put armies of people at work with lobbying the European Parliament, Commission, and Council. And they have carefully planned strategies. The Big Tech lobby has also weakened the EU’s privacy regulations, such as the GDPR and ePrivacy. And Google recently had to apologise to the Commission when internal documents were leaked revealing plans to divide the Commission over the new internet regulation. We really need more transparency about lobbying in the EU. Especially now that officials have been working digitally from home. This makes lobbyists’ work more challenging, but it also makes it more difficult to monitor who speaks with whom.

GDPR was seen as a pioneering step by the EU, with a global impact. But to what extent can we really have a European model when it comes to the digital space?

I think we can actually set an example again, like with GDPR, as the European Union. We see that, in other areas of the world, they are trying to have stricter rules on platforms. Because we have such a big market share, when we introduce legislation, we know it has a big effect on the rest of the world. Secondly, the way that we are looking at this legislation is much more from a consumer point of view. This is important to make the internet free, democratic, and user-friendly again.

Is there a tension between the steps Member States are taking nationally, and the legislation at the European level? Countries such as France, Austria, the Netherlands, and Poland all have their own proposals in this area. Does this risk fragmenting and weakening the EU’s response?

It’s positive that there is an appetite among Member States to actually do something about the problems with platforms. Many of these proposals were also related to national elections in these countries – addressing issues like political spending in online campaigns, especially after what we saw with Cambridge Analytica. But in general, it is really important that we keep European legislation on this topic. Firstly, because the internet is cross-border, so we need European rules. Secondly, we are much stronger against these big corporations when we stand together as the EU, because we already know how much these big tech companies are trying to lobby us. So it’s very important that we stick together and make sure that the European Digital Single Market is free and democratic for everyone.

In the context of the pandemic, we’ve seen huge growth in telehealth, remote learning, work teleconferences, and so on. Now more than ever we socialise, exercise, shop, learn, and complete administrative tasks online. This has led to schools, hospitals, doctor’s offices, local authorities, and police all outsourcing many of their core functions to private tech companies, giving them a massive boost. Naomi Klein has referred to this as pandemic shock doctrine. Has the crisis of the pandemic inoculated us to the concerns we may have otherwise had about this trend, and what it means for our rights and privacy?

I think it has just shown even more just how much we rely on these big companies and how they basically control our digital infrastructure. When you look at the Covid-tracker apps that have been rolled out in many countries, many countries tried to create their own app, but in the end, they all had to go with the app made by Google and Apple together, because they just own most of the knowledge on how to create such apps. This meant that if you wanted to download or install the Covid app, you needed to be an Apple or Google user as it was distributed through their app stores. So it just shows how we are locked in by these companies and how much of our basic infrastructure is in their hands. I really hope this is a wake-up call.

There’s a lot of schools that are now highly dependent on using the Google system, which means that kids as young as six or eight are already locked into using Google. In the Netherlands, it’s now more than 60% of all primary schools that are official ‘Google schools’. People are proud of this, rather than worried by it. That is something that we should be much more aware of. It’s also another sign that we need a much more pluralistic internet with many more competitors that we can also invest in.

The internet started as a very activist, revolutionary place where people from social movements found each other. This space has been taken over by big corporations who only want to make a profit.

There is a trade-off between privacy and convenience, and people are often not aware of the real cost. This is the argument made by Shoshana Zuboff who talks of surveillance capitalism – the model whereby big tech companies collect and monetise data pertaining to every aspect of our online lives and behaviour. She argues we need a public conversation about whether these mechanisms are compatible with individual and democratic sovereignty – and if not find ways to outlaw and interrupt these processes. Would you agree with this and if so how can such a public conversation be organised?

I think it is crucial that we have this debate. We all know how easy it is to ignore what is actually happening, so many of us still use all these apps where we know that we are traced by big corporations that are collecting our data and trying to influence our behaviour, but we just keep on using them because everyone does it and it’s easy. So it’s very important when we are teaching children and young adults about digital skills that we teach them how to use them consciously so they understand what is actually happening.

There should be much more transparency, because now no one reads the complete terms and conditions and when it comes to cookies, you arrive on a website and a box pops up blocking your view, and then you have to do 10 000 clicks to make sure you’re not being tracked! Or you click “Agree to all” just once and then you are being tracked. There’s a lot of things we have tried to implement, to make sure that people have more choices to opt out of being subject to surveillance online, but there needs to be a lot of improvement in how they are implemented.

In the Digital Services Act, there’s a proposal that you can opt out of an algorithm if you really want to, but we think that that proposal will probably be quite useless in practice and that’s why we really hope that we can tackle the root cause; the reasons why we are being tracked. It is a battle to change how these companies work.

So it seems that, in order to make people aware of the invisible cost of using these platforms, we need a communications and educational campaign targeting the public alongside the legislation.

It is very important that people understand how these personalised, targeted advertisements work and what is happening. It’s not just the advertisements that you see on Facebook; it’s every advertisement you see on every website, all the time. The clickbait model also means that advertising revenue is being taken away from other publishers such as newspapers. So I think it’s very important that we make people more aware of how this system works. The Netflix documentary “The Social Dilemma”, explains very effectively how these algorithms target you – the way they try to lure you in to being locked in on the platform and shoot advertisements at you when they think you’re vulnerable.

I think with what happened in the United States at the Capitol – and in the Netherlands we’ve had lot of riots, which were also stirred on by social media – we have a window of opportunity now to start this conversation about the role of social media and what we can do about it. Unfortunately, it had to get this far, but we can now really have this conversation.

Is it still possible to move away from the profit-driven, data collecting and monetising model that we have currently to a different conception of the internet, that would consider it more as a kind of public utility or a commons?

The internet started as a very activist, revolutionary place where people from social movements found each other. This space has been taken over by big corporations who only want to make a profit. With the upcoming regulations, we’re trying to take the control of these companies back into the hands of democratic institutions and into society. I think it’s going to be a very long time before the internet works as a commons again. But this is a very important first step from hyper-capitalist, data surveillance internet to getting back to a more democratic and public space that we could all benefit from.

Cookies on our website allow us to deliver better content by enhancing our understanding of what pages are visited. Data from cookies is stored anonymously and only shared with analytics partners in an anonymised form.

Find out more about our use of cookies in our privacy policy.