When technology serves people Margrethe Vestager is the European Commissioner for Competition Next year is supposed to be a tough one for tech. 2019 is when – if you believe the film Blade Runner – renegade androids should be roaming the Earth. It’s when – according to The Island - we should all have clones created just to provide us spare organs. Technology and human values Those dystopian fantasies make unforgettable films. They’re a powerful reminder of how technology can go wrong. But there’s one important thing which those filmmakers have forgotten. They’ve forgotten that technology is our future, not our fate. They’ve forgotten that our societies are about much more than technology. They’re built on a far deeper foundation of values – values like freedom, and fairness, and democracy. And that’s why those dystopias don’t quite ring true. Because in fact, we don’t just accept what technology gives us. We can shape it to fit our society’s values. So technology serves people – not the other way round. Meeting the challenge of digital technology More than a century ago, when the first cars appeared, their supporters insisted they were much better than horses. They weren’t simply faster - though they could reach terrifying speeds of 15 kilometres an hour. They were also much better for people’s environment – they made city streets cleaner, safer, and quieter. And it was only when cars became part of everyday life that some of the problems with them emerged. The accidents, the noise, the pollution. And as those problems became clear, rules started to develop. And the same sort of change is happening today, in the way we look at digital technology. After the first thrill, when we discovered what these technologies could do, we’ve started to see that there’s a dark side as well. A side that can challenge our most basic values – our privacy, our freedom to choose, even our democracy. And we’ve started to see that it’s time for people to take control. Of essential things, like what happens to their data. Data and privacy Until recently, there’s been a sense that companies could treat our data as something just lying around for anyone to pick up, like prospectors in a gold rush collecting nuggets from the ground. But those days are over. People understand that handing over data has a cost. Because each time we share our data, we give up something very valuable. Something that could be used against us. That might just be annoying, the way it is when we’re spammed with advertising. Or it could be much worse. Fraudsters might break in and use our data to steal from us. It might even be used to swing an election. GDPR and the confidence to share data Of course, services that use our data could bring us huge benefits too. But people won’t accept that, unless they know they’re in control of their data. Last year, a survey asked Europeans if they’d be willing to share medical data for research – anonymously. You might think medical research was clearly a good thing. But only one in five people said they’d share their data for medical research in the public sector – and only one in seven would share it with private companies. It’s not that people aren’t willing to share data. But they won’t do it, if they don’t trust that they’re being told the whole truth about what it’s being used for. And I don’t think that feeling is unique to Europe. That’s why it seems to me that Europe’s new rules on data protection, which started to apply last month, are only the first step in a change that will spread. Already, big businesses like Microsoft have said they’ll apply those principles beyond Europe. That new law - it’s known as the GDPR, which you have to admit isn’t the sort of punchy name you’d find in a science fiction script – is all about putting the control of data where it belongs – with the people whose data it is. So they know who has access to their data, and what they’re going to use it for. So they can be confident that the companies they deal with won’t siphon off their data overseas, to ‘data havens’ where the rules don’t apply. Google Without trust, we won’t get the most from technology. And our antitrust case about Google Shopping has shown how hard it can be for consumers to know what’s really going on. People tend to believe that Google’s search algorithm will show them the most relevant results at the top. So when users found that, though the algorithm demoted Google’s rivals to – on average – page four, its own service was shown prominently right at the top of the first page – well, it was natural to assume Google Shopping must be the best. But it wasn’t the algorithm that put Google at the top. Google Shopping appeared first because of a conscious decision by Google, not to apply the algorithm to that service. And yet consumers just saw Google Shopping, which seemed the most relevant, and not rivals. Algorithms and democracy Algorithms can help us find our way through the huge amount of information on the Internet. But the risk is that we only see what these algorithms – and the companies that use them – choose to show us. And the things that they hide might as well not exist. That can be a serious threat to our democracy. Because democracy’s not just about voting. It’s about discussing ideas in public. So everyone has a chance to be heard. Not long ago, a sound file was going round the internet – a short clip of a voice saying a name. Some people heard it as “Yanny” – others as “Laurel”. That clip was a sensation. And I think the reason is simple. It challenged the most basic belief we need to live together in society. That we all hear and see the same things. And when politics moves to our social media timelines, we can’t be sure of that. I can’t tell if you hear “Yanny” where I’m hearing “Laurel”. I can’t debate the ideas that you’re hearing. Because no one but you – and possibly the social media company – knows which ads and which news you can see. Making technology work for us In the past, there’s been a temptation to expect the tech industry to solve its own problems. Perhaps that’s because we fear innovation will suffer, if we regulate the digital world more. But that makes no sense. Supporting innovation doesn’t mean accepting every new thing, just because it’s new. The job of an innovator isn’t just to come up with new ideas. They also need to convince us that their product is worth the price. And when we insist that technology doesn’t harm our values, that doesn’t mean we’re rejecting innovation. It just means we need to be sure the price is not too high. Conclusion It means putting people, not technology, back in control. We’re dealing with businesses that are big and powerful. But we, as a society, are powerful too. In Europe, for instance, we have a single market of more than 500 million people. And that’s definitely big enough to make companies pay attention. The same companies that, not long ago, transformed our world with new ideas, have become the establishment. They have the power to protect their position, by holding back the next generation of innovators. But our competition rules allow us to protect innovation – as we’ve recently done with Amazon, and as we keep doing when we check that mergers don’t give companies so much data that no one else can compete. And Europe also has the size to put strong rules in place, like the new rules on data protection. Like the law we proposed last month, to help make sure online platforms and search engines treat their business customers fairly. Or the ethical guidelines for artificial intelligence, which we plan to present by the end of this year. Because our fundamental values are at stake here – our freedom, our democracy, our equality. And it’s up to us all to stand up and protect them. So they won’t be lost – as they say in Blade Runner - like tears in rain. This article is based on a speech delivered at Brain Bar Budapest, 1 June 2018
Supporting innovation doesn’t mean accepting every new thing, just because it’s new. The job of an innovator isn’t just to come up with new ideas. They also need to convince us that their product is worth the price