Recently, I have been reading Yuval Noah Harari’s latest book, 21 lessons for the 21st century. Harari became famous for a previous book, Sapiens, which retells the history of our species from a completely fresh perspective. The book became a worldwide success and contains a host of mind-blowing ideas (no spoilers here, don’t worry), and can be considered, without exaggeration, one of the most influential books of this century.

In fact, I would argue that “21 lessons” is successful in its difficult mission of improving upon the outstanding Sapiens. This time, Harari focuses on the recent past and present, all the while keeping an eye on the future. The book collects and summarises many of the themes which are already part of our life, in a simple and coherent way. These are themes which are influencing our present and could determine our future. First of all, he focuses on Artificial Intelligence (AI). I find Harari’s book extremely effective in demolishing some commonplace stereotypes associated with AI and it gives a realistic, though sometimes coloured, portrait of possible futures. In particular, Harari supports the idea that algorithms will soon be an even more fundamental part of our life than they currently are. Decisions, both at the national and international level are growingly taken by algorithms in numerous fields. Finance is just one example where complex calculations can determine the price of stocks, while growing complexity will continue to make it more difficult for humans to understand.

The discussion about AI, machine learning and data analysis is certainly not new, and Harari is certainly not the number one expert on these topics. In this book, however, he achieves two things which are very important. First, he is able to discuss these topics in a simple way which, contrary to most discussions on the topic, is clear to the general public. But secondly and even more importantly, he portrays an image which is balanced and without judgements. Harari is particularly effective in describing potential positive and negative outcomes of the current process. On the one hand, these technologies give us unprecedented resources to improve our society and far better instruments of decision-making than ever before. He, for example, stresses how medical assistance could become more advanced and, crucially, accessible to everyone regardless of where they are. Conversely, he emphasises that it will depend massively on how we use these technologies. This is one of the truths of history. You can use nuclear energy to build a power plant or, indeed, to build a bomb. In the same way, Harari underlines how information can be used to develop democracies, by improving their functionality, or to consolidate dictatorships, which would have unprecedented, unrivalled and unrelenting instruments of repression.

He is particularly direct in his warning relating to this point. Democracies have not yet found the answers to the new technological questions we find ourselves facing, and their survival depends on finding such answers. He goes into detail about the Trump and Brexit campaigns and the role played by Cambridge Analytica in influencing the outcome of these elections, citing such instances as examples of the beginning of this new phase of politics. For those who are not familiar with these topics, Cambridge Analytica was a corporation that acquired Facebook users’ data using unethical and downright illegal methods. The goal was to create a credible psychological profile on every user. Through these profiles, they were able to influence the elections by targeting people and bombarding them with content supporting the view of their candidates, often driven by fake news. For those who are interested in going deeper into this topic, I recommend two eye-opening documentaries available on Netflix: The Great Hack and The Social Dilemma. Here is the trailer of the Great Hack.

Last week, at a conference on data analysis organised by Wayco  in Valencia, I spoke together with Dermot Kavanagh about the relationship between politics and data analysis, a video of which you can find at the beginning of this article. During this conference, we presented Arbury Road’s position on the topic. As we know, the question of data protection is not new and has been a topic of huge debate over the last 15 years. Many of us have likely seen the famous video of Alexandra Ocasio-Cortez questioning Mark Zuckerberg about Facebook ads, making direct reference to the Cambridge Analytica scandal. Yet, my feeling is that there is still not enough attention in public discourse on the questions of privacy and data protection. The amount of power that data collecting platforms have is unprecedented for a private company and very little regulation has been enacted. Google and Facebook have far more information on you as an individual, than any government, including the most authoritarian, had on individuals only 30 years ago. It must be stressed: these are private companies. How are they using our data? What kind of data do they use? Who will limit their power? All of these questions need answers and their importance is second only to the issue of the climate crisis, in determining the future of our society. As Harari said, either we find an effective answer or the foundations of our democracies will collapse.

Arbury Road supports the idea that it is necessary to create an effective, pan-European and binding law on data protection and so, we were optimistic when the European Union approved the General Data Protection Regulation (GDPR), which came into force in 2018. A groundbreaking piece of legislation in its own right, the GDPR saw Europe take the lead in terms of data protection and it was used as a model for many nations outside of the EU, such as China, Japan and Brazil. The issues with the GDPR, however, become clear at a practical level, causing issues for businesses who may, for example, record calls as a matter of practice. More importantly from our point of view, for the individual, the practical issues are clear. People are inclined to accept the terms and conditions so as to be able to access whatever webpage they are trying to access. This defeats the main purpose of the GDPR: the granting of consent. If people are liable to consent without reading, which I assure you, they are, then companies could also take advantage of this by misrepresenting exactly what consumers are being asked to allow. Even if companies do not misrepresent themselves, the GDPR is merely a step on the road to putting personal data back in the hands of the individual.

Data collection and data analysis is and will continue to be one of the most important political questions facing us for the years to come.  It is already used commercially by Big Data companies like Amazon, Facebook and Google to connect businesses with the right clients. Of course, privacy is reduced. On the other hand, if properly regulated, it is a new tool which provides a better service to both businesses and clients. If the question is already complicated in the business sector, it is twice as complex when it comes to politics. Again, the spread of fake news over the last 20 years, which has been rife during the coronavirus pandemic, is only a small example of the negative consequences targeted tools can have if not properly regulated.  We need to find a solution. If it is possible to find an ethical, legal way of collecting data for politics, this should be implemented as soon as possible. If it is not, it should simply become illegal. 

However, if we decide that it is legal, the progressives should also consider using these tools to secure good scientific data and put it in the hands of the population. It is time, in my opinion, for the progressives to counterattack and to use these techniques as a tool against fake news. One side of the traditional political battle has lagged behind for far too long. Ignoring a problem just makes it bigger. We have to admit that the right-wing, and in particular the so-called “sovereignists” and populist parties, found better ways of communicating their message over the last 10 years, as well as utilizing more effective techniques in spreading their vision. It is time to fight back. If used in the right way, after the implementation of air-tight regulation, consensual and transparent data collection and targeting could be an instrument to put reliable and verifiable scientific data in the hands of the people.

The question of data is one of the fundamental political issues of today, and its importance will grow even further as our technical skills expand. A widespread public awareness campaign is necessary among the population, as the problem is still relatively absent from the public discourse. People still don’t know the amount of data that multinationals have on them. This data needs to be transparent, and people need to be able to see their data and to ask for their data back. If an ethical solution to this dilemma can be found, data analysis cannot be left in the hands of the populist movement. The progressive front needs to use this technique to strengthen its message. The future of our democracy, and maybe even of our planet, depends on this.