What have you done, Zuckerberg? Round-table on data protection

The General Data Protection Regulation (GDPR) entered into force in the European Union on 25 May following a transition period of two years. Besides unifying so far fragmented data protection rules existing in various forms, the regulation gives more say to users, thereby conferring more responsibility on them, but at the same time provides greater protection to their data.

Owing partly to the fact that in the last decade a growing number of scandals surfaced about large amounts of personal data becoming accessible, the news of the implementation of the regulation reached the broader public. The latest scandal receiving wide media attention involved Cambridge Analytica, a British data analysing firm which became reputed for claiming that they brought Donald Trump’s campaign to a successful conclusion using big data, i.e. by possessing the data of 87 million persons. The case involves Facebook and thus Mark Zuckerberg, as many think that they did not handle users’ personal data strictly enough. Since then, Zuckerberg has already appeared before the European Parliament where he talked about the measures they were planning to take and were already implementing against manipulations and data abuse as well as assured the sceptical MEPs about Facebook’s compliance with GDPR.

At the beginning of May, the Institute of Sociology and Social Policy staged a roundtable discussion on data protection and the risks associated with Big Data in the light of the Cambridge Analytica scandal. The invited experts, Karolina Mojzesowicz and Anna Vancsó addressed the question of what could happen to our personal data when using social media websites and online platforms, applications, as well as what we can do to protect our data and how Hungarian and EU legislation protect users. The discussion was moderated by Ágnes Győr journalist. Karolina Mojzesowicz is Deputy Head of the European Commission’s unit in charge of data protection (DG Justice and Consumers), Anna Vancsó is a PhD student of the Institute and senior analyst at Neticle Labs.

The roundtable organized by the Institute was opened by István Vilmos Kovács Director of International Relations and Innovation. Going back to his university years he said that in those times data science had not been so much in the forefront, that big data had not even been in a state of infancy. Today, we create a vast amount of data every minute and it is better not to think about our data footprint: if we are conscious and mindful users, we need not be constantly worried about what might transpire about us. In Mr Kovács’s view, data cannot only be misused, but can also be put to good use, like for instance in the university environment in the form of instant feedbacks during presentations, the real-time display of questions from the audience.

In her presentation Karolina Mojzesowicz summarized the contents of GDPR, the new data protection regulation. The European Union wishes to deal with shocking cases like that of Cambridge Analytica and to prevent the dissemination of disinformation by reinforcing data protection. In order to achieve this, differences in data protection rules between the member states should be dismantled and harmonized, thus clarifying and modernizing rules existing at the member state level. The GDPR contains directly applicable rules which must be met by all entities operating in any of the EU’s member states that handle the data of European Union citizens. It also applies to companies outside the European Union whose goods and services target the member states.

Mojzesowicz also stressed that the entities, companies themselves must ensure compliance with the rules of the GDPR. The handling of sensitive data such as sexual orientation, religion, political views and health data imposes more responsibility on the entities, these can only be used and stored under strict conditions. One of the important benefits of the regulation is that users have greater control over exactly who, how, for what purpose and how long may use or store the data provided by them. Information must be transparent, extensive and easily understandable.

Anna Vancsó pointed out that we use numerous freely accessible services and although we don’t pay money in exchange for their use, we provide our data. In fact, these personal data are valuable for service providers like Facebook and Google that offer their services free of charge. Neticle Labs, a company focusing on media monitoring, media analysis as well as the analysis of online comments, encounters many cases in which it is hard to determine how to act correctly. If, for instance, someone features in the analyzed content with his/her whole name, is it to be protected as personal data? We cannot be sure whether he/she is included with his/her whole name intentionally. We live in an age of visibility, sharing, we could even state that “what is not there on Facebook has not even happened”. The question is whether others can be limited, controlled with the sole objective of protecting them if this is not our responsibility.

With respect to the use of data Anna Vancsó added that many consider big data as dangerous as all our data have now become more visible, more accessible, but big data can be just as useful: The huge databases that are available can be used for good purposes, among others in cancer research and similar areas of social interest. In most cases, individuals are not reluctant to provide their data: with the spread of smart devices an increasing number of applications is appearing, the downloading of which implies giving an automatic consent for their data to be handled. In the US in 2016 it was shown that the time spent on phone applications decreased with age (source: Statista), but these statistics do not include under 18s who are known to spend even more time on phone applications than older generations. Under 18s cannot be expected to act as responsible consumers, namely that they will consciously read the terms of use and privacy policy of the application or the website they are using.

During the roundtable discussion Anna Vancsó stressed that the key issue with respect to data protection was trust: how does one decide whether to trust a company, entity, service provider? In the European Union up to now several institutional rules on data protection were in force and the degree of trust of the society in the various countries also varies. Data stealing unfortunately occurs quite often: databases with personal data often get leaked in such a way that they become accessible to even those who have little experience. Vancsó said that the scandal related to Facebook became so widely known because it was linked to a political campaign.

Karolina Mojzesowicz said that there is a difference in the way the various generations protect their data: older generations are often unaware that in many cases the use of a pseudonym is not sufficient since in the course of analyzing comments it is not the name, but rather a couple of characteristic phrases that make the person easily identifiable.

The contributors ended the roundtable by underlining the responsibility of private individuals. As an example, with the spread of cloud-based storage spaces many people store data on others in the cloud. Each of us is responsible for preventing anyone from accessing these data.

Borbála Szczuka