Digital

Interview led by Leonie Krone from the University of Amsterdam

Presentation of the interviewer

My name is Leonie Krone and I am a third year PPLE student (Politics, Psychology, Law and Economics) at the University of Amsterdam (UvA), with a special focus on political science. As part of my current course, “Introduction to Public Policy” I am exploring the issues associated with implementing the GDPR in effort of protecting consumer data.

 

The policy issue I seek to address is how the GDPR, serving as a primary standard to data security, can be improved, (re)phrased and enforced to better protect consumers’ fundamental right to control personal data. This issue leads us to the dilemma between the extent of state intervention to protect data and the free market economy thriving through innovation; a dilemma I am looking to research and understand in more depth.

 

In hopes of gaining a more in-depth understanding and expert opinion about said issues, I would like to engage you as a stakeholder for the interview of my research project. The following is an outline of preliminary questions I am interested in: How efficient has the GDPR been in protecting private users from illegitimate data analysis strategies? Is there a dangerous lack of education and understanding of data protection regulation among the general public? If yes, who should be held responsible to educate the public? The EU, nation-states or data companies? Would increased state or EU interference improve the data security of private users? If yes, at what costs of the economy and democracy?

Presentation of the interviewee

My name is Bruno Sivanandan Roques de Borda, I am a consultant in business transformation which nowadays implies major strategic decisions to be taken amid the rise of a digital regulation led by the GDPR. My goal is to help executives to understand the key concepts of their now ubiquitous IT systems and to take the right decisions in regards of their business strategy.

 

Also, before answering your questions, I would like to make some key statements about GDPR that will help us throughout the interview. I will try to break the problem of data protection into smaller problems for us to apprehend the problematic in a constructive way. Let us take a given information system as an example, we can separate it in two layers, what the user sees, and what is happening under the hood. Also, the two aspects of the system are correlated they shall be distinguished. The point is that the “technical” part, where all the data processes are taking place, needs to be protected from cyber-attacks.

 

Is my system secured? Is my data encrypted? Is the staff from the service providers adequately trained …?

 

This is a condition that is necessary for data and privacy protection. But it is NOT sufficient. The “functional” part, which is what the user experiences, is where the legislation will frame what we can legally do with the data.

 

Is the user provided with relevant information about data collection and purpose? Is it convenient for him to claim his rights? Is he able to make a reclamation to a data authority? …

What are the biggest problems in the field of data protection regulation currently?

One of the major issues is the cost associated with reaching compliance. The implications of a compliance program are so vast that it is unlikely much businesses could finance such a transformation. To minimize the cost, the GDPR could be improved by issuing more precise guidelines on how to implement the user experience as mentioned above. The regulation is clear on the concepts: Consent, minimization, users’ rights … But there is room for interpretation on how one should implement those constraints.

 

For example: Service providers, aka data controllers, must keep a log of the consents from the users. They must show proof of this consent when audited by a data protection authority. But it is up to the service provider to choose the technical solution. This leads to a disparity in the solutions adopted and raises the difficulty for auditing and certifying the ecosystem of service providers.

 

About the technical part, the guidelines are more matures for big corporations. We can talk about ITIL or the ISO 27001 for example, which describe a state of the art in handling a system and its evolution over the years while ensuring adequate quality controls are set in place. Those system are extremely heavy and are hardly transposable to smaller systems such as SME’s. So, there is room for guidelines in here too. The trend is going in this direction, with more and more SaaS being used, deporting the security on big players, but for small companies whose strategies is to keep their IT internally, it can be hard to comply. 

What is the efficacy of the GDPR?

The political entities entitled to enforce the GDPR are issuing sanctions to an array of companies ranging from SMEs to worldwide companies. Recently, H&M was fined 35 million € from misusing employee’s data. This starts to sting a bit amid a pandemic, even for a big corporation.

I would say, the trend is positive, GDPR regulation ecosystem is becoming more and more powerful and efficient. However, the political power of huge companies such as the GAFAM, is so strong that even the US or the EU has difficulties submitting them to their legislations. Even the coalition of big corporations that boycotted Facebook recently did not make the giant from Silicon Valley much harm. The GDPR definitely made a step in the right direction as it has brought the topic under a dimmed spotlight.

 

However, it will take time until the business environment adapts to this new legal framework. With a carrot or with a stick, businesses will have to get to the point where they are transparent about their data processing if they want to comply with the law. GDPR compliance is akin to financial audit. Companies spend a lot of resources for the latter, to know where every dollar is going, and they will have to commit to the former to know where every bit of data is flowing in their system. 

How to solve this dilemma:

state interference vs. Free market economy?

I think this is a society issue, there is no perfect answer about what our future should look like. If we consider extremes, without any regulation, people would have access to a lot of free services to the price of having behemoths companies accessing every bit of their data. On the other hand, the regulation would be so strict, companies would have to spend so much in having compliant system, hindering economic development while limiting business opportunities based on data.

 

This Business VS Pprivacy axis should be supplemented by a Surveillance VS Privacy one.

 

Indeed, there is a clear interest from states to maintain as much information about their population as possible. A recent example is the privacy shield break down which is a clear illustration of this problematic. To wrap this up, it is up to the public to draw the line between privacy and economy, as well as privacy and security. What, as a people, do we value? This should, in my opinion, drive our society and our economy. Indeed, there is innovation to be made as well if we want to come up with economic models inclusive of privacy protection. Some companies already take a pro-privacy stance, and a major one is Apple, whose new CEO, Tim Cook, clearly opted for a strategy that is protective of its user’s personal data. If we consider the state to represent the interest of the people. It is interesting to see how the market of customers reacts to such brand positioning.

 

Because less data collection means more privacy, but it also implies a loss of quality in the user’s experience.

 

Another example of a pro-privacy company is Qwant, which is a French initiative of a browser that protects its user’s privacy. I first heard of it from a training by the French secret services, so it drew some attention indeed, from people interested in secrecy. But the point is that it did not get much traction because the user experience is mediocre compared to that of google. Qwant’s branding was “We are not as good as google but we are respecting your privacy”. And people clearly stated: “We’d rather give out our privacy for better search results!” 

Is there a lack of education in the public, and who should bear the cost?

There is a need for massive campaigns of information so people know what is at stake, can form opinions and can transmit this opinion to the government through a democratic process. This shortfall is partly due to the speed at which digital technologies have become ubiquitous in our lives. It left little time for educational institutions to adapt to the extent of fulfilling the said gap of understanding.

 

This gap prevents people to formulate enlightened opinions about what they want as a res publica (republic), this term refers to everything we commonly share as a community. For example: We often hear from individual that they do not care about sharing their data for a free service. “I have nothing to hide”. But even if our life is trivial taken individually, aggregation of those data with new technologies wields enormous power. The scandal of Cambridge Analytica is a turning point in the matter. Millions of trivial lives fed into machine learning models led to one single entity being able to change the course of an election. A deeper understanding of the implications of free data sharing could make people change their mind about what future they would advocate for.

 

As for who should bear the cost of educating people. I think that private companies since their goal is to make profit should not be entitled to do so. Though they should be free to dispense knowledge under the guidance of the state. The EU could dispense guidelines on how to shape the education system, but there is no federated effort within the member states on education. This leaves it to the states to implement educational programs. Which has already started in numerous places where children are confronted to basic programming courses whose concepts will be fundamental in the future to grasp the machines that we will see everywhere.

 

Let us take again example of the Cambridge Analytica scandal:

 

The notions necessary to grasp what really happened are not particularly complicated, but they are not yet part of our common knowledge. The necessary notions: Algorithms feeding on historic of data to build a statistical model of human behaviour. Other algorithms labelling us from what we do and using the model to infer about what we want or how we would react with given inputs. This not much harder than most European languages’ grammar! 

What is the future of privacy regulations?

For this we need to consider EU amid the international scene. Because in practice, a lot of data is flowing outside of EU toward the US or Asia and this is a major deal in term of privacy. Not only the reach of the Data protection Authority is much weaker in those areas, but digital sovereignty in those countries has somewhat conflicting interests with privacy regulation. So increased EU interference could mean greater influence on the companies’ systems. For instance, restricting web hosting to services GDPR certified providers would be a disaster in terms of economy, potentially rendering digital services much more expensive that they are now.

 

The current system relies a standard contractual clauses or binding corporate rules EU entities have to sign with their third parties abroad. This works on a legal level but in practice it is quite hard to audit the system claiming to be compliant through a contract. There is currently a worldwide problem with data regulation and international data transfer. Because nowadays, any business relies on digital data transfer, regulating the latter has taken a diplomatic aspect. As mentioned earlier, there is a dichotomy between user’s privacy and data sovereignty at governmental level. It is not easy to find the right compromise within a nation but finding it between nations is even harder.

 

To conclude, the EU interfering too much could have a diplomatic impact on the world scene which would be tied to economic turndown if restrictions go too far, or some business partners are proscribed for a reason or another. This is why I think, although the legislation is issued at an institutional level, and sets the boundaries for the economic actors to thrive, real changes in terms of privacy have to come from a market demand. As scandals about poor cyber security or blatant disrespect on privacy will grow more frequent, the impact on brand image will start to be felt on the revenues of the companies.

 

 

When users will choose a respectful solution over a standard one … we will naturally see more and more of the former. Technologically speaking, the audit process will become more common and facilitated by specific tooling. For instance, it is not possible to claim being respectful of your user’s privacy if your algorithms are not transparent. Enabling auditing process while respecting the intellectual properties rights bound to those algorithms is technically possible. But all the involved stakeholder agreeing on the framework to do at an international level will take time!

Leave a Reply

Your email address will not be published. Required fields are marked *