Analysis of GDPR Article 28

From Dan Shearer CV

This analysis looks at the detail of the GDPR and its implied computer science and cybersecurity principles to derive a solution for sharing secrets according to law.

Executive Summary

The GDPR sets up a conflict in trust between companies in particular circumstances, which can only be resolved by using the automation of a cryptographic audit trail with particular properties as described below.

Problem Statement

Under the text of the EU's GDPR law virtually every company is a Controller, and virtually all Controllers use at least one Processor. When a Processor is engaged, the GDPR requires that a contract is signed with the very specific contents spelled out in clause 3 of Article 28. The GDPR requires that Controllers and Processors cooperate together in order to deliver data protection, and this cooperation needs to be very carefully managed to maintain the security and other guarantees that the GDPR also requires. That's what this mandatory contract is intended to achieve.

In other words, the GDPR simultaneously requires strong cooperation and strong security between companies who don't trust each other or have common goals. This is difficult to resolve, but it must be resolved because the parties are legally required to sign a contract to say it is resolved.

What Does Article 28 Say?

Here are a few highlights extracted from the full text of Article 28.

The processor:

3(a) ... processes only the personal data only on documented instructions from the controller

3(c) ... takes all measures required pursuant to Article 32 [about security]

3(f) ... assists the controller in ensuring compliance with the obligations pursuant to Articles 32 to 36 taking into account the nature of processing and the information available to the processor;

3(h) ... makes available to the controller all information necessary to demonstrate compliance with the obligations laid down in this Article and allow for and contribute to audits, including inspections, conducted by the controller or another auditor mandated by the controller

About Controllers and Processors

If you are familiar with the European Union's GDPR, and the roles of the Controller and Processor, then you will be aware of the need for a formal agreement (usually a contract) between every Processor a Controller uses.

Effectively, every company is at least a Controller of personal data, certainly if there are employees or customers. Most companies use at least one Processor for data under their control, from an IT support company, to online/cloud storage providers, to companies who consultant and outsource in many ways. A contract between a Processor and a Controller is very carefully specified in legal terms in the GDPR, but the technology implications are not mentioned. This is all in the GDPR Article 28.

About Sharing Access to Data

The Key Problem

Access to Data is Impractical to Share: Controllers and Processors could be almost any company using almost any system, so sharing the access to the personal data across organisations just wouldn't work. Personal data is stored in a different way in every organisation - at a minimum, in difference choices from the 200-odd main databases on the market for instance, besides all the non-database systems in use, and the policies and habits unique to every company.

Access to Keys can be Practically Shared: No matter how diverse the storage mechanism for the personal data, the secret keys will always be one of a few types. Most often passwords, but also multifactor authentication codes, or cryptographic key files, or one of a small list of other means of authentication.

Nobody Wants to Share Keys: Article 28 says that these passwords or other keys need to be available for sharing between Controllers and Processors at any time. And yet no company is happy handing out passwords to their internal systems to unknown people, and anyway this could easily become a breach of the GDPR and the forthcoming ePrivacy legislation.

Where Computer Science Comes In

When a Controller engages a Processor, there are many circumstances under the GDPR when these secret keys need to be shared between these parties, parties who should not trust each other. These circumstances include:

  • conducting audits of privacy and security, as the GDPR antcipates the controller will need to do
  • allowing the controller to know precisely who has access to the data residing on the processor's systems
  • allowing the controller to know precisely who *has* accessed the data on the processor's systems
  • allowing the controller to see when the list of users with access changes, and how it has changed

Therefore handling of the keys to the personal data is of crucial importance. The law requires that you give some degree of access, perhaps a lot of access, to a company whom you have never met and have no reason to trust. Computer Science has given us many ways to think about interacting with people we do not trust, so this is a problem that can be solved.

All About Keys

Taken together, a Computer Science view of the various laws, regulations and regulatory bodies strongly implies that some kind of cryptographically guaranteed auditing process is needed to control the keys required to access personal data held by a processor.

The inputs to consider are:

  • the post-trilogue texts for the forthcoming EU ePrivacy Regulation
  • the EU Cybersecurity Act
  • the European Communications Code
  • the EU NIS Directive
  • EU-level security and privacy bodies ENISA and the EU Data Protection Board.

All of the above is human rights-based legal pressure, so how should this be implemented in software?

Article 28(3)(c) says "takes all measures required pursuant to Article 32" so Article 32 is a part of the mandatory contract, and Article 32 is about security, and security has some pretty clear definitions - or at least, unambiguous definitions about what is not secure, as well as some fairly well agreed minimum standards as to what is required for good security. There will always be some degree of ambiguity about whether something is secure, but you only have to competently show that something is insecure once for experts to agree it is indeed insecure.

To discover exactly what kind of cryptographic solution will work, we need to look at the information flows mandated by the GDPR.

GDPR Article 28 Information Flow

A close reading of the mandatory instruments (normally contracts, but not necessarily) in GDPR Article 28 shows that the required flow of information between Controllers and Processors is almost entirely one way, from the Processor to the Controller. The Processor has to make numerous undertakings and promises to the Controller, stated in a legally binding manner.

graph LR controller((Data Controller .)) == "sends audit requests to ." ==> processor((Data Processor .)) processor -- "replies with audit answers ." --> controller processor -- "informs of breach ." --> controller classDef green fill:#9f6,stroke:#333,stroke-width:2px; classDef orange fill:#f96,stroke:#333,stroke-width:4px; classDef blue fill:#99f,stroke:#333,stroke-width:2px; class controller green class processor blue class i1 orange

(Diagram source here: GDPR_Information_Flows)

There is a lot of mandated potential communication from the Processor to the Controller, meaning that in various circumstances, there will be communication from the Controller to the Processor if the Controller wishes and the Processor has no right to deny that access. At any time the Controller can demand the Processor produce information to prove that processing is compliant, or to require the Processor to assist the Controller in certain activities. The Controller is bound by the GDPR to be able to prove at all times that processing is compliant whether or not a Processor has been engaged.

When the Controller is a small company of 20 employees, and the Processor is a giant cloud with thousands of employees, it is clear that this power dynamic is not currently set up for the small company to require the giant cloud to comply with audit requests. Giant clouds are required to respond, but most of the time (as of 2023) its going to take a lot of effort to force practical compliance.

Relationship of the Parties to Article 28 Contracts

Basic security practice is that the parties to such information flows should not trust each other; they are independent entities who in many cases will not have any other dealings. In addition, each are under very strict legal requirements of the GDPR and the (imminent) ePrivacy Regulations, and the (imminent) EU Electronic Communications Code.

Article 28(1) says "the controller shall use only processors providing sufficient guarantees". According to the computer science referred to in this article, it is possible to define a minimum value of "sufficient guarantee" under the GDPR, but even without that analysis, the Controller must seek some guarantees from the Processor and they need to be not only good guarantees but sufficient to back up the rest of Article 28.

This means that parties to Article 28 contracts are required to meet a particular standard, but also that the parties should not trust each other to meet this standard or any other good behaviour.

Article 28 is All About Processors

Article 28 is all about the Processor being bound to the Controller, with the Controller saying and doing nothing outside what is already said in the rest of the GDPR text. The only reference to a Controller in Article 28 is that the contract must "set out the obligations and rights of the controller" (Art 28(3)) which appears to mean effectively stating "Yes I acknowledge I am a Controller and I am acting according to the GDPR".

There are just two references in the entire GDPR requiring the Controller taking action with respect to using a Processor. The first is ensuring that there is a contract in place that complies with the GDPR. The second is in Article 32(4), which says "the controller and processor shall take steps to ensure that any natural person acting under the authority of the controller or the processor who has access to personal data does not process them except on instructions from the controller".

Technical Comments

Article 32 emphasises the phrase "state of the art", an English expression that has caused much confusion. The phrase is only ambiguous within the confines of English, and since the GDPR is authoritative in multiple languages we can easily compare with German, French and Dutch and see that multiple versions all agree with one of the English meanings. Therefore "State of the art" means "the art as it is practiced today in general", as widely practiced by people who can be described as "technical peers", and also as defined by standards bodies and the like. "State of the Art" does not mean the best, most efficient or most advanced technology in existence. It does not mean the most recent. Here in this discussion we consider technologies mostly developed decades ago and very widely recommended and deployed today, which are still definitely "state of the art".

Technical Analysis About Audit Records: A log file (Unix) or an EventLog (Windows) is not a quality audit record; it has often been held to be a sufficient audit record in courts worldwide, but in that context it is about balance of probabilities and taking into account other log entries created on other systems at the same time - basically a detective hunt by an expert witness. That sort of thing is an audit process but not a good one and typically only ever involves one logging party. The GDPR Article 28 contract requires that there shall be at least two parties to the audit trail whose actions will be logged, which has not been the case in any law previously. The new EU security and privacy laws use the words "appropriate", "should" and "state of the art" so much that I think it is non-controversial that the audit standard required is much higher. There needs to be a cryptographically guaranteed, non-repudiable audit trail for activities where none of the actors involved (including auditors) need to trust each other, and no special expertise or context is required to interpret the record.

Technical Analysis About Keys: A key of some sort is always required to get access to personal data, be it a password, passphrases, physical door pinpad code, two factor authentication or whatever else guards the access to the systems with personal data on it. The Article 28 mandated contract specifies that under many circumstances a Controller and a Processor release keys to each other and therefore to natural persons in the employ of each other. By auditing the use of the keys, we are auditing the access to personal data. In order to remain in compliance with Article 32, we can change passwords/keys at any time, reset the list of authorised persons and therefore also resetting the audit trail. A cryptographically secured audit facility can detect the first time that someone accesses a key.

Technical Analysis About the ePrivacy Regulation: I have tracked down the different versions presented for Trilogue, which has now finished. ePrivacy following Trilogue appears to include EU Parliament LIBE Committee amendments from October 2017, including Article 26(a) “In order to safeguard the security and integrity of networks and services, the use of end-to-end encryption should be promoted and, where necessary, be mandatory. Member States should not impose... backdoors". If we are having an audit facility for keys to personal data then it should be end-to-end. Like all end-to-end solutions it will upset government spy agencies or any other party that might want to falsify the record through government-imposed backdoors, because such backdoors cannot work according to mathematics.

Technical Analysis About the EU Code of Communications: The Code is broader than ePrivacy (which, it can be argued, is limited by its lex specialis relationship to GDPR.) The Code says: "In order to safeguard security of networks and services, and without prejudice to the Member States' powers to ensure the protection of their essential security interests and public security, and to permit the investigation, detection and prosecution of criminal offences, the use of encryption for example, end-to end where appropriate should be promoted and, where necessary, encryption should be mandatory in accordance with the principles of security and privacy by default and design." We know from Snowden and others that the "without prejudice" phrase is just being polite, because there is no technical means to implement "no backdoors end-to-end crypto" and also not make government spy agencies upset.

Minimum Audit Records Required by Article 28

Detail of Required Audit Records, with their basis in law:

Requirement 1 - Audit records that list of all natural persons who have access to keys to the personal data, and the changes to that list over time:

Article 28(2) "shall not engage another processor", so everyone can see whether or not an unexpected person was authorised for access to keys
Article 32(4) "any natural person acting under the authority of the controller or the processor who has access to personal data", so we need an audit log of who *can* have access to keys
Article 32(4) "any natural person acting under the authority of the controller or the processor ... does not process them except on instructions", so we need an audit log of who actually *did* access the keys at least once.

Requirement 2 - Audit records for who has accessed the audit records in Requirement 1:

Article 28(3) "obligations and rights of the controller", shows the controller is watching the processor

These audit records can be technically implemented regardless of what IT systems the Controller and the Processor have, because they are only about dealing with the keys. Whoever has the keys has the personal data, and the keys themselves are protected by the GDPR. These audit records are about storing passwords (or other access means.)

Computer Science doesn't seem to allow any way of meeting Article 28 "sufficient guarantee" without a zero-trust encrypted audit model, which these types of audit records enable.

Conclusions About Implementing GDPR Article 28 in Software

Conclusion 1:

the above minimum audit records are required to fulfill an Article 28 contract between Processor and Controller

Conclusion 2:

if implemented, these records rises to an Article 28(1) "sufficient guarantee" of a Processor being acceptable and therefore the contract being acceptable

Conclusion 3:

there does not seem to be any alternative way of achieving a "sufficient guarantee".

Conclusion 4:

The GDPR requires cryptographic audit facilities to exist and therefore, there is a market for companies to provide these facilities.