The security breach notification and the adoption of automated individual decision-making within the framework of the General Data Protection Regulation
Published on 20th Nov 2017
In an effort to help those controllers and processors in the application of certain obligations from the General Data Protection Regulation, the Article 29 Working Party has published some guidelines regarding the obligation of notifying security breaches and the adoption of automated individual decision-making, including profiling, clarifying concepts that raise interpretational doubts and providing some recommendations and examples applicable in practice.
The Article 29 Working Party –collegiate body that includes representatives from the data protection authorities of the European Union Member States– (the “WP 29“) adopted on 3 October 2017 guidelines related to the obligation of the security breach notification and the adoption of automated individual decision-making, including the profiling of individuals.
In terms of security breach notification, the WP 29 articulates its recommendations based on a triple classification of security incidents depending on if they affect the confidentiality, availability, and integrity of the data. The organisation provides examples in which a security breach could simultaneously affect the confidentiality and availability of personal data, stressing the importance of analysing case-by-case if a security breach affecting the temporary loss of availability of personal data requires notification to the supervisory authority and the individuals concerned.
The WP 29 considers that the 72 hour period to notify a specific security breach to the supervisory authority starts from the moment in which the controller has a reasonable degree of certainty that an incident in terms of personal data has taken place. However, without prejudice to the above, what is truly remarkable is that the controllers must have internal processes that allow them to investigate and immediately detect if a specific incident affects personal data and if they must, therefore, notify the supervisory authority and the individuals concerned. The internal processes should also include an analysis regarding the supervisory authority that must be notified. These internal processes together with the implementation of other technical measures and the execution, when appropriate, of obligations of security breach notification with processors should be understood as appropriate security measures to detect and notify security breaches without undue delays.
The guidelines provide some examples in which the controllers must notify the security breach to the affected individuals, such as when the usernames, the passwords and the consumer purchase history of an online purchase platform is published on the internet as a result of a cyberattack or in the case of lack of availability of the patients’ clinical history in a hospital for a period of 30 hours as a result of a cyberattack. In addition, other examples are provided in which the security breach notification to the individuals concerned will depend on the nature of the personal data affected and on the severity of the consequences that the security breach can reflect on such affected individuals.
Finally, the WP 29 guidelines emphasize the need to analyse the risks that a specific security breach entails for the individuals, highlighting the probability and the severity of such risk. Notably, the collegiate organisation considers that in the risk analysis certain criteria must be taken into account, such as the type of breach (if it affects the confidentiality, availability or integrity of the data), the nature, the sensitivity character and volume of personal data affected, the ease of identifying the individuals concerned, the severity of the consequences for the individuals concerned, the special characteristics of specific groups of the individuals concerned (children), the number of individuals affected and the characteristics of the controller.
On the other hand, the WP 29 highlights the benefits that profiling (which allows for analysing and predicting aspects as diverse as the performance of work, economic situation, health or personal preferences, among others) and automated individual decision-making contribute to the economy and to society in general. However, highlighting at the same time that such activities pose risks in the field of data protection and deserve the adoption of appropriate safeguards.
The WP 29 guidelines include the definitions of profiling and automated individual decision-making concepts, indicating that despite the fact that both techniques can overlap, its differential element is that in automated individual decision-making there is not any human involvement for the decision-making.
The WP 29 recognises that the general prohibition of the adoption of automated individual decision-making that produces legal effects on the individuals concerned (as could be the case, if adopted by an automated decision whether someone is entitled or denied a particular social benefit) or that affects in a similarly significant way (if the decision significantly influences the behaviour or the choices of the individuals concerned) cannot be avoided by the controllers with the inclusion of human involvement in the adoption of decisions when such human intervention does not play a significant role has and does not have the authority and the competence to alter or change the decisions.
In addition, the body stresses that the controller must make an effort to provide meaningful information about the logic applied in the undertakings of profiling and adoption of automated individual decision-making (when these are not prohibited) in a way that is understandable to the individuals concerned and to inform them of the criteria and reasons relied on to reach such decision and the effects they may have on the same. Such information can be given with real and tangible examples that do not include complex technical explanations that can generate an opposite effect on the individuals concerned.
Finally, the guidelines suggest a series of principles and obligations that are common in the activities of profiling and adoption of automated individual decision-making, also expressing the complexity and peculiarities that can lead to the adoption of automated individual decision-making that affect children.
Although these guidelines are not definitive for being subject to a process of public consultation, they constitute a first step to clear interpretative doubts regarding some ambiguous concepts of the law itself, especially through the inclusion of practical examples and the proposal of specific recommendations in particular situations.