Data law | UK Regulatory Outlook July 2024
Published on 25th Jul 2024
UK King's Speech 2024: new Digital Information and Smart Data Bill | ICO publishes annual report 2023/24 | TechUK and others call on the new government to modernise UK's data protection framework
Summer call to actionData-oriented businesses and enterprises developing new and innovative products will be encouraged by the mention of a new Digital Information and Smart Data bill in the King's Speech and the potential for the setting up of smart data schemes. Although the priorities set out in the King's Speech do not represent actions that the government is obliged to take during the next Parliament, businesses should keep an eye out for any announcement of a publication date for the draft bill. Once published, it will need careful scrutiny to understand its full implications, not only in respect of smart data and digital identity verification, but also in relation to the "targeted" reforms to some data laws that the government mentioned, as these have the potential to affect businesses across the board. At the moment, it is unclear what the new Labour government means by "targeted" reforms as there was no mention of them in Labour's manifesto. In any event the devil will, as always, be in the detail. Businesses should be aware, however, that if there is nothing too controversial included that warrants lengthy debate, the new legislation could proceed through to enactment relatively quickly. |
UK King's Speech 2024: new Digital Information and Smart Data Bill
The King's Speech 2024 took place on 17 July 2024 and included a surprise: the introduction of a new, standalone Digital Information and Smart Data Bill (DISD bill). It was expected that the new Labour government would re-introduce the smart data and digital identity verification provisions from the old Data Protection and Digital Information bill (DPDI bill), which fell during the "wash-up" period, but it was thought that the provisions would be included in a new AI bill. In the event, there was no AI bill and a new DISD bill instead.
The government says that the bill will put digital verification services, smart data schemes and the national underground asset register on a statutory footing. It will modernise and strengthen the Information Commissioner's Office (including giving it new, stronger powers), allow scientists to use data on the basis of "broad consent" for scientific research, and clarify that scientists undertaking research in commercial settings have equal rights when it comes to using data for research. It will also allow coroners to preserve and access online information when investigating a child's death.
Notably, the DISD will not introduce more wide-scale reforms to the UK data protection regime, as was proposed under the Conservative government's DPDI bill (such as reforming requirements originating from the EU's General Data Protection Regulation). However, the government did say that there will be "targeted" reforms to some data laws where there is currently a "lack of clarity" that is blocking technological advances and adoption.
The government promises that these reforms will "maintain high standards of protection" and that "standards for digital identities around privacy, security and inclusion" will be promoted, but privacy groups have raised concerns, fearing a lowering of regulatory standards. While the UK's data protection law regime will continue to stay closely aligned to the EU's for now, the UK government's intentions will only be properly understood once the new bill is published.
Please see our Insight giving an overview of the King's Speech for more.
ICO publishes annual report 2023/24
The ICO has published its annual report and financial statements 2023/24, setting out a review of its performance over the last year, as well as an accountability report and financial statements.
The report notes that the emergence and take-up of AI had a huge impact on the ICO over the year. This meant that the ICO's plans were disrupted as it grappled with the data protection and privacy issues linked to the development and use of AI, as well as the question of AI regulation.
It also meant that the regulator had to decide to divide its "finite resources" between thinking about the data protection and privacy implications of AI and other emerging technologies, and continuing its traditional role of regulating and enforcing. The report says that this is an "ongoing process", but one which still has "empowering people through information" at its heart.
In terms of meeting its four core objectives over the last year, the report points to (among other things):
Safeguarding and empowering people
- the development of a subject access request (SAR) tool to help people submit good-quality SARs, which is intended to assist both sides;
- its work on protecting children's privacy by conducting research, adopting new policy positions, providing guidance on "likely to be accessed by children" in the Children's code and revising the Commissioner's Opinion on age assurance for the Children's code;
- the publication of new guidance for providers of biometrics recognition systems, which is due to be followed up later this year with a second phase, focusing on biometric classification and including a call for evidence; and
- its work around cookie compliance, which involved warning the top 100 websites that they would face enforcement action if they did not make it as easy for users to reject all advertising cookies as it is to accept them.
Empowering responsible innovation and sustainable economic growth:
- Publishing its views on emerging technologies, such as its work on the privacy implications of neuro-technologies, the publication of reports on immersive technologies and quantum computing, and its Tech Horizons report;
- its work to reduce the cost of compliance by developing a new Regulatory Action Framework to ensure consistent decision-making, hosting its free annual conference, and launching its "Innovation Advice" service, which aims to help with compliance when developing a new product or service;
- the publication of sector-specific guidance (together with representative groups), providing more targeted compliance advice;
- the production of a child safeguarding resource tool and a 10-step guide to sharing information to safeguard children;
- its work supporting the government in undertaking adequacy assessments and making regulations to enable the free flow of personal data to trusted partners around the world; and
- the development of new tools for approving UK binding corporate rules (BCRs), including the new UK BCR addendum, which has accelerated the approval process.
Promoting openness and transparency:
- the increase in its handling of complaints in relation to freedom of information (FOI) requests, focusing its "limited resources" in areas it can have the most impact and implementing its new prioritisation approach to FOI cases, which allowed it to clear the backlog of cases post-Covid; and
- its work on revising its approach to enforcement in respect of public authorities, increasing its use of warning, enforcement notices and reprimands.
Continuously developing the ICO's culture, capacity and capability:
- the provision of internal training, updated strategies (such as its Enterprise Data Strategy) and process reviews to ensure a better focus on its work, to be more inclusive, to improve transparency and to provide regulatory certainty.
In terms of forward-looking statements, the report notes that the ICO continues its work in the following priority areas: children's privacy, AI and biometrics and online tracking.
Overall, the report points to the ICO being stretched, due to its finite resources and because of the very fast development and uptake of AI across the economy, as well as the emergence of other new technologies. It has clearly had to streamline some of its regulation and enforcement operations and find solutions, such as the recently concluded two-year pilot deploying a revised approach to regulating public sector organisations, to ensure that it can effectively cover everything in its remit, while at the same time reassuring the public that it still has the protection of their rights front of mind.
TechUK and others call on the new government to modernise UK's data protection framework
Just before the general election, techUK, alongside several other business groups, signed a letter urging the next government to modernise the UK's data protection framework through reforms that foster innovation, while maintaining high data protection standards.
The letter said that a progressive regulatory framework that maintains strong privacy standards will encourage economic growth, improve public services, ensure easier use of data for research and provide greater flexibility for data transfers.
The letter urged the government to prioritise data reform in various ways including (among other things) providing legal certainty around the use of legitimate interests as a lawful basis for processing, creating a more flexible international data transfers regime, and reforming the UK's data regulatory framework in such a way that the UK retains its EU adequacy decision, "which remains of utmost importance to businesses across all sectors".
It also urged the government to introduce smart data schemes, clarify how data can be better used to support scientific research, and to modernise the Information Commissioner's Office to align its organisational structure with other UK regulators, all of which will be included in the new DSDI bill, as announced by the government in the King's Speech 2024. In fact, following the King's Speech, techUK welcomed the DSDI bill, "particularly given techUK's open letter on the need to modernise the UK's data protection legislation".
EDPB launches AI Auditing and Standardised Messenger Audit projects to develop pilot tools to assess the GDPR compliance of AI systems.
The AI Auditing project from the European Data Protection Board (EDPB) aims to map, develop and pilot tools to assist with evaluating the GDPR compliance of AI systems and applications. The objective of the initiative is to assist with understanding and assessing data protection safeguards in the context of the EU AI Act. It also aims to help Data Protection Authorities (DPAs) inspect AI systems by providing a methodology in the form of a checklist to use when conducting an audit of algorithms.
The EDPB's Standardised Messenger Audit project aims to provide tools for both DPAs and companies to assess the GDPR compliance of messenger services used by businesses. The project provides a test catalogue of mandatory, recommended and optional requirements that a GDPR-compliant messenger frontend will need to meet.
EDPB adopts statement on role of DPAs in supervising, applying and enforcing EU AI Act
At its latest plenary on 16 July 2024, the EDPB adopted a statement on the role of Data Protection Authorities (DPAs) in relation to the impact of AI on the fundamental right to privacy and the protection of personal data. As the EDPB says, many AI systems involve the processing of personal data.
The EU AI Act states that the legislation aims to ensure a high level of protection to fundamental rights, which include privacy and the protection of personal data. Therefore, in the EDPB's view, data protection and privacy legislation should be considered as complementary to the EU AI Act and, given that DPAs already have experience of the impact of AI on fundamental rights, they should be designated as Market Surveillance Authorities (MSAs) under the EU AI Act in respect of certain high-risk AI systems (namely, AI systems used for law enforcement, border management, administration of justice and democratic processes). In addition, Member States should consider appointing DPAs as MSAs for other high-risk systems that impact on the rights and freedoms of individuals' privacy.
The EDPB also says that DPAs designated as MSAs should be made the single point of contact for people and the different regulatory bodies involved in both the EU AI Act and in EU data protection law. Additionally, it would like to see the establishment of clear procedures for cooperation between MSAs and other regulatory authorities involved in the supervision of AI systems, including DPAs, as well as between the EU AI Office and DPAs/the EDPB.
ICO publishes response to CMA's consultation on new digital markets competition guidance
Noting in its response that competition and data protection law have overlapping objectives (that is, driving better outcomes for consumers) that are strongly aligned in the context of digital markets, the ICO sees the Digital Markets, Competition and Consumers Act (DMCCA) as an opportunity to expand on its cooperation with the CMA through the Digital Regulation Cooperation Forum where their remits intersect. In the ICO's view, this is key to ensuring effective regulation and to providing a coherent regulatory landscape for digital services businesses that process personal data.
In the ICO's view, the CMA's new powers to impose conduct requirements and make pro-competitive interventions are likely to have a significant impact on the way in which businesses designated as having strategic market status (SMS) process personal data. For example, any CMA intervention requiring the fair use of data or data sharing will need to be designed with privacy and data protection in mind. Consequently, the ICO says that it is "vital" that it is consulted at an early stage.
The ICO says that it should also be involved in the designation of SMS businesses, as firms likely to be given this status will be those that already collect and process very large amounts of personal data. The ICO supports the CMA's proposal to adopt bilateral memorandums of understanding with those regulators, such as the ICO, with whom it has a statutory duty to consult.
Finally, the ICO says that further work is needed on the section in the CMA's guidance on accepting compliance with data protection laws as a "reasonable excuse" for non-compliance with the DMCCA to ensure that firms are not forced to choose between breaching data protection law and having a penalty imposed by the CMA.
ICO publishes blog post and video series encouraging app users to read privacy notices
In a blog post on its website, the ICO warns users that signing up to an app often involves "handing over large amounts of your sensitive information" and encourages them to read the privacy notice properly.
The ICO refers to last year's review of period and fertility apps, which was undertaken to understand whether the operators were processing people's personal data responsibly and which resulted in the regulator reminding all app developers of the importance of protecting users' personal data.
The ICO has also produced a series of short videos for users, encouraging them to ask themselves these key questions when signing up to an app:
- Is the privacy notice clearly written and easy to understand?
- Will they delete your data when you do not want to use the app any more?
- What measures do they have in place to prevent hackers from accessing your personal information?
- Who are they sharing your information with?
- Are you happy with where your personal information could end up?
The blog post is a useful reminder for app operators of the issues that must be addressed in the privacy notice to ensure that users can respond positively to the key questions the ICO encourages them to ask.