Artificial Intelligence (AI) is top of mind for organizations worldwide because machine learning is replacing manual, human-based computation with higher efficiency and better security at scale. Over forty-seven percent of digitally mature organizations, as well as those that have advanced digital practices, indicate that they have a well-defined AI strategy. So now with the European General Data Protection Regulation (GDPR) in full effect, how will AI handle the processing of personal data? With many organizations wanting to correlate large data sets using AI, what are some unexpected correlations that could potentially put your organization in harm’s way? Will the logic behind the AI doing the analysis of personal data be in compliance with GDPR? Let’s take a look at some of these most frequently discussed topics on using AI in regard to GDPR.
Processing Personal Data Has Restrictions
Among the many obligations organizations deploying AI to process customer data face is that the AI must comply with GDPR rules governing the processing of personal data for EU data subjects. Note that "EU data subjects" can include anyone–even non-EU citizens–who login and provide personal information on platforms while they're in the EU. This bears repeating. The GDPR applies to the personal data of all individuals in the EU, regardless of whether those individuals are EU citizens. For organizations deploying artificial intelligence platforms that interact with personal data inside the EU, attempting to restrict or segment the data based solely on location or nationality can present a formidable issue, given the ease with which data crosses national boundaries. As we will see, there is a way around this conundrum.
Unexpected Data Correlation
Organizations frequently use AI machine learning to look for correlations in very large data sets that are too complex and varied for humans to efficiently recognize patterns of value. These algorithms continually learn by processing massive amounts of data. This dependence on large data sets leads to potential conflicts with the GDPR. The GDPR requires that organizations minimize the amount of personal data they process, as well as have a legal reason to process that data with explicit consent from data subjects. Organizations may not initially know what correlations they can expect to find, but it’s not necessarily the unanticipated correlations that are the primary issue, but rather that the GDPR requires data collection and processing to be for a specific purpose. An organization can’t just decide to process personal data sets to look for interesting patterns simply because it chooses to do so. The organization must have a lawful and business-necessary reason to process personal data and it must clearly communicate the reason for the processing to the affected individuals. A simple example of where an unanticipated correlation might be problematic would be a retailer analyzing the shopping habits of its customers but inadvertently revealing details of their religious beliefs or sexual orientation.
Logic Behind AI Decisions is Subject to GDPR Compliance
AI algorithms are very complex by their nature and the logic behind how decisions are reached—often not completely understood even by AI designers—is subject to GDPR compliance and enforcement. The issue can be minimized by ensuring that if an organization is making automated decisions based on personal information, the logic involved can be explained to individuals who request it. In instances where AI complexity makes this difficult, human involvement should be applied to make the ultimate decision, such as reviewing AI-suggested outcomes.
Privacy Notice Front and Center
U.S. organizations deploying AI must also deal with GDPR requirements, such as the obligation to transparently communicate the rights of the data subject in multiple ways. A privacy notice that clearly articulates what personal data an organization collects and how that data will be used is absolutely essential for complying not only with the GDPR, but U.S. privacy regulations, such as California’s new privacy rules. Platforms that have visual space limitations, such as smartphones, a layered privacy notice can be designed to provide key privacy information up front with links to more detailed information for those individuals that want it.
Pseudonymize Personal Data and Outsource Your Risk with Tokenization
The 99 articles of the GDPR create a number of obligations for organizations, but the core of the legislation is designed to protect personal data. Within the GDPR, there are multiple references to data pseudonymization as an appropriate technical measure for protection of personal data. Pseudonymization is synonymous with tokenization, a data protection measure TokenEx has been providing to many organizations for almost a decade. Relying on our industry leading data protection platform, TokenEx clients pseudonymize identifying elements of personal data in their environments through cloud-based tokenization, thus enabling them to demonstrate GDPR compliance through consideration of data protection by design and by default, as well as implementation of appropriate technical measures such as encryption and cloud data vaulting. In effect, clients outsource their GDPR privacy risks to TokenEx, while at the same time lowering their data protection compliance obligations. Remember: No Data, No Risk, Lower Compliance Costs.
John Noltensmeyer, CIPP/E/US, CIPM, CISSP, ISA is the Privacy and Compliance Solutions Architect for TokenEx. TokenEx is the industry leader for tokenization, encryption, and data vaulting. Follow us on Twitter and LinkedIn.