Before using AI for personal data - check your A-I
A mnemonic for data protection and AI
I love mnemonics – words or phrases made up of different points that spell out something that would otherwise be difficult to remember.
I relied on them throughout university and law school and have used them ever since – whether that’s checking for technical and organisational measures used by business (BATFIST*) or ensuring that, when acting as a processor that you have the right measures in (CRAPNOW*).
So I’m delighted to have another one – specifically for looking at the data issues around AI. These are based on the questions suggested by the ICO in its white paper on AI but made (hopefully) made more memorable!
Before you AI you should ask yourself (and your organisation these questions, from A – I)….
A is for “Automated Decision Making”
Will the AI be used to subject individuals to automated decisions (e.g job applications, eligibility for insurance)? If so – and these have legal or similarly significant effects then individuals have rights under Article 22 of UK GDPR to object to that automated decision making.
B is for “Basis” - What is your lawful basis for processing personal data?
If you are processing personal data you must identify an appropriate lawful basis, such as consent or legitimate interests. On what basis are you processing data using AI? Do you have consent? Is this pursuant to a contract, do you have a legitimate interest or another legal basis?
C is for “Controller” – Are you Controller or processor?
If you are developing generative AI using personal data, you have obligations as the data controller. If you are using or adapting models developed by others, you may be a controller, joint controller, independent controller or a processor. You need to work out what your relationship is with the data subject in order to work out your obligations and make sure you have the right contractual basis for that.
D is for “Data Protection Impact Assessment” (DPIA)?
You must assess and mitigate any data protection risks using a DPIA before you start processing personal data. Your DPIA will need to be kept up to date as things change.
E is for “Explicit” – have you told data subjects?
You must be explicit about the processing, making information about the processing publicly accessible unless an exemption applies. Unless it take disproportionate effort, you must communicate this information directly to the individuals the data relates to.
F is for “Fulfil” – does your use of data fulfil the purpose you stated?
You must collect only the data that is adequate to fulfil your stated purpose. The data should be relevant and limited to what is necessary
G and H are for “Guard” against “Harmful” security risks?
This is about data security. Consider the various risks of cyber-attack and the use of (for instance) Chat GPT plug-ins.
(You might think I cheated with the G & H referring to the same question, but it’s my mnemonic – so I’m sticking with it!)
I is for “Individual rights”
You must be able to respond to people’s requests for access, rectification or other individual rights. How will you respond to DSARs that involve the AI system that you are using?
Do you find this useful?
I have reduced much of my professional (and personal) life to a series of mnemonics.
Do others do the same – or just me?