Data protection: Guidance for employees
Data protection by Design and by Default
Data protection by design aims to define and evaluate risks to the personal data to be processed, and to determine and implement appropriate controls to reduce those risks to an appropriate level; Data protection by default aims to ensure that data protection is a natural consideration embodied within the process, not bolted-on as an afterthought.
The GDPR requires that the data controller (and/or data processor) implements appropriate technical and organisational measures that are designed to implement data protection principles in an effective manner. This balances the risks posed by the processing with the cost of implementation of measures and what is possible.
In support of this the University has developed a Data Protection Impact Assessment template to capture key information about the personal data being processed, the risks to that data and the controls to address them. Whilst this is intended to completed as a first step for a new solution or data processing activity, it could be used to understand and document risks to an existing activity. It may be more costly and difficult to implement controls in this way than with a new activity, but the benefits should outweigh the impact of a personal data breach or loss.
It is essential when considering activities processing personal data that the end-to-end processing is considered and that all relevant supply chain activities are included.
The ICO has not yet developed any specific and will publish something in the future, but does have existing
- Personal Data in Training and IT Development
-
Using personal data in training materials (e.g. user manuals, Powerpoint presentations) may present a breach of data protection legislation. You must ensure that any data used in training materials is fictional data created for the purpose. If this is not practical for any reason please consult the Data Protection Department for advice prior to developing materials.
Wherever possible, the same precautions should be taken when using IT system development and test environments for both in-house and third party solutions.
In addition, care should be taken when considering reproducing any data that could be commercially sensitive, e.g. financial information.
-
Direct Marketing
-
Direct marketing activities are covered by several pieces of UK legislation in relation to personal data, technologies and means, supported by guidance from the Advertising Standards Authority. Direct marketing covers the promotion of aims and ideals as well as the sale of products and services.
Postal mail marketing is less strict than electronic marketing. The Privacy and Electronic Communications Regulations (PECR) provide rules about sending marketing and advertising by electronic means, including phone, fax, email, and messaging. PECR rules also include use and management of website cookies and telephone directories.
Use of personal data for direct marketing purposes falls within the general principles of the data protection legislation, including the General Data Protection Regulations (GDPR). In addition they place particular conditions on processing personal data for direct marketing purposes, whether electronic or otherwise.
For example, within the GDPR use of personal data for direct marketing relates to the lawful basis under which the data was collected (so potentially whether consent is required), individual rights including the right to be informed as to the purposes for which their data will be processed, and the right to object to specific processing (so processing for one purpose may be acceptable but use for direct marketing may not). Furthermore the data subject may also bring particular requirements to bear, with children having special conditions.Under PECR, organisations must not use electronic direct marketing to individuals without their prior consent. PECR includes an exception for previous customers, known as the soft opt-in. However, the GDPR has tightened rules relating to consent, which require that where different processing activities take place, consent options should be unbundled so that an individual has more choice over what their data is processed for. Opt-in needs to be more explicit and cannot be pre-selected, i.e. option boxes should be unchecked by default and require user action to demonstrate their consent and understanding of the data processing.
Where an individual objects or removes consent to direct marketing, those activities must stop within a reasonable time. However, rather than remove an individual from a contact list, it may be necessary to retain the contact's details and recorded opt-out in order to prevent them from re-appearing in the contact list at a later date.
Refer also to the ICO's published guidance on .
Mailing Lists
If you send newsletters or communications to subscribers on a mailing list you should be satisfied that you have established a lawful basis to process this information. Document the reasoning for the selection of the lawful basis, for example within a record of the processing activities, covering areas such as processing purposes, data sharing and retention. This should be retained to demonstrate compliance with GDPR.
GDPR also increases the data controller's obligations with regard to bought-in lists. The data controller must verify that contacts on the list have consented to use of their personal data for such purposes.
- Filming and photography
-
This guidance relates to images of people, including photos and videos, for the University鈥檚 own purposes. It applies to images that are planned to be taken in the future as well as those currently stored by the University.
If you are engaging a third party to take some pictures for publicity or marketing, then you need to take extra consideration of the purpose and the information that will be recorded.
Do: Consider whether there is an alternative to using people in images. If not,
Do: Consider the purpose of capturing the images, and any supporting data that you are recording regarding the content of the images. In general, if an image can be used to identify a living individual, it is likely to be personal data and that by taking and using these images you will be processing that personal data.
Do: Identify whether images you plan to capture are personal data.Examples of images not likely to be personal data Examples of images likely to be personal data A picture of a face A picture of a face with information that identifies the individual, e.g. name badge, caption specifying a year and degree subject, annotated names or metadata (a set of data that describes and gives information about other data) in the file A video of someone in the street Use of the video when investigating an incident, whether the subject of the video is involved or not, would make the image personal data if they could be identified Images of a crowd or a group of individuals Body Worn Video. This is typically used for safety and security reasons and to provide evidence of activities, so the images processed are likely to be personal data. Please refer to the operational procedure for assistance in the selection and use of Body Worn Video Images taken for personal use only e.g. photos taken by family members at a graduation ceremony. You should be able to do so unless there are other local conditions that prevent it such as a ban on photography and filming What should I do if images are not personal data?
Do: Respect the preferences of individuals who request not to be recorded
Do: Inform those present that images are being taken
Do: Consider making announcements or putting up signs to inform event attendees that images will be captured by the University
Do: Consider issuing coloured badges or lanyards that can be used to identify those individuals who did not want to be included in any imagesWhat should I do if images are personal data?
Do: Use the University鈥檚 guidance to ensure that the appropriate lawful basis is identified to enable you to process the personal data.
Do: Remember that if consent is the lawful basis, this consent can be withdrawn at any time and the participants will need to be informed of this right.
Do not: Continue to use existing images of individuals, or take further images of them, where consent was the lawful basis and this has been withdrawn. To do so would be in breach of data protection legislation.
Do: Refer to the University鈥檚 guidance to ensure that privacy information is given to those being photographed / filmed.
Do: Contact dataprotection@sheffield.ac.uk for advice if needed
Do not: Use images for another purpose other than the one stated - Individual Rights
-
Data subjects have a range of specific rights that they can exercise under GDPR. The University needs to ensure we are able to comply with these rights to meet the GDPR requirements. We must ensure we respond without undue delay and within one calendar month, although this can be extended by a further two months in certain circumstances.
Individual rights are provided for under GDPR as follows:
Right to be informed Individuals have the right to be made aware of how their personal data is being used. This should be documented and communicated in a Privacy Notice available at the point of data collection. Right of access Individuals have the right to access their personal data so that they are aware of and can verify the lawfulness of the data processing, as well as correcting any inaccuracies in that data. There are some circumstances under which the University will consider a request for access to personal data on behalf of another individual, or a request for access to personal data of another individual without their consent. Right to rectification Individuals have the right to have personal data rectified where it is inaccurate or incomplete. Right to erasure Individuals have the right to request the deletion or removal of personal data where there is no compelling reason for its continued processing. This is often called the 'right to be forgotten'. This right is not absolute and only applies in specific circumstances. Right to restrict processing Individuals have the right to ask us to temporarily stop processing their personal data in certain circumstances whilst such processing is reviewed. Right to data portability Individuals have the right to obtain and reuse their personal data for their own purposes across different services. It allows them to move, copy or transfer personal data easily from one IT environment to another in a safe and secure way, without hindrance to usability. Note that this only applies: - To personal data an individual has provided to a controller,
- Where the processing is based on the individual鈥檚 consent or for the performance of a contract, and
- When processing is carried out by automated means.
Right to object Individuals have the right to object to: - Processing based on legitimate interests or the performance of a task in the public interest/exercise of official authority (including profiling)
- Direct marketing (including profiling)
- Processing for purposes of scientific/historical research and statistics.
Right in relation to automated decision making and profiling Individuals have the right not to be subject to a decision made solely by automated means and to profiling (automated processing of personal data to evaluate certain things about an individual).
Lawful basis
The General Data Protection Regulation (GDPR) requires that a data controller establishes a lawful basis for each and every personal data processing activity it performs directly, or indirectly via any data processors.
There are six lawful bases available and each requires that processing is 'necessary'. If the same outcome can be achieved without processing the personal data then to process it would be unlawful. The lawful bases are:
- Consent
- Contract
- Legal obligation
- Legitimate interests
- Public task
- Vital interests
- Consent
-
Consent is one of the six lawful bases under which personal data may be processed in compliance with the General Data Protection Regulation. The guidance below should help determine whether consent is an appropriate legal basis for a proposed personal data processing activity and if so, help to document consent adequately.
Is consent always required?
No. The GDPR requires that personal data processing activities are all underpinned by a legal basis for the processing. There are six lawful bases, of which consent is one. The most appropriate legal basis should be established according to the particular data processing activity to which it relates, including the nature of the relationship of the University to the data subject(s) and the purpose of the processing.
Explicit consent is one of the ways in which to legitimise the processing of special category data (sensitive personal data), but is not the only way this can be done.
When is consent necessary?
Consent is necessary when none of the five other lawful bases can be relied upon.
When is consent inappropriate?
It is inappropriate to seek consent as a lawful basis for the processing of personal data when the data subject has no real choice regarding the proposed processing activity.
Consent should not be chosen as a lawful basis if:
The University would still process the personal data using a different lawful basis if consent was refused or withdrawn. Seeking consent in these circumstances would be misleading and unfair. The 'different' lawful basis should be used instead of consent Consent is being sought as a precondition of using a service The University is in a position of power over a data subject, meaning that the individual does not feel that they have a genuine choice with regards providing or refusing consent.One example of a way of showing good practice within the University would be to hold a copy of a signed and dated consent form within a student file. Similarly, consent records relating to a research project should be maintained within the project documentation.
What do I need to do if an individual withdraws consent?
An individual has the right to withdraw consent at any time. A data controller must ensure that it is easy for an individual to withdraw consent as it was for them to provide it. Where possible, an individual should be able to withdraw consent in the same way that they provided it.
For example, if the University were to provide the public with an online form to provide consent to receiving a newsletter, the University should also provide an online form to allow the public to withdraw consent, made easily available via a link within the newsletter itself.
Individuals must be able to withdraw consent without suffering any detriment. If an individual withdraws consent, the lawfulness of the processing of their personal data up to that point is not affected. However, consent can no longer be relied upon as the lawful basis for continuing to process it. An alternative lawful basis will need to be established or the processing must stop as soon as possible. Where processing must stop, in some cases the University will be able to action this immediately, whilst in others a short delay may be justified.
When seeking consent to process personal data for a number of purposes or types of processing, a separate opt in should be provided for each, unless valid consent can be sought collectively. Individuals should not be asked to consent to 'all or nothing' in these circumstances as they may wish to consent to some of the purposes or types of processing but not others.
How should consent be recorded?
An audit trail must be created to show how and when consent was given so that evidence can be produced in the event of a challenge. Good record keeping will also help support the monitoring and refreshing of consent. Records must demonstrate the following:
Who consented (e.g. the name(s) of the individual(s) or some other identifier When the consent was provided (e.g. a copy of a dated document, an electronic time stamp or a note of the time and date that verbal consent was given What the data subject(s) was/were told at the time they provided consent (e.g. a copy of a document or data cpature form or a copy of a script used to gain verbal consent) How the data subject(s) consented (e.g. online, on paper or verbally) Whether consent has been withdrawn and, if so, when.The method that is used to obtain consent must result in an unambiguous indication by clear affirmative action. This means that individuals must be asked to opt in. Neither an opt out approach nor a lack of response is valid. Examples of active opt in mechanisms are:
Signing a consent statement on a paper form Ticking an opt in box on a paper or electronic form Clicking an opt in button or link online Selecting a 'yes' button' where equally prominent yes/no options are available Choosing preference settings Answering yes to a clear oral consent request Volunteering optional information for a specific purpose e.g. completing optional fields in a form which includes a notice explaining how the personal data will be processed.A balance needs to be struck between ensuring that individuals are provided with enough information to make an informed decision as to whether to give consent, and keeping the request itself concise and easy to understand. A request for consent should be prominently displayed and kept separate from general terms and conditions.
Which methods can be used to obtain consent?
What information should be included in a request for consent?
Consent must be specific and informed, so a request for consent should include as a minimum:
The name of the University plus the department, college or support service that will process the personal data The name(s) of any third party that will also rely upon the consent (it is not acceptable to provide only broad categories of third party) The reason why the personal data is being requested The activities for which the personal data will be used Instructions on how individuals can withdraw consent at any time.For example, it would be inappropriate for the University to seek consent from students in order to process their personal data for the purpose of assessing academic performance within their academic programme. This is because the University would process the personal data anyway on a different lawful basis if consent was withheld or withdrawn so students would not have a genuine choice. The appropriate lawful basis upon which to rely in this instance is 'Contract', since the processing is necessary to fulfil the contract between the University and its students.
Similarly, if the processing of personal data is a condition of receiving a service, consent will not be an appropriate lawful basis for the processing and an alternative lawful basis should be established.
Consent is also unlikely to be an appropriate lawful basis where there is a clear imbalance of power between the University and the data subject(s). Individuals who depend upon the University's services, or fear adverse consequences of failing to agree to the processing of their personal data, might feel that they have no choice but to provide consent. In such circumstances, consent cannot be said to be freely given and therefore another lawful basis must be established. For example, it would be unfair to ask employees to provide consent for the University to monitor their work email accounts as it is likely that some employees would feel compelled to agree to the monitoring for fear of otherwise being labelled as 'difficult' or 'hiding something'. An activity such as email monitoring should only take place if an alternative legal basis for it can be established and employees have been provided with a privacy notice.
What constitutes valid consent?
For consent to be valid it must be freely given, specific, informed and provide an unambiguous indication of the individual's wishes.
Freely given
Individuals must be able to withhold consent without detriment and must be able to withdraw consent at any time. Consent would not be regarded as freely given if an individual neither has a genuine choice nor the ability to refuse or withdraw consent without detriment.
Specific and informed
When consent is requested, an individual must be provided with a clear explanation of the data processing activity that is to take place. A data controller should neither request that consent is provided for 'potential' activities, nor 'just in case' it might be useful in future, nor seek non-specific 'blanket' consent to cover different activities.
Unambiguous indication
It must be obvious that an individual has provided consent, and obvious what data processing activity is covered by the consent. The individual must take a positive action to opt in. An opt-out does not constitute valid consent, so silence, inactivity, default settings or pre-ticked boxes on forms cannot be relied upon.
Consent for some personal data processing activities can be implied by the actions of an individual, but would not extend beyond what was obvious and necessary. For example, an individual who drops their business card into a prize draw box at a conference is implying that they consent to their personal data being processed for the purpose of the prize draw. The consent does not, however, extend to the processing of the personal data for any other purpose, such as marketing.
What is explicit consent?
Explicit consent is not specifically defined by data protection legislation. It differs from other forms of consent because there is, nonetheless, an expectation that it is affirmed in a clear statement. Consent inferred from an individual's actions constitutes implied consent rather than explicit consent, however obvious the individual's expectations.
For example, the University wishes to seek consent from individuals to use their email addresses to send them information about future events.
Using the following method instead would result in valid consent:
Please provide your email address and indicate that you are willing to receive emails from 葫芦影业 University about future events by ticking the declaration box below:
Email address (optional)
鈽 I consent to receiving emails from 葫芦影业 University about future events
If the individual ticks the box, they will have explicitly consented to the processing activity.
How long does consent last?
There is no specific time limit on consent. A data controller will need to consider what information was provided to the data subject at the time that the consent was sought and the individual's expectations. For example, the University requests that students consent to receiving emails from the University containing tips on making healthy meals within a student budget. It would be appropriate to rely on consent as the lawful basis for carrying out this activity for the duration of time that each student is enrolled at the University. A student would have a legitimate expectation that the emails (and related data processing activity) would cease once their relationship with the University ended, normally at Graduation. Consent would therefore expire when a student left the University. Consent must be managed separately for each data subject.
Do we need to consider capacity to consent?
Data protection legislation does not contain specific provisions regarding an individual's capacity to consent but capacity is bound up in the issue of 'informed' consent. Generally, the University can assume that an adult has the capacity to consent unless there is reason to believe the contrary. Where there is doubt, please contact the University's Data Protection Department.
Do we need to consider childrens' capacity to consent?
Data protection legislation does not contain specific provisions regarding a child's capacity to consent but the GDPR does contain provisions relating to 'information society services' (services requested and delivered over the internet).
Where a data controller wishes to offer information society services directly to children and wants to rely on consent rather than another lawful basis for personal data processing, the data controller must gain parental consent on behalf of any children under the age of 16. Mechanisms must be put in place to verify the ages of the children and to make reasonable efforts to check parental responsibility for those below the age of 16.
For other types of personal data processing, a data controller should consider on a case by case basis whether the children have the capacity to understand and consent to the activity for themselves. Where the data controller does not believe that a child has this capacity, an adult with parental responsibility should be asked to provide consent on behalf of the child.
Parental consent will always expire once a child reaches the age at which they can provide consent for themselves. Consent will therefore need to be refreshed at appropriate milestones.
What do I need to consider when seeking consent to process personal data for scientific research?
It may not be possible for the University to establish and communicate the precise purpose(s) for which personal data will be processed at the point at which the data is collected. Data protection legislation acknowledges this by allowing the purpose(s) to be less clear than they need to be for the processing of personal data in any other context. The University must, however, identify the general area(s) of research and, where possible, give the research participants the option to consent to the processing of their personal only within limited areas of the research or within limited research projects.
How should a request for consent be set out?
A request for consent should meet the following requirements:
Be separate from any general terms and conditions Be prominent Be written in clear, straightforward language Avoid technical and legal jargon and any potentially confusing terminology Be easy for the intended audience to understand Be concise and specific and avoid vague or blanket wording.
Contracts
It is essential that the role of the University in any data processing arrangement is understood by those responsible for the selection, placement, management and termination of a contract. The University might perform the role of Data Controller or Data Processor depending on factors, such as how much control it has over the data processing and how it obtained the data. In simple terms a Data Controller owns the data and determines how it should be processed, whereas a Data Processor only acts on the instructions of a Data Controller. If a Data Processor determines the purpose and means of processing then it will be considered to be a Data Controller and will have the same liability. By way of example:
- The University processes personal data about students who have enrolled on its programmes - University is a Data Controller
- The University engages a third party to provide a cloud-based solution that will contain employee or student data but which the third party will not use themselves - University is a Data Controller; Third party is a Data Processor
- The University is engaged by an external body to conduct analysis of a dataset that the external body provides - University is a Data Processor; External body is a Data Controller.
The GDPR enhances the obligations and direct responsibilities of a Data Processor over those that were in place under the Data Protection Act.
- Key Points
-
- The GDPR makes written contracts between Data Controllers and Data Processors a general requirement, rather than treating them as a general security and compliance control. Written contracts can be in electronic form.
- Contracts set out responsibilities and liabilities of both parties.
- Contracts must include specific terms in relation to data protection, which are designed to ensure that any processing carried out by the Data Processor on behalf of the Data Controller meets all the requirements of the GDPR.
- ata Processors must only process personal data on and in accordance with the written instructions of the Data Controller.
- Sub-processors can be engaged by a Data Processor, but the initial Data Processor remains liable to the Data Controller for the performance and compliance of such sub-processors.
- If a Data Processor determines the purposes and means of processing, they shall be considered a Data Controller with respect to that processing.
- The European Commission may develop standard contractual clauses for implementation between Data Controllers and Data Processors, but has not done so in advance of the GDPR applicability date.
- A certification scheme for organisations to demonstrate compliance with GDPR has not yet been developed. Existence of such a scheme would greatly help supplier selection. Until such a scheme exists the University will need to make its own assessment of potential suppliers and manage the performance of existing suppliers. The University must only appoint Data Processors who can provide 鈥榮ufficient guarantees鈥 to implement appropriate technical and organisational measures to meet the requirements of the GDPR and to protect the rights of the data subjects.
- Notwithstanding any contractual agreements, the ICO can hold Data Processors directly responsible for non-compliance with the GDPR and impose sanctions (including warnings, fines and even a ban on data processing.
- Differences Between Data Controllers and Data Processors
-
The recognised definitions of the two roles are:
- Data Controller - The natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data. Where two or more controllers jointly determine the purposes and means of processing, they shall be Joint Data Controllers
- Data Processor - A natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller.
Decisions that would be made by a Data Controller include:
- Collecting the personal data in the first place and the legal basis for doing so;
- Which items of personal data to collect, i.e. the content of the data;
- The purpose or purposes the data are to be used for;
- Which individuals to collect data about;
- Whether to disclose the data, and if so, who to;
- Whether subject access and other individuals鈥 rights apply i.e. the application of exemptions; and
- How long to retain the data or whether to make non-routine amendments to the data.
A Data Processor is bound by the agreement with the Data Controller in assisting them to fulfil their obligations under the GDPR. The above details may be very prescriptive in how the Data Processor carries out the above, but may have some flexibility in determining how to deliver their obligations, for example using its technical and specialist knowledge. The Data Processor may therefore decide, for example:
- What IT systems or other methods to use to collect personal data;
- How to store the personal data;
- The detail of the security surrounding the personal data;
- The means used to transfer the personal data from one organisation to another;
- The means used to retrieve personal data about certain individuals;
- The method for ensuring a retention schedule is adhered to; and
- The means used to delete or dispose of the data.
- What needs to happen with a contract you receive
-
University contracts must be sent through to the University Secretary鈥檚 Office. Data protection may be covered within the body of the contract or, preferably, be addressed within an appropriate addendum, such as a Data Sharing Agreement or Data Processing Agreement.
Only in exceptional circumstances should a contract or Data Processing Agreement provided by a supplier be used.
Contracts (including Data Sharing Agreements and Data Processing Agreements) are not required for sharing data with colleagues or other departments within the University. You should still consider the risks associated with sharing data with others within the organisation and the privacy rights of the individuals concerned.
Contracts must set out the:
- Subject matter and duration of the processing
- Nature and purpose of the processing
- Type of personal data and categories of data subject
- Obligations and rights of the Data Controller.
Contracts must also include as a minimum the following terms, requiring the Data Processor to:
- Act only on the written instructions of the Data Controller.
- Ensure that people processing the data are subject to a duty of confidence
- Take appropriate measures to ensure the security of processing
- Only engage sub-processors with the prior consent of the controller and under a written contract
- Assist the Data Controller in providing subject access and allowing data subjects to exercise their rights under the GDPR
- Assist the Data Controller in meeting its GDPR obligations in relation to the security of processing, the notification of personal data breaches and data protection impact assessments
- Delete or return all personal data to the Data Controller as requested at the end of the contract
- Submit to audits and inspections, and provide the Data Controller with whatever information it needs to ensure that they are both meeting their Article 28 (Processor) obligations
- Tell the Data Controller immediately if it is asked to do something infringing the GDPR or other data protection law of the EU or a Member State.
- In the exceptional circumstances where a Data Processing Agreement has been provided by a Data Processor, ideally we should refuse and use an Agreement prepared by the University.
Note:
Templates for Data Sharing Agreements and Data Processing Agreements are being created for University use that embed the above requirements. Once completed these will need to be returned to the Data Protection Team. Until these are available, please submit your request to dataprotection@sheffield.ac.uk and they will coordinate an appropriate response.
- The University as a Data Controller
-
When the University owns the data and determines the purposes and means of processing, it is a Data Controller. The Data Controller is responsible for implementing appropriate technical and organisational measures to manage data processing risks, including probability and impact assessment which includes the rights and freedoms of data subjects.
As a Data Controller the University may wish to use another organisation to process personal data on its behalf 鈥 that organisation would be a Data Processor. The agreement between the two parties should be subject to a contract. A Data Processing Agreement would be used in this case, where the clauses are not directly embedded within the contract conditions.
The University remains ultimately responsible for the data processing. Engagement of a Data Processor does not remove the obligation to carry out this risk assessment or absolve the University of its liabilities. Therefore, the University is subject to any penalties or sanctions that can be imposed for a failure to comply with the GDPR. This can include:
- Improvement notices to bring processing into compliance
- Administrative fines, of up to 2% of turnover / 鈧10m for infringements of obligations relating to Data Controllers or Data Processors, or 4% of turnover / 鈧20m for infringements of the basic principles or data subjects鈥 rights.
- Restriction of data processing, or a temporary or permanent ban on data processing
When setting up a contract or Data Processing Agreement, you need to understand and give good regard to the requirements described above. You must be clear from the outset what it is that you are contracting for and what the extent of processing is.
The standard terms then describe responsibilities and liabilities such that:
- Written instructions 鈥 the University as the Data Controller tells the Data Processor what they are required to do before they do it
- Duty of confidence 鈥 the Data Processor鈥檚 employees and temporary or agency staff are bound by a commitment to confidentiality, such as a contractual obligation or non-disclosure agreement
- Security of processing 鈥 The Data Processor must take appropriate measures to protect the security of the data, having taken account of the risks. It is clear that risk should consider the impact on the data subject, e.g. in the event of a data breach. The University should have its own risk assessment and should be able to share relevant information with the Data Processor
- Sub-processors 鈥 must not be engaged without the prior consent of the University and must then be under a written contract that fulfils the same obligations under the GDPR regarding Data Processors. The original Data Processor will remain liable to the University for the performance of the sub-processor(s) they engage. The University must be informed in writing of any proposed changes to or replacement of those sub-processors. Where sub-processors are in place at the outset of the contract, these must be identified and the sub-processor must be able to confirm that appropriate contracts are in place with those sub-processors
- Individual rights 鈥 the Data Processor must support the University in allowing data subjects to exercise their rights under the GDPR, such as supporting subject access requests and requests for rectification or erasure of personal data. The Data Processor will be handling the data on a day-to-day basis so it is important that the processes and touchpoints are understood, even if these are not captured within the contract. Note that the first term 鈥榃ritten Instructions鈥, above, supporting documentation could be provided that addresses these processes and interactions. A Service Level Agreement, within or supporting the contract, may need to be set up to ensure that statutory timelines for fulfilment of requests are met.
- Assist the Data Controller 鈥 the Data Processor must help the University to meet its GDPR obligations to keep personal data secure, to notify data subjects and the ICO in the event of a data breach and in carrying out relevant data protection impact assessments
- End of contract 鈥 the University shall determine what happens to the data at the end of the contract and stipulate this to the Data Processor. Electronic files may be stored in backup solutions and these need to be considered, particularly with regard to the time taken for normal processes to remove the data from backups.
- Audits and inspections 鈥 the Data Controller may be required to demonstrate compliance with their GDPR obligations as a Data Processor. This may be by an audit carried out by or on behalf of the University, or through provision of other information. They should maintain their own records of data processing (like an Information Asset Register) and be able to support this with, for example, evidence of performance, internal/external audits, penetration testing, incident and breach investigations
- Infringements 鈥 The Data Processor must inform the Data Controller immediately if it is asked to do something infringing the GDPR or other relevant data protection law.
The University can be a Joint Data Controller with another organisation where both parties jointly determine the purposes and means of processing. In this situation a Data Sharing Agreement shall be used.
- The University as a Data Processor
-
Where the University processes data on the instructions of another organisation it is a Data Processor. The other organisation should ensure it has its own Data Processing Agreement in place with the University.
The University may, subject to prior written authorisation from the Data Controller, contract a sub-processor. This arrangement with a sub-processor must also be covered by a contract with equivalent conditions. In this case the University will, as the original processor, remain directly liable to the controller for the performance of the sub-processor鈥檚 obligations. Any intended change to that sub-processor regarding addition to or replacement by other sub-processors must be notified in writing to the Data Controller to allow them the opportunity to object. This should also apply in the event of any notified or intended sale/takeover of the supplier organisation.
As a Data Processor the University shall implement technical and organisational measures to ensure the security of the data appropriate to the risks, including:
- Pseudonymisation and/or encryption of data
- Confidentiality, integrity, availability and resilience (business continuity) controls
- Disaster recovery and backup controls and processes
- Testing, assessing and evaluating the effectiveness of controls.
The University must ensure that its staff only process the data in accordance with the instructions of the Data Controller, i.e. as per the Data Processing Agreement.
In the event of a data breach the University must inform the Data Controller as soon as it becomes aware. This may require some internal work to identify the nature and scale of the breach prior to informing the Data Controller. Follow the University鈥檚 process to report a data breach in the first instance. Breach reporting to the ICO has a limited time for compliance and is subject to penalties for non-compliance. The ICO can directly penalise a Data Processor and a Data Processor may also be liable to pay compensation to data subjects.The University must co-operate fully, on request, with the ICO in the performance of its tasks. All notifications to the ICO are undertaken by the Data Protection Officer.
Anonymisation and pseudonymisation
Obscuring or removing personal data from datasets allows information to be used more widely and may be particularly useful to the University in the areas of research and management information and reporting. Obscuring or hiding the personal data elements can be achieved in a number of different ways depending on the nature of the data and the need to use or share it. The common terms are anonymisation and pseudonymisation, which are described as:
- Anonymisation is the 鈥減rocess of rendering data into a form which does not identify individuals and where identification is not likely to take place鈥
- Pseudonymisation is the 鈥減rocessing or personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person鈥.
Data protection legislation does not apply to data that has been rendered anonymous when it has been carried out in such a way that the data subjects are no longer identifiable. However, complete anonymisation is not always possible and the potential for that anonymity to be undone needs to be considered when using or disclosing the information. If the risk of identification is reasonably likely, the information should be regarded as personal data. The ICO uses the term 鈥榬e-identification鈥 (sometimes known as 鈥榙e-anonymisation鈥) which is described as the 鈥減rocess of analysing data or combining it with other data with the result that individuals become identifiable鈥.
Anonymisation is most effective when considered early in the data lifecycle as it supports data protection principles and the requirement for 鈥楧ata Protection by Design and Default鈥.
Anonymisation is particularly useful in research and when disclosing data outside the institution, but can also be used internally to protect personal data and reduce the risks of inappropriate use or disclosure.
- Anonymisation Techniques
-
Several common anonymisation techniques exist that have key differences and certain attributes that would direct the user to selecting one over the other. Common anonymisation techniques include:
- Data masking, where personal data elements are removed to create a dataset where personal identifiers are not present.
- Partial data removal is a typical example of this but carries higher risks. Depending on the information removed, it is possible for this dataset to be recreated with other data, for example using the Electoral Register to determine individuals based on date of birth and postcode data in a dataset, or postcode data that relates to very low numbers of dwellings in a postcode area.
- Data quarantining is another variation whereby data is supplied to a person who is unlikely or unable to have access to the additional data required to aid re-identification. This is less likely in a connected world as it is difficult to know with some certainty what information the recipient may be able to access.
- Aggregation is a common example of anonymisation used for data analysis where data is displayed as totals rather than individual values. Low value totals are often excluded completely or may be grouped together to produce a larger group, for example a survey that results in a group of less than five data subjects would not be reported. This is a common approach with staff surveys where other data from the survey responses could identify a single person within a department. Within aggregation, there are different ways of achieving anonymisation, partly depending on the size of dataset and the degree of accuracy required for the results. Typically this is not a useful approach for research at an individual-level, but is good for large scale analysis, such as modelling people movements or social trends. It is generally low risk from a disclosure point-of-view (obviously source data is still high-risk) as it is intentionally difficult or impossible to relate any results to a particular individual.
- Derived data uses values of a less granular nature, typically through 鈥榖anding鈥, to hide the exact values, for example using a year of birth instead of date of birth. This is a lower-risk technique because data matching is more difficult. This means that the data can be relatively rich and still useful for individual-level research but with lower risks for re-identification.
- Pseudonymisation is the practice of extracting and replacing actual data with a coded reference (a 鈥榢ey鈥), where there is a means of using that key to re-identify an individual. This approach is typically used where the use of the data needs to relate to individual records, but also needs to retain security and privacy for that individual. Pseudonymised data carries a higher privacy risk and security of the key is essential. Because the data is not truly anonymised, personal data that has been pseudonymised can fall within the scope of data protection legislation depending on how difficult it is to attribute the pseudonym to a particular individual.
- Identification and Re-identification
-
Identification is achieved in one of two basic ways:
- Direct identification, where only a single data source is necessary to identify an individual;
- Indirect identification, where two or more data sources are needed to be combined to allow an individual to be identified.
The difficulty in determining whether the anonymised data you hold or wish to publish is personal data lies with not knowing what other information is available to a third party that might allow re-identification to take place. This requires a case-by-case judgement of the data. Although absolute certainty cannot be guaranteed that no individual would ever be identifiable from the anonymised data, which does not necessarily mean that personal data would be disclosed. UK case law has determined that the risk of identification must be greater than remote and for it to be reasonably likely that the data is personal data is defined in legislation.
Re-identification may be possible from information held by other organisations or that is publically available, such as with an internet search. This can be achieved by trying to match a record from an anonymised dataset with other information, or by trying to match personal data already held with a match in an anonymised dataset. Whilst the former is often seen as the more likely scenario, both have a similar result. Re-identification risk can be mitigated by employing data minimisation principles and only disclosing the anonymised data necessary for the purpose.
The risks of re-identification can change over time, particularly as more information becomes available online, computing power increases and data analysis tools become available to the consumer market. Data that is currently anonymous may not remain that way, so it is important that anonymisation policies, procedures and techniques are regularly reviewed. It is often difficult to totally remove publically available information, so once the anonymised data is published, it may not be possible to recover it to prevent data re-identification.
Identification does rely on more than making an educated guess that information is about someone in particular. Making an educated guess as to someone鈥檚 identity may present a privacy risk but not a data protection one where no personal data has actually been disclosed to the one making the guess. Making an accurate guess based on anonymised data does not mean that personal data has been disclosed. However, the impact of guesswork should be considered through the anonymisation and publication process. Many circumstances of wrong identification applied through guesswork have arisen, with individuals blamed for things they did not do, for example. This is particularly potential when combined with 鈥榩rior knowledge鈥.
- Prior Knowledge
-
If an individual knows a lot about another individual, re-identification is a good possibility although the same would not be possible for an ordinary member of the public. Consider family members, work colleagues and professional roles such as doctors. Such people might be able to learn something new about the data subject from the anonymised data, perhaps through confirmation of existing suspicions.
When considering prior knowledge, assumptions should not be made of what individuals may already know, even among family members. Professionals are likely to be covered by confidentiality and ethical conduct rules and are less likely to fit the role of 鈥榤otivated intruder鈥 with some gain to be made from use of the knowledge.
When considering releasing anonymised data, assess:- The likelihood of individuals having and using prior knowledge to aid re-identification. This may not be possible for individual data subjects in any dataset, so a more general assessment may be required. Consider even whether those individuals would see or seek out the published information.
- The impact of re-identification on the data subjects. Again this may not be possible at an individual level, but could be inferred from the data sensitivity.
- The 鈥楳otivated Intruder鈥
-
The 鈥榤otivated intruder鈥 forms the basis of a test used by the ICO and Tribunal that hears DPA and FOI appeals.
The 鈥榤otivated intruder鈥 is a person who:
- Starts without any prior knowledge but who wishes to identify an individual from an anonymised dataset
- Is reasonably competent
- Has access to resources such as the internet, libraries and all public documents
- Employs investigative techniques, including questioning people who may have additional knowledge of the individual
- Is not assumed to have any specialist knowledge such as computer hacking skills, or to have access to specialist equipment, or to resort to criminality such as burglary in order to gain access to data that is kept securely.
The 鈥榤otivated intruder鈥 is likely to be more interested in some types of information that would support their 鈥榗ause鈥, whether for financial gain, political or newsworthiness, activism or 鈥榟acktivism鈥, causing embarrassment to individuals, or even just curiosity around local events. Data with the potential to have a high impact on individuals is likely to attract a 鈥榤otivated intruder鈥.
The test therefore goes beyond considering whether an inexpert member of the public can achieve re-identification, but not as far as a knowledgeable determined attacker with specialist expertise, equipment and potentially prior knowledge.
It is possible to replicate a 鈥榤otivated intruder鈥 attempt on your anonymised data to test its potential adequacy. Consider:- Using the edited or full Electoral Register to try to link anonymised data to someone鈥檚 identity
- Using social media to try to link anonymised data to a user鈥檚 profile
- Conducting an internet search to use combinations of data, such as date of birth and postcode, to identify an individual.
Third party organisations exist that are able to do this (subject to the necessary contractual controls, of course). They may have knowledge of and access to data resources, techniques or vulnerabilities that you are not aware of.
- Creating Personal Data From Anonymised Data
-
With a range of research and other activities in similar or overlapping areas it is possible that personal data can be created through the combination, analysis or matching of information, or for the information to be linked to existing personal data within the University.
This would require the University to fulfil its responsibilities under data protection legislation for that personal data, potentially starting with informing the individuals of the data processing, which may then become problematic where that processing was not expected by the individuals 鈥 more so if they then object.
Where an organisation collects personal data through re-identification without individuals鈥 knowledge or consent, it will be obtaining personal data unlawfully and could be subject to enforcement action.
- Spatial Information
-
Information about a place can easily constitute personal data as it can be associated with an individual. This relates to electronic devices as much as buildings. Mobile devices such as smartphones contain and generate large amounts of spatial information and have been used effectively in traffic and travel surveys to map origins and destinations as well as through times. Unique identifiers, such as IP addresses, are now considered personal data in data protection legislation.
If trying to anonymise datasets using UK postcode information, the following may be useful to determine potential data groups of anonymised data:
- Full postcode 鈥 approximately 15 households, although some postcodes could relate just to a single property
- Postcode minus the last digit 鈥 approx. 120-200 households
- Postal sector (4 outbound digits + 1 inbound) 鈥 approx. 2,600 households
- Postal district (4 outbound digits only) 鈥 approx. 8,600 households
- Postal area (2 outbound digits) 鈥 approx. 194,000 households
A digital equivalent is removing the final 鈥榦ctet鈥 on IP addresses to degrade the location data they contain.
- Publication vs Limited Access
-
The more that anonymised data is aggregated and non-linkable, the more possible it is to publish it. Of course, not everything is intended for public disclosure and access to a smaller group of people may be intended. Pseudonymised data is often valuable to researchers because of the granularity it affords, but carries a higher risk of re-identification. Release of this data to a closed community is possible where there are intended to be a finite number of researchers or institutions with access to the data. Typically these are controlled by restrictions in contacts and non-disclosure agreements. This allows more data to be disclosed than is possible with wider or public disclosure. Information security controls still need to be in place and managed.
- Freedom of Information
-
The Freedom of Information (FOI) Act, under Section 40, includes a test for determining whether disclosure of personal data to a member of the public would breach the data protection principles. The University would need to consider the additional information that a particular member of the public might have or be able to access in order to combine the data to produce personal data 鈥 something that relates to and identifies a particular individual. It can be difficult to assess what other information may be available, so this is where the 鈥榤otivated intruder鈥 test is useful.
Where an FOI request is refused under Section 40, it may be possible to release some information in an anonymised form that would satisfy the requestor and avoid an appeal and review process. The data protection team can advise in this regard.