DRAFT Mental Health and Wellbeing Analytics Code of Practice Summary 1 DRAFT Mental Health and Wellbeing Analytics Code of Practice Draft 0-95 3 rd March 2020 Comments welcome to [email protected]Summary Whereas learning analytics uses data to inform decisions – from individual to curriculum level – on how to support students’ learning, data may also be used to inform decisions on how to support their mental health and wellbeing. Possible applications cover a very wide range: from screen-break reminders to alerts when a student appears to be at risk of suicide. Clearly such uses of data can involve both significant benefits and high risks. This Code of Practice suggests how universities, colleges and other tertiary education providers can ensure that their use of data to support wellbeing does not create risks for students or staff, taking responsibility and demonstrating accountability for their actions in selecting, developing, implementing, operating and reviewing data-informed wellbeing processes. As the headings in the Code indicate, this will involve working with groups and individuals across the institution: Stewardship, Transparency, Responsibility, Validity, Positive Interventions, Privacy, and Access need to be developed with students, staff, data owners, IT services and university governance, as well as student support services and data protection officers. Universities UK refers to this as a “whole-university approach”; Student Minds’ University Mental Health Charter calls it a “cohesive ethos”. To support these discussions, this Code also includes practical tools – for Data Protection Impact Assessments and purpose compatibility assessment for data sources – that should help to ensure the institution’s activities are, and can be shown to be, both safe for individuals and compliant with the law. Introduction The approach taken by Jisc’s Code of Practice for Learning Analytics provides a good starting point for mental health and wellbeing applications. This Mental Health and Wellbeing Code provides a detailed discussion of additional issues raised by the use of data for wellbeing purposes. Here we concentrate on the use of data in delivering wellbeing and mental health support: broader issues such as duty of care, healthcare treatment, human rights, equality and discrimination are not covered, though we have referenced relevant guidance on those issues where we are aware of it. When delivering wellbeing and mental health support, institutions are likely to be processing personal data concerning health; some forms of analytics may aim to infer such data from other, behavioural, indicators, such as the student’s engagement with learning systems and processes. Thus, as well as meeting the legal standards that apply to all processing of personal data, wellbeing and mental health applications must satisfy the additional conditions and safeguards that apply to Special Category Data. This Code of Practice therefore includes safeguards from several areas of the General Data Protection Regulation (GDPR) and the UK Data Protection Act 2018 that may be relevant when addressing mental health and wellbeing. In particular: • Voluntary wellbeing apps – where each individual makes a positive choice to report or be monitored – could be provided on the basis of “consent”, though this requires both clear and detailed information to be given to users and that their consent be freely given, informed, unambiguous, specific, explicit and recorded;
36
Embed
DRAFT Mental Health and Wellbeing Analytics Code of Practice CoP full... · discussions, this Code also includes practical tools – for Data Protection Impact Assessments and purpose
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
DRAFT Mental Health and Wellbeing Analytics Code of Practice Summary 1
DRAFT Mental Health and Wellbeing
Analytics Code of Practice Draft 0-95 3rd March 2020
DRAFT Mental Health and Wellbeing Analytics Code of Practice Mental Health and Wellbeing Analytics CoP 3
Since the likely legal justification for proactive mental health and wellbeing analytics is to provide support to
individuals, institutions must ensure that adequate services to provide such support will actually be available to
individuals when data, algorithms or other signals indicate that they may be needed (see under Enabling Positive
Interventions/Minimising Adverse Impacts below).
Health and wellbeing applications will require a formal Data Protection Impact Assessment (DPIA), involving
stakeholders and the organisation’s data protection team (see under Responsibility below and Annex A: Data
Protection Impact Assessment template for Wellbeing/Mental Health Analytics).
Mental Health and Wellbeing Analytics CoP Jisc’s Code of Practice for Learning Analytics provides a baseline for supplementary uses of student data. This
Code uses the same headings: for each it highlights key common areas (for which detail can be found in the
Learning Analytics Code) before a detailed discussion of additional issues raised by the use of data for mental
health and wellbeing purposes.
Responsibility
From the Learning Analytics Code:
“Institutions must decide who has overall responsibility for the legal, ethical and effective use of analytics”
“Student representatives and key staff groups at institutions should be consulted about the objectives, design,
development, roll-out and monitoring of analytics”
Confidence and trust among students, staff and wider stakeholders is essential if wellbeing activities are to be
beneficial, rather than harmful. To achieve this, institutions will need to show that they are taking responsibility:
consulting and planning carefully before implementing any policies, processes, systems or data gathering, and
checking to ensure they deliver the expected results. The GDPR’s principle of Accountability addresses many of
these issues: designing processes and systems to ensure they protect personal data and the rights of individuals,
monitoring those processes to ensure they are followed, and reviewing them to see where they can be improved.
This Code suggests various documents and records – assessments of Data Protection Impact and Purpose
Compatibility; Records of Processing Activity, mapping of data flows, and policies on use of Special Category Data
– that the institution can use to demonstrate Accountability and reassure students, staff and stakeholders.
Applications that aim to derive information about an individual’s health are likely to represent a high risk to privacy,
and thus require a formal Data Protection Impact Assessment (DPIA). This includes identifying the relevant legal
basis or bases for processing and ensuring that their specific requirements are satisfied. Several organisations
have published processes for conducting DPIAs, including UCISA and the Information Commissioner’s Office;
Annex A below specific guidance on using these processes to assess proposed wellbeing activities.
Where a high risk cannot be mitigated – though a successful DPIA process should normally do this – the institution
should consider whether to continue with the proposal. If it decides to do so, the law requires prior consultation with
the national Data Protection Regulator: in the UK, the Information Commissioner’s Office.
The law requires that processing for preventive medicine must be done “under the responsibility of a professional
subject to the obligation of professional secrecy” (Data Protection Act 2018 s.11(1)(a)). For mental health and
wellbeing applications, UUK suggests that such regulated professionals should be found in Student Support
Directorates; both Jisc and UUK recommend “extensive consultation with mental health and student counselling
specialists”. Provided policies and processes remain “under the responsibility” of such professionals, day-to-day
operations can be assigned to appropriately trained and resourced tutors and other staff in accordance with
appropriate confidentiality rules. Student Minds’ University Mental Health Charter stresses that “it is vital that staff
in these roles are properly equipped, qualified, registered and supervised. This need for quality assurance extends
to other interventions, such as the provision of digitally based services”.
have a policy document setting out how the processing is in accordance with the data
protection principles, and how it ensures that data are held in accordance with its
retention schedule (see the ICO guidance on Schedule 1 para 40); or
• Art 9(2)(j)/Section 19 “scientific research”, for the specific purpose of developing or
testing statistical models against historic data. This condition cannot be used in a
blanket manner: for each development or test the institution must ensure (and
document) that the activity is not likely to cause substantial damage or distress to
individuals, in particular that models will not retain personal data and that test data
are not used for measures or decisions about individual students; and
• Art 9(2)(a) “consent” where the institution wishes to invite students to provide
additional information on a voluntary basis, e.g. reporting state of wellbeing through
an app or disclosing previous mental health issues in a face-to-face meeting. In this
case the institution will additionally need to obtain (and record) active, free, informed
and explicit consent. The nature of the institution/student relationship is likely to mean
that consent is presumed not to be free unless the institution can demonstrate that
there was no direct or indirect pressure on the student to agree.
For any basis other than consent, the institution must also ensure that the processing is
“necessary” to achieve the purpose. This does not mean that processing has to be absolutely
essential. However, it must be more than just useful, and more than just standard practice. It
must be a targeted and proportionate way of achieving a specific purpose. The lawful basis
will not apply if the purpose can reasonably be achieved by some other less intrusive means,
or by processing less data. It is not enough to argue that processing is necessary because you
have chosen to operate your business in a particular way. The question is whether the
processing is objectively necessary for the stated purpose, not whether it is a necessary part
of your chosen methods. Given the potentially intrusive nature of processing of student data
for wellbeing and mental health, the institution must examine the processing closely, and
satisfy themselves that the approach to processing is, and continues to be, both necessary
and proportionate to the benefits it can provide.
In particular, if relying on Schedule 1 paragraph 18 to justify processing of a whole group or
cohort of students, the processing must be necessary to identify those individuals among the
group who need care and support, are at risk of mental or emotional harm, and are unable to
protect themselves from that harm. Processing under this basis must be intended, designed
and resourced, to support more than just those students who request it (i.e. those who
consent).
As well as documenting the legal basis/bases for processing, organisations should assess how the processing will satisfy each of the Data Protection principles (in GDPR Article 5), and how the relevant individual rights (in Articles 13 to 22) will be provided. This information will form part of the Policy Document for Special Category Data that is required for most legal bases.
Describe source of risk and nature of potential impact on individuals. Include associated
compliance and corporate risks as necessary, but the focus must be on the impact to individuals.
The following table suggests some risks likely to arise when processing data for wellbeing and
mental health. Individual organisations may identify others that result from their particular
situations. You should assess the likelihood and severity of these inherent risks, and score them
as low, medium or high.
Step 6: Identify measures to reduce risk
Identify additional measures you could take to reduce or eliminate risks identified
as medium or high risk in step 5. The following table suggests some measures that may be
used to mitigate risks. Individual organisations may identify others that are appropriate to
their particular situations. Assess what the likelihood and severity will be after you apply the
mitigations, and score the residual risk as low, medium, or high. Where there is a residual
‘high’ risk, you should seriously consider whether continuing with the processing is
appropriate. If you consider that the processing should continue and there is no way of
mitigating or reducing the risk, you must consult with the ICO.
17
Principle/Right Example Risk Description / Impact on Individual
Inherent Risk Example Mitigation Residual Risk
Likelihood Severity Risk Score
(LxS)
Likelihood Severity Risk Score
(LxS)
Lawful, Fair, Transparent
Students not provided with privacy notices resulting in confusion, mistrust
▪ Privacy Notice provided at the point of data collection and/or appropriate points before and during processing.
Privacy notice is unclear / complex and not understood by students
▪ Privacy Notice is written in plain, intelligible language, with consideration of the audience using a combination of techniques.
▪ Feedback from students is solicited when creating Privacy Notices.
▪ A governance process exists to review and approve Privacy Notices periodically.
If basis is Public Task, processing not necessary to protecting the physical, mental or emotional well-being of an individual (DPA2018 Sch1 para 18)
▪ A necessity and proportionality test is carried out and documented.
If basis is Legitimate Interests, balancing test not performed, and the rights and freedoms of the individuals are not properly assessed.
▪ Legitimate Interests Assessment performed
18
Principle/Right Example Risk Description / Impact on Individual
Inherent Risk Example Mitigation Residual Risk
Likelihood Severity Risk Score
(LxS)
Likelihood Severity Risk Score
(LxS)
Except where basis is Consent, the processing is not necessary to achieve the purpose, resulting in unnecessary privacy risks to individuals.
▪ A necessity and proportionality test is carried out and documented.
If basis is Consent, obtained in invalid ways (e.g. uninformed, not opt-in, or not freely given, not separate to other terms), meaning individuals are not aware of the processing of their personal data.
▪ Consent, if used, is appropriately informed, granular, fair and explicit
▪ Consent is separate to all other terms and conditions.
▪ Students who refuse to give consent are not excluded from signposting / services offered.
Unclear or unsafe sharing of data to/from third parties, increasing the risk that personal data could be inappropriately accessed, lost, altered or destroyed. Individuals may be unclear on how / who to submit an individual rights request to, or may be the victim of impersonation fraud etc.
▪ Privacy notices make clear the identity and relationships of all data controller(s)
▪ Privacy notices include third party data sources, any third parties with whom data may be shared, and legal basis
▪ Data Controller/Data Processor contracts where appropriate
▪ Data Sharing Agreements & Contracts for controller-controller sharing
▪
19
Principle/Right Example Risk Description / Impact on Individual
Inherent Risk Example Mitigation Residual Risk
Likelihood Severity Risk Score
(LxS)
Likelihood Severity Risk Score
(LxS)
Limited to the specified and legitimate purpose
Failure to identify and document the purpose for processing increases the risk that the personal data is inappropriately used or re-purposed.
▪ Documented legal basis for processing; policies and processes for retention and disposal of personal data.
▪ Comprehensive record of the processing activity exists.
Data used for purpose incompatible with original purpose without gaining individual’s consent / Data used for undeclared purpose (e.g. special examination circumstances request used for wellbeing alert; third-party data incorporated into models without notice), meaning students are unaware that their personal data is being processed and the purpose is not within their reasonable expectations.
▪ Record of processing activity defines the personal data being processed and the purpose, any changes must be subject to a defined change management Policy or process.
▪ Where research/statistics is used as basis for model building, technical and organisational processes required of that legal basis (see DPA2018 s.19)
▪ Data is stored in structured databases with restricted access to prevent further use. Policies, training, access controls & monitored audit logs for staff with authorized access to data and alerts.
▪ Students must (re-)consent to incompatible / re-purposing of data and the record of processing is updated to reflect the new purpose.
20
Principle/Right Example Risk Description / Impact on Individual
Inherent Risk Example Mitigation Residual Risk
Likelihood Severity Risk Score
(LxS)
Likelihood Severity Risk Score
(LxS)
▪
Adequate, relevant and limited to what is necessary
Information processed (whether collected or observed) is in excess of what is required for the processing, potentially creating further copies of (unnecessary) personal data and increasing the privacy risks associated with a security breach.
▪ Consultation with stakeholders (including students and specialists) to identify most appropriate data sources & fields to include in processes
▪ Where existing / third party data is used this is limited to what is absolutely necessary. Irrelevant data is deleted.
▪ Where forms are used to collect data from students these are subject to specific governance and approval and designed around the specific requirements of wellbeing/mental health processes.
Information processed (whether collected or observed) is not sufficient or relevant to fulfil the purpose, potentially resulting in inaccurate or misleading outputs.
Predictive models infer information beyond intended scope, creating unnecessary (potentially special category) data outputs.
▪ A model governance policy exists to provide a framework for ensuring the integrity of models, including: ▪ Designated model owner; ▪ Regular performance
reviews; ▪ Independent validation.
21
Principle/Right Example Risk Description / Impact on Individual
Inherent Risk Example Mitigation Residual Risk
Likelihood Severity Risk Score
(LxS)
Likelihood Severity Risk Score
(LxS)
▪
Accurate and up to date
Data collected / used is inaccurate / out of date resulting in counter-productive (non) intervention
▪ Where data sets are matched, controls exist to ensure accuracy of the matching.
▪ Quality assurance checks are performed to ensure the accuracy of manually entered data.
▪ Students are required to validate that the information they have provided is accurate.
Lack of context resulting in inappropriate conclusions being drawn (e.g. lack of VLE use due to student preferring books), and inaccurate inferences about individuals.
▪ Conversations with students are conducted by staff with appropriate training and knowledge of context to identify these problems
Cause/effect impact – predictive model generates a high number of false-positives which affects student behavior/ well-being (e.g. model indicates a mental health problem, student starts to exhibit
▪ Prior consultation with health professionals to ensure models and processes avoid this risk.
▪ Students who are offered interventions based on model output (especially those who appear to be “false positives”) are offered on-going support to
22
Principle/Right Example Risk Description / Impact on Individual
Inherent Risk Example Mitigation Residual Risk
Likelihood Severity Risk Score
(LxS)
Likelihood Severity Risk Score
(LxS)
signs of deteriorating mental health as a result)
detect and avoid adverse consequences
Data collected from third parties is inaccurate / out of date, resulting in false positives / false negatives.
▪ Data is only collected from trusted sources.
▪ Due Diligence is completed on all third parties supplying personal data.
Storage Limitation – kept for no longer than necessary
Personal data is held for longer than necessary, increasing the privacy risks associated with a security breach
▪ Where possible, synthetic, anonymized or pseudonymized data is used.
▪ Record Retention Policy defines the periods of time for which the personal data can be retained (note that different purposes may require different datasets to be retained for different periods, e.g. model/process review may require pseudonymised historic data).
▪ Quality assurance is performed to ensure no records are held outside of the Policy requirements.
▪ Processes exist to delete data when students make an individual rights request.
23
Principle/Right Example Risk Description / Impact on Individual
Inherent Risk Example Mitigation Residual Risk
Likelihood Severity Risk Score
(LxS)
Likelihood Severity Risk Score
(LxS)
Integrity and Confidentiality – Protected against unauthorised or unlawful processing, accidental loss, destruction or damage
Student’s personal data, including special categories of personal data is unlawfully accessed, deleted or modified, resulting in the risk of impersonation fraud or other distress caused by sensitive personal data becoming public.
▪ Where possible, synthetic, anonymized or pseudonymized data is used.
▪ Data is encrypted in transit and at rest.
▪ Appropriate controls exist to prevent unauthorised access to personal data. Where potential unauthorised access is suspected, this is alerted.
▪ Patch & vulnerability management processes are in place including vulnerability scanning and penetration testing.
▪ Policies and processes are in place for incident detection, response and notification.
▪ Appropriate physical and people security Policy is in place.
Access to necessary personal data is unlawfully / unexpectedly restricted, preventing the identification of a support need or critical intervention
▪ Patch & vulnerability management processes are in place including vulnerability scanning and penetration testing.
▪ Back-up
Individual Data Protection Rights
Students cannot exercise their rights in relation to the processing of their personal data (note that subject
▪ The Privacy Notice clearly explains how requests can be made and where they should be directed
24
Principle/Right Example Risk Description / Impact on Individual
Inherent Risk Example Mitigation Residual Risk
Likelihood Severity Risk Score
(LxS)
Likelihood Severity Risk Score
(LxS)
access right is modified for health data)
▪ The institution has a clearly defined process for dealing with Individual Rights requests, all relevant colleagues receive training on how to recognise a request
▪ [where the basis is Public Task or Legitimate Interests] Students are able to object to their data being processed and a process exists for assessing such objections
▪ [where predictive models are used] Students are provided with a means to request manual intervention and the model owner can explain how the model works
▪ [if automated decision making is used] Students can object to being subject to individual automated decision making or predictions and a process exists implement these objections.
Other Individual Rights and Freedoms
‘Surveillance’ perception leads to change of behavior (e.g. student avoids using institutional services, or hides true feelings)
▪ Transparency (e.g. through intranet page and opportunity to ask questions) about data, purpose and interventions.
▪ Regular consultation and feedback opportunities with students
25
Principle/Right Example Risk Description / Impact on Individual
Inherent Risk Example Mitigation Residual Risk
Likelihood Severity Risk Score
(LxS)
Likelihood Severity Risk Score
(LxS)
▪ Monitoring of use of services/responses to surveys from which data are gathered
Intervention is perceived as an infringement of privacy (despite the mitigation(s) in place) resulting in students ‘dropping out’ rather than accepting support
▪ Prior consultation with student representatives to avoid actions that might trigger these perceptions.
▪ Student behaviour (e.g. use of support services) is monitored over time to detect such responses.
Intervention by insufficiently trained/wrong staff, or in wrong setting, causes harm to student and / or staff
▪ Clear, comprehensive processes and training to ensure signals are acted on promptly by appropriate staff/organisations
Students are refused access to services / subject to discrimination as a result of the model output
▪ Models and the processes based on them are monitored for signs of discrimination. Process exists for making appropriate interventions and corrections
Critical support need is identified but student refuses assistance
▪ Alarms at critical level should be handled by health professionals, trained and supported to make decisions on further treatment in these circumstances
26
Principle/Right Example Risk Description / Impact on Individual
Inherent Risk Example Mitigation Residual Risk
Likelihood Severity Risk Score
(LxS)
Likelihood Severity Risk Score
(LxS)
▪ Review of processes/procedures following any such event
Support need is identified but the university don’t have the resources to deal with it
▪ Prior consultation with relevant parties (e.g. health professionals supporting the universities) to assess likely level and nature of support needs
▪ Clear signposting of other support provision, where appropriate
▪ Review group established to identify trends and recommend priorities
27
Assessing the risk
In considering the risk score it may be useful to use the matrix below as a guide.
Severity
Hig
h
Imp
act
High High High
Mo
de
rate
Imp
act
Low Medium High
Little
/No
Imp
act
Low Low High
Unlikely Possible Very Likely
Likelihood
28
Step 7: Sign off and record outcomes
Item Name/position/date Notes
Measures approved by: Integrate actions back into
project plan, with date and
responsibility for completion
Residual risks approved
by:
If accepting any residual high
risk, consult the ICO before
going ahead
DPO advice provided: DPO should advise on
compliance, step 6 measures
and whether processing can
proceed
Summary of DPO advice:
DPO advice accepted or
overruled by:
If overruled, you must explain
your reasons
Comments:
Consultation responses
reviewed by:
If your decision departs from
individuals’ views, you must
explain your reasons
Comments:
This DPIA will kept under
review by:
The DPO should also review
ongoing compliance with DPIA
29
Annex B: Purpose and
Transparency for Wellbeing/Mental
Health Analytics Many of the data sources likely to be used for wellbeing and mental health analytics will have been
collected for other purposes. Some may have been collected by other organisations acting as data
controllers. This raises various issues under data protection law, including: • Whether the wellbeing and mental health analytics purpose is compatible with that original
purpose; and, deriving from that
• What processes and privacy notices may be needed to inform students of the new purpose,
to ensure that the new purpose is fair and transparent and, in some cases, to get individual’s
consent to it.
Article 6(4) of the General Data Protection Regulation covers compatible and incompatible purposes,
Articles 13 and 14 cover privacy notices respectively where the information is obtained direct from the
individual data subject or from a third party (for example through a data sharing agreement). The
following questions aim to help institutions to gather and record the information they will need to
assess purpose compatibility and transparency requirements. Two examples show how this can be
used to update the organisation’s Record of Processing Activities, and how to identify which data
sources are likely to be more or less challenging to reuse for the purpose of wellbeing and mental
health analytics.
Purpose/Transparency questions
For each data source being considered the institution should answer the following questions. Much of
this information should already be available in the institution’s Record of Processing Activities.
• What was the original source of the data? (e.g. collected by the institution itself, or provided
by a third party. If the latter, which third party?)
• How was the data originally obtained? (e.g. from the student directly; from the student by
observation, for example VLE or swipe card activity)
• For what purpose was the data originally collected? (e.g. to provide individual healthcare
support, to provide individual tutorial support, to provide other individual support, for
operational purposes, statistical purposes, or something else)
• Under what lawful basis was the data originally provided? (under Article 6 GDPR: necessary
for performance of a contract with the individual, necessary to comply with a legal obligation,
necessary to protect the vital interests of the individual, necessary for a public interest task (if
so, what task), necessary for a legitimate interest of the institution or a third party (if so, what
interest), consent; for Special Category Data an Article 9 condition will also be required)
• What was the individual told about how their data would be used?
• Is the new use for wellbeing and mental health compatible with the original purpose (see
below)?
The institution should then document the following conclusions:
• Is a new privacy notice needed? If so, how will it be provided to individuals?
• Is new consent needed from individuals because of purpose incompatibility?
• Is new consent needed from individuals because the original legal basis was consent and the