Skip to main content

Publication, Part of

Mental Health Bulletin 2020-21 Annual report

Official statistics, Experimental statistics

Correction: Unknown Restrictive Intervention Types

A correction has been made to this publication. Numbers relating to the number of restraints and the number of people subject to restraint where the restrictive intervention type was unknown were incorrect. This error extends to the following cells:

 

  • England inpatients Excel file, Table 17 Column T and Cells AK27 to AK37

  • Commissioning geographies inpatients Excel File, Table 20 Cell S22 and AK22

  • Provider inpatients Excel File Table 17 Cell S22 and AK22

  • LA Excel file, Table 5 Cell S22

 

This error has now been rectified and the files have been replaced. NHS Digital apologises for any inconvenience caused.

1 December 2021 09:00 AM

Additional Measures

As part of the Bulletin 2021-22, NHS Digital published additional measures. Where possible these have been backdated for data covering 2020-21. These additional measures have been published in the separate CSV file found below. This data is also feeding into the new Power BI tool which allows users to visualise data across Bulletins. The Power Bi is available via the data hub or through the 2021-22 Mental Health Bulletin publication page. 

24 November 2022 09:30 AM

STP Populations

An issue was identified in the STP ethnicity populations. This affected the calculation of crude rates measures. 

This error has now been rectified and the files have been replaced. NHS Digital apologises for any inconvenience caused.

1 February 2023 09:30 AM

Data Quality

Information about issues of coverage and accuracy that are relevant to specific statistics in this release are included earlier in this report and within the relevant reference tables.

Data quality files are produced as part of each monthly MHSDS publication, providing detailed information on coverage, validity and integrity of the data. These reports can be accessed here.


Background

The Mental Health Services Dataset (MHSDS) is a regular return of data generated by providers of Community and Mental Health services in the course of delivering mental health, learning disabilities and autism services to people of all ages in England. The original version of the dataset was first mandated in April 2003 and is acknowledged as the national source of administrative data about NHS funded secondary mental health services for secondary uses. Submission of the dataset is a requirement of the NHS Contract for mental health, learning disability and autism services.

The dataset has gone through several version changes since April 2003 in response to changes to legislation, service models and payment mechanisms. The scope has also been expanded to include independent sector providers of NHS funded mental health services (from April 2010), learning disabilities and autism services (September 2014) and, since January 2016, mental health services for children and young people. The current version was approved by Standardisation Committee for Care Information (SCCI) in October 2018 for implementation from 1 April 2019 (DCB0011 Information Standard). Further information on the MHSDS is available. 

This section aims to provide users with an evidence-based assessment of the quality of the statistical output of this publication by reporting against those of the nine European Statistical System (ESS) quality dimensions and principles.

The original quality dimensions are: relevance, accuracy and reliability, timeliness and punctuality, accessibility and clarity, and coherence and comparability; these are set out in Eurostat Statistical Law. However more recent quality guidance from Eurostat includes some additional quality principles on: output quality trade-offs, user needs and perceptions, performance cost and respondent burden, and confidentiality, transparency and security. 

In addition to being appropriate to this output, these dimensions and principles are also consistent with the UK Statistics Authority (UKSA) Code of Practice for Statistics.

For each dimension this section briefly describes how this applies to the publication. We will continue to provide clear and comprehensive information about the methods used in our analysis and the quality of the data to assist users in interpreting our reports. More detailed background information will be presented once the quality of the data has been investigated further.

In addition, as the publication is compiled from administrative data this document also references and signposts to further information about the assurance of the MHSDS.  Quality assurance of administrative data is an ongoing, iterative process to assess the data’s fitness to serve their purpose.


Relevance

This dimension covers the degree to which the statistical product meets user need in both coverage and content.

This publication is the source of official statistics about uses of mental health, learning disabilities and autism services in England during 2019-20. It presents counts of people in contact with these services, inpatient activity, care contacts and uses of restrictive interventions in inpatient services during the annual reporting period.

Content of this publication

This publication includes the following statistical outputs:

  • A summary report containing key findings, a data quality statement and signposts to further information
  • Excel reference tables containing aggregate counts of people in contact with these services, inpatient activity, care contacts, uses of restrictive interventions in inpatient services, in a comparable format to previous years. It also includes referrals on Early Intervention for Psychosis (EIP) pathways, number of people in contact with Specialist Perinatal Mental Health Community Services, discharges followed up within 72 hours of discharge and referrals and contacts with memory services teams for people with dementia. For 2020-21, these have been split into 7 different excel reference tables. Based on the geographic level being reported (National, CCG/STP/Region, Provider and Local Authority) and the type of patient activity being reported (inpatient or outpatient activity)
  • Interactive data visualisations using Microsoft Power BI which illustrate variations in key statistics across different groups and information about coverage. Further interactive data visualisations are available in the Mental Health Data Hub
  • Machine-readable CSV data file which includes the data in the Excel tables and additional breakdowns relating to 2020-21
  • Metadata and data quality tables. These describe the measures in the publication, explain how they have been derived from record-level administrative data, and provide more detailed information on how they can be interpreted given their level of coverage and completeness

 

The statistics in this publication are marked as experimental and may be subject to further change as we develop our statistics. Experimental statistics are a subset of newly developed or innovative official statistics undergoing evaluation. The classification of experimental statistics is in keeping with the UK Statistics Authority’s Code of Practice. The Value pillar in the Code of Practice advocates that as suppliers of information it is important that we continue “improving existing statistics and creating new ones through discussion and collaboration with stakeholders “. Accordingly, these are published in order to involve users and stakeholders in their development, and as a means to build quality.

Read the UK Statistics Code of Practice,

Feedback is welcome email [email protected].

Completeness

We have assessed completeness at two levels:

  1. Have all eligible organisations submitted data to MHSDS?
  2. Is the data submitted by participating organisations complete? For example, does it cover all relevant sites and services? Have monthly data been submitted consistently and at the expected volumes?

Completeness: Have all eligible organisations submitted data to the MHSDS?

Information on this can be found in the Is this information complete? section of this report, with detailed supporting information in the Metadata and Data Quality Tables accompanying this publication. The number of organisations who have provided at least one submission to the MHSDS during 2020-21 has been summarised in Table 1.


Completeness: Is the data submitted by participating organisations complete?

Not all organisations are yet submitting data to the MHSDS. Amongst those organisations that did submit, some of the data is not complete.

Users interested in reviewing monthly data volumes can explore organisation level statistics and record counts in Data Quality Table 1 and Data Quality Table 2 of the Metadata and Data Quality Tables which accompany this publication. Please note that there is no direct relationship between the number of records submitted in any one, or all, of the record counts for tables and the measures presented in this publication, which are derived from a combination of these items. The impact of these combined issues can be explored using the people in contact with services statistics presented in Data Quality Table 2. Users interested in reviewing changes in the number of people known to be in contact with mental health or learning disabilities and autism services can explore Data Quality Table 3. As the number of submitters to MHSDS has increased year-on-year, this methodology analyses a consistent cohort of providers (those that submitted 12 months of data to MHSDS in both 2019-20 and 2020-21). It also includes any providers that submitted for the first time in 2020-21 where the activity was confirmed to be new.

As part of monthly MHSDS publications, national and organisation level data quality measures are shown that validate a selection of key data items by provider. For the 2020-21 year, the data quality CSV files in each monthly publication contains a variety of data quality information relevant to these statistics:

  • Provider Feedback: This worksheet includes a qualitative summary of any data quality issues, by provider, resulting from validation and investigation of each month’s data by NHS Digital and the provider.
  • Coverage: This worksheet shows the number of records submitted each month by organisation and by MHSDS table. This information has been replicated in Data Quality Table 1 in this publication.
  • Validity Count / Percentage: These worksheets include, for each organisation, the numbers and proportion of records which have ‘valid’, ‘other’, ‘default’, ‘invalid’ and ‘missing’ values for selected key data items.

To support providers in making accurate monthly submissions to the MHSDS, NHS Digital provides feedback to providers at different stages in the data flow (including reports at the point of submission). These include detailed Data Summary Reports about coverage, volume, code validity and data consistency.

In April 2020 a new submission model, the Multiple Submission Window Model (MSWM), was introduced. For the 2019-20 reporting period this acted as a “Refresh” of the data for the year. This allowed providers to submit any periods which they might have missed through the year or to resubmit any period of data within the 2019-20 period. This new approach is intended to allow providers some flexibility to improve their submissions by amending incorrect data or adding previously missing data. For the 2020-21 period this submission model has allowed providers to update data for any reporting period which has passed in the financial year. There will be 12 months during which April and May data (start of financial year) may be submitted, but only two months for which March data (end of financial year) may be submitted. The provisional submission window replaces the current primary window, and the performance window replaces the current refresh window. Data can then be resubmitted in each update window. The final window will be the last chance to amend data for the financial year. The ‘last good file’ submitted for each month will be used to generate final statistics at the end of the year.

NHS Digital is working together with NHS England, NHS Improvement and the Care Quality Commission (CQC) to drive up organisational coverage and data quality in the MHSDS. As well as ongoing efforts with individual providers, NHS Digital has also published detailed implementation planning guidance.

 

Further information about the multiple submission model is available. 


Accuracy & Reliability

This dimension covers, with respect to the statistics, their proximity between an estimate and the unknown true value.

The MHSDS is a rich, referral level dataset that records packages of care received by individuals as part of referrals into treatment within NHS funded specialist mental health, learning disabilities and autism services and these packages of care vary widely. This means that each record contains different elements of the dataset. Therefore, no single approach can measure the completeness and accuracy of the data collected and reported nationally. However, NHS Digital provides several different reports at different stages in the data flow to ensure that the submitted data reflect the services that have been provided. These reports have been summarised in the previous section.

NHS Digital has developed methods for accurately producing counts of activity within these services. Whilst the methodology can accurately produce counts of events with a high degree of accuracy from the data source, issues with completeness affect the accuracy of outputs produced using these methods.

The Metadata and Data Quality Tables accompanying this publication contain further details including definitions for measures included in this publication and technical descriptions of the constructions used to produce these measures from the data sources.

Sources of error and bias

The main source of error and bias in these statistics is the completeness of the MHSDS data (see ‘Relevance’ for a summary of these issues). At national level, this results in a downward bias to counts and any derived rates.

Some of these issues will have been present in previous statistics for previous years. As such, while all coverage issues in 2019-20 will result in a downward bias in comparison to the true figure, not all will result in statistics for 2019-20 not being comparable for previous years.

The impact of coverage issues on the reliability of individual figures varies according to what is being counted and at what level the data is being used. General guidance on bias in this publication is provided in the Further information about this publication section of this report, along with guidance in the notes accompanying each reference table.


Coherence and Comparability

Coherence is the degree to which data which have been derived from different sources or methods but refer to the same topic are similar. Comparability is the degree to which data can be compared over time and domain.

The number of providers of adult mental health or learning disability and autism services submitting is not consistent with the volume previously making a submission of MHLDDS monthly data. The total number of providers submitting data has increased from 178 providers submitting data in 2018-19 to 335 providers submitting data in 2020-21. The MHSDS data also contains information about mental health services for children and young people.

We are working closely with providers who have not yet submitted data and expect coverage and data quality to increase for this area over the coming months.

The impact of COVID-19 must also be considered. The COVID-19 pandemic has had a large impact on society and health services around the world. Due to the coronavirus illness (COVID-19) disruption, this is now starting to affect the quality and coverage of some of our statistics, such as an increase in non-submissions for some datasets. We are also starting to see some different patterns in the submitted data. For example, fewer patients are being referred to hospital and more appointments being carried out via phone/telemedicine/email. Therefore, data should be interpreted with care over the COVID-19 period.

 

Geographical comparisons

Included in these statistics are geographical analyses including breakdowns by:

  • Service provider organisation the person is in contact with
  • Clinical Commissioning Group (CCG) which contains the General Practitioner (GP) practice the person in contact with services is registered with. If information on the GP registration of the person is not available, then this is based on the boundaries of the CCG in which the person lives
  • NHS Commissioning Region. This is based on the CCG breakdown above
  • Sustainability and Transformation Partnership (STP) footprint areas. This is based on the CCG breakdown above
  • Local Authority District or Unitary Authority in which the person in contact with services lives

Comparison between different areas may be affected by coverage and completeness issues to a varying extent. The statistics presented for each area can be used to understand possible differences that exist across areas, however any differences identified should be considered against other sources of information available and may be subject to further investigation.

Comparisons using population-based rates

We have presented population-based rates in this publication, which allow comparisons to be made between different groups of people. We have presented crude rates for gender and age groups and standardised rates for ethnicity. These ethnicity rates use direct standardisation to adjust for the different gender and age profile of each ethnic group, allowing us to make better comparisons on a more like-for-like basis. We have also calculated 95% confidence intervals which are presented alongside the rates.

Population figures do not match across the reference tables as we have used the latest figures available in each case, at the time of production. For gender and age, we have used ONS mid-year population estimates from 2020. Ethnicity data uses 2011 Census as mid-year estimates are not produced for this breakdown. This Census data will not reflect any changes in the ethnic structure of the population in England which have occurred since 2011. When deriving rates, this creates a mismatch as the numerator is based on 2020-21 data but the denominator uses 2011 data; this mismatch will increase until the next Census data is published.


Timeliness and punctuality

Timeliness refers to the time gap between publication and the reference period. Punctuality refers to the gap between planned and actual publication dates.

This report has been produced within ten months of the end of the reporting period in line with the timeliness for the 2020-21 edition of this publication series. It was produced within eight months of the May deadline for providers to submit final data for March 2020, the last month of the annual reporting period.


Accessibility and Clarity

Accessibility is the ease with which users are able to access the data, also reflecting the format in which the data are available and the availability of supporting information. Clarity refers to the quality and sufficiency of the metadata, illustrations and accompanying advice.

These statistics have been provided in both human readable (MS Excel) and machine readable (Comma Separated Values - CSV) form. Interactive data visualisations are provided using Microsoft Power BI reports. This will be of interest to all users and can be used to further explore local data quality issues.

Re-use of our data is subject to conditions outlined in our terms and conditions

Definitions for measures included in this publication are available in the accompanying metadata file. Terminology is defined where appropriate. These definitions will be developed, and further guidance provided in future editions of this publication series.

Full details of the way that MHSDS returns are processed, which will be of use to analysts and other users of these data, are provided in the MHSDS user guidance, available on the NHS Digital website.


Trade-offs between output quality components

This dimension describes the extent to which different aspects of quality are balanced against each other.

The format of this publication has been determined to summarise key annual measures and by adjusting the scope of analysis to be achievable within NHS Digital resources and production time. Nevertheless, we have widened the scope of this annual publication to increase the usefulness and usability of these statistics for different users. This includes methods of presenting data (including additional Microsoft Power BI reports), reporting on; access and waiting times for Early Intervention for Psychosis (EIP), people in the perinatal period with a mental health open referral, discharges from adult acute mental health service beds followed up within 72 hours after discharge and referrals and contacts with memory services teams for people with dementia.

By publishing this information, we hope to promote a cycle of improving data and service quality through more frequent monitoring, feedback on data quality and completeness, and use.


Assessment of User Needs and Perceptions

This dimension covers the processes for finding out about users and uses and their views on the statistical products.

The purpose of the Mental Health Bulletin is to be the most comprehensive picture available of people who used secondary mental health, learning disabilities and autism services in England during the financial year. This is intended to provide policy makers, service commissioners, patients and citizens with an understanding of users of NHS funded secondary mental health, learning disabilities and autism services nationality and locally.

Regular consultation with customers and stakeholders is undertaken to ensure that developments introduced to the publication meet their requirements. We undertook a consultation on our adult mental health statistics during 2015 and published the results in November 2015.  A further public consultation was undertaken in August 2020 for the removal of Care Programme Approach (CPA) measures as these measures were no longer considered a priority and that the monitoring of individual elements of good care delivered greater benefit.

 

Balance between performance, cost and respondent burden

This dimension describes the effectiveness, efficiency and economy of the statistical output.

The dataset preceding MHSDS (MHLDDS) was identified as the data source to replace others in the Fundamental Review of Returns programme designed to reduce burden on the NHS. As a secondary uses dataset it intends to re-use clinical and operational data from administrative sources, reducing the burden on data providers of having to submit information through other primary collections.

As part of the agenda to reduce burden of data collections, it is anticipated that the Assuring Transformation collection for LDA inpatients will be retired when the data in MHSDS is of sufficient quality and completeness to provide equivalent data measures. The KP90 collection for people detained in hospital or subject to Community Treatment Orders under the Mental Health Act has now been retired and MHSDS is the official source of these statistics from 2016-17 onwards.


Confidentiality, Transparency and Security

The procedures and policy used to ensure sound confidentiality, security and transparent practices.

Submissions have been processed in line with the rules described in the Technical Output Specification for the dataset using a fully assured system that pseudonymises individual identifiers. As for all NHS Digital publications the risk of disclosing an individual’s identity in this publication series has been assessed and the data is published in line with a Disclosure Control Method for the dataset approved by the NHS Digital’s Disclosure Control Panel.

Please see links below to relevant NHS Digital information:

How we look after information

Freedom of Information Process

Privacy and Data Protection



Last edited: 1 February 2023 9:20 am