Skip to main content

Publication, Part of

Learning Disability Services Monthly Statistics (AT: August 2020, MHSDS: June 2020 Final)

Official statistics, Experimental statistics

Data Quality

MHSDS LDA Background

The LDA data is sourced from MHSDS, which is a regular return of data generated by providers of Community and Mental Health services in the course of delivering mental health, learning disability and autism services to people of all ages in England. The original version of the dataset was first mandated in April 2003 and is acknowledged as the national source of administrative data about NHS funded secondary mental health services for secondary uses. Submission of the dataset is a requirement of the NHS Contract for mental health, learning disability and autism services.

The dataset has gone through a number of version changes since April 2003 in response to changes to legislation, service models and payment mechanisms. The scope has also been expanded to include independent sector providers of NHS funded mental health services (from April 2010), learning disabilities and autism services (September 2014) and, since January 2016, mental health services for children and young people. The current version was approved by Standardisation Committee for Care Information (SCCI) in October 2018 for implementation from 1 April 2019. Further information on the MHSDS and on the Information Standard is available.

This section aims to provide users with an evidence based assessment of the quality of the statistical output of the MHSDS Monthly Reports publication by reporting against those of the nine European Statistical System (ESS) quality dimensions and principles1 appropriate to this output.

In doing so, this meets our obligation to comply with the UK Statistics Authority (UKSA) Code of Practice for Official Statistics, particularly Principle 4, Practice 2 which states:

“Ensure that official statistics are produced to a level of quality that meets users’ needs, and that users are informed about the quality of statistical outputs, including estimates of the main sources of bias and other errors and other aspects of the European Statistical System definition of quality”.

For each dimension this section briefly describes how this applies to the publication. We will continue to provide clear and comprehensive information about the methods used in our analysis and the quality of the data to assist users in interpreting our reports. More detailed background information will be presented once the quality of the data has been investigated further.

1 The original quality dimensions are: relevance, accuracy and reliability, timeliness and punctuality, accessibility and clarity, and coherence and comparability; these are set out in Eurostat Statistical Law. However more recent quality guidance from Eurostat includes some additional quality principles on: output quality trade-offs, user needs and perceptions, performance cost and respondent burden, and confidentiality, transparency and security.


This dimension covers the degree to which the statistical product meets user need in both coverage and content.

This publication comprises a report which has been produced from learning disabilities and autism service providers’ MHSDS final data for the reporting period covered by this publication page. The information provided in this publication series is the timeliest that is available from providers for LDA services in England.

The statistics in this publication are marked as experimental and may be subject to further change as we develop our statistics. The classification of experimental statistics is in keeping with the UK Statistics Authority’s Code of Practice. Experimental statistics are new official statistics that are undergoing evaluation. They are published in order to involve users and stakeholders in their development, and as a means to build quality at an early stage. The UK Statistics Code of Practice states that “effective user engagement is fundamental to both trust in statistics and securing maximum public value…” and that as suppliers of information, it is important that we involve users in the evaluation of experimental statistics.

Read the UK Statistics Code of Practice.

Feedback is very welcome via  (please quote ‘MHSDS LDA Monthly’ in the subject line).

Accuracy and reliability

This dimension covers, with respect to the statistics, their proximity between an estimate and the unknown true value.

The MHSDS is a rich, referral level dataset that records packages of care received by individuals as part of referrals into treatment within NHS funded, learning disabilities and autism services and these packages of care vary widely. This means that each record contains different elements of the dataset. Therefore, no single approach can measure the completeness and accuracy of the data collected and reported nationally. However, NHS Digital provides a number of different reports at different stages in the data flow to ensure that the submitted data reflect the services that have been provided.

For data suppliers only

At the point of submission:

  • Providers receive immediate feedback on the quality of their submission, including detailed Data Summary Reports about coverage, volume, code validity and data consistency. Providers have the opportunity to re-submit data up to the deadline and to send a refresh submission one month later.

On receipt of processed data by NHS Digital:

  • Where there are concerns about data quality we contact providers directly so that any issues with local data extraction processes can be addressed for a future submission. These checks are currently limited to key elements of the dataset. Additional checks will be developed as part of future submissions to the extent where they offer the same level of coverage as those previously available for MHLDDS submissions. We also issue individual monthly Data Quality Notices to all providers highlighting key data quality issues.

Interpreting uses of restrictive interventions in inpatient services

The MHSDS is derived from administrative systems rather than being a specific purposeful collection. The quality and completeness of particular data items from particular data providers is dependent on the use to which the data has so far been put. Due to this, data items which are used regularly for analysis or have been the focus of particular attention will be of better quality than less frequently used items, or those for which in depth analysis has not yet taken place. Monthly figures on the use of restrictive interventions in inpatient learning disabilities and autism services were first published for January 2019. As such the data used to derive these falls into this latter group and may be of lower quality than other data used in this publication.

Further assessment of the quality and completeness of these experimental statistics will follow this publication. This will use the statistics presented here as the basis of discussions with service providers to understand any issues, and to guide the development of the methodologies used to create statistics on the use of restrictive interventions in these services.

The data used to derive these figures may contain duplicates. Multiple interventions with identical dates and details (intervention type and duration) for the same individual have been identified. Currently it is unknown if these values are duplicates, record errors or genuine separate incidences therefore no data has been excluded. These potential issues may lead to the number of restrictive interventions shown in this publication being unreliable. As such these figures should be used with caution.

Statistics showing the number of people subject to restrictive interventions will not be affected by duplication and so are more reliable than statistics on the number of restrictive interventions. However, statistics on the number of people subject to restrictive interventions are still subject to other potential quality and completeness limitations. These statistics should be used in light of these limitations.

There has been a general increase in quality and completeness of restraints information submitted by providers over time. Over the COVID period, there has been a notable increase in the number of restraints for some providers due to more shorter incidences of restraints needed per patient that have been recorded separately (as per guidance). Therefore, caution should be taken when interpreting the data.

For all users

As part of the main MHSDS publication national and organisation level data quality measures are shown that validate a selection of key data items by provider. These show the proportion of records as counts and percentages which have ‘valid’, ‘other’, ‘default’, ‘invalid’ and ‘missing’ values for key elements of the dataset, such as Team Type and Primary Reason for Referral. A coverage report shows the number of providers submitting data each month and the number of records by provider and by table. These elements will be expanded upon in future submissions.

All providers of NHS funded specialist mental health and learning disability services should submit to the MHSDS. However, at present not all independent sector providers are making submissions, and this has an impact on the completeness, particularly in areas such as inpatient care and forensic services, where the independent sector provides much of the NHS funded care. A coverage report is included within the main MHSDS publication showing the number of providers submitting each month and number of records submitted. When an organisation starts or ceases to submit data this can affect overall record numbers.

The main MHSDS publication and associated files can be found here:

For LDA, the number of providers submitting data is lower than that recorded in as providing services in the Assuring Transformation publication. To reflect this low coverage, England/national totals are not displayed for LDA measures; instead the ‘Total of submitted data’ is presented. Caution is advised when interpreting these data as they may under-represent the LDA services at a national level.

To support NHS England’s change in Care Treatment Review (CTR) policy, from December 2017, we have included some data quality analysis on the Care Treatment Review (CTR) data within the LDA reference tables We have also included a data quality tab which includes the counts below on the use of the three SNOMED codes that relate to CTRs.

  • The number of providers who are submitting the CTR SNOMED codes
  • The number of each CTR SNOMED code being used within the month

Care and Treatment Reviews (CTRs) were developed to improve the care of people with learning disabilities, autism or both in England with the aim of reducing admissions and unnecessarily lengthy stays in hospitals and reducing health inequalities.

Timeliness and punctuality

Timeliness refers to the time gap between publication and the reference period. Punctuality refers to the gap between planned and actual publication dates.

The MHSDS LDA reports have been produced within three months of the end of the reporting period and five weeks of the submission deadline.

The submission deadlines for MHSDS are published here:

Accessibility and clarity

Accessibility is the ease with which users are able to access the data, also reflecting the format in which the data are available and the availability of supporting information. Clarity refers to the quality and sufficiency of the metadata, illustrations and accompanying advice.

Re-use of our data is subject to conditions outlined here:

Definitions for measures included in this publication are available in the accompanying metadata file. Terminology is defined where appropriate. These definitions will be developed, and further guidance provided in future editions of this publication series when needed.

Full details of the way that MHSDS returns are processed, which will be of use to analysts and other users of these data, are provided in the MHSDS User Guidance, available on the NHS Digital website:

Coherence and comparability

Coherence is the degree to which data which have been derived from different sources or methods but refer to the same topic are similar. Comparability is the degree to which data can be compared over time and domain.

From October 2016, the MHSDS LDA inpatient data has been compared to the Assuring Transformation collection. There is a slight difference in scope between these two data collections. The MHSDS data is from providers based in England and includes care provided in England but may be commissioned outside England. Whereas the Assuring Transformation data are provided by English commissioners and healthcare will typically be provided in England but also includes data on care commissioned in England and provided elsewhere in the UK. These comparators will be used to reconcile and understand the difference between the inpatient counts in the two collections as it is anticipated that the Assuring Transformation collection will be retired, and MHSDS will be the long-term vehicle for reporting on LDA inpatients.

Many of the measures in the monthly comparator data file were previously part of the MHLDS Monthly Reports.  There are notes in the metadata file about which measures previously published in MHLDS Monthly Reports can be compared to measures in the monthly comparator data file, although it may take more time for these measures to support direct like for like comparisons.

MHSDS and Assuring Transformation comparison data file

This file has been created to compare data from MHSDS to the Assuring Transformation (AT) publications. The measures in this data file include comparisons of LDA hospital inpatient counts at the start and end of the month along with inpatient flows. Provider counts from MHSDS and AT are then compared. Additional comparators may be added in the future. For further details about the Assuring Transformation collection please visit:

Trade-offs between output quality components

This dimension describes the extent to which different aspects of quality are balanced against each other.

Although the collection of LDA data via MHSDS commenced in January 2016, some providers continue to experience issues making a comprehensive submission within the permitted timescales. We expect a more complete and accurate picture to emerge over time.  This analysis presents an early view and is subject to caveats both in terms of the completeness of the submission, particularly for services that have only come within scope of the dataset since 1 January 2016, and the limits of the data that could be provided about pathways into services to support monitoring of waiting times.

The format of this publication has been determined to enable timely reporting of key measures while adjusting the scope of analysis to be achievable within NHS Digital resources and production time. Further work on data quality issues with providers is planned to help increase the usefulness and usability of these statistics for different users. Though this work, we hope to support discussions with and between providers and commissioners about caseload and activity to help narrow the differences between the two data sources, so we can look to move to one official source of LDA information in the long term.

Assessment of user needs and perceptions

This dimension covers the processes for finding out about users and uses and their views on the statistical products.

The purpose of the MHSDS LDA monthly reports is to provide learning disability and autism service providers, commissioners and other stakeholders with timely information about caseload and activity. This is intended to support changes in commissioning arrangements as services move from block commissioning to commissioning based on activity, caseload and outcomes for patients.

We undertook a consultation on our adult mental health statistics during 2015 and published the results in November 2015. Changes to the MHLDS Monthly Reports that were previously published from MHLDDS are described in a Methodological Change Paper. The introduction of statistics to support the monitoring of waiting times is in line with the ambitions set out in the NHS England’s Five Year Forward View for Mental Health and we will introduce further waiting time measurements in line with priorities identified with interested parties.

Regular consultation with customers and stakeholders is undertaken to ensure that developments introduced to the publication meet their requirements.

Performance, cost and respondent burden

This dimension describes the effectiveness, efficiency and economy of the statistical output.

The dataset preceding MHSDS (MHLDDS) was identified as the data source to replace others in the Fundamental Review of Returns programme designed to reduce burden on the NHS. As a secondary uses data set it intends to re-use clinical and operational data from administrative sources, reducing the burden on data providers of having to submit information through other primary collections. As part of the agenda to reduce burden of data collections, it is anticipated that the Assuring Transformation collection for LDA inpatients will be retired when the data in MHSDS is of sufficient quality and completeness to provide equivalent data measures.

Confidentiality, transparency and security

The procedures and policy used to ensure sound confidentiality, security and transparent practices.

Submissions have been processed in line with the rules described in the Technical Output Specification for the dataset 2 using a fully assured system that pseudonymises individual identifiers. As for all NHS Digital publications, the risk of disclosing an individual’s identity in this publication series has been assessed and the data are published in line with a Disclosure Control Method for the dataset approved by the NHS Digital’s Disclosure Control Panel.

Please see links below to relevant NHS Digital policies:

Statistical Governance Policy

Freedom of Information Process

A Guide to Confidentiality in Health and Social Care

Privacy and Data Protection


2 Technical output specification -

Further Information

Please see below for links to statistics and information that you may find useful:

Main MHSDS publication:

NHS Workforce Statistics including experimental report on Mental Health and Learning Disability workforce:

Health and Care of People with Learning Disabilities:

Learning Disabilities Census (2013, 2014 and 2015):

Primary care (GP data) - See link for the file “QOF 2017-18: Prevalence, achievements and exceptions at regional and national level”:

 NHS England Assuring Transformation (prior to March 2015) info:

 Social care publications:

 Learning Disabilities Mortality review:

 Adult Psychiatric Morbidity Survey:

Mental Health of Children and Young People in England Survey

Last edited: 9 November 2021 6:07 pm