Skip to main content

Publication, Part of

Learning Disability Services Monthly Statistics, AT: November 2021, MHSDS: September 2021 Final

Official statistics, Experimental statistics

Current Chapter

Data quality statement - MHSDS

AT data tables - total length of stay

The Assuring Transformation (AT) data tables were updated due to an issue with historic length of stay data being incorrectly calculated in table 2.7 (now removed). This has been replaced with table 3.4 which contains the correct data. Figures for November 2021 were unaffected.

20 December 2021 10:30 AM

Data quality statement - MHSDS

Purpose of document

This data quality statement aims to provide users with an evidence based assessment of quality of the statistical output included in this publication. 

It reports against those of the nine European Statistical System (ESS) quality dimensions and principles appropriate to this output. The original quality dimensions are: relevance, accuracy and reliability, timeliness and punctuality, accessibility and clarity, and coherence and comparability; these are set out in Eurostat Statistical Law. However more recent quality guidance from Eurostat includes some additional quality principles on: output quality trade-offs, user needs and perceptions, performance cost and respondent burden, and confidentiality, transparency and security.

In doing so, this meets NHS Digital’s obligation to comply with the UK Statistics Authority (UKSA) code of practice for statistics and the following principles in particular:

  • Trustworthiness pillar, principle 6 (Data governance) which states “Organisations should look after people’s information securely and manage data in ways that are consistent with relevant legislation and serve the public good.”
  • Quality pillar, principle 3 (Assured Quality) which states “Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely.”
  • Value pillar, principle 1 (Relevance to Users) which states “Users of statistics and data should be at the centre of statistical production; their needs should be understood, their views sought and acted upon, and their use of statistics supported.”
  • Value pillar, principle 2 (Accessibility) which states “Statistics and data should be equally available to all, not given to some people before others. They should be published at a sufficient level of detail and remain publicly available.”


This dimension covers the degree to which the statistical product meets user need in both coverage and content.

This publication comprises a report which has been produced from learning disabilities and autism service providers’ MHSDS final data for the reporting period covered by this publication page. The information provided in this publication series is the timeliest that is available from providers for LDA services in England.

All providers of NHS funded specialist mental health and learning disability services should submit to the MHSDS. However, at present not all independent sector providers are making submissions, and this has an impact on the completeness, particularly in areas such as inpatient care and forensic services, where the independent sector provides much of the NHS funded care. A coverage report is included within the main MHSDS publication showing the number of providers submitting each month and number of records submitted. When an organisation starts or ceases to submit data this can affect overall record numbers.

The main MHSDS publication and associated files can be found here:

For LDA, the number of providers submitting data is lower than that recorded in as providing services in the Assuring Transformation publication. To reflect this low coverage, England/national totals are not displayed for LDA measures; instead the ‘Total of submitted data’ is presented. Caution is advised when interpreting these data as they may under-represent the LDA services at a national level.

Accuracy and reliability

This dimension covers, with respect to the statistics, their proximity between an estimate and the unknown true value.

Experimental statistics

The statistics in this publication are marked as experimental and may be subject to further change as we develop our statistics. The classification of experimental statistics is in keeping with the UK Statistics Authority’s Code of Practice. Experimental statistics are new official statistics that are undergoing evaluation. They are published in order to involve users and stakeholders in their development, and as a means to build quality at an early stage. The UK Statistics Code of Practice states that “effective user engagement is fundamental to both trust in statistics and securing maximum public value…” and that as suppliers of information, it is important that we involve users in the evaluation of experimental statistics.


Submission validations

The MHSDS is a rich, referral level dataset that records packages of care received by individuals as part of referrals into treatment within NHS funded, learning disabilities and autism services and these packages of care vary widely. This means that each record contains different elements of the dataset. Therefore, no single approach can measure the completeness and accuracy of the data collected and reported nationally. However, NHS Digital provides a number of different reports at different stages in the data flow to ensure that the submitted data reflect the services that have been provided.

At the point of submission:

  • Providers receive immediate feedback on the quality of their submission, including detailed Data Summary Reports about coverage, volume, code validity and data consistency. Providers have the opportunity to re-submit data up to the deadline and to send a refresh submission one month later.

On receipt of processed data by NHS Digital:

  • Where there are concerns about data quality we contact providers directly so that any issues with local data extraction processes can be addressed for a future submission. These checks are currently limited to key elements of the dataset. Additional checks will be developed as part of future submissions to the extent where they offer the same level of coverage as those previously available for MHLDDS submissions. We also issue individual monthly Data Quality Notices to all providers highlighting key data quality issues.


Data quality reporting

As part of the main MHSDS publication national and organisation level data quality measures are shown that validate a selection of key data items by provider. These show the proportion of records as counts and percentages which have ‘valid’, ‘other’, ‘default’, ‘invalid’ and ‘missing’ values for key elements of the dataset, such as Team Type and Primary Reason for Referral. A coverage report shows the number of providers submitting data each month and the number of records by provider and by table. These elements will be expanded upon in future submissions.

To support NHS England’s change in Care Treatment Review (CTR) policy, from December 2017, we have included some data quality analysis on the Care Treatment Review (CTR) data within the LDA reference tables We have also included a data quality tab which includes the counts below on the use of the three SNOMED codes that relate to CTRs.

  • The number of providers who are submitting the CTR SNOMED codes
  • The number of each CTR SNOMED code being used within the month

Care and Treatment Reviews (CTRs) were developed to improve the care of people with learning disabilities, autism or both in England with the aim of reducing admissions and unnecessarily lengthy stays in hospitals and reducing health inequalities.


Comparing MHSDS and Assuring Transformation data

The MHSDS and AT comparators file has been created to compare data from MHSDS to the AT publications. The measures in this data file include comparisons of LDA hospital inpatient counts at the start and end of the month along with inpatient flows.

Many of the measures in the monthly comparator data file were previously part of the MHLDS Monthly Reports.  There are notes in the metadata file about which measures previously published in MHLDS Monthly Reports can be compared to measures in the monthly comparator data file, although it may take more time for these measures to support direct like for like comparisons.

For further details about the Assuring Transformation collection please visit:


Interpreting uses of restrictive interventions in inpatient services

The MHSDS is derived from administrative systems rather than being a specific purposeful collection. The quality and completeness of particular data items from particular data providers is dependent on the use to which the data has so far been put. Due to this, data items which are used regularly for analysis or have been the focus of particular attention will be of better quality than less frequently used items, or those for which in depth analysis has not yet taken place. Monthly figures on the use of restrictive interventions in inpatient learning disabilities and autism services were first published for January 2019. As such the data used to derive these falls into this latter group and may be of lower quality than other data used in this publication.

Further assessment of the quality and completeness of these experimental statistics will follow this publication. This will use the statistics presented here as the basis of discussions with service providers to understand any issues, and to guide the development of the methodologies used to create statistics on the use of restrictive interventions in these services.

The data used to derive these figures may contain duplicates. Multiple interventions with identical dates and details (intervention type and duration) for the same individual have been identified. Currently it is unknown if these values are duplicates, record errors or genuine separate incidences therefore no data has been excluded. These potential issues may lead to the number of restrictive interventions shown in this publication being unreliable. As such these figures should be used with caution.

Statistics showing the number of people subject to restrictive interventions will not be affected by duplication and so are more reliable than statistics on the number of restrictive interventions. However, statistics on the number of people subject to restrictive interventions are still subject to other potential quality and completeness limitations. These statistics should be used in light of these limitations.

There has been a general increase in quality and completeness of restraints information submitted by providers over time. Over the COVID period, there has been a notable increase in the number of restraints for some providers due to more shorter incidences of restraints needed per patient that have been recorded separately (as per guidance). Therefore, caution should be taken when interpreting the data.

Timeliness and punctuality

Timeliness refers to the time gap between publication and the reference period. Punctuality refers to the gap between planned and actual publication dates.

The MHSDS LDA reports have been produced within three months of the end of the reporting period and five weeks of the submission deadline.

The submission deadlines for MHSDS are published here:

Accessibility and clarity

Accessibility is the ease with which users are able to access the data, also reflecting the format in which the data are available and the availability of supporting information. Clarity refers to the quality and sufficiency of the metadata, illustrations and accompanying advice.

Re-use of our data is subject to conditions outlined here:

Definitions for measures included in this publication are available in the accompanying metadata file. Terminology is defined where appropriate. These definitions will be developed, and further guidance provided in future editions of this publication series when needed.

Full details of the way that MHSDS returns are processed, which will be of use to analysts and other users of these data, are provided in the MHSDS User Guidance, available on the NHS Digital website:

Coherence and comparability

Coherence is the degree to which data which have been derived from different sources or methods but refer to the same topic are similar. Comparability is the degree to which data can be compared over time and domain.

From October 2016, the MHSDS LDA inpatient data has been compared to the Assuring Transformation collection. There is a slight difference in scope between these two data collections. The MHSDS data is from providers based in England and includes care provided in England but may be commissioned outside England. Whereas the Assuring Transformation data are provided by English commissioners and healthcare will typically be provided in England but also includes data on care commissioned in England and provided elsewhere in the UK. These comparators will be used to reconcile and understand the difference between the inpatient counts in the two collections.

Trade-offs between output quality components

This dimension describes the extent to which different aspects of quality are balanced against each other.

Although the collection of LDA data via MHSDS commenced in January 2016, some providers continue to experience issues making a comprehensive submission within the permitted timescales. We expect a more complete and accurate picture to emerge over time.  This analysis presents an early view and is subject to caveats both in terms of the completeness of the submission, particularly for services that have only come within scope of the dataset since 1 January 2016, and the limits of the data that could be provided about pathways into services to support monitoring of waiting times.

The format of this publication has been determined to enable timely reporting of key measures while adjusting the scope of analysis to be achievable within NHS Digital resources and production time. Further work on data quality issues with providers is planned to help increase the usefulness and usability of these statistics for different users. Though this work, we hope to support discussions with and between providers and commissioners about caseload and activity to help narrow the differences between the two data sources, so we can look to move to one official source of LDA information in the long term.

Assessment of user needs and perceptions

This dimension covers the processes for finding out about users and uses and their views on the statistical products.

The purpose of the MHSDS LDA monthly reports is to provide learning disability and autism service providers, commissioners and other stakeholders with timely information about caseload and activity. This is intended to support changes in commissioning arrangements as services move from block commissioning to commissioning based on activity, caseload and outcomes for patients.

We undertook a consultation on our adult mental health statistics during 2015 and published the results in November 2015. Changes to the MHLDS Monthly Reports that were previously published from MHLDDS are described in a Methodological Change Paper. The introduction of statistics to support the monitoring of waiting times is in line with the ambitions set out in the NHS England’s Five Year Forward View for Mental Health and we will introduce further waiting time measurements in line with priorities identified with interested parties.

Regular consultation with customers and stakeholders is undertaken to ensure that developments introduced to the publication meet their requirements.

Performance, cost and respondent burden

This dimension describes the effectiveness, efficiency and economy of the statistical output.

The dataset preceding MHSDS (MHLDDS) was identified as the data source to replace others in the Fundamental Review of Returns programme designed to reduce burden on the NHS. As a secondary uses data set it intends to re-use clinical and operational data from administrative sources, reducing the burden on data providers of having to submit information through other primary collections. 

Confidentiality, transparency and security

The procedures and policy used to ensure sound confidentiality, security and transparent practices.

Submissions have been processed in line with the rules described in the Technical Output Specification for the dataset using a fully assured system that pseudonymises individual identifiers. As for all NHS Digital publications, the risk of disclosing an individual’s identity in this publication series has been assessed and the data are published in line with a Disclosure Control Method for the dataset approved by the NHS Digital’s Disclosure Control Panel.

Please see links below to relevant NHS Digital policies:

Statistical Governance Policy

Freedom of Information Process

A Guide to Confidentiality in Health and Social Care

Privacy and Data Protection


Last edited: 20 December 2021 9:33 am