This dimension covers, with respect to the statistics, their proximity between an estimate and the unknown true value.
Experimental statistics
The statistics in this publication are marked as experimental and may be subject to further change as we develop our statistics. The classification of experimental statistics is in keeping with the UK Statistics Authority’s Code of Practice. Experimental statistics are new official statistics that are undergoing evaluation. They are published in order to involve users and stakeholders in their development, and as a means to build quality at an early stage. The UK Statistics Code of Practice states that “effective user engagement is fundamental to both trust in statistics and securing maximum public value…” and that as suppliers of information, it is important that we involve users in the evaluation of experimental statistics.
Submission validations
The MHSDS is a rich, referral level dataset that records packages of care received by individuals as part of referrals into treatment within NHS funded, learning disabilities and autism services and these packages of care vary widely. This means that each record contains different elements of the dataset. Therefore, no single approach can measure the completeness and accuracy of the data collected and reported nationally. However, NHS Digital provides a number of different reports at different stages in the data flow to ensure that the submitted data reflect the services that have been provided.
At the point of submission:
- Providers receive immediate feedback on the quality of their submission, including detailed Data Summary Reports about coverage, volume, code validity and data consistency. Providers have the opportunity to re-submit data up to the deadline and to send a refresh submission one month later.
On receipt of processed data by NHS Digital:
- Where there are concerns about data quality we contact providers directly so that any issues with local data extraction processes can be addressed for a future submission. These checks are currently limited to key elements of the dataset. Additional checks will be developed as part of future submissions to the extent where they offer the same level of coverage as those previously available for MHLDDS submissions. We also issue individual monthly Data Quality Notices to all providers highlighting key data quality issues.
Data quality reporting
As part of the main MHSDS publication national and organisation level data quality measures are shown that validate a selection of key data items by provider. These show the proportion of records as counts and percentages which have ‘valid’, ‘other’, ‘default’, ‘invalid’ and ‘missing’ values for key elements of the dataset, such as Team Type and Primary Reason for Referral. A coverage report shows the number of providers submitting data each month and the number of records by provider and by table. These elements will be expanded upon in future submissions.
To support NHS England’s change in Care Treatment Review (CTR) policy, from December 2017, we have included some data quality analysis on the Care Treatment Review (CTR) data within the LDA reference tables We have also included a data quality tab which includes the counts below on the use of the three SNOMED codes that relate to CTRs.
- The number of providers who are submitting the CTR SNOMED codes
- The number of each CTR SNOMED code being used within the month
Care and Treatment Reviews (CTRs) were developed to improve the care of people with learning disabilities, autism or both in England with the aim of reducing admissions and unnecessarily lengthy stays in hospitals and reducing health inequalities.
Comparing MHSDS and Assuring Transformation data
The MHSDS and AT comparators file has been created to compare data from MHSDS to the AT publications. The measures in this data file include comparisons of LDA hospital inpatient counts at the start and end of the month along with inpatient flows.
Many of the measures in the monthly comparator data file were previously part of the MHLDS Monthly Reports. There are notes in the metadata file about which measures previously published in MHLDS Monthly Reports can be compared to measures in the monthly comparator data file, although it may take more time for these measures to support direct like for like comparisons.
For further details about the Assuring Transformation collection please visit:
https://digital.nhs.uk/data-and-information/data-collections-and-data-sets/data-collections/assuring-transformation/reports-from-assuring-transformation-collection
Interpreting uses of restrictive interventions in inpatient services
The MHSDS is derived from administrative systems rather than being a specific purposeful collection. The quality and completeness of particular data items from particular data providers is dependent on the use to which the data has so far been put. Due to this, data items which are used regularly for analysis or have been the focus of particular attention will be of better quality than less frequently used items, or those for which in depth analysis has not yet taken place. Monthly figures on the use of restrictive interventions in inpatient learning disabilities and autism services were first published for January 2019. As such the data used to derive these falls into this latter group and may be of lower quality than other data used in this publication.
Further assessment of the quality and completeness of these experimental statistics will follow this publication. This will use the statistics presented here as the basis of discussions with service providers to understand any issues, and to guide the development of the methodologies used to create statistics on the use of restrictive interventions in these services.
The data used to derive these figures may contain duplicates. Multiple interventions with identical dates and details (intervention type and duration) for the same individual have been identified. Currently it is unknown if these values are duplicates, record errors or genuine separate incidences therefore no data has been excluded. These potential issues may lead to the number of restrictive interventions shown in this publication being unreliable. As such these figures should be used with caution.
Statistics showing the number of people subject to restrictive interventions will not be affected by duplication and so are more reliable than statistics on the number of restrictive interventions. However, statistics on the number of people subject to restrictive interventions are still subject to other potential quality and completeness limitations. These statistics should be used in light of these limitations.
There has been a general increase in quality and completeness of restraints information submitted by providers over time. Over the COVID period, there has been a notable increase in the number of restraints for some providers due to more shorter incidences of restraints needed per patient that have been recorded separately (as per guidance). Therefore, caution should be taken when interpreting the data.