Skip to main content

Publication, Part of

Mental Health of Children and Young People in England 2022 - wave 3 follow up to the 2017 survey

Official statistics, Survey

Data quality statement

Introduction

This report highlights aspects of quality and methodology that users may consider when interpreting results of the Mental Health of Children and Young People (MHCYP) 2022, wave 3 follow-up to the 2017 survey. Information contained within this report should be considered alongside the Survey Design and Methods Report.


Background

This is the third in a series of follow up reports to the 2017 Mental Health and Young People Survey (MHCYP) 2017, exploring the mental health of children and young people in April/May 2022, following the coronavirus (COVID-19) pandemic and changes since 2017. Experiences of family life, education and services are also examined.

The sample for the Mental Health Survey for Children and Young People, 2022 (MHCYP 2022), wave 3 follow up was based on children and young people who took part in the MHCYP 2017 survey. Information about the sampling strategy adopted in 2017 can be found in the MHCYP 2017 Survey Design and Methods Report. The consent for follow up studies was originally asked during the MHCYP 2017 fieldwork period, and has been re-asked at each follow up survey.

For 7 to 10 year olds, parents or legal guardians (referred to as parents throughout this report) were able to complete an online questionnaire about their child in the survey. For 11 to 16 year olds, children were able to complete an online questionnaire as well as a separate questionnaire for completion by the parent. Young people aged 17 to 24 were asked to complete an online questionnaire. As in 2021, for 2022, there was also an option to complete via telephone for all respondents.

In 2017, the detailed and comprehensive Development and Well-Being Assessment (DAWBA) (Goodman et al., 2000) was used to assess a range of mental health conditions, including emotional, hyperactivity, behavioural and less common disorders, like autism. After interviews had been completed, trained clinical raters reviewed the data collected to assess for a range of mental disorders for each participant. The questionnaire also covered many aspects of people’s lives that are linked to mental health, and this information can be used to profile the circumstances of children and young people with mental disorders. However in 2017, the Strengths and Difficulties Questionnaire (SDQ) was also included. For the 2020, 2021 and 2022 follow-up surveys, the DAWBA was not included and was replaced with the SDQ. This was mainly due to the shift towards an online survey during the pandemic, so a shorter, simpler questionnaire was deemed more suitable, whilst still producing robust estimates across the time series.

This report provides users with an evidence based assessment of the quality of the statistical output of the Mental Health of Children and Young People in England, 2022 publication by reporting against those of the nine European Statistical System (ESS) quality dimensions and principles. In addition to being appropriate to this output, these dimensions and principles are also consistent with the UK Statistics Authority (UKSA) Code of Practice for Official Statistics. 

This section briefly describes how each of the nine quality dimensions applies to the publication. We will continue to provide clear and comprehensive information about the methods used in our analysis and the quality of the data to assist users in interpreting our reports.

Relevance

This dimension covers the degree to which the statistical product meets user needs in both coverage and content.

From our engagement with customers, we know that there are many users of these statistics. They are used by the Department of Health and Social Care (DHSC), Office for Health Improvements and Disparities (OHID), the Department for Education (DfE), NHS organisations, charities, academics, educators, the public and the media. Uses of the data include: informing and monitoring policy; monitoring the prevalence of health or illness and changes in health or health related behaviours in children and young people; informing the planning of services for this age group; and writing media articles. Universities, charities and the commercial sector use the data for health and social research.

User needs have been gathered and considered at all points in the collection and publication of this information. This has been guided by a steering group consisting of representatives from NHS Digital, DHSC, Office for Health Improvements and Disparities, DfE, NHS England, the Children’s Society, the Children’s Commissioner for England, the Royal College of Paediatrics and Child Health, Children and Young People’s Mental Health Coalition, the Royal Collage of Psychiatrists, a Young Service User Representative, academic leads in Child and Adolescent Mental Health, and academic leads in Contemporary Psychoanalysis and Developmental Science.

All information used in this publication is taken from a sub sample of MHCYP respondents in the 2017 survey. The 2017 survey had a sample of 9,117 respondents and the 2022 follow-up survey had a sub sample of 2,866 respondents. All surveys are subject to bias. Some people, for example those who live in an institution, could not have been selected to take part. Non-response means that some selected households or individuals could either not be contacted or refused to take part. Others may not have been well enough or may have lacked the cognitive capabilities to complete an online or telephone questionnaire. Social desirability biases may mean some people, did not answer fully or honestly. These limitations, while ameliorated to some extent with use of validated measures, weights, understanding of the population they relate to and how the data should appropriately be applied, should be acknowledged. The strengths and limitations of this information, detailed information on the survey methodology, and how the statistics in this report should be interpreted in light of this methodology can be found in more detail in the Survey Design and Methods Report for wave 2 and the wave 3 Technical Appendix.

Descriptions of concepts and terms used in this publication can be found at the start of each topic chapter or at the end in the Glossary. More detailed definitions can be found in the Survey Design and Methods Report.

The purposes of the survey, agreed following user engagement and guided by the steering group, were:

  • To collect recent data on the state of children and young people’s mental health during the pandemic in order to compare with 2017, 2020 and 2021.
  • To estimate what proportion of children and young people in England are living with a mental disorder in 2022.
  • Produce trends in disorders through comparisons with the MHCYP 2017, 2020 and 2021 surveys on a cross-sectional basis.
  • Enable the circumstances of children and young people with different mental disorders to be compared with those of children and young people without a disorder.
  • Improve understanding of the state of children and young people’s mental health and wellbeing since the COVID-19 pandemic.
  • Inform the design of mental health services for children and young people.

We try to engage with people to gain a better understanding of the uses and users of these statistics and to ensure they remain relevant and informative. This was done via the MHCYP Steering Group for the purpose of this follow up survey and reporting. An extensive public consultation was carried out for the main MHCYP 2017 survey which received 183 responses that were considered.

Feedback is very welcome at any time via [email protected].

Accuracy and reliability

This dimension covers, with respect to the statistics, their proximity between an estimate and the unknown true value.

As the data are based on a sample (rather than a census) of the population, the estimates are subject to sampling error. For the 2017 survey a stratified multistage random probability sample of 18,029 children was drawn from the NHS Patient Register in October 2016. Children and young people were eligible to take part if they were aged 2 to 19, lived in England, and were registered with a GP. The sample was designed to be representative of the population of children and young people aged 2 to 19 living in England. A sub-sample was followed up for the 2022 survey via consent that was sought for such future research during the fieldwork for the MHCYP 2017 survey, and refreshed in the 2020 and 2021 surveys. The survey was introduced via a letter sent in advance of the survey going live online. The main characteristics of the sample are provided in Table A of the MHCYP 2022 data tables in Excel, alongside previous survey waves. Any differences in characteristics was accounted for in the MHCYP weighting methodology. Details of the weighting methodology can be found in the Weighting and non-response section of the Survey Design and Methods Report

One of the effects of using the complex design and weighting is that standard errors for survey estimates are generally higher than the standard errors that would be derived from an unweighted simple random sample of the same size. The calculation of standard errors and comments on statistical significance has been included in the report, all of which have taken into account the clustering, stratification and weighting of the data.

Details of the sample design, response rates, survey methods, design effects, sampling errors, measurement errors and activities undertaken to understand and address sources of error, including piloting and cognitive testing of individual survey elements, are available in the Survey Design and Methods Report.

The uncertainty in how reflective the estimates produced are of the resident population is dependent on the size of the sample or sub sample achieved for each survey/wave. Each prevalence estimate must be understood to sit inside confidence intervals, giving a range of likely true prevalence values for the population. Confidence intervals are included for all estimates and are available on the right-hand side of each data table in Excel. Reported estimates which may be less precise are noted within each topic chapter. See the Survey Design and Methods Report for relative standard errors (RSEs) for key estimates, which provide further detail about the likely precision of these results.

Given the small number of respondents screening positive for a probable or possible disorder from certain groups within the population (for example, young people not in education, employment or training), the results of the survey may not find significant differences between estimates for these groups if the true differences are relatively small.

Throughout the report, comparisons between estimates are commented on where there appears to be a statistically significant difference between two estimates. Where estimates show no statistically significant difference, then this has been made clear in the text and worded appropriately (such as ‘no change’ or ‘findings are similar’).

Methods used to limit the likelihood of errors being introduced in the data capture, preparation and analysis stages of the production of this publication are detailed in the Survey Design and Methods Report. Further information regarding NatCen’s quality assurance policies can be found at: https://natcen.ac.uk/about-us/commitment-to-quality/

As with all NHS Digital publications this publication has been subject to the NHS Digital Statistical Governance Policy, section 8 ‘Policy on Principle 4: Sound Methods and assured Quality’ of which details NHS Digital’s roles and responsibilities in quality assuring the methods and results of this publication. Quality assurance of methods and production standards began at the procurement stage with detailed quality standards and requirements being included in a detailed specification. Following the award of the contract to NatCen and the Office for National Statistics with support from researchers from from the University of Cambridge and University of Exeter, ongoing quality assurance of methods was incorporated into regular contract management meetings and engagement with NHS Digital. Methods were also informed by the MHCYP Steering Group and drew upon a range of statistical and clinical expertise. Further details on the quality assurance of the methods used in this publication can be found in the Survey Design and Methods Report.

Prior to publication results were subject to further quality assurance by NHS Digital, for example in checking for internal consistency, spotting any errors, ensuring that the report is clear and understandable and describes the data appropriately and objectively. This involved NHS Digital being provided with four draft versions of each topic chapter within the publication alongside associated reference tables and having the opportunity to comment on all aspects of the report at each stage. All aspects of the publications were reviewed and approved by the Lead Analyst for this publication prior to release. 

 

 

Timeliness and punctuality

Timeliness refers to the time gap between publication and the reference period. Punctuality refers to the gap between planned and actual publication dates.

Data collection for the follow-up survey was done via online or telephone questionnaires in April and May 2022 with results made available within 6 months of data collection. 

 

Accessibility and clarity

Accessibility is the ease with which users are able to access the data, also reflecting the format in which the data are available and the availability of supporting information. Clarity refers to the quality and sufficiency of the metadata, illustrations and accompanying advice.

The report is published online and is available free of charge. Excel tables are included with associated reference data.

Approved researchers seeking to undertake secondary analysis of the MHCYP 2017 follow up series will be able to apply for access via NHS Digital’s Data Access Request Service (DARS) and the UK Data Service, more information is available on the Population Health Data Access Webpage. Users interested in accessing data should contact [email protected].

Information on how users should interpret the results within this publication can be found in the Survey Design and Methods Report, the wave 3 Technical Appendix and in the individual topic reports (for topic-specific guidance). Also, within the individual topic reports, there are references to other sources of information where users can find more information on the topics covered.

Coherence and comparability

Coherence is the degree to which data, which have been derived from different sources or methods but refer to the same topic, are similar. Comparability is the degree to which data can be compared over time and domain.

This is the third in a series of follow-up reports to the 2017 MHCYP survey. One of the key aims of the follow up surveys is to be able to compare children and young people’s mental health between 2017, 2020, 2021 and 2022, overall and by sub-group (age and sex).

It is important to note that while the mental disorder prevalence estimates presented in this report are based on the Strengths and Difficulties Questionnaire (SDQ), the initial MHCYP 2017 survey used the Development and Well-being Assessment (DAWBA) (and drew on a larger sample (9,117 children and young people, aged 2 to 19 years old) which was used for reporting in MHCYP 2017, even though SDQ was also conducted.

It was decided that the SDQ was more suited to a shorter survey conducted online in 2020, 2021 and 2022, which still gave robust estimates on children and young people’s mental health. <> When the 2022 report was produced, comparable SDQ measures were produced for 2017, 2020, 2021 and 2022 for mental disorder prevalence estimates.

Estimates in the MHCYP 2020 report were based on the 3,570 children and young people who took part in both the 2017 and 2020 surveys with estimates for both 2017 and 2020 based on this to allow like-for-like comparisons. This was seen as a strength in the survey methodology at the time. This methodology was reviewed for the 2021 report. 

For the MHCYP 2021 and 2022 reports the full 2017 sample of 9,117 children and young people is used for the 2017 estimates. The decision to use the full 2017 sample for 2017 estimates in the MHCYP 2021 report was taken after assessing alternative options whilst considering the sample size to be used for 2021 estimates. Further details can be found in the Methodological Change Notice with some sensitivity analysis available in the Survey Design and Methods Report. The change in method also has the added benefit that if future follow up surveys of the 2017 cohort are conducted no further changes to the MHCYP 2017 estimates will be required.

Therefore, any comparisons between 2017, 2020, 2021 and 2022 must draw on the results presented in this report, which are based on a comparable measure of the SDQ using children that were aged between 7 to 19 at the time of each survey.

Furthermore, direct comparisons with the MHCYP 2017 estimates are not advised due to the changes in survey design, such as the 2017 survey was conducted face to face while the 2020, 2021 and 2022 follows-up were completed online (or telephone in 2021 and 2022).

Trade-offs between output quality components

This dimension describes the extent to which different aspects of quality are balanced against each other.

Within this publication different aspects of quality have been balanced against each other in order to best meet the aims of the survey. For example, in order to best understand the prevalence of mental health disorders in the population, a survey of the resident population has been used. This allows for the results of this survey to be used to examine the ‘treatment gap’. That is, the survey data can be used to explore what proportion of children and young with a condition are not in contact with services nor in receipt of any treatment, or who are in receipt of inappropriate treatment. This information would not be available if other methods of collection were used, such as sampling from lists of patients in contact with mental health services.

As this is a follow up to the 2017 survey, it is advised to look at the MHCYP 2017 Background Data Quality Statement for further background.

The 2022 survey was designed to represent the whole population of English children and young people. The statistics based on this survey are not the actual rates; instead they are estimates subject to a margin of error, which is presented in the form of a 95% confidence interval.

Confidence intervals are used to make inferences about the values of a variable within a population (such as the prevalence of mental disorders in children and young people). They aid interpretation of data by identifying the range, within which, the true population percentage (or another summary statistic) most likely lies. Typically, 95 per cent confidence intervals are calculated. These indicate that that if several random samples were drawn from the population the true percentage for a variable would lie within this range in 95 per cent of the samples. Confidence intervals are influenced by the size of the sample on which the estimate is based. Larger sample sizes typically result in smaller confidence intervals and, therefore, more precise estimates.

The 2022 survey utilised a complex sample design. Furthermore, weights were applied when obtaining survey estimates. Using complex designs and weighting can increase standard errors and confidence intervals for survey estimates compared to those that would be derived from an unweighted, simple, random sample of the same size. Standard errors have been calculated, taking sample design complexity and weighting into account.

The design factor (Deft) estimates the effects of design complexity on the precision of estimates. Specifically, it represents the ratio of the standard error under a complex design and the standard error that would have resulted from a simple random sample. For example, a design factor of 3 indicates that standard errors are three times as large as they would have been in the case of a simple random sample.

Confidence intervals have been presented for all key estimates in the data tables published on the NHS Digital website. The calculations were carried out using the statistical software R – with an additional procedure written to account for the effect of weighting to account for the complex sample design.

Details of the strengths and limitations of the results of this survey are detailed in full in the Survey Design and Methods Report and wave 3 technical appendix.

Assessment of user needs and perceptions

This dimension covers the processes for finding out about users and uses and their views on the statistical products.

From our engagement with customers, we know that there are many users and uses of these statistics. Details of these uses and users have been included in the ‘Relevance’ section of this Data Quality Statement, along with details of previous consultations on this survey series. Regular consultation with customers and stakeholders is undertaken before each follow up survey to ensure that developments introduced to this publication and other NHS Digital publications meet their requirements.

Performance, cost and respondent burden

This dimension describes the effectiveness, efficiency and economy of the statistical output.

The MHCYP survey was conducted with 5 to 15 year olds living in Britain in 1999 and 5 to 16 year olds living in Britain in 2004. The 1999 and 2004 surveys sampled from Child Benefit records. For the 2017 survey a stratified multistage random probability sample of 18,029 children was drawn from the NHS Patient Register in October 2016. Children and young people were eligible to take part if they were aged 2 to 19, lived in England, and were registered with a GP. Children, young people and their parents were interviewed face-to-face at home using a combination of Computer Assisted Personal Interview (CAPI) and Computer Assisted Self Interview (CASI), between January and October 2017. A short paper or online questionnaire was completed by a nominated teacher for children aged 5 to 16 years old. Data collection varied with the selected child’s age. This was conducted via online questionnaires for the MHCYP 2022 follow-up survey (based on consent obtained for future research originally during MHCYP 2017 fieldwork, and refreshed during 2020 and 2021) and questionnaires completed as follows  :

Online

  • 7 to 10 year olds: parent online questionnaire (17.36 minutes)
  • 11 to 16 year olds: child online questionnaire (11.20) and parent online questionnaire (19.58 minutes)
  • 17 to 19 year olds: young person online questionnaire (18.96)
  • 20 to 24 year olds: young person online questionnaire (17.58)

Telephone

  • 7 to 10 year olds: parent telephone questionnaire (37.08 minutes)
  • 11 to 16 year olds: child telephone questionnaire (23.42) and parent telephone questionnaire (39.62 minutes)
  • 17 to 19 year olds: young person telephone questionnaire (38.66)
  • 20 to 24 year olds: young person telephone questionnaire (39.99)

More details of the survey methodology and associated burden can be found in the Survey Design and Methods Report

Confidentiality, transparency and security

The procedures and policy used to ensure sound confidentiality, security and transparent practices.

No personal/individual level information is contained in the report. Information is presented at a high level of aggregation. As for all NHS Digital publications the risk of disclosing an individual’s identity in this publication series has been assessed and the data are published in line with a Disclosure Control Method for the dataset.

Please see links below to relevant NHS Digital policies:

Statistical Governance Policy

https://digital.nhs.uk/data-and-information/find-data-and-publications/statement-of-administrative-sources/a-z-of-nhs-digital-official-and-national-statistics-publications#user-documents

Freedom of Information Process

https://digital.nhs.uk/about-nhs-digital/contact-us/freedom-of-information

A Guide to Confidentiality in Health and Social Care

https://digital.nhs.uk/about-nhs-digital/our-work/keeping-patient-data-safe

Privacy and Data Protection

https://digital.nhs.uk/about-nhs-digital/privacy-and-cookies



Last edited: 31 January 2023 9:34 am