Methodology

Childcare and early years provider survey

Published

1. Introduction

The Survey of Childcare and Early Years Providers (SCEYP) is important for monitoring and evaluating childcare and early years provision. It covers childminders, school-based childcare providers (both maintained nursery schools and other schools offering nursery classes) and group-based childcare providers (other childcare providers operating on non-domestic premises, such as playgroups and day nurseries). It helps the Government, including the Department for Education (DfE), understand the issues that providers face, informing development of early years and childcare policy. 

The survey has been running since 1998, with the latest 2024 iteration carried out by IFF Research and London Economics. There are two versions:  

  • The main SCEYP: a large mixed-mode survey, which can be completed either online or over the telephone. Topics covered include the number of children registered at different settings, staff-to-child ratios, Government funding for childcare including funded entitlements and Tax-free Childcare (TFC), children with Special Educational Needs and Disabilities (SEND), staff qualifications and pay, and the financial side of providing childcare such as costs, incomes and fees. 
  • The short SCEYP: delivered online or via postal entry, this covers a set of core questions from the main SCEYP, asking about fees charged by providers to parents, funding received by providers from their local authorities, and the number of children registered at providers. The purpose of this short survey is to allow fee and funding estimates to be calculated at a local authority (LA) level by aggregating data collected on the short and main versions of the survey. 

1.1 The SCEYP timeseries 

The SCEYP has undergone some changes in scope and design over time, which mean comparisons between different survey years should be approached with caution: 

  • The questionnaire was fully redesigned in 2016 to reflect changes in the childcare market. Data from the 2016 survey were intended to provide a new and enhanced baseline. Changes made to the design of the survey that year mean that 2016 data are not directly comparable with those from previous years of the survey. There were some further changes to the 2018 survey to improve data quality and to reflect new policy priorities, but otherwise it was designed to be comparable with the 2016 survey. 
  • From 2019 – and following a user consultation on the frequency, sampling approach, mode and topic coverage of the SCEYP[1] – DfE decided to intensify the early years focus of the survey by only sampling and recruiting providers delivering childcare to pre-school children aged 0-4 years. Previous surveys had covered reception provision as well as wraparound and holiday care for school-aged children aged up to seven. This means that figures published before and after 2019 are not comparable for “all school-based providers” and “all providers”. Adjustments have been made, however, to figures collected on surveys prior to 2019 to ensure that where these are shown in reports published since 2019 they are comparable with the most recent data. 
  • In more recent years, following consultation with DfE and through cognitive testing with providers, further refinements have been made to some of the survey questions. Such changes have been made with a view to improving respondent experience as well as collecting more reliable data. In 2023, for instance, the cost and income element of the survey was re-designed to make answering these questions easier. In 2024, further clarification and validation checks were added, in particular to questions about the number of booked places at providers, number of paid staff employed, and staff salaries and contracted hours of work.

The SCEYP sample includes some providers that are registered as early years providers but who told us on the survey that they do not currently look after any pre-school children aged 0 to 4. These providers are excluded from all analysis reported in the DfE publications because the focus of the survey is on childcare for pre-school children.  

[1] (opens in a new tab) Department for Education (2018) Surveys on childcare and early years in England: Government consultation response  (opens in a new tab)London: DfE 

1.2 Temporary closed settings

Fieldwork for the 2024 survey took place between May and July 2024. To try to obtain as accurate a picture as possible of available childcare provision in 2024, providers were invited to complete the survey if they had been open and offering relevant childcare in the last 30 days, even if their setting was temporarily closed at the time of interview. Settings that were open were asked to think about their current situation, whilst settings temporarily closed at the time of interview were asked to answer with reference to the last term-time day on which they were open.

Of the 12,166 respondents who answered the survey, 21 were closed less than 30 days prior to interview, while 181 had been closed for more than 30 days. The latter 181 were screened out prior to fieldwork commencing.

2. Sampling

As in previous waves of the Survey of Childcare and Early Years Providers (SCEYP), the study comprised a representative sample of three distinct provider populations in England, each of which was analysed separately:

  • Group-based providers (GBP): childcare providers registered with Ofsted and operating on non-domestic premises;
  • School-based providers (SBP): maintained nursery schools and other schools offering nursery classes;
  • Childminders (CM): Childminders on Ofsted’s Early Years Register of childcare providers.

A target number of responses for each of the provider types was set for the main SCEYP. As in previous years, this was around 2,500 responses for SBPs, 6,200 for GBPs and 2,000 for CMs. 

The number of providers invited to take part in the main SCEYP was then decided using historic response rate data to estimate likely response rates for 2024. The response rates assumed for the 2024 survey were slightly lower than those achieved in 2023 to reflect a general trend of declining response rates.

All remaining providers were sent the short SCEYP. This represented a slight change in the process for CMs, with the 2024 survey the first year in which all eligible CM records were invited to take part. This was due to the steady decline in CM numbers seen with each survey year.[1] In previous years the CMs invited to take part in the short SCEYP represented a subset of all available CMs. This was because the number of eligible CMs was greater than the number of invitations required to meet the target number of responses. 

  1. There was also a slight change in the sampling criteria this year, related to Ofsted registers. GBPs and CMs can appear under one or more of these Ofsted registers which include: 
  2. Early Years Register (EYR): this is a register of all providers looking after children aged from birth to 31 August after their fifth birthday. 
  3. Childcare Register:
  • Compulsory Childcare Register (CCR): providers caring for children from 1 September after the child’s fifth birthday up until their eighth birthday.
  • Voluntary Childcare Register (VCR): providers looking after children aged 8 and over, or those choosing to register voluntarily. 

Historically GBPs and CMs were sampled where they appeared on either the EYR or the CCR. Ahead of this year’s survey, it was agreed to rescope the population so that only those on the Early Years Register were sampled. This reduced the number of providers approached for the survey who, in practice, did not look after any children ages 0-4.

In the main Official Statistics publication for SCEYP 2024, figures were reported for five sub-groups:

  • Private GBPs;
  • Voluntary GBPs;
  • Nursery class childcare settings;
  • Maintained nursery schools (MNS); and
  • Childminders.

This chapter describes the sampling approaches adopted for both the main SCEYP and the short SCEYP surveys. 

[1] (opens in a new tab) The CM population comprised 33,800 records in 2022 and 29,990 in 2023, for example, compared with 25,580 in 2024. 

2.1 Group-based providers sample

GBPs are childcare providers registered with Ofsted and operating in non-domestic premises.[1] The survey collected further details about the ownership and type of childcare offered.

A request was made to Ofsted to provide information for all records in the “childcare on non-domestic premises” category who appeared on the EYR from their July 2023 snapshot database. The sample was based on records from July 2023 to be consistent with the timing of sampling for previous years’ surveys. The data was provided in two stages, with stage 1 providing the variables necessary for drawing the sample and stage 2 containing provider contact details (address, telephone numbers and e-mail address). At stage 1, 22,975 records were provided.

Prior to distributing cases between the main and short surveys, records were checked for duplicates, settings that were known to have closed, or settings identified as only offering holiday care or leisure activities. In total, 1,080 records were removed from the sample.

The remaining 21,895 records were used to draw a sample for the main SCEYP. As in previous years, a stratified random sampling approach was used. Records were selected in proportion to the regional distribution of records and stratified by local authority (LA), register type, ownership status, deprivation band based on the Income Deprivation Affecting Children Index (IDACI), and postcode. As in previous years, GBPs in the North East were oversampled to achieve sufficient responses for this region to be analysed in conjunction with others. A total of 20,059 records were selected for the main SCEYP, based on the previous year’s response rate, and a target of 6,200 responses.

The selected sample was then allocated systematically (using the same stratification variables) to one of three questionnaire variants (16/42/42% split[2]) and, within each variant, to a specific day of the week about which to answer certain questions. This was done to get precise answers that also allowed for analysis of trends in attendance and capacity that vary by weekday. 

All remaining GBPs not selected for the main SCEYP were sent the short SCEYP (1,858).

Table 2.1 Number of group-based providers issued to each survey element

Survey elementN
Main SCEYP20,046[3]
Short SCEYP1,849
Not issued to mainstage fieldwork1,080
Total22,975


[1] (opens in a new tab) This does not include provision in schools.

[2] (opens in a new tab) Those assigned to variant 1 and 3 were shown questions related to child ratios, Special Educational Needs and Disability and provider finances. By contrast, variant 2 focused on staffing. 

[3] (opens in a new tab) 100 records were assigned to ‘pilot/cognitive testing’ status. They were re-sampled for the main SCEYP.

2.2 School-based providers sample

The sample of SBPs consisted of maintained and independent primary schools with nursery and reception provision, including maintained nursery schools (MNS). Although schools offering reception provision alongside nursery provision were eligible for the study, they were only asked about their nursery provision. This sample was drawn from the Schools Census collected in November 2023 from all schools in England. The sample was complemented with further information from Get Information About Schools (GIAS)[1]. The types of schools eligible for the study were identified using the following criteria:

  • Independent schools with reception and nursery provision: open independent schools with statutory lowest age equal to 1, 2 or 3 years (according to School Census) and offering nursery classes (according to GIAS);
  • Maintained nursery schools: open state-funded nursery schools (according to School Census);
  • Maintained schools with reception and nursery provision: open state-funded mainstream and special schools which offer nursery classes (according to GIAS) or  which have at least one pupil in nursery first or second year classes).

A total of 10,265 eligible school-based providers were identified. After the sampling frame was checked for duplicate records and known or planned closures, 9,980 records were included in the sample.

To decide the required issued sample sizes, IFF assumed the same response rates as in the SCEYP 2023 main sample and – as noted earlier – set them slightly lower, to reflect the general trend of declining response rates: 

Table 2.2 Response rate assumptions for school-based providers by type

School type

Main SCEYP

Independent schools with reception and nursery provision

27%

Maintained nursery schools

37%

Maintained schools with reception and nursery provision

30%

A stratified random sampling approach was used to select the samples of school-based providers. Table 2.3 presents the number of cases sampled for the main SCEYP and the short SCEYP. As in previous years, it was necessary to include all maintained nursery schools that were sampled in the main SCEYP to ensure achieving sufficient interviews for regional analysis. The samples were stratified by school type (disproportionate sampling), region and IDACI score to ensure geographical representativeness of the sample.

After stratification, there were 8,100 schools eligible for the main SCEYP. The samples were then allocated systematically (using the same stratification variables) to one of two questionnaire variants (50/50% split) and, within each variant, to a specific day of the week about which to answer questions. This was done to get precise answers that also allowed for analysis of trends in attendance and capacity that vary by weekday.

The remaining maintained and independent schools not selected for the main SCEYP were sent the short SCEYP (1,880).

Table 2.3. Number of school-based providers issued to each survey element

Survey elementIssued (n)
Main SCEYP8,100
Short SCEYP1,880
Total9,980

[1] (opens in a new tab) Get Information About Schools (GIAS) is a register of educational establishments in England and Wales. It is maintained by the Department for Education.

2.3 Childminders sample

The sample of childminders, those offering childcare on domestic premises, was drawn from the July 2023 monthly snapshot of the Ofsted register of childminders.[1] Only those that confirmed they were still practicing childminders at the time of interview were eligible to be surveyed.

A request was made to Ofsted to provide information for all records of active childminders on the EYR from their July 2023 monthly snapshot database. The data was provided in two stages, with stage 1 providing the variables necessary for drawing the sample, and stage 2 containing the provider name, address and contact details (telephone numbers, e-mail address) for sampled childminders. After cleaning to remove childminders known to have closed, the stage 1 database included 25,580 active childminders. This database was used to select samples for all survey components.

A stratified random sampling approach was used to select both samples. The sample was stratified by LA, time elapsed since registration and level of deprivation[2] and then 8,333 childminders were drawn for the main SCEYP sample (based on target response rate of 24%). The selected sample was then allocated to one of two questionnaire variants (50/50% split) and, within each variant, to a specific day of the week to answer questions about. This was done to get precise answers that also allowed for analysis of trends in attendance and capacity that vary by weekday. 

The remaining childminders were sampled for the short SCEYP (17,247 records in total). 

Table 2.4 Number of childminders issued to each survey element

Survey elementIssued (n)
Main SCEYP8,333
Short SCEYP17,247

[1] (opens in a new tab) Childminders must register with Ofsted or a childminder agency. Only those that registered with Ofsted were included in the SCEYP sample frame.

[2] (opens in a new tab) Level of deprivation was determined using the 2019 Income Deprivation Affecting Children Index

3. Questionnaire

3.1 Main SCEYP: Questionnaire development 

3.1.1 Main SCEYP: Questionnaire development 

Much of the content of the 2024 questionnaire mirrored that of the 2023 survey to maintain the timeseries. However, some new questions were introduced this year to cover salient topics within the sector and some of the questions asked in 2023 were removed to accommodate these. Moreover, some changes were introduced, particularly to the staff salary questions, to improve the overall quality of the data collection as well as respondent experience. 

3.1.2 Cognitive testing 

Cognitive testing is an interviewing technique employed to understand the way in which respondents react to, respond to and understand survey questions. As in previous years, IFF research cognitively tested potential changes to the questionnaire ahead of the launch of the 2024 survey. This was to determine the comprehension, relevance, and appropriateness of new or amended questions. The full questionnaire was not cognitively tested as most of it had not changed since 2023. 

A total of 15 cognitive interviews were conducted via teleconferencing between 14th and 29th of March. Respondents included Headteachers and Early Years Managers from 5 SBPs and 5 GBPs and 5 childminders. Each interview lasted approximately one hour. Respondents were asked whether they experienced any difficulties with certain questions (and why), how they arrived at each of their answers, and what the specific wording had meant to them. Specific areas tested included:   

  • Clarification text added to questions around attendance and capacity.
  • Relative ease of answering questions about government funded hours.
  • Interpretations of a more detailed definition of SEND and the impact adding such detail might have 
  • A definition of Education, Health and Care (EHC) plans.
  • Relative ease of answering questions about staff joining and leaving.
  • A new approach to asking about staff contracted hours and salary.
  • How easy it was for settings to answer questions about hourly fees and additional charges.
  • The income and cost questions and how easy it would be for SBPs to provide costs specifically for their nursery provision.

A handful of amendments were agreed with DfE following the cognitive testing report. The majority of these were small adjustments including the addition of clarification texts in places. For example, updated clarification text for questions about the time childcare provision started on the reference day, to specify this meant the earliest time children were able to attend the first session under this childcare provision. For SBPs this specifically asked about their nursery provision to ensure respondents were not answering thinking about their wraparound provision.

More substantial changes made to the questionnaire in 2024 included:

  • Removing questions about school-aged children and wraparound childcare to reflect the early years focus of the survey on children aged 0-4 years.
  • Addition of a question about the ages of the children booked to attend the setting’s full-day care on a particular day.
  • Amendment to questions about whether children receive various government-funded hours, to include an indication of the number that received each.
  • Amendment to the questions about Education, Health and Care plans to include a definition. 
  • Addition of questions asking whether reported numbers of ‘total staff’ include agency and other temporary staff and how many agency and other temporary staff are involved in the delivery of provision. This change was made to ensure agency staff were not being included in ‘total paid staff’ figures. 
  • Addition of questions about where new staff recruits had joined from and their highest qualification level.
  • Amendments to the staff contracted hours and salary questions to ask about more staff members with lower levels of qualifications.
  • Amendments to the staff contracted hours and salary questions to ask about weeks worked per year to assist with the calculation of hourly pay. Specifically this included a new question asking how many weeks per year staff members were contracted to work for. This was then used to derive a weekly salary for those who preferred to answer the salary questions in terms of an annual salary, instead of simply dividing the figure by the 52 weeks there are in a year.
  • Deletion of questions relating to online payments for tax free childcare, the childcare element of universal credit and childminder’s Self-Employment Support Scheme (SEISS) payments.

3.2 Main SCEYP: Questionnaire sections 

Table 3.1 sets out the questionnaire structure used in the main SCEYP 2024. As in previous years, the questionnaire was split into variants to reduce the time burden of completing it, while still ensuring a large enough response to enable detailed analysis. 

The core questionnaire for each provider type asked a series of contextual questions as well as covering capacity and attendance. 

GBPs answered the core questionnaire and one of three provider-specific variants while SBPs and childminders answered the core questionnaire and one of two provider-specific variants. A full breakdown of which questions sections were asked to which variants is included in Table 3.1.  

Table 3.1. Main SCEYP questionnaire sections

Questionnaire sections

SBPs

GBPs

Childminders

Provider Background 

Core

Core

Core

Attendance and Capacity

Core

Core

Core

Funded hours 

Core

Core

Core

Staff-to-child ratios 

Variant 2

Variant 1+3

n/a

Children with SEND 

Core

Variant 1+3

Core

Access to SENCO (Special Educational Needs Coordinator)

Variant 1

Variant 1

Variant 1

Provider level staff questions*

Core / Variant 1

Core / Variant 2

Core

Looped staff questions

Variant 1

Variant 2

Core

Fees and Free Entitlements

Core

Variant 1+3

Variant 2

TFC and Universal Credit

Core

Core

Core

Costs of childcare

Core

Core

Variant 2

Income from childcare

Core

Core

Variant 2

 *The majority of the questions in these sections were core and asked of all although some follow-up questions were only allocated to a specific variant. 

4. Survey fieldwork

The main SCEYP was conducted as a mixed-mode survey, which could be completed either online (via Computer Assisted Web Interviewing, also known as “CAWI”) or on the telephone (Computer Assisted Telephone Interviewing, also known as “CATI”). Both surveys were conducted by IFF Research. Data for the mixed-mode short SCEYP was collected online and using a paper questionnaire. 

4.1 Main SCEYP: Interviewer training and project briefings 

Before starting work on the survey, all interviewers were required to attend a project briefing on the survey. Briefings were carried out using video teleconferencing and covered: 

  • Purpose of the study and how the data are used. 
  • Questionnaire content.
  • Study procedures, including initial contact with respondents and reminder communications.
  • Motivating respondents to take part.
  • Practice interviews. 

4.2 Main SCEYP: Fieldwork dates 

The main SCEYP fieldwork took place between 8th May and 19th July 2024. A “webfirst” design was followed, with the online survey launched first and a communication strategy designed to maximise the number of online completions (see Table 4.1 for communication dates). Telephone fieldwork ran from 20th May to 5th July 2024. 

4.3 Main SCEYP: Making contact 

As this survey is voluntary, potential respondents were sent a range of survey materials to motivate participation, increase the completeness of the data collected and reduce participant burden by helping respondents to prepare in advance. Building on previous years, IFF Research worked with the Department for Education (DfE) to make these materials as accessible, appealing and persuasive as possible. Key features of the mailings intended to encourage response included: 

  • Content of materials. Increased the emphasis on the benefits of survey participation by stressing what results will be used for. They were also simplified this year, and the length reduced, to encourage engagement. 
  • Using different motivational statements across the different contact attempts. Research evidence[1],[2] shows that varying the messaging across different communications increases response rates by attracting participants with different motivations to take part. Initial communications focussed on the most popular motive for survey participation - altruism (by emphasising the importance of the research and making ‘your voice heard by Government’). The later reminder letter highlighted other potential motivations, such as self-interest i.e. ‘taking part will help you, your staff and the families and children you support’.  
  • Design of mailings. Efforts were made to ensure that all mailings looked professional and legitimate and would not be mistaken for junk mail or commercial marketing. The layout was engaging and concise, clearly spelling out what was being asked of participants and why. 
  • ‘Nudges’ for participants without email addresses. A postcard nudge was sent out a couple of weeks after the invitation letter. This was to remind those who may not have acted immediately upon receipt of the invitation letter. 

The full range of communications sent to potential respondents was as follows: 

Ministerial letter (from the Parliamentary Under-Secretary of State for Children, Families and Wellbeing) 

To encourage school-based providers to take part, a letter was sent from the Parliamentary Under-Secretary of State for Children, Families and Wellbeing to this participant group at the outset of fieldwork. This letter was sent alongside the survey invitation and highlighted the importance of the survey in providing robust and up to date evidence on the childcare and early years sector. 

Invitation letter 

All providers were sent an invitation letter or email inviting them to take part. These invitations gave full details about the study, including information security. The letters also included details on how to take part online and informed providers that they would hear from an interviewer via telephone if they did not complete the survey online. Both letters and emails included either a physical preparation sheet or a link to one online to help providers prepare to take part. Providers were given an email address and a contact number that they could use to opt-out of the survey and receive no future contact. See Appendix A for the survey invitation letter sent to each provider type. 

Preparation sheet 

The preparation sheet helped providers prepare for the telephone interview by listing information that they might need to look up in advance, such as financial figures that they were unlikely to know by heart or the qualifications of staff members. These preparation sheets were tailored according to the provider type and the variant of the survey that they would be asked to complete. Providers were encouraged to complete the preparation sheet before their interview. 

Qualifications list 

A list of qualifications was enclosed with the invitation letter and email to help providers identify ‘full and relevant’ qualifications (i.e., all those accepted by the DfE for childcare providers to work in an early years setting) and assist them in answering questions relating to the levels of early years or teaching qualifications their staff members held. [3] Telephone interviewers were able to resend the invitation email with a link to the preparation sheet and qualifications list to providers who wanted to prepare before completing the survey. 

Reminder emails and texts 

IFF Research sent multiple reminders across modes to ensure the response rate to the survey was as high as possible. Up to three reminder texts, three reminder emails and a reminder letter were sent to all non-responding providers for whom IFF Research had the relevant contact details. Each reminder stressed a different potential motivation for taking part. A final text reminder was sent only to those who had begun the questionnaire but not completed it. The email reminders contained a link to an online version of the preparation sheet. 

Table 4.1. Main SCEYP fieldwork and communication dates

Communication  Sent out 
Ministerial letter (for SBPs only) 8th May 2024
Invitation letter8th May 2024
Email invitation 9th May 2024
Reminder email and SMS 1 4th June 2024
Reminder letter 128th May 2024
Reminder email and SMS 2 13th June 2024
Reminder email and SMS 3 10th July 2024


 

[1] (opens in a new tab) Dillman, D. A., Smyth, J. D., Christian, L. M. (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: 

The Tailored Design Method, 4th Edition, Wiley. 

[2] (opens in a new tab) Nicolaas, G., Smith, P., Pickering, K. & Branson, C. (2015). Increasing response rates in postal surveys while controlling costs: an experimental investigation. Social Research Practice Issue 1 Winter 2015. http://the-sra.org.uk/wp-content/uploads/social-research-practice-journal-issue-01-winter-2015.pdf  (opens in a new tab)

[3] (opens in a new tab)https://assets.publishing.service.gov.uk/media/65844707ed3c34000d3bfd40/Early_years_qualification_requirements_and_standards_-_Jan_24.pdf

4.4 Short SCEYP: Fieldwork dates 

The short SCEYP fieldwork ran from 9th May until 12th July 2024.[1] As with the main SCEYP, a “web-first” design was followed, with the online survey launched first and a communication strategy designed to maximise the number of online completions (see Table 4.2 for communication dates). Paper questionnaires were sent out with reminder letters over a month into fieldwork and were supplemented with email reminders.

[1] (opens in a new tab) This is the date when the invitation letters were first mailed out.

4.5 Short SCEYP: Making contact 

The communication strategy for the short SCEYP mirrored that for the main SCEYP (see Section 4.3). Potential respondents were sent a range of survey materials to motivate participation, increase the completeness of the data collected and reduce participant burden by helping respondents to prepare in advance. Building on previous years, IFF Research worked with DfE to make these materials as accessible, appealing and persuasive as possible. 

The full range of communications sent to potential respondents are outlined below.

Ministerial letter (from the Parliamentary Under-Secretary of State for Children and Families) 

In order to encourage school-based providers to take part, a letter was sent from the Parliamentary Under-Secretary of State for Children and Families, to this participant group around a month into fieldwork. This letter highlighted the importance of the survey in providing robust and up to date evidence on the childcare and early years sector, and encouraged schools to take part. 

Invitation letter and email 

All providers were sent an invitation letter. These invitations gave full details about the study, including information security, and invited the provider to take part. Providers were given an email address and a contact number that they could use to opt-out of the survey and receive no future contact. They also included details on how to take part online. See Appendix B  (opens in a new tab)for the survey invitation letter sent to each provider type. 

Reminder emails 

Up to three reminder emails were sent to all non-responding providers who IFF had the relevant contact details for. Each reminder stressed a different potential motivation for taking part. 

Reminder letter and paper questionnaire 

A reminder letter was sent to all providers over a month into fieldwork. A paper copy of the questionnaire, along with a return envelope, was also included for those providers who preferred to complete the survey on paper rather than online. 

Table 4.2. Short SCEYP fieldwork and communication dates

Communication

Sent out

Invitation letter

8th May 2024

Ministerial letter (for school-based providers)

8th May 2024

Invitation email

9th May 2024t

Reminder email 1

4th June 2024

Reminder email 2

13th June 2024

Copy of paper questionnaire and reminder letter

28th May 2024

Reminder email 3

10th July 2024

5. Validation process

A data validation stage that was first introduced for the 2022 survey continued in 2023 and 2024. This involved identifying improbable responses using the built-in validation checks agreed in advance of fieldwork and re-contacting providers flagged as part of this via telephone, to confirm the correct response. 

The aim of this exercise was to reduce the number of records that needed to be edited after survey closure. IFF programmed a series of ‘validation flags’ on key survey variables that signalled where findings were outside of expected limits or seemed implausible. IFF then recontacted providers to flag the issue and confirm correct responses. A summary of the checks undertaken, and the number of records confirmed/edited as a result, is outlined in Table 5.1 below. 

Table 5.1. Validation checks completed for the 2024 survey

Validation checkNo. applied 
Number of 2-year-olds receiving Government-funded hours under the 15 hours Free Entitlement is higher than total no of registered 2-year-olds 

382

Number of 3–4-year-olds receiving Government-funded hours under the 30 hours Free Entitlement is higher than total no of registered 3–4-year-olds

284

Number of 3–4-year-olds receiving Government-funded hours under the 15 hours Free Entitlement is higher than total no of registered 3–4-year-olds

179

If individual costs of delivering provision do not sum to total cost of delivering provision, with the difference 10% or greater. 

3

If individual sources of income do not sum to total income, with the difference 10% or greater. 

89

The overall cost of delivering provision is given, but a response of “don’t know” is collected at least one of the individual costs. 

39

Overall income is given, but a response of “don’t know” is collected at least one of the individual income sources.

84

TOTAL

1,060

The validation callback stage ran from 9th July to 19th July 2024 and IFF Research attempted to contact 1,781 providers in total from which 1,241 responses were secured:

  • Childminders: 118 completes
  • School-based providers: 169 completes
  • Group-based providers: 954 completes

6. Weighting

6.1 Weighting the main SCEYP

Weighting was used to ensure that the final achieved samples were representative of early years and childcare providers in England. Survey weights were designed separately for the three provider types.[1] These were applied to correct for unequal selection probabilities and non-response bias. Grossing weights were then created to ensure that the weighted achieved samples gross up to the population of early years and childcare providers in England.

[1] (opens in a new tab) Survey weights were designed for the three overall provider types (group-based, school-based and childminders). In the official statistics report there are five provider types reported on separately. However, private and voluntary are sub-groups of the overall group-based provider type (and were sampled in the same way), and maintained nursery schools and school-based providers offering nursery provision are sub-groups of the overall school-based provider type (and were sampled in the same way).

6.1.1 Main SCEYP: Weighting for group-based providers

Population totals

The achieved sample of group-based providers was weighted to be representative of all active group-based childcare providers (GBP) in England that were eligible for the study. During the interviews some institutions were found to be ineligible to take part in the survey: those that did not offer nursery provision, those that had closed and those coded as “duplicates”. The eligible population size and profile was estimated by removing the first group (those that did not offer nursery provision). Establishments that had shut down and those coded as “duplicates” were treated as eligible on the assumption that any provider that had opened in the intervening period that would therefore not be included in the sampling frame, would be roughly analogous to those that had subsequently closed. 

The total eligible population was estimated at 21,717 establishments – 99.2% of the sample frame.

Weighting process

Design weights

Design weights were first calculated to correct for unequal selection probabilities arising from the fact that establishments in the North East were oversampled (with all available records invited to take part) to allow robust figures for the region to be calculated. 

Calibration weighting 

Calibration weighting was used to remove the (measurable) bias introduced through non-response to the main SCEYP and align the profile of the achieved sample to the profile of the eligible population defined by region, register type, ownership type, and deprivation band based on Income Deprivation Affecting Children Index (IDACI).

Questionnaire variant weights

Questionnaire variant weights were calculated for the analysis of providers allocated to each of the three GBP variants of the questionnaire. The final overall weight served as a pre-weight for adjustments to the responding cases and their profiles were grossed up to the eligible population profile. 

Day of the week weights 

The selected sample was allocated systematically within each variant of the questionnaire to a specific day of the week (Monday to Friday), about which they were asked certain questions. However, providers were able to complete the survey about a different day. The questionnaire included questions ascertaining which days of the week each group was open. The profile and the overall population size for each day were estimated from the main SCEYP (weighted by overall interview weight). Subsamples of each day were then calibrated to the population profile.

Staff weights 

In addition to calculating weights to make the sample representative of GBPs in England, weights for analysis of information on staff in such settings were also calculated. Staff weights were needed to account for the fact that settings employing more than a certain number of staff at a given qualification level were only asked to give information about one or two members of their team at different levels, chosen at random. A design weight to take account of this selection was calculated and then multiplied by the provider-level interview weights to obtain staff level weights for each case. Additional calibration was applied to ensure that the weighted staff profile (by level of education) was in line with the data collected about the number of staff working.

Some data were excluded from the staff-level datasets prior to weighting. This included cases where information on the staff member was not provided or the respondent reported an overseas qualification. Providers were also excluded if the number of paid staff with recorded qualifications differed by more than ±2 from the total number of paid staff reported.

6.1.2 Main SCEYP: Weighting for school-based providers

Population totals

To create a sampling frame of school-based providers the Schools’ Census database from June 2023 was used, enhanced with further information from a Get Information About Schools (GIAS) extract. As with group-based providers, the eligible population size and profile was estimated based on the removal of providers that had confirmed during the interviews that they did not offer nursery provision. Establishments coded as “duplicates” and those that had shut down were treated as eligible on the assumption that any school that had opened in the intervening period that would therefore not be included in the sampling frame, would be roughly analogous to those that had subsequently closed. 

The total eligible population was estimated at 9,876 – 99.0% of the initial sample frame.

Weighting process

Design weights

Design weights were calculated to correct for disproportionate sampling of different types of schools. 

Calibration weighting

Calibration weighting was used to remove the (measurable) bias introduced through non-response to the main SCEYP and align the profile of the achieved sample to the profile of the eligible population on the following variables: school type (main stratum), region, type of establishment, quintile of number of places registered.

Questionnaire variant weights

Questionnaire variant weights were calculated for the analysis of school-based providers allocated to each of the two variants of the questionnaire. The responding cases and their profiles were grossed-up to the eligible population profile.

Day of the week weights

As with GBPs, although respondents were allocated a random day of the week (Monday – Friday) to answer certain questions about, they were able to complete the survey about a different day. As in previous waves, an assumption was made that all schools were open every weekday and the sample of schools which answered about each day of the week was weighted to the overall profile of the eligible population. 

Staff weights

In addition to calculating weights to make the sample representative of SBPs in England, weights for analysis of information on staff in such settings were also calculated. Staff weights were needed to account for the fact that settings employing more than a certain number of staff at a given qualification level were only asked to give information about one or two members of their team at different levels, chosen at random. A design weight to take account of this selection was calculated and then multiplied by the provider-level interview weights to obtain staff level weights for each case. Additional calibration was applied to ensure that the weighted staff profile (by level of education) was in line with the data collected about the number of staff working. 

Some data were excluded from the staff-level datasets prior to weighting. This included cases where information on the staff member was not provided or the respondent reported an overseas qualification. Providers were also excluded if the number of paid staff with recorded qualifications differed by more than ±2 from the total number of paid staff reported.

6.1.3 Main SCEYP: Weighting for childminders

Population totals

The childminders’ data was weighted to be representative of the eligible population of childminders in England as of July 2023. Some childminders sampled for the main SCEYP  were found to be ineligible in that they were no longer offering provision. As in the previous waves, it was decided not to redefine the population and to assume that the population was stable in its characteristics.

As in previous waves, providers with zero registered places were included on the assumption that at the time of the interview, information about the number of registered places might have already been outdated. It was decided, as in previous waves, that cases who reported zero registered places during the interview would be re-coded as ineligible, and responding childminders would be weighted to the profile of population excluding cases with zero registered places in the sample frame. 

Following data cleaning, the total eligible population was 25,414 – 99.4% of the initial sample frame.

Weighting process

Calibration weights 

Calibration weighting was used to remove measurable bias introduced through non-response to the main SCEYP and to align the profile of achieved sample to the profile of the population. The population targets used for calibration weighting included: region, register type, registration year, and deprivation band based on IDACI. Questionnaire variant weights were calculated for the analysis of childminders allocated to each of the two childminder variants of the questionnaire. The responding cases and their profiles were grossed-up to the eligible population profile. 

Questionnaire variant weights

Questionnaire variant weights were calculated for the analysis of CMs allocated to each of the two variants of the questionnaire. The responding cases and their profiles were grossed-up to the eligible population profile.

Day of the week weights 

The selected sample was allocated systematically within each variant of the questionnaire to a specific day of the week (Monday to Friday) about which they were asked certain questions. The overall population size for each day were estimated from the main SCEYP (weighted by overall interview weight) using 2023 data which asked childminders which days of the week they were open for. Subsamples of each day were then calibrated to population estimates. 

Staff weights 

CMs were also asked several follow up questions about the assistants they employ. There were up to two loops in the questionnaire for assistants, so for respondents with more than two assistants design weighting was required to compensate for this. The final staff level weight was re-scaled so that the total weighted number of assistants was in line with an estimate of the total number of assistants.

6.2 Weighting the short SCEYP

Data from the short SCEYP was combined with data from the financial variant of the main SCEYP where variables were available in both, i.e., the fees and funding questions.

The weighting strategy was designed to create a weight to be used in analysis of the combined dataset comprising data from the short SCEYP and the financial variants of the main SCEYP. Three separate weights were created for SBPs, GBPs and childminders as set out below. 

The weighting strategy was the same for all provider types taking part in the short SCEYP or the financial variant of the main SCEYP. Data from the two surveys were combined for use in financial analysis and calibrated to the same variables and population totals as used for the main SCEYP. 

A combined weight was also created for analysis of all groups together. Combined weights for SBPs, GBPs and childminders were scaled so that the weighted profile by provider type was aligned with the percentage distribution of provider types in the population (modelled estimates).

7. Response rates

7.1 Main SCEYP: Response

The main SCEYP saw 8,111 productive interviews in total. This included:

  • 5,042 interviews with group-based providers (GBP);
    • Private (3,360 interviews, 67%)
    • Voluntary (1,429 interviews, 28%)
    • School/college/LA/unclassified GBPs (238 interviews, 5%)[1]
  • 1,702 completes with school-based providers (SBP);
    • Those offering nursery provision (1,609 interviews, 95%)
    • Maintained nursery schools (MNS) (93 interviews, 5%)
  • 1,367 interviews with childminders.

Table 7.1 shows the outcomes achieved for each provider type. Tables 7.2 shows the response rates achieved for each provider type. Table 7.3 shows the split of responses by mode.

Table 7.1 Main SCEYP: Outcomes by provider type

Group-based providersSchool-based providersChildmindersTotal
Issued sample20,0468,1008,33336,479
Completes5,0421,7021,3678,111
Bad number[2]992641,0292,085
Other non-contact[3]6,1666723,40310,241
Total non-contact7,1587364,43212,326
Unavailable[4]6,1255,2041,90013,229
Refused[5]1,1183372191,674
Total contacted but unproductive7,2435,5412,11914,903
Permanently closed14127271439
Temporarily closed (for longer than 30 days)181278108
Open but not offering relevant childcare783234144
Other ineligible3655033448
Total ineligible6021214161,139

As shown by Table 7.2, response rates continue to decline with each survey year. A possible reason for the lower response rates is that the intensity of research, particularly for schools, has increased in recent years potentially leading to over-saturation of the sample. Moreover, the pre-election period in May may have had an adverse effect. Indeed, the final reminder email and SMS messages were not sent as a result. Additionally, a postcard nudge was removed from the overall communications strategy this year, due to a significant increase in postal costs this year.  

Table 7.2 Main SCEYP: Provider response rates

YearResponse ratesGroup-based providersSchool-based providersChildminders
2022Achieved (n)

6,338

2,476

1,604

Overall response rate

42%

36%

32%

2023Achieved (n)

5,529

2,079

1,482

Overall response rate

38%

32%

26%

2024Achieved (n)

5,042

1,702

1,366

Overall response rate

25%

21%

16%

Table 7.3 Main SCEYP: Mode response rates

Mode

Group-based providers

School-based providers

Childminders

Total

%

n

%

n

%

n

%

n

Telephone

59%

2,991

62%

1,056

54%

737

59%

4,784

Web

41%

2,051

38%

646

46%

630

41%

3,327

Overall, interviews were skewed slightly towards telephone (59% vs. 41% web), with this broadly consistent across provider type. The split was more even among childminders however.


 

[1] (opens in a new tab) This excludes the 17 who answered ‘Don’t know’ to when asked about their provider type.

[2] (opens in a new tab) ‘Bad number’ includes all those who had an invalid number or informed us we had the wrong number when we called.

[3] (opens in a new tab) ‘Other non-contact’ includes all those we did not have any telephone contact with including where there was no answer, our call was rejected, or the call went to answer machine. These will have received an invitation to complete the survey by email or post. 

[4] (opens in a new tab) ‘Unavailable’ includes all those who were busy when we called them including those who said they would be unavailable throughout the whole fieldwork period and instances where we spoke to someone at the setting but the person we asked to speak to was not available. 

[5] (opens in a new tab) ‘Refused’ includes all those who opted out of the research or who said they did not want to participate when we called.

7.2 Main SCEYP: Interview length

Average telephone interview lengths and web interview lengths for different provider types and variants are provided below in Tables 7.4 and 7.5. These are based on interviews that were completed within an hour on the same day; any interviews longer than one hour were excluded from the analysis of interview lengths as the timings for these outlier interviews are likely to be inaccurate. 

Interview lengths are shorter for childminders on average than those with school-based or group-based providers. This is because childminders were asked fewer questions than other providers e.g. they were not asked questions about staff-to-child ratios.

Table 7.4 Main SCEYP: Average (mean) telephone interview length by provider type and questionnaire variant

Provider type

Variant

2024 average telephone interview length (minutes)

Group-based providers

1

34 mins

2

38 mins

3

34 mins

School-based providers

1

37 mins

2

32 mins

Childminders

1

17 mins

2

27 mins

 

Table 7.5 Main SCEYP: Average (mean) web interview length by provider type and questionnaire variant

Provider type

Variant

2024 average web interview length (minutes)[1]

Group-based providers

1

36 mins

2

35 mins

3

35 mins

School-based providers

1

33 mins

2

30 mins

Childminders

1

15 mins

2

24 mins


 

[1] (opens in a new tab) The average online survey time is calculated using the times that a respondent enters and leaves the survey after completing. It cannot account for periods of time when the respondent may have taken a short break, however, we have accounted for outliers by removing anyone who took longer than 1 hour to complete the survey.

7.3 Short SCEYP: Response

The total issued sample for the short SCEYP was 20,976; 1,849 GBPs, 1,180 SBPs, and 17,247 childminders. 

There were 4,055 achieved interviews; 315 from GBPs, 225 from SBPs, and 3,515 from childminders. Table 7.6 displays the outcomes and response rates achieved for the whole sample, and by provider type. It was not possible to estimate eligibility rates given online and postal modes of data collection, so response rates were calculated based on the whole issued sample (i.e. they assume that all sampled providers were eligible). The breakdown of completions by mode (web vs. postal) is shown in Table 7.7.

Table 7.6 Short SCEYP response rates: Whole sample by provider type

Group-based providers

School-based providers

Childminders

Total

Issued sample

1,849

1,880

17,247

20,976

Achieved (n)

315

225

3,515

4,055

Overall response rate

17%

12%

20%

19%

Table 7.7 Short SCEYP: Response mode

Response modeGroup-based providersSchool-based providersChildmindersTotal

%

N

%

N

%

N

%

N

Post

35%

110

24%

53

48%

1,679

45%

1,842

Web

65%

205

76%

172

52%

1,836

55%

2,213

7.4 Short SCEYP: Interview length

The average web interview lengths for the different provider types are provided below in Table 7.8. These are based on interviews that were completed within an hour on the same day. Analysis of interview length excluded those longer than one hour, as the timings for these outlier interviews are likely to be inaccurate.

As Table 7.8 shows, the length of the short SCEYP ranged from 9 minutes on average for childminders to 13 minutes on average for SBPs and 15 minutes for GBPs.

Table 7.8 Short SCEYP: Average web interview length by provider type

Provider typeAverage web interview length
School-based providers13 minutes
Group-based providers15 minutes
Childminders9 minutes

 

8. Data processing

8.1 Coding 

Both the main and short surveys included a small number of questions which offered the respondent the option to choose “other” as their response code and to enter free text into the designated field. These were subsequently coded by trained coders at IFF Research. Code frames from previous iterations of the survey were used for existing questions. Code frames for new questions were devised by the same coding team and signed off by the Research Manager. 

8.2 Main SCEYP: Data cleaning 

Numeric information collected as part of the main SCEYP, such as childcare fees or opening hours, was typed in by respondents themselves (online), keyed in by interviewers (telephone) or recorded on paper and keyed in by IFF’s Data Unit (paper).

It was possible for mis-keying to occur and, on inspection, a small number of answers appeared either much too large or much too small or nonsensical given other survey answers. Therefore, it was decided to implement rules whereby certain values would be removed from the data. Decisions around these rules were informed by observing the distribution of answers at specific questions alongside other contextual information. Key checks include instances where:

  • The number of parents paying fees using Tax Free Childcare is higher than total number of registered children
  • The total number of registered children across all age bands was zero
  • Where the reported number of places that were spare was the same as the total number of registered places, but the number of children booked was not zero, spare capacity was recoded to missing
  • Where the reported number of children with Special Educational Needs or Disabilities was greater than the total number of registered children

8.3 Main SCEYP: Calculating hourly pay 

Respondents were asked a variety of questions in the staff loop section to help collect data on hourly pay. First, respondents were asked how many hours per week and how many weeks per year they were contracted to work for. They were then given the choice to provide an answer about pay for each member of staff as either an annual, weekly or hourly salary. 

Hourly pay was then calculated for each member of staff, based on their answer to how many hours per week and weeks per year they were contracted to work for and what income period was given. Where the reported rate of pay was zero for staff, the derived variable was set to missing (as the analysis was interested in paid staff). 

Annexes

Appendix A: Main SCEYP: Survey invitation letter

Appendix B: Short SCEYP: Survey invitation letter

Help and support

Contact us

If you have a specific enquiry about Childcare and early years provider survey statistics and data:

Early years and childcare research and analysis

Email: EY.AnalysisANDResearch@education.gov.uk
Contact name: John Canlin

Press office

If you have a media enquiry:

Telephone: 020 7783 8300

Public enquiries

If you have a general enquiry about the Department for Education (DfE) or education:

Telephone: 037 0000 2288

Opening times:
Monday to Friday from 9.30am to 5pm (excluding bank holidays)