Methodology

Childcare and early years provider survey

Published

1. Introduction

The Survey of Childcare and Early Years Providers (SCEYP) is important for monitoring and evaluating childcare and early years provision. It covers childminders, school-based childcare providers (both maintained nursery schools and other schools offering nursery classes) and group-based childcare providers (other childcare providers operating on non-domestic premises, such as playgroups and day nurseries). It helps the Government, including the Department for Education (DfE), understand the issues that providers face, informing development of early years and childcare policy. 

The survey has been running since 1998, with the latest 2025 iteration carried out by IFF Research and London Economics. The SCEYP comprises a large mixed-mode survey, which can be completed either online or over the telephone. Topics covered include the number of children registered at different settings, attendance and capacity, staff-to-child ratios, Government funding for childcare including funded entitlements and Tax-free Childcare (TFC), Special Educational Needs and Disabilities (SEND), staff qualifications and pay, and the financial side of providing childcare such as costs, incomes and fees.

1.1 The SCEYP timeseries 

The SCEYP has undergone some changes in scope and design over time, which mean comparisons between different survey years should be approached with caution: 

  • The questionnaire was fully redesigned in 2016 to reflect changes in the childcare market. Data from the 2016 survey were intended to provide a new and enhanced baseline. Changes made to the design of the survey that year mean that 2016 data are not directly comparable with those from previous years of the survey. There were some further changes to the 2018 survey to improve data quality and to reflect new policy priorities, but otherwise it was designed to be comparable with the 2016 survey. 
  • From 2019 – and following a user consultation on the frequency, sampling approach, mode and topic coverage of the SCEYP[1] – DfE decided to intensify the early years focus of the survey by only sampling and recruiting providers delivering childcare to pre-school children aged 0-4 years. Previous surveys had covered reception provision as well as wraparound and holiday care for school-aged children aged up to seven. This means that figures published before and after 2019 are not comparable for “all school-based providers” and “all providers”. Adjustments have been made, however, to figures collected on surveys prior to 2019 to ensure that where these are shown in reports published since 2019 they are comparable with the most recent data. 
  • In more recent years, following consultation with DfE and through cognitive testing with providers, further refinements have been made to some of the survey questions. Such changes have been made with a view to improving respondent experience as well as collecting more reliable data. They include:
    • In 2023 the cost and income element of the survey was re-designed to make answering these questions easier. 
    • In 2024, various clarifications and validation checks were added to questions about the number of booked places at providers, number of paid staff employed, and staff salaries and contracted hours of work.
    • In 2025, a ‘short’ version of the survey (asking providers about the fees that they charged to parents and funding rates they received from their local authorities) was dropped and all providers were sent the bigger, ‘main’ survey, in order to maximise the number of responses that this received. 

[1]  Department for Education (2018): Surveys on childcare and early years in England: Government consultation response  (opens in new tab)

1.2 Temporary closed settings

Fieldwork for the 2025 survey took place between May and July 2025. To try to obtain as accurate a picture as possible of available childcare provision in 2025, providers were invited to complete the survey if they had been open and offering relevant childcare in the last 30 days, even if their setting was temporarily closed at the time of interview. Settings that were open were asked to think about their current situation, whilst settings temporarily closed at the time of interview were asked to answer with reference to the last term-time day on which they were open.

Of the 9,358 respondents who completed the survey, 11 answered that they had temporarily closed less than 30 days prior to interview. 181 providers we contacted had been closed for more than 30 days. These providers were screened out prior to fieldwork commencing. 

2. Sampling

As in previous waves of the SCEYP, the study comprised a representative sample of three distinct provider populations in England, each of which was analysed separately:

  • School-based providers (SBP): maintained nursery schools and other, maintained and non-maintained schools offering nursery classes;
  • Group-based providers (GBP): childcare providers on Ofsted’s Early Years Register[1] of childcare providers and operating on non-domestic premises;
  • Childminders (CM): Childminders on Ofsted’s Early Years Register[2] of childcare providers.

Prior to 2025 there were two versions of the survey: the main SCEYP which covered all topics and a ‘short’ version of the survey which asked specifically about fees and funding rates. Historically, targets for the main SCEYP were set, and the amount of sample drawn calculated, based on response rates to the main survey in previous waves. Any eligible sample not drawn for the main SCEYP was allocated to the short survey to boost the responses to the fees and funding questions where analysis is presented at the local authority level. 

Due to falling response rates in 2024, achieving SBP and GBP target responses to the main survey in 2025 required us to send it to all of them. Therefore, this year the short survey was dropped, and all eligible providers (including all eligible CMs) were sent the main survey. Targets for 2025 were set based on response rates to previous waves and adjusted to reflect response rates in 2024. They were: 2,000 SBPs, 5,000 GBPs and 4,000 CMs. 

This chapter describes the sampling approaches adopted for each of the three main types of provider.

[1] Historically GBPs and CMs were sampled where they appeared on either the Early Years Register (EYR) and / or the Compulsory Childcare Register (CCR). Ahead of the 2024 survey, it was agreed to rescope the population so that only those on the EYR were sampled. This reduced the number of providers sent the survey unnecessarily, because they did not look after any children aged 0-4. This approach was retained for the 2025 survey.

[2] As above.

2.1 School-based providers sample

The sample of SBPs was drawn from the January (spring) School Census, the School Level Annual School Census and, the General Hospital School Census published in June 2024 relating to the academic year 2023-24 [1]. The sample was complemented with further information from a Get Information About Schools (GIAS)[2] snapshot downloaded on January 7th 2025 and data on children registered in nursery provision from the School Census for the academic year 2023-24.

Three types of eligible SBPs were identified for the study using the following criteria:

  • Independent schools with nursery provision: independent schools (according to the School Level Annual School Census); that were ‘open’ or ‘open, but proposed to close’, offered nursery classes and had a statutory lowest age equal to 0, 1, 2 or 3 years (according to GIAS);
  • Maintained nursery schools: state-funded nursery schools (according to the School Census); that were ‘open’ or ‘open, but proposed to close’ (according to GIAS);
  • Maintained schools with nursery provision: state-funded primary, secondary or special schools (according to the School Census); which were ‘open’ or ‘open, but proposed to close’, and either had nursery classes (according to GIAS) or had at least one pupil in nursery first or second year classes (according to the School Census).

In 2025, two changes were made to the SBP sampling approach. Firstly, providers whose establishment status on GIAS was ‘Open, but proposed to close’ were included as eligible providers. These records were excluded from the sample in previous years, but the decision was made to include them in this wave. This was because in most cases this status refers to schools who were proposed to close but then re-open as an academy school and, therefore, were still eligible to take part in the survey.[3] Secondly, hospital schools were identified and removed from the sample for the first time in 2025 as their nursery provision was assumed to run differently to provision offered by other types of providers.[4] 

The sample was then shared with DfE who appended school email addresses and the latest information regarding the school’s closure status. In previous waves, any schools DfE identified as ‘Closed’ or ‘Open, but proposed to close’ were removed from the sample at this stage. However, in line with the updated approach to closure status outlined above, in 2025 only those marked as ‘Closed’ were removed at this stage.[5] The remaining sample was then randomly allocated to one of two questionnaire variants[6] and, within each variant, to a specific day of the week about which to answer certain questions. This was done to allow for analysis of trends in attendance and capacity that vary by weekday. 

The total remaining sample of 10,089 eligible SBP records were then invited to take part in the survey. This population was comprised of 836 independent schools with nursery provision, 379 maintained nursery schools and 8,874 maintained schools with nursery provision. 

 

[1] Schools and local authorities are required to provide the Department for Education (DfE) with a school census return (using a DfE-designed survey instrument) covering a wide range of information on the characteristics of schools and the pupils within them in January each year. Additionally, independent schools, general hospital schools and alternative provision provide (via the local authority) details on the number and characteristics of their pupils at this time. 

[2] Get Information About Schools (GIAS) is a register of educational establishments in England and Wales. It is maintained by the Department for Education.

[3] This included 162 schools. These schools were checked again in August and it was confirmed that none had permanently closed as they had re-opened as an academy school immediately afterwards. 

[4] This only impacted 2 schools that were removed from the sample.

[5] Only 1 school was identified as ‘Closed’ at this stage.

[6] As in previous years, the questionnaire was split into variants to reduce the time burden of completing it, while still ensuring a large enough response to enable detailed analysis.

2.2 Group-based providers sample

GBPs are childcare providers registered on the Ofsted Early Years Register (EYR) and operating on non-domestic premises.[1] The survey collected further details about the ownership and type of childcare offered by these providers.

Ofsted provided information for all 22,749 records in the ‘childcare on non-domestic premises’ category who appeared on the EYR. In the 2024 SCEYP, records were taken from the July 2023 snapshot database. However, in 2025, records were instead taken from the December 2024 snapshot database to secure the most up-to-date snapshot possible. 

To avoid burdening potentially ineligible providers, we performed additional checks to remove providers who are suspected to be primarily holiday or activity clubs and therefore less likely to be providing the type of provision relevant to the survey i.e. nursery provision for under school-aged children. We did this by searching provider names for the following key terms: ‘holiday’, ‘summer’, ‘leisure’, ‘sport’, ‘football’, ‘tennis’, ‘activ’. 857 providers were identified and removed on this basis. 

The remaining population of 21,892 records were drawn for inclusion in the main SCEYP this wave due to the removal of the short survey. The sample was then randomly allocated to one of two questionnaire variants and, within each variant, to a specific day of the week about which to answer certain questions. This was done to allow for analysis of trends in attendance and capacity that vary by weekday.

[1] This does not include provision in schools.

2.3 Childminders sample

As with GBPs, details of all Childminders who appeared on the EYR from the December 2024 snapshot database were provided by Ofsted.  

The total population of 23,968 eligible records were drawn for inclusion in the main SCEYP. The selected sample was then randomly allocated to one of two questionnaire variants and, within each variant, to a specific day of the week to answer questions about. 

3. Questionnaire

3.1 Questionnaire development 

Much of the content of the 2025 questionnaire mirrored that of the 2024 survey to maintain the timeseries. However, some new questions were introduced this year to cover salient topics within the sector and some of the questions asked in 2024 were removed to accommodate these. Moreover, the short survey  was removed this year to maximise responses to the core survey and thereby collect more data which would improve the analysis potential.

3.2 Cognitive testing 

Cognitive testing is an interviewing technique employed to understand the way in which respondents react to, respond to and understand survey questions. As in previous years, IFF research cognitively tested potential changes to the questionnaire ahead of the launch of the 2025 survey. This was to determine the comprehension, relevance, and appropriateness of new or amended questions. The full questionnaire was not cognitively tested as most of it had not changed since 2024. 

A total of 15 cognitive interviews were conducted via teleconferencing between 11th and 21st of March. Respondents included Early Years Managers, Finance Administrators, and Early Years and Foundation Stage Leads from 5 SBPs, 5 GBPs and 5 childminders. Each interview lasted approximately one hour. Respondents were asked whether they experienced any difficulties with certain questions (and why), how they arrived at each of their answers, and what the specific wording had meant to them. 

Examples of areas tested and decisions made included: 

  • How SBPs approached answering questions on the number of registered places that they have. We explored what they understood the questions to be asking and what they included in their answer. The purpose of this was to understand whether/how the questionnaire could be amended to elicit the most accurate responses. The testing found that while SBPs did not necessarily use the term “registered places” they understood it and answered according to the number of children they were able to take given their existing staffing levels and space. While they did use other terms such as “offered places”, “advertised places” and “capacity” it was decided their understanding of the current questionnaire terminology was sufficient so there was no need to change the wording of this question.
  • Exploring the possibility of a question about the primary category of need for children with SEND, and the extent to which providers felt able to answer such a question. While providers were generally aware of and felt confident identifying a child’s primary category of need, they reported they would often need to do so by referring to another member of staff (e.g. SENCO) or information held about the child. They also felt less able to answer this for children awaiting diagnosis. Due to the time it would take for providers to answer this question (especially for those with 10+ children with SEND) and the uncertainty for children without a formal diagnosis, it was decided not to add a question on primary category of SEND to the survey.
  • Exploring the possibility of a question about the number of two-year olds registered that access government funded hours which required splitting out of children accessing the ‘new entitlement for working parents’ and the ‘disadvantaged entitlement’, to see whether providers would be able to split these out and the best terminology to use in the question. The testing found that providers thought about these two streams distinctly, so it was easy for them to answer a question which split these out. As such, it was decided to add this question to the survey.

3.3 Questionnaire changes for 2025

The most substantial amendments to the questionnaire in 2025 were the removal of the short survey and the third variant of the GBP survey. These were made in consultation with DfE to maximise responses to the core survey and improve the analysis potential. 

Another focus of the questionnaire design phase for 2025 was on improving the accuracy of the data collected. As such, a number of small wording adjustments were made throughout the survey in consultation with DfE. These mainly included small wording changes or the addition, in places, of clarification text. For example, clarification text for questions about the time childcare provision started on the reference day were updated to specify this meant the earliest time children were able to attend the first session under this childcare provision. For SBPs this was also updated to specifically ask about their nursery provision to ensure respondents were not answering thinking about their wraparound provision[1].

Additionally, new logic checks were built into the questionnaire which asked providers to confirm or revisit their answers at key questions if the answer they had given seemed to be illogical with answers given at previous questions. For example,  if the number of SEND children reported by a provider was higher than the total number of registered children they had previously reported, they were asked to go back and check their answers.

There were similar logic checks added that related to other questions about booked places, funded hours, SEND, number of staff and hourly fees.

Other substantial amendments agreed with DfE following the review of the questionnaire and cognitive testing report included:

  • Amendment to the ‘registered places’ question for SBPs to specifically ask them about school-aged children aged 4 and over that are registered under their nursery provision. This change was made to reduce the data edits needed at this question as, in previous waves, some SBPs were including children in Key Stage 1 and Key Stage 2 when asked about school-aged children attending “nursery”
  • Replacement of questions asking providers how many spare morning and spare afternoon places they had with a question that asked how many extra children they would have been willing and able to take for “less than a full day” at any point in the day. This amend was made following feedback from cognitive testing that many providers did not think about spare places in terms of “morning” or “afternoon” sessions and, instead, took a more flexible approach to capacity
  • Addition of questions about the government-funded hours for under-two-year-olds introduced in April 2025
  • Updates to the question about the ethnicity of staff and apprentices to align with the 2021 Census ethnicity major codes as per other Official Statistics
  • Addition of questions asking about the hourly funding rate received from their local authority for the government funded hours for under-2-year-olds introduced in April 2025
  • Amendment to the childminding personal income question to specify that they should only include income that they personally take from the business in the form of a salary, before deductions for tax and national insurance. This amend was made following cognitive testing feedback that childminders were unclear whether to respond with their pre- or post-tax income when reading the previous question wording.
  • Amendment to the income breakdown question to ask specifically about income received from under two entitlements to reflect the introduction of funded hours for under-two-year-olds in April 2025
  • Deletion of questions relating to the following topics primarily to reduce the overall length of the questionnaire and reduce the burden on providers, although each was selected based on agreed rationale:
    • Whether the number of registered places they offered had increased since last year – also removed because analysis of previous years’ data suggested these responses were not always accurate, for example providers reporting that their registered places had “increased” despite the number of registered places that they reported on the survey being lower than the number they had reported on the previous year’s survey
    • How many staff whose highest relevant qualification is Level 3 or above also have a Level 2 qualification in Maths – also removed because this qualification is no longer necessary to count as a Level 3 member of staff in providers’ staff:child ratios
    • The destination, background and qualification level of staff joiners and leavers – also removed due to concerns over the accuracy of  these responses in previous years (responses showed, for instance, a much larger number of staff joining from another childcare provider than leaving to join another one).
    • Why the relevant providers are not offering government-funded hours – also removed due to the number of providers answering these questions in previous years being too low for analysis
    • Reasons why relevant GBPs do not have rent or mortgage payments – also removed due to limited use of the data collected in previous years

[1] Wraparound provision refers to care to school children provided outside of standard school hours, such as breakfast and after school clubs

3.4 Questionnaire sections 

As in previous years, the questionnaire was split into variants to reduce the time burden of completing it, while still ensuring a large enough response to enable detailed analysis. 

All providers were asked questions about their characteristics, about capacity and attendance, about the average hourly fees that they charged to parents, about children receiving government-funded entitlements and about Tax-Free Childcare.

Other sections of the questionnaire, however, were only asked  on one of two provider-specific variants of the survey. In previous years, the GBP questionnaire had been split into three variants but other questionnaire changes had meant the third variant was no longer sufficiently different to make it worthwhile, so variants 2 and 3 were combined. A full breakdown of which question sections were included on which variants of the survey is shown in Table 3.1.  

Table 3.1. Questionnaire sections and which variants of the survey they were included on

Questionnaire sections

SBPs

GBPs

Childminders

Provider characteristics

All

All

All

Attendance and capacity

All

All

All

Children receiving government-funded entitlements

All

All

All

Staff-to-child ratios 

Variant 2

Variant 1

Not asked

Children with SEND 

All

Variant 1

All

Access to SENCO (Special Educational Needs Coordinator)

Variant 1

Variant 1

Variant 1

Provider level staff questions*

All/ Variant 1

All/ Variant 2

All

Staff-level salary questions

Variant 1

Variant 2

Variant 1

Average hourly fees

All

All

All

Additional charges and Funded Early Education Entitlement

All

Variant 1

Variant 2

Tax-Free Childcare

All

All

All

Costs of delivering childcare

All

All

Variant 2

Income from childcare

All

All

Variant 2

 *The majority of the questions in these sections were asked to all providers although some follow-up questions were only allocated to a specific variant of the survey. 

4. Survey fieldwork

The SCEYP was conducted as a mixed-mode survey, which could be completed either online (via Computer Assisted Web Interviewing, also known as “CAWI”) or on the telephone (Computer Assisted Telephone Interviewing, also known as “CATI”). Both elements of fieldwork were conducted by IFF Research.

4.1 Interviewer training and project briefings 

Before starting work on the survey, all interviewers were required to attend a project briefing on the survey. Briefings were carried out using video teleconferencing and covered: 

  • Purpose of the study and how the data are used. 
  • Questionnaire content.
  • Study procedures, including initial contact with respondents and reminder communications.
  • Motivating respondents to take part.
  • Practice interviews. 

4.2 Fieldwork dates 

The SCEYP fieldwork took place between 2nd May and 15th July 2025. A “webfirst” design was followed, with the online survey launched first and a communication strategy designed to maximise the number of online completions (see Table 4.1 for communication dates). Telephone fieldwork ran from 13th May to 11th July 2025. 

4.3 Making contact 

As the SCEYP is voluntary, potential respondents were sent a range of survey materials to motivate participation, increase the completeness of the data collected and reduce participant burden by helping respondents to prepare in advance. Building on previous years, these materials were updated to be as accessible, appealing and persuasive as possible. Key features of the mailings intended to encourage response included: 

  • Design of mailings. In 2025 a new SCEYP logo and email header was created and used in all communications to increase brand recognition and ensure that all mailings looked professional and legitimate and would not be mistaken for junk mail or commercial marketing. The layout was engaging and concise, clearly spelling out what was being asked of participants and why. 
  • Content of materials. Communications emphasised the benefits of survey participation by stressing what results would be used for. They were also simplified in the 2025 survey, and the length reduced, to encourage engagement.
  • Using different motivational statements and forms of contact across the different contact attempts. Initial communications focussed on altruism by emphasising the importance of the research (e.g. ‘your opportunity to directly inform government childcare policy’). Different reminder communications emphasised this altruistic motive as well as personal validation (e.g. “your answers matter” and recognising the time input of respondents) and general adherence (e.g. highlighting the survey as “critical” for policymakers).  
  • ‘Nudges’ for participants without email addresses. A postcard nudge was sent out a couple of weeks after the invitation letter. This was to remind those who may not have acted immediately upon receipt of the invitation letter.

The full range of communications sent to potential respondents was as follows: 

Ministerial letter (from the Minister for Early Education) 

To encourage providers to take part, a letter was sent from the government’s Minister for Early Education to all school- and group-based providers, and all childminders who we did not have an email address for. This letter was sent alongside the survey invitation. It highlighted the importance of the survey in providing robust and up to date evidence on the childcare and early years sector and emphasised the role of the survey in informing government childcare policy.

Invitation letter 

All providers were sent an invitation letter or email inviting them to take part. These invitations gave full details about the study, including information security. The letters also included details on how to take part online and informed providers that they would hear from an interviewer via telephone if they did not complete the survey online. Both letters and emails included either a physical preparation sheet or a link to one online to help providers prepare to take part; there was also a physical list of “full and relevant” qualifications relevant to the survey or link to one included in letters and emails, respectively. Providers were given an email address and a contact number that they could use to opt-out of the survey and receive no future contact. 

See Appendix A for the survey invitation letter sent to providers. 

Preparation sheet 

The preparation sheet helped providers prepare for the telephone interview by listing information that they might need to look up in advance, such as financial details and the qualifications of staff members. These preparation sheets were tailored according to the provider type and the variant of the survey that they would be asked to complete. Providers were encouraged to complete the preparation sheet before their interview. 

Qualifications list 

A list of qualifications was enclosed with the invitation letter and email to help providers identify ‘full and relevant’ qualifications (i.e., all those accepted by the DfE for childcare staff to work in an early years setting and count towards staff:child ratios) and assist them in answering questions relating to the levels of early years or teaching qualifications held by their staff members. [1] 

Reminder emails and texts 

IFF Research sent multiple reminders across modes to ensure the response rate to the survey was as high as possible. Up to four reminder texts, four reminder emails and a reminder postcard were sent to all non-responding providers for whom IFF Research had the relevant contact details. Each reminder stressed a different potential motivation for taking part. The email reminders contained a link to an online version of the preparation sheet and the qualifications list.

Table 4.1. Fieldwork and communication dates

Communication  Sent out 
Email invitation 30th April 2025
Ministerial letter 1st May 2025
Invitation letter1st May 2025
Reminder email and SMS 1 21st May 2025
Reminder postcard28th May 2025
Reminder email and SMS 2 12th June 2025
Reminder email and SMS 3 26th June 2025
Reminder email and SMS 43rd July 2025

[1]Early years qualifications achieved in the UK (opens in new tab)

5. Validation process

A data validation stage that was first introduced for the 2022 survey continued in 2025. This involved identifying improbable responses and re-contacting providers via telephone, to confirm the correct response. The aim of this exercise was to reduce the number of records that needed to be edited after survey closure. 

A series of ‘validation flags’ on key survey variables were programmed that signalled where findings were outside of expected limits or seemed implausible. Providers were contacted by telephone to flag the issue and confirm correct responses. 

For example, to validate responses to cost and income questions that seemed implausible, a flag was set up to identify providers who had reported total income and cost, but their total income was three times more or less than half of their reported total cost. IFF then attempted to recontact all 172 providers who met this criterion to ask them to confirm their total income and total costs. Where providers gave new figures, their answers to the survey were overwritten with the newly collected data. Where providers re-confirmed their previous answers, their survey data was not amended.

A summary of the main validation flags and checks undertaken and the number of records recontacted (whose answers were confirmed or edited as a result), is outlined in Table 5.1 below. 

Table 5.1. Main validation checks completed for the 2025 survey

Validation checkNumber failing check Number of cases recontacted
“Income to cost ratio” very high or low17290
Missing cost component310205
More children claiming entitlement than children registered in those age groups339199
Illogical booked places vs. registered places257152
High numbers of school-aged children attending setting (SBPs)11493

The validation callback stage ran from 26th June to 14th July 2025. In total the validation flags identified 1,473 potential issues in the data. IFF research attempted to contact all of the affected providers, and 753 responses were secured – 209 SBPs, 410 GBPs and 134 CMs. In total, 899 implausible question responses were validated (and either edited or confirmed as a result).

6. Weighting

Weighting was used to ensure that the final achieved samples were representative of early years and childcare providers in England by correcting for non-response bias. 

Survey weights were designed separately for the three provider types.[1] The method used for the survey weights was Random Iterative Method (RIM) weighting. This was chosen because it allows for multiple characteristics to be adjusted simultaneously, whilst distorting the overall dataset as little as possible. The technique works by using an iterative process to achieve the ‘best fit’ for the weighted variables. The process starts by applying the weighting factors for the first variable. Once this has been calculated, the weighting is then applied for the second variable. As this second step will likely mean the profile will no longer match the first variable targets, the process then involves iteratively weighting, getting increasingly closer to the targets for each variable with each iteration. This is repeated until the profile is as close as possible to the targets. 

Grossing weights were used to ensure that the weighted data can be used to give estimates relating to the full population of early years and childcare providers in England. In this process each provider represents a certain number of providers in the population, so that calculations will add to the total number of providers in the total population.

[1] Survey weights were designed for the three overall provider types (group-based, school-based and childminders). In the official statistics report there are five provider types reported on separately. However, “private” and “voluntary” are sub-groups of the overall group-based provider type (and were sampled in the same way), and maintained nursery schools and other school-based providers offering nursery provision are sub-groups of the overall school-based provider type (and were sampled in the same way).

6.1 Population totals

The achieved sample of providers was weighted to be representative of all active childcare providers in England that were eligible for the study. During the interviews some group-base providers (GBPs) and school-based providers (SBPs) were found to be ineligible to take part in the survey: those that did not offer nursery provision, those that had closed and those coded as “duplicates”. The total population size used for weighting was calculated by adjusting for those falling into the first group (those that did not offer nursery provision). No adjustment was made for those that had shut down or were coded as duplicates on the assumption that any provider that had opened in the intervening period that would therefore not be included in the sampling frame, would be roughly analogous to those that had subsequently closed. 

Some childminders (CMs) were found to be ineligible in that they were no longer offering provision. As in the previous waves, it was decided not to redefine the population and to assume that the population was stable in its characteristics.

As in previous waves, providers with zero registered places according to sample sources were included in the survey sample on the assumption that at the time of the interview, information about the number of registered places might have already been outdated. However, as in previous waves, responding CMs were weighted to the profile of population excluding cases with zero registered places in the sample frame. 

The total eligible population was estimated at:

  • 21,718 GBPs – 99.2% of the sample frame
  • 10,045 SBPs – 99.1% of the initial sample frame
  • 23,695 CMs – 99.9% of the initial sample frame

6.2 Weighting process

Calibration weighting 

Calibration weighting was used to remove the (measurable) bias introduced through non-response to the SCEYP and align the profile of the achieved sample to the profile of the eligible population. To do this, RIM weight targets were set for:

  • GBPs: region, register type, ownership type, deprivation band based on Income Deprivation Affecting Children Index (IDACI) and quintile of number of places registered.
  • SBPs: school type, region, type of establishment, quintile of number of places registered.
  • CMs: region, register type, registration year, and deprivation band based on IDACI.

Table 6.1 provides an example of the weighting targets compared to the survey results and the weighted profile of the final dataset.

Table 6.1 Example calibration RIM weight targets and final profile (IDACI for GBPs)

Deprivation band (IDACI)

Weighting targets[1]

Unweighted

Profile after weighting applied

Most deprived

15.44%

14.73%

15.44%

Deprived

18.99%

18.58%

18.99%

Average / Unknown

20.35%

20.51%

20.35%

Less deprived

21.74%

23.11%

21.74%

Least deprived

23.48%

23.06%

23.48%

Questionnaire variant weights

Questionnaire variant weights were calculated for the analysis of providers allocated to each of the two variants of the questionnaire for each provider type. This involved applying the calibration weights to providers asked each variant. It was at this stage that providers were assigned a grossing weight. 

Day of the week weights 

Weights were also calculated for analysis relating to provision on each day of the week. Providers were allocated systematically within each variant of the questionnaire to a specific day of the week (Monday to Friday), about which they were asked certain questions. However, providers were able to complete the survey about a different day if they chose. The questionnaire included questions ascertaining which days of the week each provider was open. The profile and the overall population size for each day were estimated from the SCEYP (weighted by overall interview weight). Subsamples of providers asked about each day were then calibrated to the profile of providers open each day.

Staff weights 

In addition to calculating weights to make the sample representative of providers in England, weights for analysis of information on staff in such settings were also calculated. Staff weights were needed to account for the fact that GBPs and SBPs employing more than a certain number of staff at a given qualification level were only asked to give information about one or two members of their team at different levels, chosen at random. The provider-level interview weights were multiplied by the number of staff that provider had at each qualification level to obtain staff level weights for each case. Additional calibration was applied to ensure that the weighted staff profile (by level of education) was in line with the data collected about the number of staff employed with each qualification level.

CMs were also asked several follow up questions about the assistants they employ. There were up to two loops in the questionnaire for assistants so those with more than two assistants were asked to answer about two at random. The provider-level interview weights were multiplied by the number of assistants the CM had to obtain staff level weights for each case.

Some data were excluded from the staff-level datasets prior to weighting. For GBPs and SBPs this included cases where information on the staff member was not provided or the respondent reported an overseas qualification. GBPs and SBPs were also excluded if the number of paid staff with recorded qualifications differed by more than ±2 from the total number of paid staff reported or if they gave a qualification level in the detailed questions which did not align with the qualification level of staff they were asked to provide information for. 

The final staff level weight was re-scaled so that the total weighted number of staff/assistants was in line with an estimate of the total number of staff/assistants.
 

[1]based on population after excluding ineligible providers

7. Response rates

7.1 SCEYP Response

The SCEYP saw 9,358 productive interviews in total. This included:

  • 3,924 interviews with group-based providers (GBP);
    • Private (for profit) companies (including employer-run childcare for employees) (2,498 interviews, 64%)
    • Voluntary, community group or charity (including church(es) or religious group(s)) (1,220 interviews, 31%)
    • School/college/LA/unclassified GBPs (110 interviews, 3%)
    • Other GBPs (88 interviews, 2%)[1]
  • 1,524 interviews with school-based providers (SBP);
    • Maintained nursery schools (MNS) (92 interviews, 6%)
    • Other schools offering nursery provision (1,432 interviews, 94% )
  • 3,910 interviews with childminders.

Table 7.1 shows the outcomes achieved for each provider type. 

Table 7.1 Outcomes by provider type

Group-based providers

School-based providers

Childminders

Total

Issued sample

21,892

10,089

23,968

55,949

Received postal invite[2]

21,892

10,089

2,280

34,261

Received email invite[3]

9,633

10,063

21,688

41,384

Attempted contact by telephone[4]

21,818

10,074

23,600

55,492

Completes

3,924

1,524

3,910

9,358

Bad number[5]

1,163

52

3,452

4,667

Others unable to contact by telephone[6]

10,178

4,965

13,145

28,288

Total unable to contact by telephone

11,341

5,017

16,597

32,955

Unavailable[7]

4,599

3,147

1,820

9,566

Refused[8]

1,524

303

1,034

2,861

Total unproductive telephone contacts 

6,123

3,450

2,854

12,427

Permanently closed[9]

78

28

371

477

Temporarily closed (for longer than 30 days)

13

10

158

181

Open but not offering relevant childcare

413

60

79

552

Total ineligible

504

98

608

1,210

Table 7.2 shows overall response rates by survey year. When comparing between waves it is important to note that there was a change in approach to sampling for the 2025 survey. In previous years, for the core SCEYP, a random sample of GBPs and CMs were selected for the core survey (and the rest were allocated to a short survey). However, in 2025, all providers were asked to complete the core survey. This was to maximise the analysis potential of the survey. 

In 2025, while overall response rates for GBPs and SBPs declined, among childminders the response rate remained consistent with 2024. A possible reason for the lower response rates is an increase in the intensity of research being carried out among providers in 2025 due to the large policy changes impacting the sector. For example the May SCEYP Pulse survey (part of a series of online surveys designed to complement the main SCEYP by providing more frequent, up-to-date findings on topical issues) was in field at the same time as the main SCEYP survey as was fieldwork for the Children of the 2020s study. 


[1] This includes those run by, for instance, “private not-for-profit” organisations but excludes the 8 who answered either ‘Don’t know’ or ‘Prefer not to say’ when asked about their ownership type.

[2] Postal invitations were sent to all SBPs and GBPs, and all CMs where no email address could be found on sample sources

[3] Email invitations were sent to all providers where an email address could be found on sample sources

[4] Postal and email invites were sent out prior to commencing telephone calling, so this includes all providers that had not yet completed the survey before telephone calling started 

[5] ‘Bad number’ includes all those who had an invalid telephone number or informed us we had the wrong telephone number when we called.

[6] This includes other providers we did not have any telephone contact with including where there was no answer, our call was rejected, or the call went to answer machine. 

[7] ‘Unavailable’ includes all those who were busy when we called them including those who said they would be unavailable throughout the whole fieldwork period and instances where we spoke to someone at the setting but, despite repeated contact attempts, the person we asked to speak to was not available. 

[8] ‘Refused’ includes all those who opted out of the research or who said they did not want to participate when we called.

[9] Providers who had closed after the date the sample sources were last updated

Table 7.2 shows overall response rates by survey year. When comparing between waves it is important to note that there was a change in approach to sampling for the 2025 survey. In previous years, for the core SCEYP, a random sample of GBPs and CMs were selected for the core survey (and the rest were allocated to a short survey). However, in 2025, all providers were asked to complete the core survey. This was to maximise the analysis potential of the survey. 

In 2025, while overall response rates for GBPs and SBPs declined, among childminders the response rate remained consistent with 2024. A possible reason for the lower response rates is an increase in the intensity of research being carried out among providers in 2025 due to the large policy changes impacting the sector. For example the May SCEYP Pulse survey (part of a series of online surveys designed to complement the main SCEYP by providing more frequent, up-to-date findings on topical issues) was in field at the same time as the main SCEYP survey as was fieldwork for the Children of the 2020s study. 

Table 7.2 Provider response rates to the SCEYP survey

YearResponse ratesGroup-based providersSchool-based providersChildminders
2022Achieved (n)

6,338

2,476

1,604

Overall response rate

42%

36%

32%

2023Achieved (n)

5,529

2,079

1,482

Overall response rate

38%

32%

26%

2024Achieved (n)

5,042

1,702

1,366

Overall response rate

25%

21%

16%

2025Achieved (n)

3,924

1,524

3,910

Overall response rate

18%

15%

17%

 

As shown in Table 7.3, interviews were skewed slightly towards telephone (57% vs. 43% web), in line with 2024. This was consistent among group-based providers and school-based providers; however the split was more even among childminders.

Table 7.3 Response mode

Mode

Group-based providers

School-based providers

Childminders

Total

%

n

%

n

%

n

%

n

Telephone

59%

2,327

66%

1,009

50%

1,973

57%

5,309

Web

41%

1,597

34%

515

50%

1,937

43%

4,049

7.2 Interview length

Average telephone interview lengths and web interview lengths for different provider types and variants are provided below in Table 7.4 and Table 7.5. These are based on interviews that were completed within an hour on the same day; any interviews longer than one hour were excluded from the analysis of interview lengths as the timings for these outlier interviews are likely to be inaccurate. 

Interview lengths are shorter for childminders on average than those with school-based or group-based providers. This is because childminders were asked fewer questions than other providers e.g. they were not asked questions about staff-to-child ratios.

Table 7.4 Average (mean) telephone interview length by provider type and questionnaire variant

Provider type

Variant

2025 average telephone interview length (minutes)

Group-based providers

1

41 mins

2

41 mins

School-based providers

1

36 mins

2

35 mins

Childminders

1

20 mins

2

29 mins

Table 7.5 Average (mean) web interview length by provider type and questionnaire variant

Provider type

Variant

2025 average web interview length (minutes)[1]

Group-based providers

1

35 mins

2

37 mins

School-based providers

1

33 mins

2

31 mins

Childminders

1

17 mins

2

26 mins

[1] The average online survey time is calculated using the times that a respondent enters and leaves the survey after completing. It cannot account for periods of time when the respondent may have taken a short break, however, we have accounted for outliers by removing anyone who took longer than 1 hour to complete the survey.

8. Data processing

8.1 Coding 

A small number of questions in the survey offered the respondent the option to choose “other” as their response code and to enter free text into the designated field. These were subsequently coded by trained coders at IFF Research. Code frames from previous iterations of the survey were used for existing questions and no new codes were produced for the 2025 survey.

8.2 Deriving variables

Some variables in the final dataset were derived by combining responses from multiple survey variables. For example, in the staff loop section, respondents were asked how many hours per week and how many weeks per year their staff were contracted to work for. They were then given the choice to provide an answer about pay for each member of staff as either an annual, weekly or hourly salary. Hourly pay was then calculated for each member of staff based on these answers. Where the reported rate of pay was zero for staff, the derived variable was set to missing (as the analysis was interested in paid staff). 

Another derived variable produced for analysis was Flag04. The SCEYP sample includes some providers that are registered as early years providers but who told us on the survey that they do not currently look after any pre-school children aged 0 to 4. In 2025, 294 providers (21 SBPs, 49 GBPs and 224 CMs) who were contacted told us they were open, but did not offer relevant childcare. These providers were identified using the derived Flag04 variable and excluded from all analysis reported in the DfE publications because the focus of the survey is on childcare for under-school aged children.

8.3 Data cleaning

Numeric information collected as part of the SCEYP was typed in by respondents themselves (online) or keyed in by interviewers (telephone). It was possible for mis-keying to occur and, on inspection, some answers appeared either much too large or much too small or nonsensical given other survey answers. Therefore, it was decided to implement rules whereby certain values would be removed from or edited in the data where doing so improved the quality of the data produced. Decisions around these rules were informed by observing the distribution of answers at specific questions alongside other contextual information. 

We reviewed any records that failed the data validation flags (and which we were not able to re-contact to check the suspect data). We also undertook a review of the raw data file to identify all outliers and unlikely responses to key variables (e.g. registered places, booked places, ratios, incomes, costs, salaries). These identified outliers were compared against answers given within the same record (e.g. number of staff) to inform a recommendation for treatment. There were some that we addressed through ‘editing rules’ that could be applied as a formula, but at this stage they were more likely to be ‘case-by-case’ edits. 

Where an obvious data entry error had been made (e.g. an additional ‘0’ added to a salary figure given), we edited the data to ensure it was still usable in analysis. Where data looked incorrect, but no obvious data entry error was identified, we blanked the variable for that record. Their answers to other questions (unless there were additional data validation issues raised) were retained for the analysis. These edits were agreed with DfE before being implemented in the final data file.

Data were recoded to “missing” in the following scenarios:

  • Extreme differences between total staff and registered places: 14 records
  • Instances where providers recorded 0 staff members: 6 records
  • Number of temporary staff or apprentices are significantly higher than the number of paid staff: 2 records
  • Where the number of staff with a particular Level 6 qualification is greater than the total number of staff with any Level 6 qualification: 17 records
  • Extreme staff:child ratios of greater than 1:10 (for children under 3) or 1:20 (for children aged 3-4) or improbably decimal answers between 1 and 2 to the staff:child ratio questions: 83 records
  • Where the reported number of children with Special Educational Needs or Disabilities was greater than the total number of registered children: 24 records
  • Reported salaries of £0, or extreme salaries (such as annual salaries reported as £13.80, or hourly salaries reported as £1,000): 275 records
  • Number of children paying fees using Tax Free Childcare is higher than total number of children registered: 58 records
  • Number of children booked for the full-day, plus the number of extra children providers could have taken for a full-day is more than the total number of children their Ofsted registration allows them to look after: 143 records
  • Reported income or cost total is 0: 615 records
  • Extreme income-to-cost ratios (e.g., annual income reported as £11, and annual cost reported as £18,000): 28 records

8.4 Data imputation

In 2025, a new method was developed for imputing values for missing data at key questions. Prior to 2025, missing responses were excluded from figures, effectively treating them as zero. This implicit assumption led to the underestimation of total figures as providers with missing data had been assigned a weight. If the proportion of missing data varies between years, the extent to which totals had been underestimated also varies, reducing comparability over time. 

To generate imputed values, weighted medians were calculated using all valid responses within each provider type:

  • Nursery class childcare settings 
  • Maintained nursery schools
  • Private group-based providers
  • Voluntary group-based providers
  • Childminders

The weighted median by provider type is a more appropriate value to use than the weighted mean because the distribution of responses to the relevant questions is skewed. In cases where the data are not normally distributed the median provides a more accurate measure of the “typical” provider. The unweighted base for each variable only includes providers that gave a valid response and excludes the providers for which values were imputed.

The new imputation approach was applied to the following variables:

  • Registered places for full day, before school, and after school provision
  • Booked places for full day, morning, and afternoon provision

The method was also considered for other key variables, such as the number of paid staff. However, where missing data were minimal and similar across years, imputation was not applied. 

Annexes

Appendix A: SCEYP Survey invitation letter

Help and support

Contact us

If you have a specific enquiry about Childcare and early years provider survey statistics and data:

Early years and childcare research and analysis

Email: EY.AnalysisANDResearch@education.gov.uk
Contact name: John Canlin

Press office

If you have a media enquiry:

Telephone: 020 7783 8300

Public enquiries

If you have a general enquiry about the Department for Education (DfE) or education:

Telephone: 037 0000 2288

Opening times:
Monday to Friday from 9.30am to 5pm (excluding bank holidays)