Methodology

Childcare and early years provider survey

Published

1. Introduction

The Survey of Childcare and Early Years Providers (SCEYP) is important for monitoring and evaluating childcare and early years provision. Covering group-based providers (private, voluntary and school/college/LA/other unclassified[1]), school-based providers (those offering nursery provision and maintained nursery schools) and childminders, it provides a representative snapshot of early years provision in England. It helps the Government, including the Department for Education (DfE), understand the issues that providers face, informing development of early years and childcare policy. 

The survey has been running since 1998, with the latest 2023 iteration carried out by IFF Research and London Economics. There are two versions:  

  • The main SCEYP 2023: a large mixed-mode survey, which can be completed either online or on the telephone. Topics covered include staff to-child ratios, delivery of Government policies including funded entitlements and Tax-free Childcare (TFC), children with Special Educational Needs and Disabilities (SEND), staff qualifications and pay, and the financial side of providing childcare such as costs, incomes and fees. This year, new questions were also included on staff leaving the workforce. There are also a number of core questions such as the number of children registered at the setting. 
  • The short SCEYP: delivered online or via postal entry, this a set of core questions from the main survey, focusing on fees and Government funding received, as well as areas such as the number of children registered at the setting. The purpose of this short survey is to allow fee and funding estimates to be calculated at a local authority (LA) level by aggregating data collected across this and the main SCEYP. 

[1] All group-based providers that are not voluntary or private have been recorded in this third category, of which most are either school run, college run or local authority run group-based providers. Although this category is included in “all group-based provider” totals, in most of the tables in the official statistics publication it is not shown separately. 

1.1 The SCEYP timeseries 

SCEYP has undergone some changes in scope and design which mean that comparisons between different survey years should be approached with caution: 

  • The questionnaire was fully redesigned in 2016 to reflect changes in the childcare market. These changes to the design meant that the 2016 data is not directly comparable with previous years of the survey; the 2016 data was intended to provide a new and enhanced baseline. There were some changes to the 2018 survey to improve the quality of data and to reflect new policy priorities, but otherwise it was designed to be comparable with the 2016 survey. 
  • From 2019, and following a user consultation on the frequency, sampling approach, mode and topic coverage of the SCEYP[2], DfE decided to intensify the early years focus of the survey by sampling and recruiting only providers for pre-school children aged 0-4 years. Previous surveys had included reception provision as well as wraparound and holiday care for school-aged children aged up to seven. This means that figures published before and after 2019 are not comparable for “all school-based providers” and “all providers”. Any trend analysis within publications will have made adjustments to ensure figures from different years are comparable.
  • In 2023, following consultation with DfE and through cognitive testing with providers, the cost and income element of the survey was re-designed. This was done to improve respondent experience as well as to collect more reliable data.

The SCEYP sample includes some providers registered as early years providers but not currently looking after any children aged 0 to 4. London Economics excluded these providers from all analysis reported in the DfE publications. 

[2] Department for Education (2018) Surveys on childcare and early years in England: Government consultation response London: DfE 

1.2 Temporary closed settings

Fieldwork for the 2023 survey took place between April and July 2023. 

To try to obtain as accurate a picture as possible of available childcare provision in 2023, providers were invited to complete the survey if they had been open and offering relevant childcare in the last 30 days, even if their setting was temporarily closed at the time of interview. Settings that were open were asked to think about their current situation, whilst settings temporarily closed at the time of interview were asked to answer with reference to the last term-time day on which they were open.

Of the 15,828 respondents who answered the survey, 22 were closed less than 30 days prior to interview, while 88 had been closed for more than 30 days. The latter 88 were screened out prior to the survey.

2. Sampling

As in previous waves of the Survey of Childcare and Early Years Providers (SCEYP), the study comprised a representative sample of three distinct provider populations in England, each of which was analysed separately:

  • Group-based providers (GBP): childcare providers registered with Ofsted and operating in non-domestic premises;
  • School-based providers (SBP): nursery provision in schools and maintained nursery schools;
  • Childminders: Ofsted-registered childminders providing early years care and operating in domestic settings (excluding providers solely on the voluntary register).

The sample sizes drawn for each provider type drew upon the response rates of the 2022 survey to determine how many records would feasibly be needed to hit response targets. For GBPs and SBPs, this meant using all of the available sample.[3] For childminders, not all of the available sample was used; once the sample for the main survey was drawn, using the response rates from 2022 and the target responses, a sub-sample of records were drawn for the short survey from 16 larger Local Authorities to ensure sufficient base sizes for analysis. For the remaining Local Authorities, all records were sampled. 

In the main Official Statistics publication for SCEYP 2023, figures were reported for five sub-groups:

  • Private GBPs;
  • Voluntary GBPs;
  • Nursery class childcare settings;
  • Maintained nursery schools (MNS); and
  • Childminders.

This chapter describes the sampling approaches adopted for both the main SCEYP and the short SCEYP surveys.

[3] After the sample for the main survey was compiled, all remaining records were assigned to the short version of the survey.

2.1 Group-based providers sample

GBPs are childcare providers registered with Ofsted and operating in non-domestic premises.[4] The survey collected further details about the ownership and type of childcare offered.

A request was made to Ofsted to provide information for all records in the “childcare on non-domestic premises” category (excluding those classified as inactive and those appearing on the Voluntary Childcare Register (VCR) only) from their July 2022 snapshot database. The sample was based on records from July 2022 to be consistent with the timing of sampling for previous years’ surveys. The data was provided in two stages, with stage 1 providing the variables necessary for drawing the sample and stage 2 containing provider contact details (address, telephone numbers and e-mail address). At stage 1, 24,404 records were provided.

Prior to distributing cases between the main and short surveys, records were checked for duplicates, settings that were known to have closed, or settings identified as only offering holiday care or leisure activities. In total, 1,033 records were removed from the sample.

The remaining 23,371 records were used to draw a sample for the main survey. As in previous years, a stratified random sampling approach was used. Records were selected in proportion to the regional distribution of records and stratified by local authority (LA), register type, ownership status, deprivation band based on the Income Deprivation Affecting Children Index (IDACI), and postcode. As in previous years, it was decided to oversample the North East to achieve sufficient responses for this region to be analysed alongside others. A total of 15,120 records were selected for the main survey, (based on the previous year’s response rate and a target of 6,200 responses).

The selected sample was then allocated systematically (using the same stratification variables) to one of three questionnaire variants (16/42/42% split[5]) and, within each variant, to a specific day of the week about which to answer certain questions. This was done in order to allow analysis of trends in attendance and capacity that vary by weekday. All GBPs that remained in the sample frame following sampling to the main survey were treated as eligible for the short survey (8,251).

Table 2.1 Number of group-based providers issued to each survey element

Survey elementN
Main SCEYP15,120[6]
Short SCEYP8,251
Not issued to mainstage fieldwork1,033
Total24,404

[4] This does not include provision in schools.

[5] The majority of providers were split between the staffing (v2) and finance (v3) variants which were of most interest analytically. A smaller proportion were allocated to variant 1 which focused on SEND. The same split has been in place since 2016.

[6] 100 records were assigned to ‘pilot/cognitive testing’ status. They were re-sampled for the main SCEYP.

2.2 School-based providers sample

The sample of SBPs consisted of maintained and independent primary schools with nursery and reception provision, including maintained nursery schools (MNS). Although schools offering reception provision alongside nursery provision were eligible for the study, they were only asked about their nursery provision. This sample was drawn from the Schools Census, collected in November 2021 from all schools in England. The sample was complemented with further information from the Get Information About Schools (GIAS)[7] extract. The types of schools eligible for the study were identified using the following criteria:

  • Independent schools with reception and nursery provision: open independent schools with statutory lowest age equal to 0, 1, 2 or 3 years (according to School Census) and offering nursery classes (according to GIAS);
  • Maintained nursery schools: open state-funded nursery schools (according to School Census);
  • Maintained schools with reception and nursery provision: open state-funded mainstream and special schools which offer nursery classes (which have at least one pupil in nursery and at least one pupil in reception).

A total of 10,049 eligible school-based providers were identified.[8] After the sampling frame was checked for duplicate records and known or planned closures, 9,956 records were included in the sample.

To decide the required issued sample sizes, IFF assumed the same response rates as in the SCEYP 2022 main sample (see Table 2.2) and used these to estimate the number of surveys that would need to be issued to achieve the target of 2,542 total responses.

Table 2.2 Response rate assumptions for school-based providers by type

School type

Main survey

Short survey

Independent schools with reception and nursery provision

36%

25%

Maintained nursery schools

63%

-

Maintained schools with reception and nursery provision

36%

25%

A stratified random sampling approach was used to select the samples of school-based providers. Table 2.3 presents the number of cases sampled for the main survey and the short survey. As in previous years, it was necessary to include all maintained nursery schools that were sampled in the main survey in order to ensure achieving sufficient interviews for regional analysis. The samples were stratified by school type (disproportionate sampling), region and IDACI score to ensure geographical representativeness of the sample. After stratification, there were 6,836 schools eligible for the main survey. The samples were then allocated systematically (using the same stratification variables) to one of two questionnaire variants (50/50% split) and, within each variant, to a specific day of the week about which to answer questions. This was done in order to allow analysis of trends in attendance and capacity that vary by weekday.

The remaining 3,120[9] schools in the sample frame were allocated to the short survey. These were only maintained and independent schools offering both reception and nursery.

Table 2.3. Number of school-based providers issued to each survey element

Survey elementIssued (n)
Main SCEYP6,836
Short SCEYP3,120
Not issued to mainstage fieldwork93
Total10,049

[7] Get Information About Schools (GIAS) is a register of educational establishments in England and Wales. It is maintained by the Department for Education.

[8] The increase in initial sample since 2022 was caused mainly by schools moving from ‘No Nursery Classes’ classification to ‘Has Nursery Classes’ on Get Information About Schools (GIAS).

[9] 60 schools were assigned ‘pilot/cognitive testing’ status.

2.3 Childminders sample

The sample of childminders was drawn from the July 2022 monthly snapshot of the Ofsted register of childminders.[10] Only those that confirmed they were still practicing childminders at the time of interview were eligible to be surveyed.

A request was made to Ofsted to provide information for all records of active childminders (excluding those appearing on the VCR only) from their July 2022 monthly snapshot database. The data was provided in two stages, with stage 1 providing the variables necessary for drawing the sample, and stage 2 containing the provider name, address and contact details (telephone numbers, e-mail address) for sampled childminders. After cleaning to remove childminders known to have closed, the stage 1 database included 29,990 active childminders.[11] This database was used to select samples for all survey components.

Based on a target response rate of 34% for the main survey, and a target of 2,000 responses, IFF selected 6,059 childminders for the main survey sample. 

As in previous years of SCEYP, the design of the short survey sample was driven by the need to provide estimates at LA level. Therefore, IFF estimated how many interviews including data on fees would be achieved with the main survey for each LA and then ‘topped up’ the sample with the aim of achieving either 80 or 100 interviews in each LA (across the short and main survey combined). If there were more than 600 CMs in the sample frame for an LA, the target was 100. If the LA had fewer than 600 CMs, the target was 80. In some cases there were too few CMs in the population to achieve 80 interviews and in these cases all records were drawn. In order to calculate the volume of top-up records needed per LA, a response rate of 20% was assumed for the short survey in every region except London, where a response rate of 15% was assumed based on SCEYP 2021. A total of 26,320 childminders were sampled.

A stratified random sampling approach was used to select both samples. Smaller regions were oversampled for the main survey to enable reporting by region and questionnaire variants. The sample was further stratified by LA, register type, time elapsed since registration and IDACI score, and then checked to ensure that the selection was representative of the entire population of childminders.

The selected sample was then allocated systematically (using the same stratification variables) to one of two questionnaire variants (50/50% split) and, within each variant, to a specific day of the week to answer questions about. This was done in order to allow analysis of trends in attendance and capacity that vary by weekday. The sample for the short survey was stratified by the same set of variables (explicitly stratifying by LA), and then allocated to a specific day of the week about which to answer certain questions.

Table 2.4 Number of childminders issued to each survey element

Survey elementIssued (n)
Main SCEYP6,059
Short SCEYP20,261

[10] Childminders must register with Ofsted or a childminder agency. Only those that registered with Ofsted were included in the SCEYP sample frame.

[11] This continued the trend in reduction of the size of the sampling frame seen in previous years of the survey. In 2022, there were 33,800 active childminders to sample from.

3. Questionnaire

3.1 Main SCEYP: Questionnaire development 

Much of the content of the 2023 questionnaire mirrored that of the 2022 survey in order to maintain the timeseries. However, some new questions were introduced this year to cover salient topics within the Department for Education and some of the questions asked in 2022, for instance about the impact of Covid-19 on the setting, were removed. Moreover, some changes were introduced, particularly to the cost and income section in order to improve the overall quality of the data collection as well as respondent experience. 

3.1.1 Cognitive testing  

The cognitive testing phase was a relatively small-scale exercise focused on questions new to the survey in 2023. These covered:  

  • Clarifying the language around ‘full and relevant’ qualifications when the respondent is asked about the number of paid staff at their setting. Also, clarifying whether those with Early Years degrees had any other type of Level 6 qualification.
  • Testing changes to the questions around attendance and capacity
  • The number of paid childcare staff that had left their employment, their tenure and highest qualification
  • Knowledge of how parents pay for their childcare using Tax Free Childcare
  • Knowledge of the childcare element of Universal Credit
  • Testing the redesign of the finances section of the survey.  The new design used a ‘totaliser’ function. After answering questions on the total cost / income of running their setting, respondents were asked to provide a breakdown of the various components of their costs / income. For online respondents, the cumulative total of the component amounts was shown on screen as they were answering each one. This was designed to reduce the arithmetical burden on the respondent and improve accuracy. 

A handful of amendments were agreed with DfE following the cognitive testing report, with small adjustments made and the addition of clarification text in places. 

3.2 Main SCEYP: Questionnaire sections 

Table 3.1 sets out the questionnaire structure used in the main SCEYP 2023. As in previous years, the questionnaire was split into variants in order to reduce the time burden of completing it, while still ensuring a large enough response to enable detailed analysis. 

The core questionnaire for each provider type asked a series of contextual questions as well as covering capacity and attendance. 

GBPs answered the core questionnaire and one of three provider-specific variants. The content of these variants consisted of questions on either funded hours and expanding provision (variant 1), staffing (variant 2) or finances (variant 3). 

SBPs and childminders answered the core questionnaire and one of two provider-specific variants. One variant asked about staffing and children with SEND (variant 1), while the other comprised financial questions (variant 2).  

Table 3.1. Main SCEYP questionnaire sections

Questionnaire sections 

School-based providers

Group-based providers 

Childminders

Provider Background 

Core

Core

Core

Attendance and Capacity

Core

Core

Core

Funded hours - delivery

Core

Core

Core

Funded hours - flexibility  

Variant 2

Variant 1

Variant 1

Staff-to-child ratios 

Variant 2

Variant 1+3

n/a

Children with SEND 

Core

Variant 1+3

Core

Access to SENCO (Special Educational Needs Coordinator)

Variant 1

Variant 1

Variant 1

Provider level staff questions*

Core / Variant 1

Core / Variant 2

Core / Variant 1

Looped staff questions

Variant 1

Variant 2

Variant 1

Fees and Free Entitlements

Core

Variant 1+3

Variant 2

TFC and Universal Credit

Core

Core

Core

Costs of childcare

Variant 2

Variant 3

Variant 2

Income from childcare

Variant 2

Variant 3

Variant 2

Additional services

Core

Core

N/A

         *Some questions in these sections were core and asked of all, some questions allocated to a specific variant. 

4. Survey fieldwork

The 2023 main Survey of Childcare and Early Years Providers (SCEYP) was conducted as a mixed-mode survey, which could be completed either online (via Computer Assisted Web Interviewing, also known as “CAWI”) or on the telephone (Computer Assisted Telephone Interviewing, also known as “CATI”). Both surveys were conducted by IFF Research. Data for the mixed-mode short SCEYP was collected online and using a paper questionnaire. 

4.1 Main SCEYP: Interviewer training and project briefings 

Before starting work on the survey, all interviewers were required to attend a project briefing on the survey. Briefings were carried out using video teleconferencing and covered: 

  • Purpose of the study and how the data are used; 
  • Questionnaire content; 
  • Study procedures, including initial contact with respondents and reminder communications; 
  • Motivating respondents to take part; 
  • Practice interviews. 

4.2 Main SCEYP: Fieldwork dates 

The main SCEYP fieldwork took place between April 17th and 25th July 2023. A “webfirst” design was followed, with the online survey launched first and a communication strategy designed to maximise the number of online completions (see Table 4.1 for communication dates). Telephone fieldwork ran from 17th May to 3rd July 2023. 

As a result of a routing error, GBP and SBP respondents were initially not asked all of the staff questions as intended in the survey. These questions are set up in a ‘loop’, asking respondents to answer a series of questions about pay, age and contracted hours of members of staff with different qualification levels. IFF Research recontacted 988 SBP and 2,093 GBP respondents between 31st October and 17th November to answer questions that were previously missed. IFF Research completed the recontact exercise with 474 SBPs and 1,268 GBPs during this time. Respondents were provided with the option of answering based on the situation in May / June of 2023, or what the situation is now. Nearly all (96%) were able to answer based on the situation in May or June 2023.

4.3 Main SCEYP: Making contact 

As this survey is voluntary, potential respondents were sent a range of survey materials to motivate participation, increase the completeness of the data collected and reduce participant burden by helping respondents to prepare in advance. Building on previous years, IFF Research worked with the Department for Education (DfE) in order to make these materials as accessible, appealing and persuasive as possible. Key features of the mailings intended to encourage response included: 

  • Content of materials. Increased the emphasis on the benefits of survey participation by stressing what results will be used for. They were also simplified this year, and the length reduced, to encourage engagement. 
  • Using different motivational statements across the different contact attempts. Research evidence[12],[13] shows that varying the messaging across different communications increases response rates by attracting participants with different motivations to take part. Initial communications focussed on the most popular motive for survey participation - altruism (by emphasising the importance of the research and making ‘your voice heard by Government’). The later reminder letter highlighted other potential motivations, such as self-interest i.e. ‘taking part will help you, your staff and the families and children you support’.  
  • Design of mailings. Efforts were made to ensure that all mailings looked professional and legitimate and would not be mistaken for junk mail or commercial marketing. The layout was engaging and concise, clearly spelling out what was being asked of participants and why. 
  • ‘Nudges’ for participants without email addresses. A postcard nudge was sent out a couple of weeks after the invitation letter. This was to remind those who may not have acted immediately upon receipt of the invitation letter. 

The full range of communications sent to potential respondents was as follows: 

Ministerial letter (from the Parliamentary Under-Secretary of State for Children, Families and Wellbeing) 

In order to encourage school-based providers to take part, a letter was sent from the Parliamentary Under-Secretary of State for Children, Families and Wellbeing, to this participant group at the outset of fieldwork. This letter was sent alongside the survey invitation and highlighted the importance of the survey in providing robust and up to date evidence on the childcare and early years sector. 

Invitation letter 

All providers were sent an invitation letter. These invitations gave full details about the study, including information security, and invited the provider to take part. They also included details on how to take part online and informed providers that they would hear from an interviewer via telephone if they did not complete the survey online. Both letters and emails included either a physical preparation sheet or a link to one online to help providers prepare to take part. Providers were given an email address and a contact number that they could use to opt-out of the survey and receive no future contact. See Appendix A for the survey invitation letter.

Preparation sheet 

The preparation sheet helped providers prepare for the telephone interview by listing information that they might need to look up in advance, such as financial figures that they were unlikely to know by heart or the qualifications of staff members. These preparation sheets were tailored according to the provider type and the variant of the survey that they would be asked to complete. Providers were encouraged to complete the preparation sheet before their interview. 

Qualifications list 

A list of qualifications was enclosed with the invitation letter and email to help providers identify ‘full and relevant’ qualifications, as well as which levels of Early Years or teaching qualifications their staff members held. Telephone interviewers were able to resend the invitation email with a link to the preparation sheet and qualifications list to providers who wanted to prepare before completing the survey. 

Postcard /email nudge 

A postcard reminder was sent to all providers with no email address or mobile phone number available. The purpose of these postcards was to jog the memories of those who may have intended to take part, but had forgotten to, and to catch the attention of those who may not have seen the first invitation. 

For providers for whom an email address was available, the postcard was replaced with an email nudge drawing attention to the invitation letter. 

Reminder emails and texts 

IFF Research sent multiple reminders across modes to ensure the response rate to the survey was as high as possible. Up to three reminder texts, four reminder emails and a reminder letter were sent to all non-responding providers for whom IFF Research had the relevant contact details. Each reminder stressed a different potential motivation for taking part. A final text reminder was sent only to those who had begun the questionnaire but not completed it. The email reminder contained a link to an online version of the preparation sheet. 

Table 4.1. Main SCEYP fieldwork and communication dates

Communication  Sent out 
Ministerial letter (for SBPs only) 17th April 2023
Invitation letter17th April 2023
Email invitation 3rd May 2023 
Postcard nudge (to those without an email address)3rdMay 2023
Reminder email and SMS 1 10th May 2023
Reminder letter 115th May 2023
Reminder email and SMS 2 22nd May 2023
Reminder email and SMS 3 5th June 2023
Reminder email and SMS 4 26th June 2023

[12] Dillman, D. A., Smyth, J. D., Christian, L. M. (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Edition, Wiley. 

[13] Nicolaas, G., Smith, P., Pickering, K. & Branson, C. (2015). Increasing response rates in postal surveys while controlling costs: an experimental investigation. Social Research Practice Issue 1 Winter 2015. http://the-sra.org.uk/wp-content/uploads/social-research-practice-journal-issue-01-winter-2015.pdf 

4.4 Short SCEYP: Fieldwork dates 

The short SCEYP fieldwork ran from 17th April until 25th July 2023.[14] As with the main survey, a “web-first” design was followed, with the online survey launched first and a communication strategy designed to maximise the number of online completions (see Table 4.2 for communication dates). Paper questionnaires were sent out with reminder letters over a month into fieldwork and were supplemented with email reminders.

[14] This is the date when the invitation letters were first mailed out.

4.5 Short SCEYP: Making contact 

The communication strategy for the short SCEYP mirrored that for the main SCEYP (see section 4.3). Potential respondents were sent a range of survey materials to motivate participation, increase the completeness of the data collected and reduce participant burden by helping respondents to prepare in advance. Building on previous years, IFF Research worked with DfE in order to make these materials as accessible, appealing and persuasive as possible. 

The full range of communications sent to potential respondents was as follows: 

Ministerial letter (from the Parliamentary Under-Secretary of State for Children and Families) 

In order to encourage school-based providers to take part, a letter was sent from the Parliamentary Under-Secretary of State for Children, Families and Wellbeing, to this participant group around a month into fieldwork. This letter highlighted the importance of the survey in providing robust and up to date evidence on the childcare and early years sector, and encouraged schools to take part. 

Invitation letter and email 

All providers were sent an invitation letter. These invitations gave full details about the study, including information security, and invited the provider to take part. Providers were given an email address and a contact number that they could use to opt-out of the survey and receive no future contact. They also included details on how to take part online. See Appendix B for the survey invitation letter.

Reminder emails 

Up to four reminder emails were sent to all non-responding providers who IFF had the relevant contact details for. Each reminder stressed a different potential motivation for taking part. 

Reminder letter and paper questionnaire 

A reminder letter was sent to all providers over a month into fieldwork. A paper copy of the questionnaire, along with a return envelope, was also included for those providers who preferred to complete the survey on paper rather than online. 

Table 4.2. Short SCEYP fieldwork and communication dates

Communication

Sent out

Invitation letter

17th April 2023

Ministerial letter (for school based providers)

17th April 2023

Invitation email

3rd May 2023

Reminder email 1

10th May 2023

Reminder email 2

22nd May 2023

Copy of paper questionnaire and reminder letter

5th June 2023

Reminder email 3

5th June 2023

Reminder email 4

26th June 2023

5. Validation process

A data validation stage that was first introduced for the 2022 survey continued in 2023. This involved identifying improbable responses using the built-in validation checks agreed in advance of fieldwork and re-contacting providers flagged as part of this via telephone, to confirm the correct response. 

The aim of this exercise was to reduce the number of records that needed to be edited after survey closure. IFF programmed a series of ‘validation flags’ on key survey variables that signalled where findings were outside of expected limits or seemed implausible. IFF then recontacted providers to flag the issue and confirm correct responses. 

In 2022, the validation exercise was focussed on those that provided implausible answers in the cost / income section of the survey. With the new survey set-up in place for the cost and income questions, as discussed in section 3.1, fewer respondents provided implausible answers here in 2023. Instead, the validation callbacks targeted those that initially said ‘don’t know’ at these questions to increase the amount of usable data for this section.

The validation call back stage ran from 21st June to 13th July 2023 and IFF Research attempted to contact 1,286 providers in total. From these, 1,002 responses were secured.

  • Childminders: 65 completes
  • School-based providers: 300 completes
  • Group-based providers: 637 completes

Figures were provided (in place of the original “don't know”) by 340 providers. 

6. Weighting

6.1 Weighting the main SCEYP

Weighting was used to ensure that the final achieved samples were representative of early years and childcare providers in England. Survey weights were designed separately for the three provider types[15] to correct for unequal selection probabilities and non-response bias. Grossing weights were created in order to ensure that the weighted achieved samples gross up to the population of early years and childcare providers in England.

[15] Survey weights were designed for the three overall provider types (group-based, school-based and childminders). In the official statistics report there are five provider types reported on separately. However, private and voluntary are sub-groups of the overall group-based provider type (and were sampled in the same way), and maintained nursery schools and school-based providers offering nursery provision are sub-groups of the overall school-based provider type (and were sampled in the same way).

6.1.1 Main SCEYP: Weighting for group-based providers

Population totals

The achieved sample of group-based providers was weighted to be representative of all active group-based childcare providers (GBP) in England that were eligible for the study. However, the sampling frame did not allow for the exclusion of all ineligible institutions prior to sampling, therefore the eligible population size and profile were estimated using information about institutions subsequently found to be ineligible.

The total eligible population was estimated at 22,924 establishments – 98.1% of the sample frame.

Weighting process

Design weights

Design weights were first calculated to correct for unequal selection probabilities arising from the fact that establishments in the North East were oversampled to allow robust figures for the region to be calculated. 

Calibration weighting

Calibration weighting was used to remove the (measurable) bias introduced through non-response to the main Survey of Childcare and Early Years Providers (SCEYP) and align the profile of the achieved sample to the profile of the eligible population defined by: region, register type, ownership type, and deprivation band based on Income Deprivation Affecting Children Index (IDACI).

Questionnaire variant weights

Questionnaire variant weights were calculated for the analysis of providers allocated to each of the three GBP variants of the questionnaire. The final overall weight served as a pre-weight for adjustments to the responding cases and their profiles were grossed up to the eligible population profile. 

Day of the week weights 

The selected sample was allocated systematically within each variant of the questionnaire to a specific day of the week (Monday to Friday), about which they were asked certain questions. However, providers were able to complete the survey about a different day. The profile and the overall population size for each day were estimated from the main SCEYP (weighted by overall interview weight), using the data from questions about which days of the week GBPs were open in 2022, as this question was not asked in 2023. Subsamples of each day were then calibrated to the population profile. 

Staff weights 

In addition to calculating weights to make the sample representative of GBPs in England, weights for analysis of information on staff in such settings were also calculated. Staff weights were needed to account for the fact that settings employing more than a certain number of staff at a given qualification level were only asked to give information about one or two members of their team chosen at random. A design weight to take account of this selection was calculated and then multiplied by the provider-level interview weights to obtain staff level weights for each case. Additional calibration was applied to ensure that the weighted staff profile (by level of education) was in line with the data collected about the number of staff working.

Some data were excluded from the staff-level datasets prior to weighting. This included cases where information on the staff member was not provided or the respondent reported an overseas qualification. Providers for whom the number of paid staff for whom qualifications were recorded was greater than +/- 2 the total number of paid staff reported have been excluded.

6.1.2 Main SCEYP: Weighting for school-based providers

Population totals

To create a sampling frame of school-based providers the Schools’ Census database from June 2022 was used, enhanced with further information from a Get Information About Schools (GIAS) extract. During the interviews some institutions were found to be ineligible to take part in the survey: schools that had closed down and schools that did not offer nursery provision. As for group-based providers, establishments coded as “duplicates” and those that had shut down were treated as eligible on the assumption that any school that had opened in the intervening period that would therefore not be included in the sampling frame, would be roughly analogous to those that had subsequently closed. The total eligible population was estimated at 9,943 – 99.9% of the initial sample frame.

Weighting process

Design weights

Design weights were calculated to correct for disproportionate sampling of different types of schools. 

Calibration weighting

Calibration weighting was used to remove the (measurable) bias introduced through non-response to the main SCEYP and align the profile of the achieved sample to the profile of the eligible population on the following variables: school type (main stratum), region, type of establishment, quintile of number of places registered.

Questionnaire variant weights

Questionnaire variant weights were calculated for the analysis of school-based providers allocated to each of the two variants of the questionnaire. The responding cases and their profiles were grossed-up to the eligible population profile.

Day of the week weights

As with GBPs, although respondents were allocated a random day of the week (Monday – Friday) to answer certain questions about, they were able to complete the survey about a different day. As in the previous four waves, an assumption was made that all schools were open every week day and the sample of schools which answered about each day of the week was weighted to the overall profile of the eligible population. 

Staff weights

In addition to calculating weights to make the sample representative of SBPs in England, weights for analysis of information on staff in such settings were also calculated. Staff weights were needed to account for the fact that settings employing more than a certain number of staff at a given qualification level were only asked to give information about one or two members of their team chosen at random. A design weight to take account of this selection was calculated and then multiplied by the provider-level interview weights to obtain staff level weights for each case. Additional calibration was applied to ensure that the weighted staff profile (by level of education) was in line with the data collected about the number of staff working. 

Some data were excluded from the staff-level datasets prior to weighting. This included cases where information on the staff member was not provided (the staff loop was empty) or the respondent reported an overseas qualification. Providers for whom the number of paid staff for whom qualifications were recorded was greater than +/- 2 the total number of paid staff reported have been excluded.

6.1.3 Main SCEYP: Weighting for childminders

Population totals

The childminders’ data was weighted to be representative of the eligible population of childminders in England as of July 2022. Some childminders sampled for the main survey were found to be ineligible in that they were no longer offering provision. As in the previous waves, it was decided not to redefine the population and to assume that the population was stable in its characteristics.

As in the previous four waves, providers with zero registered places were included on the assumption that at the time of the interview, information about the number of registered places might have already been outdated. It was decided, as in previous waves, that cases who reported zero registered places during the interview would be re-coded as ineligible, and responding childminders would be weighted to the profile of population excluding cases with zero registered places in the sample frame. Following data cleaning, the total eligible population was 28,262.

Weighting process

Design weights 

Design weights were first calculated to correct for unequal selection probabilities arising from the oversampling of smaller regions. 

Calibration weights 

Calibration weighting was used to remove measurable bias introduced through non-response to the main SCEYP and to align the profile of achieved sample to the profile of the population. The population targets used for calibration weighting included: region, whether on all three registers (Early Years Register, Compulsory Childcare Register and Voluntary Childcare Register), registration year, and deprivation band based on IDACI. Questionnaire variant weights were calculated for the analysis of childminders allocated to each of the two childminder variants of the questionnaire. The responding cases and their profiles were grossed-up to the eligible population profile. 

Day of the week weights 

The selected sample was allocated systematically within each variant of the questionnaire to a specific day of the week (Monday to Friday) about which they were asked certain questions. The overall population size for each day were estimated from the main SCEYP (weighted by overall interview weight) using 2022 data which asked childminders which days of the week they were open for. Subsamples of each day were then calibrated to population estimates. 

Staff weights 

Respondents that answered questionnaire Variant 1 were asked several follow up questions about the assistants they employ. There were up to two loops in the questionnaire for assistants, so for respondents with more than two assistants design weighting was required to compensate for this. The final staff level weight was re-scaled so that the total weighted number of assistants was in line with an estimate of the total number of assistants.

6.2 Weighting the short SCEYP

Data from the short SCEYP was combined with data from the financial variant of the main SCEYP where variables were available in both, i.e., the fees and funding questions.

The weighting strategy was designed to create a weight to be used in analysis of the combined dataset comprising data from the short SCEYP and the financial variants of the main SCEYP. Three separate weights were created for SBPs, GBPs and childminders as set out below. 

The weighting strategy was the same for all provider types taking part in the short SCEYP or the financial variant of the main SCEYP. Data from the two surveys were combined for use in financial analysis and calibrated to the same variables and population totals as used for the main SCEYP. 

A combined weight was also created for analysis of all groups together. Combined weights for SBPs, GBPs and childminders were scaled so that the weighted profile by provider type was aligned with the percentage distribution of provider types in the population (modelled estimates).

7. Response rates

7.1 Main SCEYP: Response

In total, 9,090 productive interviews were carried out as part of the main Survey for Childcare and Early Years Providers (SCEYP). This included:

  • 5,529 interviews with group-based providers (GBP);
    • Private (3,651 interviews, 66%)
    • Voluntary (1,591 interviews, 29%)
    • School/college/LA/unclassified GBPs (268 interviews, 5%)[16]
  • 2,079 interviews with school-based providers (SBP);
    • Those offering nursery provision (1,938 interviews, 93%)
    • Maintained nursery schools (MNS) (141 interviews, 7%)
  • 1,482 interviews with childminders.

Table 7.1 shows the outcomes achieved for each provider type. Tables 7.2, 7.3 and 7.4 show the response rates achieved for each provider type and variant. Table 7.5 shows the split of responses by mode.

Table 7.1 Main SCEYP: Outcomes by provider type

Group-based providersSchool-based providersChildmindersTotal
Issued sample15,1206,8366,05928,015
Completes5,5292,0791,4829,090
Bad number[17]703366141,353
Other non-contact[18]4,6232,2052,6399,467
Total non-contact5,3262,2413,25310,820
Unavailable[19]2,6541,9695945,217
Refused[20]1,1914482941,933
Total contacted but unproductive3,8452,4178887,150
Permanently closed24152374666
Temporarily closed (for longer than 30 days)3094988
Open but not offering relevant childcare663613115
Other ineligible832085
Total ineligible42099436955

Response rates were lower this year across all provider types compared to response rates in 2022, as shown by Table 7.2, despite the survey being set up and carried out following the same method. A possible reason for the lower response rates is that the intensity of research, particularly for schools, has increased in the post-Covid period potentially leading to over-saturation of the sample. Moreover, there was industrial action in schools in May when the survey first launched, which may have had an adverse effect on response rates from SBPs.

Table 7.2 Main SCEYP: Group-based provider response rates

YearResponse ratesGroup-based providersSchool-based providersChildminders
2022Achieved (n)

6,338

2,476

1,604

Overall response rate

42%

36%

32%

2023Achieved (n)

5,529

2,079

1,482

Overall response rate

38%

32%

26%

Table 7.3 Main SCEYP: Group-based provider response rates

Response ratesVariant 1Variant 2Variant 3Total
Achieved (n)

917

2470

2142

5529

Overall response rate

39%

40%

35%

38%

Refusal rate

8%

8%

8%

8%

Eligibility rate

97%

97%

97%

97%

Table 7.4 Main SCEYP: School-based provider response rates

Response ratesVariant 1Variant 2Total
Achieved (n)11239562079
Overall response rate34%29%32%
Refusal rate7%6%7%
Eligibility rate95%95%95%

Table 7.5 Main SCEYP: Childminder response rates

Response ratesVariant 1Variant 2Total
Achieved (n)8716111482
Overall response rate31%22%26%
Refusal rate5%5%5%
Eligibility rate93%93%93%

Table 7.6 Main SCEYP: Response mode

ModeGroup-based providersSchool-based providersChildmindersTotal

%

n

%

n

%

n

%

n

Telephone

58%

3206

66%

1363

39%

585

57%

5154

Web

42%

2323

34%

716

61%

897

43%

3936

Overall, interviews were skewed slightly towards telephone (57% vs. 43% web). Childminders were more likely to complete via web (61% vs. 39% for telephone), with SBPs especially more likely to complete via telephone (66% vs. 34% for online).

[16] A small number of providers answered ‘Don’t know’ when asked about their provider type.

[17] ‘Bad number’ includes all those who had an invalid telephone number or where interviewers were informed they had the wrong number when they called.

[18] ‘Other non-contact’ includes all those we did not have any telephone contact with including where there was no answer, our call was rejected, or the call went to answer machine. Although no telephone contact was made, these providers will have received an invitation to complete the survey by email or post. 

[19] ‘Unavailable’ includes all those who were busy when we called them including those who said they would be unavailable throughout the whole fieldwork period and instances where we spoke to someone at the setting but the person we asked to speak to was not available. 

[20] ‘Refused’ includes all those who opted out of the research or who said they did not want to participate when we called.

7.2 Main SCEYP: Interview length

Average telephone interview lengths and web interview lengths for different provider types and variants are provided below in Tables 7.6 and 7.7. These are based on interviews that were completed within an hour on the same day; any interviews longer than one hour were excluded from the analysis of interview lengths, as the timings for these outlier interviews are likely to be inaccurate. 

Interview lengths are shorter for childminders on average than those with school-based or group-based providers. This is because childminders were asked fewer questions than other providers e.g. they were not asked questions about staff-to-child ratios.

Table 7.7 Main SCEYP: Average (mean) telephone interview length by provider type and questionnaire variant

Provider type

Variant

2023 average telephone interview length (minutes)

Group-based providers

1

29 mins

2

28 mins

3

34 mins

School-based providers

1

32 mins

2

36 mins

Childminders

1

17 mins

2

27 mins

Table 7.8 Main SCEYP: Average (mean) web interview length by provider type and questionnaire variant

Provider type

Variant

2023 average web interview length (minutes)[21]

Group-based providers

1

27 mins

2

26 mins

3

32 mins

School-based providers

1

28 mins

2

32 mins

Childminders

1

16 mins

2

25 mins

[21] The average online survey time is calculated using the times that a respondent enters and leaves the survey after completing. It cannot account for periods of time when the respondent may have taken a short break, however, we have accounted for outliers by removing anyone who took longer than 1 hour to complete the survey.

7.3 Short SCEYP: Response

The total issued sample for the short SCEYP was 31,632; 8,251 GBPs, 3,120 SBPs, and 20,261 childminders.

In total, there were 6,741 interviews achieved; 1,798 from GBPs, 641 from SBPs, and 4,302 from childminders. Table 7.9 displays the outcomes and response rates achieved for the whole sample, and by provider type. It was not possible to estimate eligibility rates given online and postal modes of data collection, so response rates were calculated based on the whole issued sample (i.e. they assume that all sampled providers were eligible). The breakdown of completions by mode (web vs postal) is shown in Table 7.9.

Table 7.9 Short SCEYP response rates: Whole sample by provider type

Group-based providers

School-based providers

Childminders

Total

Issued sample

8,251

3,120

20,261

31,632

Achieved (n)

1,798

641

4,302

6,741

Overall response rate

22%

21%

21%

21%

Table 7.10 Short SCEYP: Response mode

Response modeGroup-based providersSchool-based providersChildmindersTotal

%

N

%

N

%

N

%

N

Post

30%

532

22%

140

39%

1,668

35%

2,340

Web

70%

1,266

78%

501

61%

2,634

65%

4,401

7.4 Short SCEYP: Interview length

The average web interview lengths for the different provider types are provided in Table 7.10. These are based on interviews that were completed within an hour on the same day. Analysis of interview length excluded those longer than one hour, as the timings for these outlier interviews are likely to be inaccurate.

As Table 7.10 shows, the length of the short SCEYP ranged from 5 minutes on average for childminders to 9 minutes on average for SBPs and GBPs.

Table 7.11 Short SCEYP: Average web interview length by provider type

Provider typeAverage web interview length (minutes)
School-based providers9 mins
Group-based providers9 mins
Childminders5 mins

8. Data processing

8.1 Coding 

Both the main and short survey included a small number of questions which offered the respondent the option to choose “other” as their response code and to enter free text into the designated field. These were subsequently coded by trained coders at IFF Research. Code frames from previous iterations of the survey were used for existing questions. Code frames for new questions were devised by the same coding team and signed off by the Research Manager.

8.2 Main SCEYP: Data cleaning 

Numeric information collected as part of the main SCEYP, such as childcare fees or opening hours, was typed in by respondents themselves (online), keyed in by interviewers (telephone) or recorded on paper and keyed in by IFF’s Data Unit (paper). It was possible for mis-keying to occur and, on inspection, a small number of answers appeared either much too large or much too small or nonsensical given other survey answers. Therefore, it was decided to implement rules whereby certain values would be removed from the data. Data were recoded to “missing” in the following scenarios: 

  • Number of parents paying fees using Tax Free Childcare is higher than total number of registered children: 60 records 
  • Total number of registered children across all age bands was zero: 57 records 
  • Where the reported number of places that were spare was the same as the total number of registered places, but the number of children booked was not zero, spare capacity was recoded to missing: 58 records 
  • Where the reported number of children with Special Educational Needs or Disabilities was greater than the total number of registered children: 1 record 

8.3 Main SCEYP: Calculating hourly pay 

Respondents were asked a variety of questions in the staff loop section to help collect data on hourly pay. First, respondents were given the choice to provide an answer about pay for each member of staff for any period (i.e. hourly, daily, weekly, monthly and annually). Hourly pay was then calculated for each member of staff, based on their answer to how many hours they worked per week and what income period was given. To derive weekly wages, if the pay is provided as:

  • An annual figure: this is divided by the reported contracted number of weeks they work a year, or 52 if they do not provide this information
  • A monthly figure: the figure is divided by 4.3

Following this, their derived weekly income was divided by the reported number of hours they are contracted to work a week to estimate the hourly pay.

 If they could not provide an exact answer about pay they were asked to give a banded answer from the list presented to them. 

Where the reported rate of pay was zero for staff, the derived variable was set to missing (as the analysis was interested in paid staff). This impacted 152 records in total.  

Annexes

Appendix A: Main SCEYP: Survey invitation letter

Appendix B: Short SCEYP: Survey invitation letter

Help and support

Contact us

If you have a specific enquiry about Childcare and early years provider survey statistics and data:

Early years and childcare research and analysis

Email: EY.AnalysisANDResearch@education.gov.uk
Contact name: John Canlin

Press office

If you have a media enquiry:

Telephone: 020 7783 8300

Public enquiries

If you have a general enquiry about the Department for Education (DfE) or education:

Telephone: 037 0000 2288

Opening times:
Monday to Friday from 9.30am to 5pm (excluding bank holidays)