For the March, May and June 2022 statistics releases, the Educational Setting Status form was used to estimate course starts by pupils. This data source was the best way of producing estimates of school-led tutoring course starts at the time, following the introduction of school-led tutoring. For better estimates of school-led tutoring courses delivered throughout the academic year 2021/22 please use the more recent ‘2021/22 Academic Year’ release.
Data
During the academic year 2021/22, the Department for Education (DfE) collected data on the uptake of school-led tutoring approximately every four weeks through the Educational Setting Status form. The questions asked were:
- Is your school currently using the School-Led Tutoring grant to provide tuition?
- Since the beginning of the Autumn term, how many pupils have received tuition through the School-Led Tutoring grant in total?
- Since the beginning of the Autumn term, how many pupil hours of tuition have been delivered through the School-Led Tutoring grant in total?
The date of the form used to inform our statistical publications is indicated in the release. This was a non-compulsory survey which meant not every school responded.
A small number of outliers were removed. For example, where schools had reported delivery of unfeasibly high average numbers of hours of tutoring per pupil or an unfeasibly high number of pupils given the number of pupils recorded in the School Census. Such schools are still treated as participating in SLT.
For the release on the 31st March 2022, we tightened up our criteria for outliers, compared to previous ad-hoc releases. This reduced the estimated starts on courses by pupils for school-led tutoring by around 7%.
We did not find responding and non-responding schools were significantly different in characteristics (for example size, phase, location). As such we did not feel it was necessary or appropriate to make any adjustment for non-response bias. This is consistent with how other data from the Educational Settings Status form was reported, for example, absence rates. However, that does not mean that we can rule out the risk of non-response bias in these estimates.
Starts on Courses by Pupils
We reported estimates of the number of starts on courses made by pupils based on schools’ responses to the Educational Setting Status form. We assumed that the average number of pupils receiving tutoring at non-responding schools was equal to the average number of pupils receiving tutoring at responding schools.
For these statistics, we had to infer what non-responding schools were doing from the sample of responding schools, therefore estimates are uncertain. This uncertainty increased as the response rate fell and as the population size fell. As such, we have greater confidence in the national estimates, which are based on a relatively large group of schools, than in the local authority estimates, which are based on a much smaller group. Local Authority estimates should be treated with caution, particularly when comparing against previous estimates.
Starting with the May 2022 publication, to be transparent and give an indication of the uncertainty, we included the number of schools that responded to the Educational Setting Status form and the response rate next to every School-Led Tutoring estimate. In line with DfE Statistics on “Attendance in education and early years settings during the coronavirus (COVID-19) pandemic”, we also flagged School-Led Tutoring estimates for Regions and Local Authorities where the estimate was:
- Based on 10 or less responding schools, or
- Based on a response rate of 50 per cent or less.
The flags are provided in the downloadable data tables. They act as a guide to users as to where the uncertainty of the estimates may be higher, although they should not be interpreted as hard rules on data quality. There will always be some variability in estimates calculated from a sample of schools.
For school-led tutoring, we received data on the number of pupils undertaking tutoring at each school. To make these figures consistent with those we received for tuition partners and academic mentors (where pupils can undertake multiple courses of tutoring) we additionally estimated how many course starts had been undertaken by these pupils.
To do this, we reviewed evidence on the numbers of hours of tutoring reported by schools. In some cases, pupils received more than 15 hours of tutoring, meaning that they in effect started more than one 15-hour course of tutoring. For example, some schools reported an average of more than 15 hours per pupil being delivered since the start of the academic year, meaning that at least some pupils must have received more than 15 hours of tutoring. This could be because they were receiving tutoring in multiple subjects, or because they were receiving intensive tutoring in one subject which totalled more than 15 hours.
We estimated how many pupils this affected by assuming that tutored pupils at each school received the average number of hours of tutoring for that school. We then found the percentage of schools where the average pupil was receiving more than 15 hours tutoring. We applied this proportion to the estimated number of pupils receiving school-led tutoring to estimate starts on courses by pupils. This ratio was applied nationally, rather than calculated separately for each Region and Local Authority.
Not all courses delivered to pupils were 15 hours long. Individual pupils may have undertaken more than one course. Where a tutor delivered a course to multiple pupils, that course was counted once per pupil.
School Participation
Using the Educational Setting Status data, we estimated the total number of schools doing SLT based on that data and then calculated this as a percentage of all schools. Not all schools received funding for school-led tutoring but all schools (as defined in ‘Other Notes below) are included in the calculation of the percentage. As with the estimate for starts on courses by pupils on School-Led Tutoring, we have included information about the Educational Settings Status responses and flags to indicate where Regional and Local Authority estimates may be less reliable. We started doing this in May 2022.