The HTST data collection is being developed to:
- build a clearer picture of local authority provided transport at local and national levels, and
- enable local authorities to benchmark against each other when the quality of the data allows.
115 local authorities (or 75%) submitted data for this initial collection. Table 1 shows that a good response rate by region was achieved, though rates for local authorities in the East Midlands and London were relatively lower. Table 2 shows the response rates for London, rural and urban local authorities, showing that rural authorities were more likely to have responded to the data collection than urban and London authorities. Urban and rural has been defined using the percentage of the population living in rural settings as published by ONS. Rural local authorities are those where at least 15% of the population live in rural settings.
Table 1: Overall response rate by region
| Region | Number of responding local authorities | Percentage of local authorities in the region that responded |
|---|
| East Midlands | 6 | 60% |
| East of England | 11 | 100% |
| London | 21 | 64% |
| North East | 11 | 92% |
| North West | 19 | 79% |
| South East | 15 | 79% |
| South West | 11 | 73% |
| West Midlands | 10 | 71% |
| Yorkshire and the Humber | 11 | 73% |
Table 2: Overall response rate by urban/rural
| Urban/rural | Number of responding local authorities | Percentage of local authorities that responded |
|---|
| London | 21 | 64% |
| Urban | 48 | 74% |
| Rural | 46 | 84% |
The quality of the data returned by local authorities varied by question. In terms of coverage, all 115 authorities were able to provide data for the highest-level questions: how many pre-16 pupils received HTST and how many post-16 learners received transport. We are satisfied that the data provides, for the first time, a robust assessment of the number of pupils using free HTST, but we also believe that this will improve further as the collection matures.
Response rates varied across data items; this includes where more detailed breakdowns were not readily available. While complete responses were received from some local authorities to the lower-level questions, data received from others reflected the fact that there has previously been no standardised data collection, hence they were unable to provide full answers. These included:
- Null responses – local authorities did not have the data to answer the question. Several of the post-16 transport questions were left blank with comments that local authorities did not hold the data requested. Sometimes this was because we asked for the data split between 16-18 and 19-25 year olds for those using transport as a result of their special educational needs and disabilities (SEND), and they were not able to segment the data in this way; and sometimes it was because they could not separate the post-16 data from the pre-16 data because pupils and learners of different ages shared routes and vehicles.
- Questions were interpreted differently – emails received and comments left by local authorities made clear that they had interpreted some questions in different ways. For example, when asked for numbers of pupils and spend by ‘mode of transport’ the distinction between “dedicated school transport” and “any other vehicle” was unclear and local authorities responded differently; and when asked about pupils travelling in single occupancy vehicles, some local authorities excluded pupils where they were the only pupil travelling from an area to their school, but we had intended for these pupils to be included. We have confidence in the percentages presented as only a small number of the local authorities included raised this issue.
- Errors in the data returned – such as typographical errors and inconsistencies between numbers of pupils reported for different questions. While we have performed checks on the data, including comparing reported spend and spend recorded in Section 251 data returns for 2023-24 (Section 251 is part of the Apprenticeships, Skills, Children and Learning Act 2009 – local authorities submit statements of planned and actual expenditure on education and children’s social care) and comparing the number of pupils reported in responses to different questions, we were unable to contact all local authorities where their data appeared to contain errors. Of the 15 local authorities we did contact, five subsequently returned corrected data. We are aware of some reasons for the difference in spend: a small number of LAs were unable to separate their spend by pre- and post-16 as pupils and learners use the same vehicles/routes; and a small number said they had not included all overheads in the mode data, such as the cost of vehicles, passenger assistants and office staff. For the local authorities included in the percentages presented, the difference in the overall number of pupils/learners is small, however there is a slightly larger difference in the spend data.
As this is a new data collection, we expect the quality of the data returns to continue to improve over time as the collection becomes established. For example, reductions in the number of unknown values for detailed breakdowns, improvements in our guidance and categories, and increased response rates. Caveats for each data item are given in the accompanying footnotes.