Academic year 2020/21

Progression to higher education or training

View latest data: Academic year 2022/23This is not the latest data
Published
Release type

Introduction

These statistics show the percentage of level 3 pupils (e.g. those that studied A levels, tech levels and applied general qualifications) continuing to a sustained education or training destination at level 4 or higher (such as degrees, higher apprenticeships and higher national diplomas) in the year after completing 16 to 18 study.

The release also provides information on destination outcomes for different groups of pupils and education providers.


Headline facts and figures - 2020/21

Explore data and files used in this release

  • View or create your own tables

    View tables that we have built for you, or create your own tables from open data using our table tool

  • Data catalogue

    Browse and download open data files from this release in our data catalogue

  • Data guidance

    Learn more about the data files used in this release using our online guidance

  • Download all data (ZIP)

    Download all data available in this release as a compressed ZIP file

Additional supporting files

All supporting files from this release are listed for individual download below:

What is progression to higher education or training?

Progression to higher education or training shows the percentage of students that sustain an education course or apprenticeship at level 4 or higher in the two years following their 16 to 18 study. The most recent data reports on students who completed 16 to 18 study in the 2018/19 academic year and identifies their education and/or apprenticeship destinations in the two years following their last attendance at a 16 to 18 institution.

The measure is designed to complement the existing destination measures (Destinations after KS4 and 16-18 study) which provide more information on the destinations that are not featured here such as employment and further study at level 3 or below. These measures are produced for a number of purposes, including to assist with provider choice and encourage provider improvement via the school performance data, and to inform the public and stakeholders for policy development.

This progression measure differs from the original measures in that it uses a two-year destination window (rather than one) in order to better report students that take gap years and similar breaks. It also calculates value-added scores which take both prior attainment at GCSE and main qualification type into account. 

Timeliness of data 

There is a time lag between students completing their 16 to 18 study and this measure being published. Two years have to elapse during which young people are participating in their chosen destination, and datasets have to be combined before measuring sustained participation in education or apprenticeships. This publication reports on students that completed their 16 to 18 study in summer 2019, and considers their destination activity up to summer 2021.

What is a ‘sustained’ destination? 

To be counted in a level 4 or higher destination, students have to be recorded as having sustained participation for a 6 month period in the two-year destination window. This participation can include activity in a single destination or a combination, as long as there are six consecutive months at level 4 or higher. 

Who is included in the cohort? 

This measure is restricted to students that studied level 3 qualifications as there is less expectation for students studying qualifications at lower levels to progress to level 4 or higher. It thus includes students that studied academic qualifications such as A levels, applied general qualifications, technical levels, or other qualifications that have not been included in performance data but are notionally level 3. State-funded mainstream schools and colleges are included. The cohort includes students who completed their 16 to 18 study in 2018/19, and focuses on activity during the two years after they last attended a 16 to 18 provider. 

How does the value-added score (“Progression score”) work? 

The probability of a student progressing to a level 4 or higher destination is strongly correlated with their prior attainment at key stage 4 (GCSE and equivalents) and the qualification type they study at 16 to 18. An institution that starts with an intake of high-prior-attainment pupils will naturally have a higher rate of progression to level 4+ than an institution with an intake of low-prior-attainment pupils. For this reason we calculate a “value-added” score which is presented alongside the progression rate, and is an indication as to how the institution has performed once prior attainment and qualification types are taken into account. The score is calculated by comparing each individual student’s outcome (a 1 if they progress to level 4 or higher, a 0 if they do not) against the national average for the group of students nationally with similar prior attainment and qualification type. If, for example, 85% of the highest-prior-attainment academic students progressed to higher education or training nationwide, then an individual student in that group will score 1 – 0.85 = +0.15 if they progress, but 0 – 0.85 = -0.85 if they do not.

These individual student scores are then averaged for the institution and multiplied by 100 to obtain the VA score. A VA score of e.g. +7 thus represents a seven percentage point increase on progression into level 4 or higher destinations for that institution (or group) than similar students nationally. A VA score of zero shows that progression for that group was as expected according to the national average. 

At institution level we also include confidence intervals, as the score is likely to be a more accurate representation of the value added by the institution for larger cohorts than small ones. 

Individual student scores have also been averaged at local authority level, parliamentary constituency level, national level, and for various characteristics.

Impact of Covid-19 on destination measures

This publication is looking at destination activity across the 2019/20 and 2020/21 academic years, and so the results may have been affected by the COVID-19 disruption. The majority of sustained progression destinations will have been completed by March 2020, just before the first lockdown began at the end of that month; however destinations that would normally have been achieved in the second year may have been affected.  

Comparison to previous year

Following the large increase in overall progression seen in last year's release (which had been caused by a shift in the cohort's qualification types: many level 3 vocational qualifications with lower progression rates had their performance data status withdrawn, resulting in some of the students being reclassified as level 2 students and thus removed from the cohort of this level 3-only measure), the results stabilised this year and are very similar to last year's. Overall progression fell by 0.2 percentage points, with degree destinations falling by 0.4 percentage points but level 4/5 destinations increased by 0.3 percentage points.

Provider type

Students from state-funded mainstream (SFM) schools are much more likely to progress to level 4 or higher education and training (74.6%) than students from SFM colleges (54.9%). 

However, this might be due in part to different intentions between school and college students. For example, the 16 to 18 standard destination measures for this cohort (published last year, as the standard measure only considers one year of activity rather than the two used here)  showed that while students from schools and colleges had more similar rates of overall sustained destinations (89.3% vs 85.6% respectively), students from colleges were much more likely to sustain an employment destination at 27.6% than the 19.8% seen from school students, and less likely to sustain an education destination.

Another factor in the school vs college difference is likely to be that schools tend to have higher prior-attainment intakes, while colleges are more likely to have students opting for vocational qualifications and employment destinations. This can be seen in the value-added progression scores: +0.9 for schools vs -1.2 for colleges, which demonstrates that despite a gap in overall progression of nearly 20 percentage points, this reduces to 2.1 percentage points once prior attainment and qualification type are taken into account.

There is wide variation in progression by institution type within the schools grouping and within the colleges grouping.

For example, within SFM colleges the students from Sixth form colleges are much more likely to progress (73.1%) than those from other FE-sector colleges (47.0%). Within SFM schools, students from Free schools had the highest rate of progression (83.1%) while students from Studio schools had the lowest (55.6%). However, while Free schools continued to perform well once prior attainment and qualification types were taken into account with a high progression score of +7.1, Studio schools were also performing above the national average with a progression score of +1.2.

Students from non-selective schools in highly-selective areas continue to progress below the national average even once prior attainment is taken into account.

 Students from non-selective schools in highly-selective areas (such as Kent, Lincolnshire and Buckinghamshire) were much less likely to progress to a level 4 or higher destination (62.0%) than students at non-selective schools in other parts of the country (73.5%). While students at Other non-selective schools achieved an average progression score of +1.0, those from non-selective schools in highly-selective areas received a progression score of -4.0, the lowest of all institution types. 

This can be interpreted as showing that for two students with the same GCSE results and studying the same qualification types, both at non-selective schools, the one studying in a highly-selective area is five percentage points less likely to progress to a level 4 or higher destination than the student studying elsewhere.

Students from selective schools continued to progress at a very high rate (88.5%) with an average progression score of +2.4.

Regional results

The gap in progression between London and the South West widens slightly

London continues to have the highest rates of progression to level 4 or higher (77.0%), while the South West continues to have the lowest (59.5%). This difference remains when prior attainment and qualification type are taken into account, with London having a progression score of +9, compared to the South West's -6. This 15 percentage point difference is one ppt larger than the 14 percentage point difference seen in last year's results.

The gap between Inner London and the South West is even larger, such that a student living in the South West is 18 percentage points less likely to progress to a level 4 or higher destination than a similar student in Inner London with the same prior attainment and studying the same types of qualifications.

Part of the reason for this large progression gap is likely to be proximity to Higher Education Institutions (HEIs). Students from Inner London might have the opportunity to sustain degree destinations while living at home, while those from the South West, with far fewer local options, may find the necessary travel and rental costs prohibitive. Another contributing factor to London's high progression score could be that it has a higher-than-average ratio of schools to colleges, which might create a stronger bias towards education destinations over employment. However, demographics of the different regions may also play a role, as is discussed in the Student characteristics section.

The South East and East of England also had low rates of progression with scores of -4. The North East had a slightly higher score than in last year's results, rising from the national average (0) to +1. This was despite having the lowest rates of degree destinations (56.4%) after the South West (54.4%), as the North East had by far the highest rates of level 4 or level 5 course destinations (5.7%, around twice the national average of 2.8%).

Local authority results

Urban local authorities show higher rates of progression than those in rural and coastal areas

The most striking feature of the map of progression scores by local authority is that the majority of high-scoring LAs appear to be clustered around the urban centres of large cities such as Manchester, Birmingham, Newcastle, Sheffield and particularly London, while the LAs of coastal and rural areas are more likely to have negative scores.

When the LAs are matched to the Rural Urban Classification lookup tables (opens in a new tab) we see that “Predominantly Rural” local authorities average a score of -5.0, “Urban with Significant Rural” LAs average -4.1, while “Predominantly Urban” LAs average +2.6.

As discussed in the context of the differences between London and the South West in the regional section of this publication, reasons for these differences might include demographics and also accessibility of local Higher Education Institutions (HEIs).

Student characteristics

Disadvantaged students (those eligible for pupil premium in year 11) were less likely to sustain a level 4 or higher destination (61.8%) than other students (67.0%)

This gap is marginally smaller than in last year's results, with the disadvantaged figure rising by 0.1 percentage points and the not-disadvantaged figure falling by 0.2 ppts.

This gap actually reverses however when looking at the value-added scores, which take prior attainment at GCSE and qualification type into account. Disadvantaged students scored +2.5, compared to -0.6 for other students, showing that a student of given prior attainment and qualification type is actually slightly more likely to progress if part of the disadvantaged cohort (the lower overall sustained level 4+ destinations for disadvantaged students can agree with this observation if disadvantaged students begin with lower prior attainment on average, or if they are more likely to study qualification types with lower rates of progression). This could be because disadvantaged students are more likely to enter into level 3 study with the intention of progression, or it could be that they are boosted by the “London effect”, as London has by far the highest proportion of disadvantaged students and a very high progression rate.

While disadvantaged students were slightly more likely to sustain a level 4 or level 5 destination (3.2%) than other students (2.7%), they were much less likely to sustain a degree destination at a top-third higher education institution (10.0% vs 18.8%).

Female students were more likely to progress to a level 4 or higher destination (69.0%) than male students (62.6%)

This 6.4 percentage point gap is larger than the 5.4 ppt gap in last year's results. The pattern was the same in progression scores (i.e., having taken prior attainment and qualification types into account) but to a smaller extent, with female students scoring 2.8 ppts higher than males, compared to a gap of 1.9 ppts last year. 

Though female students were much more likely to sustain a degree destination, male students were slightly more likely to sustain a level 4 or level 5 destination and more than twice as likely to sustain an apprenticeship.

As has been observed in previous years, while female students of academic, applied general and other level 3 qualifications were more likely to sustain a level 4+ destination than male students of the same qualification types, this pattern reversed for students of Tech levels with male Tech level students obtaining a higher progression score (+3.5) than female students (-2.4). This may be due to gender bias in the particular Tech level subjects chosen, as some subjects typically lead to higher rates of education or apprenticeship destinations than others.

There is large variability in the rate of progression by ethnicity group

Students from the Chinese major ethnicity group were the most likely to sustain a level 4 or higher destination (88.7%), more than 27 percentage points ahead of students from the White major ethnicity group, who had the lowest progression rate. Once prior attainment and qualification type were accounted for, students from the Black or Black British major ethnicity group achieved the highest progression scores (+19.1), followed by students from the “Any other ethnic group” (+14.7) and the Asian or Asian British major ethnicity group (+14.4).

Part of the explanation for this result might be demographical. High-progressing London has a higher proportion of students from the Black or Black British major ethnicity group than other regions in England, and so it might be that these students are benefitting from the wealth of HEIs in the city. Alternatively, given that students from these major ethnicity groups also progress at higher-than-average rates outside of London, it might be that the contribution from these high-progressing groups is a factor in London's very high progression rate.

Students from the White major ethnicity group were the only ones to average a negative progression score, however while they were more than 30 percentage points less likely than students from the Chinese group to sustain a degree destination, students from the White major ethnicity group were more likely than students from other groups (besides the very small Unclassified group) to have an apprenticeship or level 4/5 destination.

Help and support

Methodology

Find out how and why we collect, process and publish these statistics.

Official statistics

These are Official Statistics and have been produced in line with the Code of Practice for Official Statistics.

This can be broadly interpreted to mean that these statistics are:

  • managed impartially and objectively in the public interest
  • meet identified user needs
  • produced according to sound methods
  • well explained and readily accessible

Find out more about the standards we follow to produce these statistics through our Standards for official statistics published by DfE guidance.

Our statistical practice is regulated by the Office for Statistics Regulation (OSR).

OSR sets the standards of trustworthiness, quality and value in the Code of Practice for Statistics that all producers of official statistics should adhere to.

You are welcome to contact us directly with any comments about how we meet these standards. Alternatively, you can contact OSR by emailing regulation@statistics.gov.uk or via the OSR website.

Contact us

If you have a specific enquiry about Progression to higher education or training statistics and data:

Destination measures

Email: Destination.MEASURES@education.gov.uk
Contact name: Robin Davis

Press office

If you have a media enquiry:

Telephone: 020 7783 8300

Public enquiries

If you have a general enquiry about the Department for Education (DfE) or education:

Telephone: 037 0000 2288

Opening times:
Monday to Friday from 9.30am to 5pm (excluding bank holidays)