Academic year 2019/20

Progression to higher education or training

View latest data: Academic year 2022/23This is not the latest data
Published
Release type

Introduction

These statistics show the percentage of level 3 pupils (e.g. those that studied A levels, tech levels and applied general qualifications) continuing to a sustained education or training destination at level 4 or higher (such as degrees, higher apprenticeships and higher national diplomas) in the year after completing 16 to 18 study.

The release also provides information on destination outcomes for different groups of pupils and education providers.


Headline facts and figures - 2019/20

Explore data and files used in this release

  • View or create your own tables

    View tables that we have built for you, or create your own tables from open data using our table tool

  • Data catalogue

    Browse and download open data files from this release in our data catalogue

  • Data guidance

    Learn more about the data files used in this release using our online guidance

  • Download all data (ZIP)

    Download all data available in this release as a compressed ZIP file

Additional supporting files

All supporting files from this release are listed for individual download below:

What is progression to higher education or training?

Progression to higher education or training shows the percentage of students that sustain an education course or apprenticeship at level 4 or higher in the two years following their 16 to 18 study. The most recent data reports on students who completed 16 to 18 study in the 2017/18 academic year and identifies their education and/or apprenticeship destinations in the two years following their last attendance at a 16 to 18 institution.

The measure is designed to complement the existing destination measures (Destinations after KS4 and 16-18 study) which provide more information on the destinations that are not featured here such as employment and further study at level 3 or below. These measures are produced for a number of purposes, including to assist with provider choice and encourage provider improvement via the school performance tables, and to inform the public and stakeholders for policy development.

This progression measure differs from the original measures in that it uses a two-year destination window (rather than one) in order to better report students that take gap years and similar breaks. It also calculates value-added scores for state-funded mainstream institutions which take prior attainment at GCSE and main qualification type into account. 

Timeliness of data 

There is a time lag between students completing their 16 to 18 study and this measure being published. Two years have to elapse during which young people are participating in their chosen destination, and datasets have to be combined before measuring sustained participation in education or apprenticeships. 

What is a ‘sustained’ destination? 

To be counted in a level 4 or higher destination, students have to be recorded as having sustained participation for a 6 month period in the two-year destination window. This participation can include activity in a single destination or a combination, as long as there are six consecutive months at level 4 or higher. 

Who is included in the cohort? 

This measure is restricted to students that studied level 3 qualifications as there is less expectation for students studying qualifications at lower levels to progress to level 4 or higher. It thus includes students that studied academic qualifications such as A levels, applied general qualifications, technical levels, or other qualifications that have not been included in performance tables but are notionally level 3. State-funded mainstream schools and colleges are included. The cohort includes students who completed their 16 to 18 study in 2017/18, and focuses on activity during the two years after they last attended a 16 to 18 provider. 

Underlying cohort changes for this cohort

The cohort for this Progression measure is comprised of students who predominantly studied level 3 qualifications during their 16-18 study period. However, when deciding at which level to place a student who studied multiple qualification types at different levels, our process gives precedence to the qualifications that are included in the performance tables.

In 2018 the list of requirements placed on a vocational qualification for its inclusion in performance tables was extended, with the result that many applied general qualifications (AGQs) and tech levels were removed. Many students studying these qualifications in the 2017/18 cohort featured in this statistical release were thus reclassified from AGQ/Tech-Level to “Other level 3”, and those that had additionally studied any approved level 2 qualifications were exempted from the measure as “Level 2” students. This shift in the cohort configuration accounts for most of the large interannual change seen in this year's results.

How does the value-added score (or “Progression score”) work? 

The probability of a student progressing to a level 4 or higher destination is strongly correlated with their prior attainment at key stage 4 (GCSE) and the qualification type they study at 16 to 18. An institution that starts with an intake of high-prior-attainment pupils will naturally have a higher rate of progression to level 4+ than an institution with an intake of low-prior-attainment pupils. For this reason we calculate a “value-added” score which is presented alongside the progression rate, and is an indication as to how the institution has performed once prior attainment and qualification types are taken into account. The score is calculated by comparing each individual student’s outcome (a 1 if they progress to level 4 or higher, a 0 if they do not) against the national average for the group of students nationally with similar prior attainment and qualification type. If, for example, 85% of the highest-prior-attainment academic students progressed to higher education or training nationwide, then an individual student in that group will score 1 – 0.85 = +0.15 if they progress, but 0 – 0.85 = -0.85 if they do not.

These individual student scores are then averaged for the institution and multiplied by 100 to obtain the VA score. A VA score of +10 thus represents a ten percentage point increase on progression into level 4 or higher destinations for that institution (or group) than similar students nationally. A VA score of zero shows that progression for that group was as expected according to the national average. 

Bands have been determined for providers to help put their scores in context. These take into account confidence intervals, as the score is likely to be a more accurate representation of the value added by the institution for larger cohorts than small ones. 

Individual student scores have also been averaged at local authority level, parliamentary constituency level, national level, and for various characteristics. For these purposes they are referred to as “Progression scores” rather than "Value-added scores".

Impact of Covid-19 on destination measures

As this publication is looking at activity in the 2018/19 and 2019/20 academic years it is mostly unaffected by the COVID-19 disruption.  The only exception to this is data for students progressing to apprenticeships in the 2019/20 academic year.  This is because the methodology takes into account whether students have sustained an apprenticeship for 6 months at anytime in the 2019/20 academic year (August 2019 and July 2020), rather than the first two terms for other destinations.

Comparison to previous year

Overall and degree destinations appear to have risen due to a change in which students are included

The proportion of students progressing to a level 4 or higher destination rose by 2.4 percentage points, with most of this difference seen in degree destinations which increased from 59.5% to 61.9%.

The reported increase in degree destinations can be understood by examining the cohort make-up. Compared to the 2016/17 cohort: 

  • The number of academic/applied general qualification (AGQ) students fell by approximately 53 thousand, with an increase in the academic count partially masking an even greater drop in the number of students doing approved AGQs. This shifted the balance within the academic/AGQ group towards academic students, resulting in a large increase in the proportion progressing to level 4 or higher destinations (from 68.2% to 73.1%) as academic students tend to progress at the highest rate.
  • The number of tech level students fell from 49 thousand to 10 thousand (with a small change in progression rate that is difficult to interpret given the much reduced cohort size).
  • The number of Other level 3 students dramatically increased as it expanded to include the majority of the students studying qualifications that had previously been classed as approved AGQs or tech levels. This addition of former AGQ/tech-level students resulted in a large boost to the progression rate for the Other level 3 group, from 23.6% last year to 44.9%.

Last year's 16-18 destinations publication (which reported on the same cohort due to a reduced time-lag) showed an increase in the number of approved level 2 students, accounting for the remaining drop in AGQ and tech level students. The influx of former AGQ/tech-level students to this group raised its HE destination figure from 0.9% to 4.5%, similar to the effect seen on the Other level 3 group in this measure.

Provider type

Schools outperform colleges in progression rate, but the gap closes for progression score

Further Education (FE) sector colleges had a low rate of progression to higher education or training at 48.8% (c.f. national average of 66.2%). Sixth form colleges scored much higher at 70.7%, but this was still below the average progression rate of state-funded mainstream (SFM) schools (72.8%).

Part of the explanation for dissimilar progression rates between schools and colleges might be the different intentions of the students. For example, the 16-18 destinations measure for this 2017/18 cohort's level 3 students showed a high employment rate of 31% for students from FE sector colleges (c.f. 20% for SFM schools). 

It is also interesting that despite a 15.8 percentage point gap in progression rates between all SFM schools and all SFM colleges, this difference shrank to 1.9 percentage points in the scores (+0.8 for schools vs -1.1 for colleges), demonstrating that progression between the two provider types was much closer once prior attainment and qualification type were taken into account. 

Within the group of state-funded mainstream schools there was high variability in progression rate between the different school types, ranging from 75.0% for students at Free schools to 50.7% for those at Studio schools.  

The divide between selective schools and non-selective schools in highly-selective areas remains notable

Students from selective schools had the highest progression rate at 87.3% (well above the national average of 66.2%), as would be expected for those providers that chiefly enrol students of high academic ability. However, selective schools also achieved a positive progression score of +2.2, demonstrating that they progressed more students to level 4 or higher destinations than the national average even after having taken prior attainment and qualification type into account.

By contrast students from non-selective schools in highly-selective areas (such as Kent and Buckinghamshire) not only progressed at a much lower rate (58.5%) but also obtained a negative progression score of -5.0, showing that these students were five percentage points less likely to progress than other students with similar GCSE grades elsewhere in the country. This difference in progression scores could be due to students at non-selective institutions in highly-selective areas seeing fewer of their peers aiming for higher education than in other areas. The 16-18 destinations measure showed a higher rate of employment, level 2 and level 3 apprenticeship destinations than the national average for the 2017/18 non-selective-school-in-highly-selective-area leavers, and correspondingly a lower rate than average in these destinations for selective-school leavers.

Regional results

London continues to produce highest rates of progression by far

Large variation is seen in rates of progression to higher education and training across the country, with students in Inner London most likely to progress (78.2%), and students in the South West least likely to progress (59.6%). This difference between the two regions remains when prior attainment and qualification type are taken into account, with students from providers in Inner London achieving a progression score of +12, compared to -6 for those from providers in the South West. This can be interpreted as saying that a student of given GCSE results and qualification type is 18 percentage points less likely to progress if they live in the South West than in Inner London.

Part of the reason for this large progression gap is likely to be proximity to Higher Education Institutions (HEIs). Students from Inner London might have the opportunity to sustain degree destinations while living at home, while those from the South West, with far fewer local options, may find the necessary travel and rental costs prohibitive. Another contributing factor to London's high progression score could be that it has a higher-than-average ratio of schools to colleges, which might create a stronger bias towards education destinations over employment. However, demographics of the different regions may also play a role, as is discussed in the Student characteristics section.

Outer London had the second highest progression score, followed by the West Midlands, Yorkshire and The Humber, and the North West, which all achieved positive progression scores. Students from the East of England, South East, and South West were the only ones to average negative progression scores, showing that they were less likely to progress than the national average for students of similar prior attainment and qualification type.

Local authority results

Cities show higher progression than rural and coastal local authorities

The chart of progression score by local authority reveals the cluster of high-progressing London boroughs that would be expected given the region's progression rate. However, the map also shows high progression in local authorities corresponding to urban regions in the North and Midlands. For example, Liverpool, Manchester, Sheffield and Birmingham local authorities all have positive progression scores. In contrast, rural and coastal local authorities predominantly have negative progression scores. As with the regional analysis, two factors producing this outcome might be proximity to HEIs and demographics.

Student characteristics

Disadvantaged students were less likely to progress than their non-disadvantaged peers, but achieved higher average progression scores

Disadvantaged students (those eligible for pupil premium in Year 11) were 5.5 percentage points less likely to progress than other students. However, their positive progression score shows that they were actually nearly 3 percentage points more likely to progress than their peers once prior attainment and qualification type are taken into account. This might be partly explained by the fact that London, with its high progression score, has by far the largest proportion of disadvantaged students of any region. An alternative reason could be that those disadvantaged students that enter level 3 study do so with plans for a particular destination.

Though disadvantaged students were five percentage points less likely to sustain a degree destination than all other students, the difference was even larger for destinations at top third higher education institutions, with disadvantaged students approximately half as likely to attend a top-third institution (10.2% vs 19.1% for all other students). A similar disparity was seen for level 4 apprenticeship destinations.

Female students were more likely to progress than males

Female students were 5.4 percentage points more likely to sustain a higher education or training destination than male students. This remained the case even after prior attainment and qualification type were taken into account, as can be seen by the positive average progression scores of female students (and the negative scores of male students).

Interestingly, this trend was reversed for the small subgroup of tech level students, with male tech level students more than 7 percentage points more likely to progress than female tech level students. This might be due to gender biases in the tech level subjects chosen, of which some are more inclined to lead to higher education and training than others.

Large variations in progression rates and scores by major ethnicity group

Students from the Chinese major ethnicity group were the most likely to sustain a level 4 or higher destination (89.2%), more than 27 percentage points ahead of students from the White major ethnicity group, who had the lowest progression rate (besides Unclassified). Once prior attainment and qualification type were accounted for, students from the Black or Black British major ethnicity group achieved the highest progression scores (+18.6), followed by students from the Asian or Asian British major ethnicity group (+15.6).

Part of the explanation for this result might be demographical. High-progressing London has a higher proportion of students from the Black or Black British major ethnicity group than other regions in England, and so it might be that these students are benefitting from the wealth of HEIs in the city. Alternatively, given that students from these major ethnicity groups progress at higher-than-average rates outside of London, it might be that the contribution from these high-progressing groups is a factor in London's extraordinary progression rate.

Students from the White major ethnicity group were the only ones to average a negative progression score (with the exception again of Unclassified), however while they were a full 31 percentage points less likely than students from the Chinese group to sustain a degree destination, students from the White major ethnicity group were the most likely to have an apprenticeship or level 4/5 destination.

Help and support

Methodology

Find out how and why we collect, process and publish these statistics.

Official statistics

These are Official Statistics and have been produced in line with the Code of Practice for Official Statistics.

This can be broadly interpreted to mean that these statistics are:

  • managed impartially and objectively in the public interest
  • meet identified user needs
  • produced according to sound methods
  • well explained and readily accessible

Find out more about the standards we follow to produce these statistics through our Standards for official statistics published by DfE guidance.

Our statistical practice is regulated by the Office for Statistics Regulation (OSR).

OSR sets the standards of trustworthiness, quality and value in the Code of Practice for Statistics that all producers of official statistics should adhere to.

You are welcome to contact us directly with any comments about how we meet these standards. Alternatively, you can contact OSR by emailing regulation@statistics.gov.uk or via the OSR website.

Contact us

If you have a specific enquiry about Progression to higher education or training statistics and data:

Destination measures

Email: Destination.MEASURES@education.gov.uk
Contact name: Robin Davis

Press office

If you have a media enquiry:

Telephone: 020 7783 8300

Public enquiries

If you have a general enquiry about the Department for Education (DfE) or education:

Telephone: 037 0000 2288

Opening times:
Monday to Friday from 9.30am to 5pm (excluding bank holidays)