Further education and skills statistics: methodology
Further education and skills statistics: methodology
The further education (FE) and skills statistics covered in this user guide include all age apprenticeships and wider adult (19+) FE learning, such as community learning, education and training and English and maths qualifications.
We publish associated data and statistics on activity in the FE system on a regular basis, with quarterly ‘main’ releases for each academic year, which include national data for England and further breakdowns by, for example, learner characteristics, subject, provider, geography etc, with a final release for the academic year based on final end of year data for the period.
We also release headline apprenticeship statistics on a monthly basis for transparency purposes. You can contact the team by emailing FE.OFFICIALSTATISTICS@education.gov.uk or using the contact details at the end of this methodology.
This methodology provides a brief overview of the FE system and also covers:
Users and uses of the statistics
Key historic policy changes
The quality of the statistics
Changes made to the statistics and plans for future changes
‘FE’ within the FE and skills and apprenticeship and traineeships publications covers learners who are studying courses in a FE College, with a training provider or within their local community. It also includes employees undertaking an apprenticeship or other qualification in the workplace.
The overall FE and skills participation statistics include four main types of FE provision: apprenticeships, workplace learning, community learning, and education and training provision (which includes offender learning).
FE learning can be undertaken via a range of FE institutions: General FE Colleges (including Tertiary), Sixth Form Colleges, Special Colleges (Agricultural and Horticultural Colleges, and Art and Design Colleges), Specialist Colleges and External Institutions.
The statistics also cover apprenticeships and traineeships for all ages 16 and over and adult (19 and over) wider FE learning. Statistics on aged 16-18 activity that are delivered in FE institutions and schools are generally not covered in these releases, other statistical publications cover this type of post-16 learning, for example: Participation in education and training and employment.
‘FE’ in these publications does not include ‘higher education’, such as undergraduate or graduate degrees, unless undertaken as part of an apprenticeship programme (see ‘apprenticeships’ below).
Apprenticeships are paid jobs that incorporate on-the-job and off-the-job training leading to nationally recognised qualifications. As an employee, apprentices earn as they learn and gain practical skills in the workplace.
There are currently two types of apprenticeships: ‘frameworks’ and ‘standards’.
An apprenticeship ‘framework’ typically contains the following separately certified elements:
A knowledge-based element (the theoretical knowledge underpinning a job in a certain occupation and industry, typically certified via a technical certificate).
A competence-based element (the ability to discharge the functions of a certain occupation, typically certified via work-based assessed national vocational qualifications (NVQs)).
As part of these reforms, new employer-led apprenticeship ‘standards’ were introduced in 2014 and were initially known as ‘trailblazers’.
Apprenticeship standards are designed with employers to help ensure apprentices have the skills businesses need, and outline the skills, knowledge and behaviours (KSBs) required to carry out a certain job role. All apprentices must take an independent assessment at the end of their training to demonstrate the KSBs set out in the occupational standard.
New standards continue to be made available and have been replacing the older apprenticeship frameworks in recent years. All new apprenticeship starts will be on standards by the beginning of the 2020/21 academic year.
For further information on apprenticeships please see here.
Traineeships were introduced in the 2013/14 academic year to provide young people with essential work preparation, English, maths and work experience to secure an apprenticeship or other work, and can last up to 6
months in duration. From August 2014, traineeships were available to young people aged 16-24 and prior to this only to young people aged 16-23.
Traineeships support social mobility by providing training for young people who need to develop their skills and experience to enter the labour market. They are aimed at those young people who are motivated to work but lack the necessary skills and work experience to gain an apprenticeship or other job.
Community learning funds a wide range of non-formal courses, from personal development through to older people’s learning, IT courses, employability skills, family learning and activities to promote civic engagement and community development. Courses may be offered by local authorities, colleges, and voluntary and community groups, and include activity targeted at deprived areas and disadvantaged groups.
English and maths courses are also referred to as ‘basic skills’ and cover GCSEs, Functional Skills, Adult Basic Skills Certificates, including English for Speakers of Other Languages (ESOL) Certificates, and Qualifications and Credit Framework Certificates and Awards in English and maths.
Formerly known as ‘Skills for Life’ in the FE and skills statistics, they are designed to give learners the necessary basic reading, writing, maths and communication skills for everyday life, to operating effectively in work and/or succeeding on other training courses. Achieving a level 2 qualification in both English and maths is often required for further study, training and skilled employment.
Education & Training
Education and Training is mainly classroom-based adult further education that is not classed as an apprenticeship, community learning or workplace learning. It can also include distance learning or e-learning. It includes traineeships and offender learning.
Historic FE provision and statistics
The FE and skills publication formerly included statistics on other FE learning and government programmes which are no longer relevant to the releases or no longer exist, such as:
Workplace Learning: this covered a broad range of training including basic skills, level 2, level 3 and higher-level skills. Workplace Learning was mainly delivered through the workplace and did not include apprenticeships, which are classified as a separate programme.
Apprenticeship Grant for Employers (AGE 16 to 24) scheme: this provided apprenticeship grants with a value of £1,500 to employers with up to 1,000 employees to encourage employers to take on new apprentices aged 16 to 24. Eligible employers must not have had taken on an apprentice in the previous 12 months. Subject to budget availability and the employer’s commitment to support the apprentice to the end of their programme, any one employer could claim support for up to ten apprentices. AGE 16 to 24 had been extended to the 2015-16 financial year.
Access to Apprenticeships pathway: here participants used elements of an apprenticeship framework to brush up skills and workplace experience, with the aim of securing a paid apprenticeship with an employer as quickly as possible, up to a maximum of six months. Participants were not categorised or counted as an ‘apprentice’ until they became employed. This pathway was announced in May 2011 and was closed to new starts end of December 2013.
Employer Ownership Pilot (EOP): offered all employers in England direct access to up to £340 million of public investment over the period of the pilot (up to 2015/16) to design and deliver their own training solutions. Data for learners undertaking learning as part of the EOP was collected in separate data collections to ILR learners.
Large Employer Outcome Pilot (LEOP): the LEOP dataset collected information on learners funded through this pilot scheme and included learners employed by 23 employers who were part of the scheme. As LEOP learning was undertaken under the auspices of employers, where the employers were effectively learning providers and treated as such in statistics which report by provider.
Offender learning: aims to ensure offenders have the skills that will enable them to gain worthwhile, sustainable employment and in-so-doing reduce the likelihood of re-offending. The Offender Learning and Skills Service (OLASS) funds a wide range of learning aims for adults (aged 18 and over) in custody in prisons in England, including awards and units, with a focus on addressing English and maths at the start of a sentence and vocational skills in the run-up to release. Offender learning was reported separately in the FE and skills statistics for the last time in November 2018, as both budget and delivery transferred to the Ministry of Justice from 1 October 2016, although associated learners continue to be included in our overall Further Education and Skills national figures.
Contextual data and statistics: the FE and skills release formerly included additional, more contextual data around FE, although users can still obtain related data from alternative sources, for example:
Vocational qualifications achievements data remains available via Ofqual publications.
Level of highest qualification held by economically active adults in the population (based on the Labour Force Survey) can still be obtained from Nomis provided by the Office for National Statistics, which allows users online access to UK labour market statistics.
Note: Nomis statistics will not be directly comparable to the previously published FE and skills data due to some methodological differences. The previously published data used an ONS-agreed methodology to enhance raw Labour Force Survey data by reference to educational administrative data; this corrected for underreporting of levels of qualification.
For previous announcements on the changes to FE and skills releases outlined above please see here:
FE and skills statistical releases do not contain information regarding provision for 16-18-year-olds except for apprenticeships, traineeships and in some cases community learning, regardless of educational institution.
These releases also do not include higher education, for example degrees, in higher education institutions (HEI), but will include FE learning performed at a HEI and degrees performed in a FE institution, and as noted above degrees undertaken as part of an apprenticeship programme.
Additionally, the FE and skills releases do not include privately funded training.
The Education and Skills Funding Agency (ESFA) brings together the responsibilities of two former non-departmental public bodies, the Education Funding Agency (EFA) and Skills Funding Agency (SFA), to create a single agency accountable for funding education and skills for children, young people and adults. The ESFA was formed in April 2017 and is now formally part of the DfE.
As stated in the ESFA business plan for the 2019 to 2020 financial year, they are accountable for almost £60 billion of funding for the education and training sector and regulate academies, FE and sixth form colleges, and training providers, intervening where there is risk of failure or evidence of mismanagement of public funds.
Since May 2017, funding arrangements for apprenticeships changed to give employers greater control over funding for apprenticeship training. The UK wide apprenticeship levy came into force on 6 April 2017 and required all UK public and private sector employers with an annual pay bill of £3 million or more to invest in apprenticeship training. The Apprenticeship Levy is a levy on UK employers to fund new apprenticeships and is charged at a rate of 0.5% of an employer’s pay bill.
The Apprenticeship Service was introduced in England as an online service to allow levy-paying employers to choose and pay for apprenticeship training more easily. They are able to access their levy funds to spend on apprenticeships, manage apprentices, pay training providers and stop/pause payments to training providers. Apprenticeships are a devolved policy, meaning that authorities in each of the UK nations manage their own apprenticeship programmes, including how funding is spent on apprenticeship training.
Employers register for an Apprenticeship Service Account (ASA) and the amount of funds available in each ASA will depend on how many of their employees live in England and the proportion of their pay bill paid to these employees.
Funds in ASAs can only be used to pay for apprenticeship training and assessment, and not other costs associated with apprentices, such as wages, statutory licenses to practise, travel and subsidiary costs, work placement programmes or the setting up of an apprenticeship programme).
In April 2018 it became possible for levy-paying organisations to transfer up to 10 per cent of the annual value of funds entering their apprenticeship service account to other organisations on the apprenticeship service. From April 2019, this increased to 25 per cent.
Employers who do not pay the levy can access apprenticeship training via their selected training provider who is given an allocation of funds for apprenticeship training from the ESFA. The provider claims for these funds against their allocation using the Individualised Learner Record (ILR - further details below) as was the case before the introduction of the levy.
From 09 January 2020 the apprenticeship service was extended for all non-levy paying employers to register and use. Employers who do not pay the apprenticeship levy are able to make a funding ‘reservation’ (or have a funding ‘reservation’ made on their behalf) with respect to an apprenticeship planned to start in the following 3 months. This will allow non-levy paying employers to access the benefits of the system and reserve funds to support their training.
Until autumn 2020 employers who do not pay the apprenticeship levy were able to access apprenticeship training either through a provider with an existing Government contract or via the apprenticeship service. By autumn 2020, all new apprenticeship starts were to be arranged through the apprenticeship service
Advanced Learner Loans
Advanced Learner Loans are available from Student Finance England for those aged 19+ on the first day of their course to help cover the costs of a Level 3, 4, 5 or 6 qualification at an approved college or training provider in England. The amount of funding depends on the type of course, course fees and the maximum loan available for your course. The minimum loan you can get is £300 and is paid directly to the provider. Loan balances owed for ‘Access to HE’ courses are written off once the learner has completed a higher education course, presuming the HE course is eligible for student finance. For more information on advanced learners loans, including funding rules and qualifications available for funding, please see here.
FE and skills releases use two different data sources to report information on loans:
Advanced learner loan participation data which uses the ILR, and
The number of learners with an advanced loan reported through the ILR differ from data showing loan applications and loans paid based on SLC data.
Loan application information relates to applications received for FE provision funded through advanced loans. The data used to produce the publication are sourced from the SLC application database. The data refer to received applications, including those not yet processed, and will therefore be different than actual number of learners participating on a course with an advanced loan.
Adult Education Budget
The scope of adult skills funding has changed over the years to reflect funding settlements. For many years there were separate funding lines for offender learning and Community Learning and the previously named Adult Skills Budget, which included funding for apprenticeships (for ages 19+). In addition, a separate funding line was introduced for advanced learner loans. However, more recently both the budget and responsibility for delivery of offender learning transferred to the Ministry of Justice (in October 2016) and since the introduction of the apprenticeship levy (in May 2017), apprenticeships (for ages 19+) are no longer included.
From 2016/17, the Adult Education Budget (AEB) has covered education and training for adults (aged 19+) from pre-entry to Level 3, including Community Learning, the Prince’s Trust Team Programme and 19-24 Traineeships and excluding apprenticeship learning and advanced learner loans.
The AEB provides funding to some providers of education and training with the aim to engage adults and provide the skills and learning they need to equip them for work, an apprenticeship or other learning.
It enables more flexible tailored programs of learning to be made available, which may or may not require a qualification, to help eligible learners engage in learning, build confidence, and/or enhance their wellbeing.
This includes information on legal entitlements, flexibilities, funding for learners with an Educational Health Care Plan (EHCP) and for the unemployed, some of which is outlined here.
ESFA funded AEB supports three legal entitlements to full funding for eligible adult learners. These are set out in the Apprenticeships, Skills and Children's Learning Act 2009, and enable eligible learners to be fully funded for the following qualifications:
English and maths, up to and including level 2, for individuals aged 19 and over, who have not previously attained a GCSE grade A* - C or grade 4, or higher, and/or
first full qualification at level 2 for individuals aged 19 to 23, and/or
first full qualification at level 3 for individuals aged 19 to 23.
Funding for learners with an EHCP
ESFA has the responsibility for securing the provision of reasonable facilities for education and training suitable to the requirements of persons who are 19 and over. This includes learners with an identified learning difficulty or disability who have previously had an EHCP and have reached the age of 25.
ESFA funded AEB also supports delivery of flexible tailored provision for adults, including qualifications and components of these and/or non-regulated learning, up to level 2 – this is referred to ‘local flexibility’. Local flexibility provision is either fully or co-funded, depending on the learner’s age, prior attainment and circumstances.
We have two main statistical publications that we release on a quarterly basis
FE and skills publication: includes quarterly releases and covers overall FE and skills learning (including apprenticeships and traineeships), with breakdowns of non-apprenticeship FE learning.
Apprenticeship and traineeships publication: includes quarterly releases apprenticeships and traineeships learning.
Additionally, we also publish headline apprenticeship starts data on a monthly basis, along with registrations and commitments data from the Apprenticeship Service for transparency purposes. Some of the statistics we release are published on an annual basis only, this includes statistics showing progress towards the public sector apprenticeship target (please see ‘Key historic policy changes’ below).
Quarterly release schedule:
Quarter 1: Data from August to October published in January
Quarter 2: Data from August to January published in March
Quarter 3: Data from August to April published in July
Full Year: Data from August to July published in November
Note: The academic year in the FE publications covers August to July.
The quarterly releases include national summary ‘headline’ data, along with a wider range of more detailed breakdowns, e.g. learner characteristics, subject information, geographical breakdowns, provider level data etc.
Individualised Learner Record (ILR)
The key data source used to produce the FE publications is the Individualised Learner Record (ILR) collection run by the ESFA, which is based on data returns from FE education colleges and training providers, about learners in the system and the learning undertaken by each of them. More background information on the ILR is available on page 30.
Apprenticeship Service data
Data from the Apprenticeship Service is increasingly being used in our statistical releases, including information on levy-paying and non-levy organisations on e.g. registrations, commitments, transfers etc (see ‘Apprenticeship Service’ in the ‘Funding’ section above for more information).
Public Sector returns
This data is used to measure progress towards the public sector apprenticeship target. It is based on self-declared information on organisation headcounts and apprentice recruitment that has been submitted to the Department by each public sector body that has determined itself to be in scope
(see ‘Key historic policy changes’ in the annexes for more information on the public sector apprenticeship target).
Student Loans Company data
We also publish some information based on data collected by the Student Loans Company (see ‘Advanced learner loans’ in the ‘Funding’ section above for more information).
Historical data sources
Other data sources have previously been included in the published FE statistics, including pilot data (e.g. EOP/LEOP data), Ofqual data and also data from the Labour Force Survey, but these are no longer include these sources as set out in the ‘change announcements’ (see ‘Historic FE provision and statistics’ in the ‘What is Further Education?’ section above)
Our statistics aim to adhere closely to the guidance presented in the UK Code of Practice for Statistics and the two-year GSS Quality Strategy, which was released in June 2019 to improve the quality of statistics across the UK. We are engaging with the GSS quality strategy group and have already taken steps forward to ensure our statistics achieve the strategic goals. Our progress in quality is outlined below in the five strands of quality highlighted in the GSS to ensure consistency; Relevance, Accuracy and Reliability, Timeliness and Punctuality; Accessibility and Clarity, and Comparability and Coherence.
Relevance is defined as the degree to which statistics meet the current and potential needs of users for both coverage and content. We regularly engage with users and ensure they are at the forefront of the iterative improvement to our collections.
FE statistics can be used to give insight into participation and learner outcomes in the FE and Skills sector, including apprenticeships. This includes the ability for users to see how participation in government funded FE qualifications change over time by various learner characteristics (such as gender, age etc.).
Over recent years we have changed what we publish and the frequency of our releases to keep our statistics relevant and to fulfil a broad range of user needs. This has included increasing the granularity of what we publish and new outputs that support more re-use, along with instances of where we have ceased to publish statistics that are no longer relevant or have been replaced. Some of these changes have been the result of internal consultations and others have been user feedback as outlined here:
In terms of our internal users, we engage regularly with policy colleagues to understand the latest policy requirements and check that we have published the correct statistics to support their policy needs. For example, following the introduction of the apprenticeship levy and creation of the apprenticeship service, we introduced monthly releases of headline apprenticeship data for transparency and to allow policy leads to view the impact of apprenticeship reforms.
We also ensure we improve the relevance of our statistics through discussions with policy leads to ensure we are meeting their needs. For example, we moved away from publishing traineeship progression data based on self reported information, which is inherently inaccurate, to more useful measures on traineeship ‘completion rates’ and ‘conversion rates’ from a traineeship start to an apprenticeship start. This resulted in more reliable measures and also made better use of available administrative data.
Furthermore, we engage early with policy teams to ensure the data we collect will support any future publishing requirements for the department. For example, early discussions with them to understand the potential need for data and statistics to monitor the introduction of T levels may lead to discussions with data collection teams to ensure we are collecting the data required. If not, we can ensure system changes are implemented to collect the data needed well in advance of potential publication.
We also regularly engage with our key external stakeholders. For example, we have been working with Devolved Mayoral Combined Authorities and the Greater London Authority to ensure our statistical publications address their needs. Several new statistical products have been developed as a result of these conversations, including new breakdowns of FE participation funded through the Adult Education Budget and tools that allow users to breakdown non-apprenticeship FE data by each active devolved area.
We also publish a Power BI dashboard of FE statistics to provide more interactive and visual access to our data at sub national levels.
Another result of DfE regularly reviewing and assessing official statistics publications against user need is the development of the new Explore Education Statistics dissemination platform, through which it is planned that all DfE statistics will be published. This platform will allow for quicker refreshes of our statistics and improved navigation for users, allowing them to quickly locate statistics and data of interest to them.
We continually review our departmental policies on statistics to ensure we are meeting user needs consistently. For example, we have recently reviewed our suppression policy and now suppress data only when needed, for disclosure control rather than suppressing by default. This has allowed us to release detailed data products (including new unrounded underlying CSV data files) which will support more detailed user analysis.
We regularly review requests for data via PQs, FOIs and other ad-hoc requests to ensure we are meeting demand for having specific breakdowns in the public domain.
The Code of Practice for Statistics states that information on accuracy should be monitored and reported regularly to users. Here, accuracy means how close the estimated or observed statistics are to the (unknown) true value.
It is important for our users that we provide an early picture of FE performance to allow them to assess the impact of government-funded provision and to hold the ‘system’ to account. However, the key data we publish (the ILR) is from an operational system designed for funding providers. Therefore, we note important limitations for users to be aware of and consider when producing analysis and making comparisons, for example based on provisional in-year data. Please see the ‘Methodological Considerations’ section for issues with using in year data.
Some data items returned by data providers are more reliable than others
For example, some may be ‘self-reported’ by learners, whereas others are not needed for funding purposes and may therefore be less well completed by the provider.
Additionally, some ILR returns are more robust than others, it may depend on whether they have traditionally been ‘compulsory’ returns – our main quarterly returns are based on the traditionally more robust data returns.
Our quality is managed by:
Working closely with data collection, specification and data warehousing colleagues within the ESFA on data quality issues. ILR datasets used for FE publications go through a quality assurance process before being made available to internal users for analysis and publication. Known data errors and/or issues are shared with internal users on a monthly basis. Such issues could, for example, relate to problems with the calculation of derived variables within the system or recording of data by providers. These will be taken into account when producing statistics and highlighted to users where appropriate.
Ensuring we have a robust quality assurance process in place from collection through to publication, with it increasingly based on automated checking routines of data production processes and formal quality assurance checklists/logs.
Ensuring that we are transparent with the quality of our data and statistics for users.
Presenting limitations of the data and statistics we produce where this is relevant:
Indicating when figures are provisional and are subject to change.
Indicating in footnotes how we have suppressed and rounded figures where relevant.
Regularly reviewing our data collections and methodology to improve the quality and accuracy of our statistics and data where possible. Linking again with other teams within the ESFA to review and agree methodology for new fields and measurements and any amendments required due to new policies, collection changes etc.
The methodology section describes how our different measures are calculated and any limitation to the measures and data. For example, we have detailed the issues with using provisional in-year data or comparing data across years.
Timeliness and Punctuality
Timeliness refers to the lapse of time between publication and the period to which the data refers. Punctuality refers to the time lag between the actual and planned dates of publication.
Both the FE and skills, and apprenticeship and traineeship releases are updated on a quarterly basis. We also release headline apprenticeship and data relating to the levy from Apprenticeship Service on a monthly basis as a transparency update. This is an example of a timely release enabling users to have an early view of commitments to future starts.
Our releases are reliant on when and what providers report for funding purposes using the ILR and Apprenticeship Service systems. On a monthly basis, providers make their returns, which are processed and quality assured within the ESFA before being made available an ESFA data warehouse for internal users.
As providers take time to fully report all their learners for a specific period, it is only the final return made in October that provides an accurate picture of their activity. This is the final, full year data return that is used in the final year FE statistics release for a particular academic year, published in November. All releases for the academic year prior to this point will be based on provisional in year-data that is subject to change.
Underreporting can be significant at the beginning of the year as providers focus on other priorities such as enrolment, but the level of underreporting as a ratio of the total numbers reported will diminish as the year progresses. We do provide some high-level comparisons of in-year data with levels reported at the same point of previous years to offer some meaningful comparison. However, reporting behaviours change between years so it is not possible to accurately measure the proportion of under reporting for any published period.
For example, Table 1 below shows the data lag for apprenticeship starts for 2016/17 to 2018/19, comparing when data were first reported for each quarterly release compared to final data for the period (received in the ‘R14’ ILR return). The ILR data for the R04, R06 and R10 returns is the ‘first reported’ data for the first quarter, first two quarters and first three quarters published in Jan, March and July, respectively. The R14 return is published in November each year and covers data for the full, final year. The table shows, the data first reported for quarter 1 (August 2018-October 2018) was 88.1 per cent of the final figure for the period for the 2018/19 academic year. The equivalent proportion in 2016/17 was 94.8 per cent. Please note that some proportions in year can be greater than 100 per cent of the final year figure in certain returns. This highlights how providers can make revisions and corrections downwards in final year data for earlier periods.
Table 1: Data lag examples
Proportion of final year figure:
The ESFA publish a schedule of when data will be made available within the Department to help users with planning analytical work. This includes us setting publications dates to ensure we can publish in a timely manner.
Once data has been quality assured for payment purposes by ESFA it is transferred to a data warehouse and other ESFA servers. It is then further quality assured during our data production processes prior to publication.
Our releases are produced as soon as possible after data has been submitted by providers and has passed through quality assured processes to ensure users have access to the most relevant and timely statistics on FE. The exact date is announced no later than 28 days, as mentioned in the UK Code of Practice for Statistics.
In deciding how frequently to publish figures the FE statistics team take account of factors such as the users’ needs, the nature of the data, the reliability of publishing at certain time points and the resource involved in doing so.
FE statistics are published at 9:30am on a date where the month of publication has been pre-announced in advance on the gov.uk website.
Accessibility and Clarity
Good quality statistics are presented in a clear and understandable form, released in a suitable and convenient manner, available and accessible on an impartial basis with supporting metadata and guidance.
Statistics on the FE system are currently published via gov.uk. Our statistical release dates are also available on the gov.uk official statistics calendar. Our recent move to the new Explore Education Statistics dissemination platform allows us to use a consistent dissemination approach across all our statistics. Furthermore, it ensures that there is more data available for users to interrogate and come up with their own insights. As always, we will continually provide footnotes and technical advice to our statistics.
Furthermore, we now present the data in just two releases to make it easier for users to find and access our statistics. We also produce more Power BI dashboards to give users more visual illustrations and interactive releases of our data.
We have been reviewing our releases to see how we can meet the legal accessibility obligations. The new EES platform has been designed in an accessible format.
Comparability and Coherence
Comparability is the degree to which data can be compared over time, by region or other domain. Coherence is the degree to which the statistical processes, by which two or more outputs are generated, use the same concepts and harmonised methods.
The FE releases are produced so they can be compared over time. However, for provisional in-year data care should be taken in comparing across years both because of reporting lag and timing of differing types of programmes and qualifications change year on year. For example, there is a strong correlation between level and subjects taken with age groups and ethnicity so, comparing in-year data with in-year data from previous years can be difficult to interpret.
areas Another issue is how well high-level averages give a true comparison across years. Achievement rates (published annually) differ significantly by age and sector subject area, therefore change in the national rate will not provide a reasonable comparison if there has been a change to the subjects delivered.
If revisions are made, such as to geography, we ensure there is enough of a time series available for comparison purposes. Geographical classifications are updated once a year. Typically, we use the May postcode mapping file from the Office for National Statistics. When showing a time series, we use the geographical mappings which were relevant to each historical year.
In our current releases we have provided up to six years of full-year data on Explore Education Statistics (EES) as a default. Data for earlier years is available in the FE data library and from our FE & Skills statistics landing page. Our basic skills participation tables have data from 2010/11 to 2019/20. We ensure that where it is not possible to compare over time due to a change in methodology or other reasons, we detail this in the footnote of tables and within our statistical commentary.
For example, we highlight the impact of the introduction of the ‘Single ILR’ in table footnotes and link to a separate document detailing the impact on the numbers and comparability caveats across years:
Figures for 2011/12 onwards are not directly comparable to earlier years as a Single Individualised Learner Record (ILR) data collection system has been introduced. Small technical changes have been made in the way learners from more than one provision type are counted, leading to a removal of duplicate learners and a reduction in overall learner numbers of approximately 2 per cent. More information on the Single ILR is available at:
Another example of how we highlight comparability issues to users relates to the reclassification of a number of full level 2 qualifications to level 2 qualifications in 2016/17:
From 2016/17, some learners who would have previously been included in full level 2, have been reclassified. Between August 2016 and July 2017 183,200 learners from Education and Training provision have been reclassified to level 2.
Where possible in DfE we are consistent with our measures and naming conventions across collections, such as geographical locations and definitions of qualification levels.
We review comparability within our statistics and strive to make the department’s collections more comparable with each other as well as with external statistics where relevant.
Data sources, process flow of data and statistical outputs
Individualised learner record (ILR) background
The Leitch review of skills published in 2006, recommended a move towards a ‘demand-led’ funding for vocational education and training within the FE sector, with employers being directly involved in deciding training priorities.In 2008/09, the then Learning and Skills Council implemented a new demand-led funding methodology. This resulted in extensive changes for the data collection and funding arrangement for the ILR. There were changes to the names, the frequency and the data requested in the ILR collections.
Further changes to the ILR were implemented in 2011/12, when a ‘single individualised learner record’ (SILR) collection was introduced to replace the multiple separate ILR collections used in previous years for different funding streams. Providers now return all information on all their learners in a single file.
ILR returns are required from providers who receive funding directly from the ESFA, or through Advanced Learner Loans.
Providers are grouped into two broad types for ILR collection purposes, these are:
General FE colleges, Tertiary, Sixth-form, Specialist Colleges, and Academies
Sixth Form Colleges
Higher Education Institutions.
Training organisations, covering:
An organisation whose main area of business is education and/or training, but which is not a college or local authority
A ‘loans only’ training provider with a facility to deliver only Advanced Learner Loans funded provision.
All providers must return ILR data for learners for whom they receive direct funding from the ESFA through any of the following funding models:
16-19 (excluding apprenticeships)
European Social Funding (ESF)
Apprenticeships (from 1 May 2017)
Other Adult funding, or
Other 16-19 funding
Adult Education Budget Devolution
This includes learners who are subcontracted out to other providers. There are some circumstances in which learners who are not funded by the ESFA must be included in ILR returns. Providers must also return ILR data for learners who are financed by Advanced Learner Loans.
Please see the above link to the provider support manual for more details.
Uses of ILR
The above link outlines some of the key uses of ILR data, which includes:
Supporting funding and commissioning decisions
Informing the work of Ofsted and other agencies.
Presenting progress and position within the FE sector to Government to help inform policy making decisions.
Calculating funding earned by providers.
Providing management information, including performance indicators and modelling relating to different programmes and/or policies.
Information on effectiveness of the learning programmes and outcomes achieved.
Ensuring public money is spent in line with government targets for quality and value for money.
Quality of ILR data
The ILR is the key data source used in our FE publications. It is administrative data collected and held by the ESFA based on data returns from FE providers. One of the key uses of ILR data is to fund learning providers. Providers are informed about changes to the ILR specification on an annual basis and guidance for completing the ILR is provided (see the ‘provider support manual’ above). The ESFA then compiles and quality assures the data from the providers.
Some items on the ILR may be more complete than others and some may be based on self-reported information by the learner. Data items that underpin funding claims, such as learner activity, will be more closely monitored by providers and more complete, and being subject to audit and other controls. Other data such as learner characteristics may not be subject to the same controls but are nevertheless reliable. Some fields are not mandatory in ILR returns or are reliant on the processes providers deploy to gather the information. The ILR specification is sent to providers and their management information (MI) software suppliers, so in most cases data is reliably transferred to the ILR directly from their MI system. Other data use a shared reference library such as the Learning Aim Reference Service (LARS, see below),for details of the learning aim being delivered so data is accurate. Fields such as employment status or learning disability or difficulty, are reliant on the process the provider uses to help the learner report information. Guidance is provided to providers on collecting such information and there is a well-developed support network for MI managers to maximise the chance of robust reporting. However, how learners choose to report can vary.
Learner Record Service
The main purpose of the Learner Record Service (LRS) is for providers to register their learners for a Unique Learner Number (ULN), and view their learners’ achievements via the Personal Learning Record (PLR). Awarding organisations authorise and award learner qualifications in the UK and provide these to ESFA to upload into the PLR. The LRS offers the ULN and PLR to enable qualification data from AOs to be shared securely and efficiently within the education and skills sector.
The Unique Learner Number is used for all learners from the age of 14 at Further Education institutes and schools. It is also optionally used in Higher education. Further information about the Learner Record Service is available here:
The Learning Aim Reference Service (LARS) is used as a reference library and contains all qualifications DfE fund or have funded. Providers, awarding organisations and Ofqual supply information that different users may need. Providers use it to select the aim being delivered to a learner and enter the aim number as part of their ILR return. When the ILR is received via the ESFA collection process, the fields held in LARS can be linked with the aim reference number the provider supplied and the metadata is added to the ESFA FE standard datasets and data warehouse etc, to enable users to perform operational or analytical tasks.
Awarding Organisations are those organisations where learners or providers can register to take an exam. A list of awarding organisations is available here.
Awarding organisations have data on learners and the grades they have achieved for various subjects. Ofqual require this data to be submitted on a regular basis. Information on how Ofqual collect their data is available.
Awarding organisation data from this collection or collected through other routes feeds into other data products such as ILR, LRS and the National Pupil Database.
The diagram below shows the data flow from the data collection to the outputs including our National Statistics, i.e. the various stages of the process before the data is in a useable format in the data warehouse from which FE statistics are created. There are various other data outputs which use the warehouse including a localism dashboard for Local Enterprise Partnerships (LEPs), Mayoral Combined Authorities (MCAs) and Greater London Authority, FOI and PQ responses and external data sharing for research purposes. The various data sources have been detailed above.
ILR data goes through a review every year where variables are removed, changed or added. To change or add variables, ESFA run a process where internal or external users can input their requirements through a business case. ESFA then iteratively discusses this with the user and the decision is made through an ILR change board. Where changes are made, these are reflected in the ILR data specification and fed back to our users so they can effect changes to their management information or data processing systems ahead of the change to the ILR collection system and related downstream processing systems.
This is counted as the number of people who have attended one day or more on a learning aim in the given academic year. Any learner studying more than one aim, at the same provider, at the same level is counted once at that level in published tables. If studying at another level they will also appear in that level, but a total count of learner participation will only count learners once at each provider.
It should be noted that this is counting learners at each provider separately. Data presented in a way to tally with how we fund the system so all data aggregates appropriately whether looking at provider level or by learner characteristics etc. It is a count of unique learning spells at each provider and not a count of unique learners.
This is the count of apprenticeship programmes that begin in an academic year, showing the take-up of programmes. An apprentice is counted for each apprenticeship they start at a provider.
This counts starts on each apprenticeship programme and is not a count of unique learners. If a learner changes their programme, level or pathway, they will be treated as a new start. This measure is consistent with how we fund the system and all data will aggregate appropriately whether looking at provider level, subject or learner characteristics within the cohort for that year.
As it is a count of programmes in a given academic year, care should be taken in interpreting change in apprenticeship numbers over years as an apprentice will be counted again where starting a new apprenticeship. For the measure to be consistent with other educational measures, each year is treated as a different cohort so, where learners go on extended breaks and return to the programme they will also be counted again in a different academic year.
Over the period affected by the COVID-19 pandemic, there has been disruption to learning, leading to extended breaks in the 2019/20 academic year. We have changed how we identify new starts in 2020/21 to treat learners as continuing if and when they return in 2020/21. We will monitor this method as the 2021 year progresses to ensure the comparability of the starts measures is not affected by any continued disruption.
This is the count of enrolments at aims level (including component aims) for each academic year. Learners will be counted for each aim they are studying and so, can be counted more than once. It is not a count of unique learners, but does provide a total count of activity in that aim in any given year. Care should be taken therefore in interpreting different learner characteristics (such as ethnicity) where provided in tables using this measure, as they could be repeated where a learner does more than one aim.
Achievements in further education and skills are the numbers of learners who successfully complete an individual aim in an academic year. Given learners do multiple aims this is not a count of unique learners. Care should be taken therefore in interpreting breakdowns by learner characteristics, as they will be repeated where a learner does more than one aim.
Apprenticeship achievements are recorded when a learner reaches an end point of assessment and not at the end of learning. Apprenticeships generally take over a year or more to complete,so comparisons of participation and achievements in an individual year will be misleading. Achievement rates are published and these include more detail on their method of collection, but do remove duplication of learners and provide a clearer rate of conversion of starts to achievement. Please see the separate National Achievements Rates publication for more detail. It is also worth noting the Outcome Based Success Measures method based on the LEO dataset uses the learner’s highest and latest aim in a given academic year as a basis of its measures. Please see the links in ‘Related statistics’ below.
Starts supported by Apprenticeship Service Account (ASA) levy funds
In our January 2020 Apprenticeships and Traineeships release (which reported the first apprenticeships starts data for the 2019/20 academic year) we changed our terminology from ‘Levy-supported starts’ to ‘Starts supported by ASA levy funds’ to better reflect what we are actually counting. We also changed our approach in calculating such starts. Prior to the January 2020 release we had been publishing the number of levy-supported starts based on a match between ILR starts data have to information in an organisation’s ASA – this is known as the ‘data lock’ and is essential for payment of levy funds.
As this data match did not always occur in a timely way during the early part of the academic year this leads to an under-count when using this approach to identify whether a start is funded by the levy. To improve the quality of the count for in-year starts from January 2020 we started using an alternative approach based on a new ILR field that recorded the contract type an apprentice is funded through to identify a start as being ‘supported by ASA levy funds’. As we progress through the year, providers update their information on the ILR and by the final return the data lock issue should be resolved to allow payments to be made. From January 2020 we used this new approach for all in-year reporting and noted that there will be no back series on the new approach for in-year data, and that a time series would only be available for the final end of year data.
From 09 January 2020 the apprenticeship service was extended for all non-levy paying employers to register and use. This has led to a further minor change methodology change for 2020/21 in how we count starts supported by ASA levy funds as the new ILR field mentioned above that records contract type now includes both ‘levied’ and ‘non-levied’ starts. To improve the accuracy of the count we now include other ‘markers’ in the apprenticeship service to arrive at a better estimate now that we can no longer use the old method reliably. This particularly affects starts at the end of the year where it helps better identify potential ‘levied’ starts that have yet to generate payment in the system. Using this method improves the accuracy of our overall estimate of ‘levied’ starts figure, but it does potentially include small numbers of ‘non-levied’ starts.
The change to the approach in counting makes a minor difference to the actual numbers, but should be taken into account when making comparisons between 2020/21 and previous years.
Please note: the apprenticeship programme is largely funded by the apprenticeship levy, although an organisation can choose to fund apprenticeships themselves.
New approach for calculating achievements for apprenticeship standards
A new approach for calculating the date of achievement for apprenticeship standards was implemented in the apprenticeships and traineeships statistical release in January 2020 due to the introduction of a to record the end point assessment (EPA) for apprenticeship standards. For apprenticeship standards the passing of EPA represents the achievement date.
Under the previous methodology the learner’s leave month/year when learning had successfully completed and achieved was used as the date of achievement, which would normally be the same date. With the introduction of EPAs, the leave date and the EPA achievement date could span two different academic years. Therefore, a learner could have completed their learning in the 2018/19 academic year for example (and left the programme), but the EPA may not occur until the following academic year, i.e. 2019/20.
The new field enables the new method to calculate the month/year the learner successfully passed their apprenticeship programme for standards and applies from the 2019/20 academic year.
Full level 2 and Full level 3 methodology in 2016/17
In 2016/17 the number of qualifications classed as Full Level 2 and Full Level 3 were reclassified by the ESFA for the 19-23 entitlement, and to align with the 16-19 offer and recommendations in the Wolf Review of Vocational Qualifications.
The methodology change involved a number of Level 2 and Level 3 vocational qualifications no longer being classed as Full Level 2 or Full Level 3 for funding purposes, now being reclassified to Level 2 and Level 3, respectively.
Therefore, the number of learning aims (qualifications) designated as ‘full’ for 2016/17 onwards has decreased. The new methodology aligns more closely with the 16 to 19 Performance Tables in terms of the qualifications included.
Apprenticeships expected duration
Apprenticeship expected duration is the expected time period to complete the framework/standard. Before November 2018, the duration was based on the learning start date and actual end date where available. Otherwise the planned end date of the apprenticeship as recorded in the ILR was used.
Therefore, the actual end date was used if the apprenticeship completion status was ‘completed’, and the planned date was used for other reasons (withdrawn, planned break, transferred, continuing).
In the November 2018 release, we changed the methodology slightly to use planned end date as recorded on the ILR and exclude re-starting learners who may have returned from a planned break of stay. If a learner has an original start date that is different to their learning start date, this indicates that the learner is restarting the learning aim at the same provider, for example if they are returning from an agreed break in learning. As these learners will have some prior attainment, it is expected that the duration of their apprenticeship on returning would be shorter compared to new starters, and therefore they have been removed from the duration calculation.
Planned length of stay
Minimum durations have been in place for framework-based apprenticeships since August 2012. For learners aged 16 to 18, apprenticeships must last at least 12 months, but for learners aged 19 and over there is more flexibility, as some adults have prior learning/attainment and can complete more quickly (for example if the training provider can evidence prior learning, the minimum duration is reduced to 6 months). For new apprenticeship standards the minimum duration is 12 months, with no exceptions.
From 2015/16, the methodology to calculate planned length of stay has been slightly revised, and we include those learners whose start date is the same as their planned end date. In 2014/15 this would have meant 100 learners included in the total for ’12 months or more’ would also have been included in total for ‘fewer than 12 months’. As this methodology change does not affect many learners, figures for previous years have not been revised.
Employer Ownership Pilot collection
The Employer Ownership Pilot (EOP) 2014/15 collection was affected by the move to an improved collection system between provisional and final return dates. While this move has put future collections on a better footing, issues resulting from the transfer remain. Rather than extend the considerable work with providers to reconcile remaining differences between returns we decided to continue to use the provisional data. The provisional data was complete and passed full quality assurance.
The overall impact on the FE figures is negligible because EOP is very small in relation to the ILR collection, the main data source for this release. In 2014/15, just 1,500 apprenticeship starts of the 499,900 in total were EOP and in 2015/16, just 1,000 apprenticeship starts of the 509,400 in total were EOP. There were no EOP starts since 2016/17.
Motor vehicle service and maintenance technician (light vehicle) apprenticeship standard sector subject area change
The Institute for Apprenticeships have reclassified the Motor Vehicle Service and Maintenance Technician (light vehicle) apprenticeship standard from the Retail and Commercial Enterprise sector subject area tier 1 to Engineering and Manufacturing Technologies. The new sector subject area tier 2 for the standard is Transportation Operations and Management. All tables and tools published from March 2019 containing sector subject area fields have been updated to reflect this change, including historical data.
It is important to provide the earliest picture of FE performance once it becomes available so that users may assess the impact of government-funded provision and hold the system to account. However, in-year data are taken from an operational information system that is designed to support the funding of providers and there are some important limitations users should take into consideration.
In-year information is subject to data lags when providers submit information after the period it related to. This information is subsequently attributed to the correct time period. Data are subject to data lag until the final returns for the academic year are made by providers, after the end of the academic year. Final data for each academic year (August to July) are published in the following November. The size of revision to individual estimates that arise from data lag can vary greatly. They tend to be around 2 to 3 per cent but have been as much as 20 per cent. Revisions are typically upward though it should be noted that on occasions small downward revisions are possible.
Data lag from one year to the next is not predictable as provider behaviour changes over time and there is no source of information that would enable a robust estimate of completeness of data that have been returned. We generally recommend using final data from the last complete academic year for analysis of changes over time.
We carry out a quality assessment of the volume of providers that have made their returns. If we consider estimates to be particularly weak, due to data lag or any other factor, we may defer publication of those estimates. In recent years, we have encouraged more timely reporting of data ahead of the final return.
Issues with comparing data across academic years
In the 2011/12 academic year, a Single ILR (SILR) data collection system was introduced (see page 30 for more information). This replaced the multiple separate data collections used in previous years and led to small technical changes in the way learners from more than one funding stream are counted.
Overall, the new collection system led to a removal of duplicate learners and a reduction in total learner participation of approximately 2 per cent. However, apprenticeship participation figures were more significantly affected due to a change in the way learners participating on more than one apprenticeship programme were counted. This has led to a removal of duplicated learners and a reduction in overall apprenticeship learners of approximately 5 per cent; therefore, participation figures for 2011/12 onwards are not comparable to earlier years.
There are several other statistical publications available that can be used to provide contextual information to Further Education statistics. Some examples include:
National Achievement Rate Tables (NARTs), which provide summary indicators of performance in apprenticeships and education and training. Individual providers use the data to benchmark their own targets and actual performance. National figures are now incorporated into our releases on Explore Education Statistics.
Further Education for Benefit Claimants provides information on further education activity funded by the ESFA for adult benefit claimants in England. The statistics are produced using a matched dataset of the ILR from the (former) Department for Business, Innovation and Skills (BIS) and the Work and Pensions Longitudinal Study (WPLS) from the Department for Work and Pensions (DWP). The publication provides statistics on benefit spells with training, learners who were on benefits at the start of training, level of qualification studied (all and highest) and type of learning provision.
Since the latest review by the Office for Statistics Regulation, we have continued to comply with the Code of Practice for Statistics, and have made the following improvements:
We have highlighted the key methodology information in this content, including data quality issues surrounding the in-year data in our releases, and ensured these statistics are now clearly caveated. We have improved the commentary with the necessary caveats to aid interpretation and to visually reflect the variability with in year statistics. We made a number of changes to our publication structure and publication timings from January 2018, which were outlined in the Proposed Changes document in November 2017. We responded to various user needs by creating more data tools with the view to improving the richness of data and developed a table finder to help users navigate the tools in the further education data library before the move onto the Explore Education Statistics (EES). These changes, particularly those implemented from the 2017/18 academic year were informed by user feedback, an internal review of further education Official Statistics, and departmental priorities, with a view to improving consistency across our statistical publications. Furthermore, the FE Official Statistics mailbox has been introduced as the central point of contact for further engagement and to invite feedback. The very latest data is now fully interactive within EES and accessible using the table builder.
Pre-release access to statistics
Pre-release access is the process by which Ministers and officials have access to official statistics 24 hours before their publication in accordance with the Pre-release Access to Official Statistics Order 2008. It is only given to enable the making of a public response at or shortly after the time of publication of statistics, or to enable the taking of action at that time. The number of people granted pre-release access is kept to a minimum.
A list of officials who have received pre-release access to our Statistical Releases up to 24 hours in advance of publication is available on our release pages.
Rounding is our primary approach to preserving confidentiality in our main tables, where headline volumes are reported rounded to the nearest 100.
In the supplementary tables, volumes are rounded to the nearest 10. There may be exceptions to this in instances where any figure that is an exact duplicate of a value in the headline figures (e.g. Grand Totals), which are rounded to the nearest 100 and avoid contradictory figures.
Percentages reported are calculated on pre-rounded data and given to one decimal place.
We suppress some figures
In the main tables, headline volumes greater than 0 but below 50 are suppressed. In reality, very few figures in our main tables are small enough to be suppressed in this way.
In the supplementary tables volumes below 5 are suppressed.
A percentage of less than 0.5 per cent is suppressed.
We adopt symbols to help identify suppression
Symbols are used in the tables as follows:
‘~’ indicates a headline volume of greater than 0 and fewer than 50 in the embedded tables, and greater than 0 but fewer than 5 in the supplementary tables. It also indicates a percentage of less than 0.5 per cent.
Apprenticeships are available at intermediate, advanced and higher levels – see Table 1 below for examples of equivalent qualifications. Higher apprenticeships are those at level 4 and above. Higher level apprenticeships at levels 6 and 7 may include a degree as a component qualification.
Level 4 apprenticeships were introduced in 2006/07; level 5 apprenticeships were introduced in 2011/12; level 6 apprenticeships were introduced in 2013/14 and level 7 apprenticeships were introduced in 2014/15.
Equivalent educational level examples
5 GCSE passes at grade A*- C or 9 – 4
2 A level passes
Level 3 Diploma
4 / 5
6 / 7
Examples of equivalent level 4 and 5 apprenticeships would be Higher National Certificates (HNCs) and foundation degrees, respectively.
Apprenticeships at levels 6 and 7 are equivalent in level to a bachelor’s or a master’s degree, and can include a degree as a component qualification within the apprenticeship programme, which may or may not be mandatory
English and maths, up to and including level 2 (must be delivered as part of the legal entitlement)
Level 2 (excluding English and maths) (First full 2 must be delivered as part of the legal entitlement)
Fully funded* (first and full)
Learning to progress to level 2
Fully funded^ (up to and including level 1)
Level 3 (First full level 3 must be delivered as part of the legal entitlement)
Fully funded* (first and full)
Loan-funded** (previously achieved full level 3 or above)
Fully funded (including 16- to 24- year-olds##)
English for speakers of other languages (ESOL) learning up to and including level 2
Fully funded – unemployed
Learning aims up to and including level 2, where the learner has already achieved a first full level 2, or above
Fully funded – unemployed
Learning aims up to and including level 2, where the learner has not achieved a first full level 2, or above
*Must be delivered as one of the English and maths, and/or first full level 2 or first full level 3 qualifications required as part of the legal entitlements. ^Must be delivered as entry or level one provision from local flexibility. # Excludes flexible element where funding depends on age and level. ## 16- to 18-year-old learners must be eligible under the ESFA's young people's residency requirements. ** Availability of loans at level 3 does not replace the legal entitlement to full funding for learners aged 19 to 23 undertaking their first full level 3. + Low wage flexibility may apply
FE is a broad and changing topic – the following key policy changes may be useful when using the statistics.
2008/09: Demand-led funding
The ‘Leitch Review of Skills’ published in 2006 recommended a move away from delivering skills via a ‘supply driven’ approach based on central government planning, towards a more ‘demand-led’ system of funding for vocational education and training within the FE sector, with employers being directly involved in deciding what the training and skills priorities should be.
Following this, in 2008/09 the then Learning and Skills Council implemented a new demand-led funding methodology resulting in extensive changes for the data collection and funding arrangement for the ILR. This resulted in changes to the names of the ILR collections, the frequency of collections and the data requested in each collection.
2013/14: 24+ Advanced Learner Loans
Advanced Learner Loans provide tuition fee support to adults wishing to train and retrain through higher quality learning programmes with strong labour market outcomes.
From 2013/14 to 2015/16, these were known as advanced learning loans and were available for learners aged 24 or older studying full level 3 and level 4 qualifications.
From 2016/17, these loans were extended to learners aged 19 or older on the first day of their course and studying at levels 3 to 6. They are now known as advanced learner loans.
24+ Advanced Learner Loans were introduced in August 2013 and withdrawn for apprenticeships in March 2014. This appears to have affected learner numbers on Level 3 and above courses and apprenticeships for ages 24 and over in 2013/14. This should be noted when comparing the 2014/15 volumes with 2013/14.
2014/15: Introduction of apprenticeship standards
As noted under ‘Apprenticeships’ on page 4, following a review and consultation of apprenticeships in England during 2012 and 2013, the government announced reforms to apprenticeships, part of which was the introduction of new employer-led apprenticeship ‘standards’ in 2014. Since their introduction different types of standards have been made available each year and have been replacing the older apprenticeship frameworks. All new apprenticeship starts will be on standards by the beginning of the 2020/21 academic year.
The apprenticeship reforms have focused on this move to standards, improving the quality and introducing minimum durations of apprenticeships. This means some learners find it more difficult or take longer to complete an apprenticeship, which may be reflected in the statistics.
In recent years more higher-level standards (to levels 6 and 7) have become available.
2016/17: Adult Education Budget
As noted under the ‘AEB’ in the ‘Funding’ section, the scope of adult skills funding has changed over the years to reflect different funding settlements.
The current AEB replaces what had previously been three separate budget lines (adult skills, community learning and discretionary learner support) and has remained consistent from 2016/17. It covers education and training for adults (aged 19+) from pre-entry to Level 3, including Community Learning, the Prince’s Trust Team Programme and 19-24 Traineeships and excluding apprenticeship learning and advanced learner loans. There are additional entitlements and flexibilities as set out in the latest published AEB funding rules.
2016/17: Apprenticeship reforms
Two key changes to way apprenticeship funding works were introduced and implemented in 2017, these were the apprenticeship levy and the online Apprenticeship Service, as outlined in the ‘Funding’ section above.
From 6 April 2017, employers with pay bills of over £3 million a year have had to pay the apprenticeship levy. Levy-payers were then able access their levy funds to spend on apprenticeships via the Apprenticeship Service.
The profile of apprenticeship starts changed significantly in the run up to the introduction of the levy and beyond and this is likely to have impacted on apprenticeship starts and participation as the new approach beds in.
Other aspects of the reforms included ensuring a minimum of 20 per cent off-the-job training as part of the apprenticeship programme.
2017/18: Levy transfers (1)
From April 2018 employers who pay the apprenticeship levy have been able to transfer funds to another organisation within the Apprenticeship Service up to a maximum amount of 10 per cent of their annual funds.
2017/18: Public Sector Apprenticeship Target
Public sector bodies in England with 250 or more staff have a target to employ an average of at least 2.3% of their staff as new apprentice starts over the period 1 April 2017 to 31 March 2021. Bodies in scope must have regard to the target meaning that, in making workforce planning decisions, they should actively consider apprenticeships either for new recruits or as part of career development for existing staff.
This target is for new apprenticeship starts, which includes both existing staff that start an apprenticeship and new recruits, and measures these as a percentage of total headcount of public sector bodies at the beginning of the reporting period. The target is an average over four years, split into individual reporting periods covering financial years (1 April-31 March).
The reporting periods are:
1 April 2017 to 31 March 2018
1 April 2018 to 31 March 2019
1 April 2019 to 31 March 2020
1 April 2020 to 31 March 2021
Please see the statistical release for progress towards the target in the first year for more methodological information:
From April 2019 the maximum amount a levy-paying organisation could transfer to another organisation within the Apprenticeship Service was increased to a maximum amount of 25 per cent of their annual funds.
2019/20: Devolution of Adult Education Budget
From 2019/20 around half of the AEB has been devolved to six Mayoral Combined Authorities (MCA) and the Greater London Authority (GLA). Delivery to remaining areas remain the responsibility of the ESFA. The seven active areas for 2019/20 are: Cambridgeshire and Peterborough, Greater Manchester, Liverpool City Region, Tees Valley, West Midlands, West of England and Greater London Authority.
MCAs/GLA have their own policies and priorities which, will change the nature of delivery from previous provision and therefore volumes will not be directly comparable with previous years. For more information on the policies of the active MCAs/GLA please use the links provided:
In addition to the information we previously published above, there are now a total of ten Combined Authorities along with the Greater London Authority. Please see here for further information.
2019/20: Non-levy payers on the Apprenticeship Service
From 09 January 2020 the Apprenticeship Service was extended for all non-levy paying employers to register and use. Employers who do not pay the apprenticeship levy are able to make a funding ‘reservation’ (or have a funding ‘reservation’ made on their behalf) with respect to an apprenticeship planned to start in the following 3 months. This reservation is in advance of recruitment or an offer of an apprenticeship being made to an existing employee. This reservation ensures that employers can plan, and that funds will be available to pay for the training from the point the apprenticeship starts. It is anticipated that this is new functionality will be utilised further in due course.
Training providers can ‘support’ this by making reservations on behalf of non-levy employers. Additionally, providers with reservations that have progressed to a ‘commitment’ (with a training provider assigned) are counted as supporting.
Prior to 9 January 2020, some non-levy organisations registered accounts on the Apprenticeship Service as part of testing in preparation for the extension of the service to all employers that do not pay the levy.
The number of learners who successfully complete the programme. Date recorded as date of end-point assessment.
Aim enrolments is a count of enrolments at aim level (including component aims) for each academic year. Learners will be counted for each aim they are studying and so can be counted more than once.
Old style apprenticeships that are being phased out and will have been completely replaced by the start of the 2020/21 academic year.
The UK wide apprenticeship levy that came into force on 6 April 2017 requiring all UK public and private sector employers with an annual pay bill of £3 million or more to invest in apprenticeship training.
New, high quality employer-designed apprenticeships that will have replaced all frameworks by the start of the 2020/21 academic year.
Black, Asian and Minority Ethnic group
The time period for which the learner is expected to complete the learning related to a particular learning aim, based on the learning aim start date and expected learning end date, as recorded on the programme record in the Individualised Learner Record (ILR).
Full level 2
Equivalent to a National Vocational Qualification at Level 2, or 5 GCSEs. The widths of all a learner’s Level 2 aims are summed to establish whether a learner is taking a Full Level 2 programme.
Full level 3
Equivalent to a National Vocational Qualification at Level 3, or 2 A-Levels. The widths of all a learner’s Level 3 aims are summed to establish whether a learner is taking a Full Level 3 programme.
Further Education and Skills
Learners who are studying a course in a FE college, training provider or within their local community and employees undertaking an apprenticeship or other qualification in the workplace.
The FE and skills and apprenticeship and traineeships publications generally cover adult (19+) FE and all age apprenticeships.
Learners with Learning Difficulty or Disabilities. This data is self-reported for the ILR.
Length of employment
Length of time that an apprentice reports as being with their employer prior to starting their apprenticeship programme, recorded from 2013/14 onwards.
The number of people who have attended one day or more on a learning aim in the given academic year. A learner studying more than one aim at the same provider at the same level is counted once in the line showing that level in tables. If studying at another level, they will also appear in that level, but a total count of learner participation will only count learners the once where studying at the same provider.
Apprenticeships where the employer is a public sector body who self-reports their progress towards the public sector target. It will not include those in the public sector in bodies below the threshold for reporting.
The number of apprenticeship programmes that begin in a given academic year, showing the take-up of programmes. An apprentice is counted for each apprenticeship they start in a given academic year.
Starts supported by Apprenticeship Service Account (ASA) levy funds
These were formerly referred to as ‘levy supported starts’ (see the methodology changes section below). An apprenticeship start that is either partially or completely supported by Apprenticeship Service Account levy funds. Levy funds are calculated by HMRC based on returns made by the employer. Employers register with Apprenticeship Service to drawn down funds they have accumulated to pay for apprenticeship training
Science, Technology, Engineering & Maths. Figures include learners in the 1) Construction, Planning and the Built Environment, 2) Engineering and Manufacturing Technologies, 3) Information and Communication Technologies and 4) Science and Mathematics sector subject areas.
Traineeship completion rate
The proportion of learners starting a traineeship that go onto complete it.
Traineeship conversion rate
The proportion of traineeship starts that go onto start an apprenticeship.
Our commitment to the users of our statistics is set out in the DfE policy on user engagement, which can be found at the link below, along with other documents describing the standards we follow to produce official statistics, which can be summarised as:
To publish our statistics on the pre-announced time and date
To publish our statistics in line with the Code and the Department’s Statistical Policies and Procedures
To give open and easy-to-use routes to allow users to supply us with their views and opinions
To respond quickly and accurately to questions and enquiries from our users
To consult with users on developments and changes to our statistical methodologies, publications or publication processes
To consult with both internal and external users on our statistical work plan
To ensure independent scrutiny of the work plan when necessary
To respond in a timely manner to any complaints from users
To be transparent when we are unable to meet any of the above commitments
The most common individuals, groups and types of organisations that use our statistics can be summarised in the table below:
Department for Education (DfE)
We have regular correspondence with internal customers from policy, analysis and press office teams. They use the statistics to report on ambitions; develop and monitor policies; and answer parliamentary questions.
Education and Skills Funding Agency (ESFA)
They use FE releases for all statistics shared externally including press releases and for responding to requests from educational bodies, and to respond to requests for participation, achievement and success rates information from external partners such as Ofsted and organisations.
Local Enterprise Partnerships (LEPS)
LEPs are regular users of our statistics to assess the level of further education training against their local labour market and skills needs.
Mayoral Combined Authorities (MCA) and the Greater London Authority (GLA)
As noted above, from 2019/20 around half of the AEB has been devolved to six MCAs and the GLA, enabling the authorities to take responsibility for delivering adult education in their local area. We have regular engagement activities with these authorities on data and publishing requirements, and as noted above have introduced new supplementary tables as part of the FE statistical release to provide participation activity within these areas.
Update: July 2021
In addition to the information we previously published above, there are now a total of ten Combined Authorities along with the Greater London Authority. Please see here for further information.
FE providers use our data to understand their delivery and impact in comparison to others that are active in their local area or for benchmarking their performance
Academics and researchers
We are often contacted by academics requesting information regarding our statistics for use in their research. Researchers and others can and do also apply for controlled access of FE unit level data for research purposes.
Other government departments use our statistics, for example the link between education and employment is crucial for the Department for Work and Pensions. Ofsted and Ofqual are also regular users of achievement rates and other data to support their functions.
Our releases are often picked up by various media outlets, running stories on the headline statistics.
FE statistics are viewed by members of the general public with an interest in education.
Various sector bodies, including the Association of Colleges, Holex and Association of Employment and Learning Providers, use our data to input into their activities.
We work with various internal and external stakeholder groups and boards on data requirements, many of which work with the department to support policy development and operational delivery. They may be provided internal management information to support their work with DfE, but may also drive requirements for what is published. For example, the Apprenticeship Ambassador Network is supported by DfE officials and works with employers to stimulate growth in apprenticeships.
Future developments for users [removed]
We use a wide range of opportunities to develop and strengthen our links with users so their views can drive the production and development of our official statistics. Examples include:
Direct user-interface – We have relationships with Bodies and Associations who have a professional interest in our statistics.
One-to-one engagement – We maintain formal and informal contact with individual users and bodies.
Feedback – Within our publications we invite and encourage user feedback and publish our contact details so users can provide us with feedback on any of our products or services.
Exploiting existing user information – We draw on any information we have from users who have contacted us in the past in order to get a better picture of user experience and need.
Improving presentation and dissemination – Our recent move to the new EES dissemination platform has allowed a much more interactive user experience. Usersare able to interrogate the statistics themselves more easily, filtering for specific variables and time periods.
We will seek feedback from users to develop our releases further. Users can access the full range of our data either via underlying data files or use of the tool to build and save their own tables. A particular area we wish to explore more is the pre- set views we offer. These are aimed at the time-poor user who wants to quickly find and use more detailed breakdowns than what is available in the main release. We wish to find the optimal level of such set views users find useful.
DfE has a central development team who lead on the platform. They will continue to check functionality and lead on the IT side of queries; however, we will engage with users on content.
Devolution will also bring some changes to our outputs as we collaborate with Mayoral Combined Authorities (MCAs) and the Greater London Authority (GLA). We will continue to publish detailed data on national and local delivery, but the devolved bodies will publish more data and information themselves to provide a richer picture of their policies. Through our collaboration with MCAs/GLA on this work, along with other local bodies such as LEPs, we hope to gain a stronger insight to local data needs to further improve sub-national data.
Being open – We are open and transparent, explaining limitations and priorities, and changes are detailed in the methodology section of this guide which accompanies releases.
Freedom of Information (FOI) requests
Users who would like access to data and statistics on FE that have not been included in our publications can contact the department in writing by letter, email, social media, online form or fax. A response will be issued within 20 working days of receiving the request.
FOIs (and other ad hoc requests) provide a good insight into other uses and needs we may not be addressing in our publications, and so offer an important source of information on what changes we might make to meet users’ needs better. Such requests often include more detailed breakdowns of published figures and/or different combinations of variables than the ones published.
Taken in conjunction with public and media debate, we seek to regularly review the breadth of data and measures we offer in our releases.