Thursday, 8 December 2022

The Recent Evolution of Apprenticeships

CVER's Chiara Cavaglia, Sandra McNally and Gu Ventura discuss the evolution of apprenticeships in England over the last 20 years.

The number of apprenticeship starts in England has reduced dramatically in recent years following government reforms and the COVID-19 pandemic. The composition has also shifted from almost complete domination by low and intermediate level apprenticeships to one where higher and degree apprenticeships constitute a significant share (26 per cent in 2020). Even though the number of apprenticeships has decreased, policy changes have likely improved their average quality. One might characterise the changes (at least up until the pandemic) as a substitution of quantity for quality. But this might have come at the cost of less equitable access with, for example, those living in disadvantaged neighbourhoods losing out from these changes.

Our report, ‘The recent evolution of apprenticeship participation and pathways’ documents these trends and assesses how they have affected different groups of people - by age group, socio-economic background, gender and ethnicity. We also look at the prior attainment of individuals undertaking different types of apprenticeship, the extent of progression and drop-out rates across apprenticeship types. Finally, we interpret what these patterns imply for broader concerns on the efficacy of the system and for social mobility. We study the change in apprenticeships between August 2014 and July 2020, using comprehensive national data (the Individual Learner Record). For younger age groups, we can link information to school records.

Changes in the number and composition of apprenticeships were strongly influenced by changes in government policy over these years, which included the overhaul of how apprenticeships are funded (with the introduction of the Apprenticeship Levy from 2017), the replacement of apprenticeship frameworks by employer-led standards, and new rules aimed at improving the quality of training (including a minimum duration, a minimum threshold for off-the-job training and a more rigorous final assessment). In line with these efforts, there has been a marked increase in the planned duration of apprenticeships. The net impact on productivity depends on whether the improvement in quality offsets the fall in numbers and the extent to which newer (more expensive) apprenticeships are displacing pre-existing forms of training (which is difficult to evaluate).

Changes in number and type of apprenticeships on offer appear to have had distributional implications. Whereas in 2015 apprenticeship starts were more frequently observed among people living in the most deprived fifth of neighbourhoods of England, by 2020 they were more evenly split across types of neighbourhood. This change is driven by bigger relative falls in lower-level apprenticeships (Levels 2 and 3) in more deprived neighbourhoods, particularly among older individuals. Young individuals from disadvantaged backgrounds are less and less represented at successively higher levels of apprenticeship. In fact, they are more likely to start a university degree than to study for a degree apprenticeship. Cast in this light, it is difficult to see such apprenticeships as being a route to improve social mobility.

Unlike in most other countries, apprenticeships in England are not predominantly used to facilitate the transition from school to work. Individuals over 25 years of age account for 40 per cent of all apprentices. Further, they account for the vast majority of those undertaking higher apprenticeships (at Levels 4 and 5) and over half of those undertaking degree apprenticeships. This matters because returns to apprenticeships are considerably higher for younger individuals (McIntosh and Morris, 2018). Women and ethnic minorities are under-represented among younger apprentices (up to age 25).

Another part of the story is that drop-out rates across apprenticeship types are relatively high. About 11 to 26 per cent of individuals drop out within one year (depending on the level of the apprenticeship and the age of the apprentice). The overall achievement rate varies between 60 and 70 per cent, which is lowest for older individuals (25+) on higher apprenticeships and highest for younger people on Level 3 apprenticeships. The fact that so many individuals fail to complete their apprenticeship is a cause for concern, especially given the high subsidy from the taxpayer.

Overall, our report points to improvements in the quality of apprenticeships on offer but fewer possibilities to access them because of their reduced number and more stringent academic requirements when offered at higher levels. Questions for policy makers include whether there ought to be more explicit targeting of firm-level incentives towards younger people and how opportunities may be made more widely available for those from disadvantaged backgrounds.


CVER Discussion Paper 039, The Recent Evolution of Apprenticeships: Participation and Pathways is published 8 December 2022. Link: https://cver.lse.ac.uk/textonly/cver/pubs/cverdp039.pdf

Friday, 11 November 2022

On Track to Success? Returns to Vocational Education against different alternatives

Sönke Matthewes and Guglielmo Ventura explore the labour market consequences of students’ enrolment in FE colleges in England

In the wake of Brexit and the global pandemic, skills shortages across the UK economy risk hampering efforts to reverse a decade of languishing productivity and festering inequality. Reinvigorating the long-neglected British vocational (or technical) education system is often hailed as a solution to this problem.

Arguments in favour of vocational education are familiar: it caters for more than just academic talents, while equipping the future workforce with essential skills for the well-functioning of the economy. It can aid the transition from school to work and enhance productivity, thanks to closer links between what is taught and the skills employers need. Critics worry this may come at the expense of general skills - reducing future workers’ adaptability to ever changing patterns of work. In practice, whether students benefit from vocational education will depend on what is their alternative: those who would otherwise leave education altogether might well benefit from gaining extra skills, even if the qualifications gained are at a low level. The picture is less clear for those who would otherwise complete academic schooling and possibly go on to obtain a university degree. Empirical evidence from the UK has not yet provided a convincing answer.

In a recent study we contribute to this debate with new evidence about the payoffs to vocational education in England. A new empirical approach allows us to estimate these payoffs separately for two groups of students facing separate alternatives: (1) those who would otherwise enrol in an academic sixth form and (2) those who would otherwise take no post-16 courses.

We follow the education and labour market careers of three cohorts of state-educated students who sat their GCSEs between 2002 and 2004. At the time, the school-leaving age was still 16 and it was not uncommon for students not to take any course after their GCSEs (14 percent). Those 86 percent continuing their studies were evenly split between academic institutions (sixth form schools or sixth form colleges) and more vocational institutions (mostly Further Education colleges). Unsurprisingly, the three groups of students (vocational, academic and no further education) do not look alike in terms of their previous academic achievement or socio-economic status. Any simple comparison of their labour market careers would thus be misleading.

To overcome this problem and ensure we compare the education choices of otherwise similar students, we exploit the role of students’ geographic proximity to academic and vocational providers as a driver of students’ post-16 education choices. For this, we focus on students from schools without sixth form provision who move institution after GCSEs. Intuitively, students living further away from a Sixth Form college are more likely to enrol in an FE college. Similarly, living further away from an FE college increases students’ probability to choose an academic provider or, to a lesser extent, leave education entirely. We also take account of a vast range of student-, school- and neighbourhood-level characteristics to make sure students’ proximity to post-16 institutions does not reflect better labour market opportunities or residential sorting. Under this approach, the estimated payoffs relate to students whose education choices are influenced by distance to the different institutional types (‘marginal’ students).

Our analysis paints a rather different picture depending on the group of marginal students considered.  Let’s focus first on those who enrol in FE colleges as an alternative to Sixth Form Colleges. For these students, enrolling in vocational institutions leads to a loss in annual earnings at age 29-30 of £2,900 (or 11 percent) for males and £1,700 (or 8 percent) for females. These gaps open up very early in students’ careers (in their mid-twenties). They are not explained by differences in labour market participation as vocational and academic graduates are equally likely to be employed, but are due to vocational students being more likely to move into lower-paid jobs with worse wage progression.

But what drives this difference? We find that male vocational students are 5 percentage points less likely to achieve qualifications at Level 3 (A-Levels or equivalent) and about 5 percentage points less likely to obtain a university degree than if they had studied in a Sixth Form College. Additionally, vocational education almost halves students’ chances to enrol in more selective universities. The weaker academic progression is not compensated by a higher probability of starting an apprenticeship. Overall, differences in educational attainment and progression explain at least 20 per cent of the earnings penalty we found.

Our approach also allows us to unpack average payoffs and explore how they vary based on students’ underlying preferences for the academic and vocational options. They do so considerably: students with a stronger motivation to pursue academic education in Sixth Form Colleges (they are willing to travel longer distances to enrol) are penalised to a much greater extent if diverted to FE Colleges. But while payoffs to vocational education are negative for most marginal students, we find some tentative evidence that the least academically-inclined students benefit from it. Perhaps unsurprisingly, students who enrol in vocational education as an alternative to leaving education at age 16, seem to increase their annual earnings - although results are less conclusive as students’ choice to enrol in vocational education rather than dropping out is not strongly driven by whether there is an FE college in reach. 

In light of these results, policy efforts should focus on improving the quality of the vocational track by tackling some of its well documented problems. After all, recent economic studies from Nordic countries support the idea that vocational programmes can benefit students even compared to academic ones under certain conditions. First, vocational programmes must have well-signposted progression pathways to tertiary level education and have better career guidance and financial support for students. In this respect, the recent roll-out of Institutes of Technologies and the announcement of more comprehensive post-18 funding may improve vocational students’ progression through the system. Second, internationally, vocational programmes appear to work better when they are closely linked with workplace-based training. In the UK, apprenticeships have become less common for 16-19-year-olds than for older people over the years. Without strong incentives for firms’ involvement, recent reforms, such as the introduction of T-Levels with mandatory work placements and the consolidation of employer-designed apprenticeship standards, risk being futile.

Overall, if we are to be serious in this country about promoting growth and reducing inequality by improving and diversifying skills, there needs to be much more policy attention towards the Further Education sector and the challenges its students face.


Matthewes and Ventura (2022). On Track to Success? Returns to vocational education against different alternativesCVER Discussion Paper 038. London School of Economics. 

Tuesday, 8 March 2022

Do Management Practices Matter in Further Education?

Better management practices in FE colleges could help students from disadvantaged backgrounds, say Sandra McNally, Luis Schmidt, Anna Valero

In the first study to evaluate management practices in Further Education (FE) colleges, published today, we find that well run colleges boost student performance and can help close the gap between poorer pupils and their peers.

The FE sector plays a vital role in helping people acquire education and skills,  in improving social mobility (Augar Review, 2019) and in “levelling up”  opportunity across and within regions. FE and Sixth Form colleges enrol about half of every group completing compulsory full-time education at age 16, including a disproportionate share of students from disadvantaged backgrounds. FE colleges are also important for adults who wish to train and reskill. Despite their importance, we know relatively little about how to improve efficacy in FE colleges.

Our study is the first to evaluate management practices in colleges – and although its findings are inevitably specific to the institutional context of the UK, it also has relevance to institutions with similar aims in other countries (such as community colleges in the US). In addition to looking at overall performance, we also examine whether better management practices help students from disadvantaged backgrounds. This is a very pertinent issue as the share of students from disadvantaged families enrolling straight after GCSEs is about twice what it is in other educational settings.

We investigate whether management practices in FE colleges influence performance in 16-19 education. We collect our own data on management practices using the methodology of the World Management Survey (WMS) (Bloom and Van Reenen, 2007) as applied to the FE sector. These methods were first applied to the manufacturing sector in a handful of countries and have now been carried out across 35 countries worldwide and in a variety of sectors including schools, universities and healthcare (see Scur et al. (2021) for an overview). Across these different settings, good management practices are a key driver of performance.

In our survey, college principals are asked 21 questions about their management practices across college operations, monitoring, target setting and people/talent management, and each is scored between 1 and 5, where 5 indicates the college has fully adopted good practice. We link our survey data to administrative data for educational outcomes, progression and other important characteristics of these institutions and the people who attend them.

Our first key finding is that structured management practices appear to matter for educational achievement and progression to university education. For example, if the management score increases from an average of 4.28 (out of 5) to 4.64, the probability of a young person achieving a ‘level 3’ qualification (e.g. A-levels or BTECs) or going to university increases by 2 percentage points. 

Our second main finding is that good management practices are more important for achieving a level 3 qualification for students from low-income backgrounds. In a hypothetical scenario where a learner is moved from a college with relatively poor management practices (i.e. 10th percentile) to a one with relatively good practices (i.e. 90th  percentile) they are eight percentage points more likely to achieve a level 3 qualification. This is nearly half of the educational gap between those from poor and non-poor backgrounds. The labour market return to level 3 qualifications is at least six per cent (Machin et al., 2018). Improving college management practices could reduce inequality and improve social mobility.

The effect on progress to university is driven by students who enter FE colleges with good GCSEs at age 16 and by institutions focused on higher education qualifications (Level 4 or higher, e.g. Foundation Degrees, HNCs, HNDs). Well-managed FE colleges have the potential to be engines of social mobility at this higher level, at least for those students who are already well-prepared when they enter.

What factors lead to good management? We find that spatial competition from nearby colleges may help. Although good leadership is correlated with good management, we find that management practices do not simply reflect the influence of college principals or more effective leadership. Management practices can be thought of as a type of technology (Bloom et al., 2016), evolving slowly as particular leaders come and go. An important area for future research is to further explore the interaction between management practices and leadership styles, as we know that principals do matter for outcomes in this sector (Ruiz-Valenzuela et al. 2017).

This paper suggests good management practices at FE colleges play a big role in the prospects of young people in general, and those from disadvantaged backgrounds in particular. Furthermore, as ‘good management’ is slow-changing, any positive effects apply to new groups of college entrants each year. Improving management practices in colleges across the country could therefore be an important channel for reducing inequalities.

About the authors
Luis Schmidt is a pre-doctoral researcher at the LSE’s Suntory and Toyota International Centres for Economics and Related Disciplines (STICERD).

Anna Valero is a Senior Policy Fellow at the LSE's Centre for Economic Performance, London School of Economics, and Deputy Director of the Programme on Innovation and Diffusion (POID), London School of Economics.

Sandra McNally is a Professor of Economics at the University of Surrey. She is Director of the Centre for Vocational Education Research at the London School of Economics and is also Director of the Education and Skills Programme at the Centre for Economic Performance, London School of Economics.

Tuesday, 23 November 2021

The Effects of College Capital Projects on Student Outcomes

 Stephen Gibbons, Claudia Hupkau, Sandra McNally, Henry Overman

About half of school leavers in England attend colleges of Further Education (FE), though these colleges are often considered the poor relation of schools and universities, enrolling lower achieving students and spending less per student (Britton et al. 2019). Capital expenditure accounts for about 10 per cent of FE College expenditure and in the 2020 budget the UK government committed £1.5 billion over five years to bring college facilities up to a good level.

Will this investment make much of a difference to student outcomes? In a Centre for Vocational Education Research (CVER) paper published today, we suggest that it will help - getting more students to a good upper secondary qualification, increasing enrolment in higher-education and even improving employment outcomes.

Our analysis uses information on capital expenditure programmes undertaken between 2006 and 2009, linking this to administrative data on individual educational and labour market outcomes up to 2017 (using the DfE Longitudinal Educational Outcomes data). We focus on large capital projects only. We find that these projects take about three years to complete and that changes in student outcomes take place at that time or the year after. We find that large capital grants increase the share of students enrolled on upper secondary courses that lead to “good” qualifications such as A-levels or BTECs (i.e. at Level 3). This matters because less than half of young learners entering FE colleges progress to these courses (Hupkau et al. 2016).  Level 3 qualifications are associated with higher earnings (McIntosh and Morris, 2016) and are a pre-requisite for university. Conditional on enrolment, large capital grants do not affect achievement. This is a still a good outcome because it shows that enrolments go up with no (negative) effect on achievement rates. Investment in capital projects also affects the probability of enrolling in higher education. The magnitude of effect is within the same ballpark as Machin et al. (2020) who consider the effect of marginally achieving a good grade in GCSE English on enrolment to upper secondary education and university degrees. Furthermore, any benefits from capital projects affect multiple cohorts of students.

These effects persist even after we allow for the fact that FE colleges see a marked change in student composition after the completion of capital projects – they attract students with higher prior achievement and a higher proportion of “non-poor” students (i.e. who did not receive free school meals when in school) – although there is no overall increase in the number of students. There is also a higher probability that students will achieve sustained employment. Effects are usually larger for the largest grants.

There are several reasons why capital expenditure may have these effects. First, substantial capital expenditure on new equipment, laboratories or workshops may improve learning on courses that rely on specific and costly assets (for instance, engineering). Second, better buildings may improve the learning experience. Safe, clean, and appealing learning environments – with no overcrowding, good lighting and heating - could improve concentration and lead to greater student and teacher morale and effort.  On the other hand, large capital expenditure projects may be disruptive and positive effects may take time to materialise. As we show, these positive effects also partly reflect changes in the composition of student intake – improving outcomes for colleges that receive grants but not necessarily for the whole system.

Our study takes advantage of two features of the data and the way the capital expenditure program was implemented to better get at the causal impact of expenditure on outcomes. First, we use rich data on student-level outcomes and characteristics, which allows us to show that improved outcomes don’t simply reflect improved intakes. Second, we show that improved outcomes aren’t explained by expenditure being targeted at colleges that were already improving. Results improve at investing colleges even when compared to a control group of colleges that will benefit from investment in the near future.

Our study is one of very few to evaluate the effect of capital expenditure for students in post-secondary education. Most academic studies evaluate effects in schools, mostly in the US. Our use of micro-data, and our careful attention to causality – allow us to go beyond the government’s own analysis of FE capital expenditure (Business Innovation and Skills, 2012) – which only found small effects on student numbers and no effect on achievement or retention. Our study suggests that these effects were under-estimated.

Our results show that capital investment in college infrastructure has a visible effect on student outcomes within a reasonable timeframe. Investing in capital infrastructure can benefit many cohorts of students and is best considered a long-term investment. These results are, however, reassuring for policy makers who may be more concerned about short-term returns as they show that for large capital projects, the benefits materialise as soon as the project is complete.

Wednesday, 14 April 2021

Apprenticeships, the Levy and COVID-19

Gavan Conlon, Andy Dickerson, Steven McIntosh and Pietro Patrignani 


Recent policy developments in the apprenticeship system

In the 2015 Queen’s Speech, the UK Government set a target for a total of 3 million new apprenticeships starts in England by 2020, with the pledge confirmed by the new Government in 2017. During the same period, the English apprenticeship system experienced a series of major reforms, affecting the duration, training requirements, content, technical level, and funding of apprenticeships. In particular, the government introduced an Apprenticeship Levy across the UK to help fund apprenticeship starts for large employers. Since April 2017, all UK employers with an annual pay bill of more than £3 million contribute 0.5% of their pay bill in excess of this threshold to the Apprenticeship Levy which they can then use for funding apprenticeships.

In this blog we examine the impact of these various changes to the apprenticeship system on recent enrolments, and then briefly describe the initial effects of the COVID-19 pandemic on apprenticeships.

Trends in apprenticeship starts

The 3 million target for new apprenticeship starts for 2015-2020 was not met. There were just over 2 million apprenticeship starts in the period, with a decline in apprenticeship starts from 509,000 in academic year 2015/16 to 393,000 starts in 2018/19 (the last full academic year unaffected by COVID-19)[1]. However, this significant decline in the aggregate annual number of starts masks divergent underlying trends in the nature of the apprenticeships and in the characteristics of learners. There was a substantial decline in the number of starts at Intermediate Level (RQF Level 2[2]) (and to a much lesser extent at Advanced Level - RQF Level 3), which was partly offset by a very rapid increase in the number of starts at Higher Level (RQF Level 4+) [3]. The fastest increase in Higher Level apprenticeships was observed in services activities, in particular in ‘Professional, scientific and technical activities’ and in ‘Financial and insurance activities’.

These trends were starting to emerge at the same time as the Apprenticeship Levy was introduced. Our research shows that employers subject to the Levy (i.e. those above the £3 million pay bill threshold) were generally more likely to engage with apprenticeship training as compared to non-Levy paying employers. In addition, the rise in starts at Higher levels was substantially greater for Levy employers as compared to non-Levy employers with similar characteristics, while the rapid fall in starts at Intermediate and Advanced levels occurred at a faster rate amongst non-Levy employers.

As a consequence, the fall in the overall number of starts cannot be directly attributed to the introduction of the Levy. Rather, the other changes to the apprenticeship system which occurred over the period, including the introduction of Apprenticeship Standards and removal of Frameworks, the new requirements of a 12-month minimum duration and a lower threshold of 20% of off-the-job training, all contributed to the changing patterns in apprenticeship provision.

Nonetheless, Levy contributions remain under-used (i.e. Levy-paying employers do not spend all of the funds they have available for apprenticeships within the 24 months allowed), while for non-Levy employers (who receive a government contribution of 95% for Apprenticeship costs), sources have reported a shortage of funds restricting their ability to engage in new apprenticeships[4]. Recent trends show that there has also been a substantial shift towards Levy-funded apprenticeship starts which accounted for 57% of new apprenticeships during 2018/19 as compared to slightly below 50% in the previous year. Finally, Levy funds seem to be more often directed towards apprenticeship starts for older learners (aged 25+) and those not from socially disadvantaged backgrounds.[5]

Effect of COVID-19 on apprenticeships

The emerging data available for the COVID-19 period (from 23 March 2020 when face-to-face delivery was halted and training moved online where possible) show that the trends described above were exacerbated by the COVID-19 pandemic. The total number of apprenticeship starts further declined to 323,000 during academic year 2019/20 and fell again in the first quarter of 2020/21 (by around 30% compared to the same period in previous years), while the proportion of Levy-funded starts increased to 65% in 2019/20 (although has declined to 57% in the first quarter of 2020/21).

The COVID-19 pandemic also appears seems to have intensified the shift towards Higher Level apprenticeships and older apprentices (aged 25+) and away from more socially disadvantaged apprentices. For obvious reasons, apprenticeship starts in ‘Leisure, Travel and Tourism’ and ‘Retail and Commercial Enterprise’ were particularly adversely impacted by the COVID-19 pandemic.

In response to the COVID-19 crisis, the Government introduced financial incentives for new apprentices in August 2020 – an additional £2,000 for each new start aged 16-24 and £1,500 for those aged 25+. The March 2021 Budget extended the incentive eligibility period to September 2021 and also increased the size of the bonus to £3,000 per new apprentice, irrespective of age. However, take-up of this scheme has been limited so far with approximately 25,000 employers having submitted claims for the bonus as of 1 February 2021 (compared to a budget allocation of up to 100,000 bonus payments)[6].

What next?

Apprenticeship training is a key component of skills development and has been subject to extensive policy attention both before and during the pandemic. Despite its importance, little is currently known about the effects of the COVID-19 pandemic on new and in-progress apprenticeships and how the funding rules for Levy and non-Levy employers should be reformed. In-depth analyses and discussions in these areas would inform policymakers and stakeholders on how to ensure that the apprenticeship system delivers skill development and training efficiently and effectively for different businesses and learners.



[2] The RQF (Regulated Qualifications Framework) system for England and Northern Ireland enables individuals to compare different qualifications according to their level (knowledge and skills), from Entry Level 1 through to Level 8. For example, A-levels are RQF Level 3 and UG (bachelors) degrees are RQF Level 6. The RQF replaced the National Qualifications Framework (NQF) and the Qualifications and Credit Framework (QCF) in 2015.

Thursday, 29 October 2020

STARK CONTRAST BETWEEN PRE- AND POST-16 PROVISION IN UTCs

Prime Minister Boris Johnson has put the government’s skills policy agenda in the spotlight. 

In a recent speech denouncing skills shortages in several technical occupations, the Prime Minister vowed to ‘end the pointless, nonsensical gulf … between the so-called academic and the so-called practical varieties of education’.

University Technical Colleges (UTCs) are new state-funded 14-19 schools established by the government in the past ten years with a very similar intent. In our new research article we seek to understand whether UTCs were successful at bridging the gap between academic and technical education in England.

UTCs were conceived as a response to a growing technical skills gap and to the perception that young people lacked an adequate and engaging technical skills provision at school. The cornerstone of UTCs is a teaching curriculum that blends core academic subjects with technical subjects that meet regional skill shortages (such as engineering, manufacturing or digital technology). UTCs also benefit from the direct engagement of local universities and employers: industry experts help design and deliver project-based learning seeking to develop the skills and attributes valued in the workplace. (More information here)

There are currently 48 UTCs open in England, according to Department for Education figures for September 2020. The first UTC opened in 2010, since then 58 more opened across England. Despite this expansion, over the years the UTC model has been dogged with controversy. As new schools with no established record and a lack of publicity, UTCs struggled to recruit enough students: recruitment at age 14 proved particularly challenging as typically English students would not change school at that age. Low operating capacity has dented the financial viability of a number of UTCs resulting in 11 closing or changing status. UTCs have also been criticized for their poor performance in national examinations (see e.g. Dominguez-Reig and Robinson, 2018 and Thorley, 2017). Part of this poor performance, however, might be due to the fact that UTC students look very different from typical students. In our study of the effectiveness of UTCs we device an empirical strategy to take this into account, and are able to evaluate the causal effect of enrolling in a UTC on student outcomes.

Our research focuses on 30 UTCs that opened between 2010 and 2014. Having access to education registry data linked to tax records (LEO data), we were able to follow cohorts of students who enrolled in a UTCs either in Year 10 (age 14) or Year 12 (age 16). For Year 10 entrants, we measure academic performance two years later in the high-stakes end-of-secondary school exams (GCSEs). For Year 12 entrants, we investigate post-16 course choices, achievement at Level 3 (e.g. A-Levels or BTECs) and at whether students start an apprenticeship. We are also able to look at students’ post-18 transition into Higher Education or into the labour market. Our research strategy leverages UTCs’ geographical availability across students’ cohorts allowing us to compare UTC students with arguably very similar non-UTC students.

Our findings reveal a stark contrast between pre- and post-16 UTC provision. First consider Year 10 enrolment, we find that enrolling in a UTC has a detrimental effect on GCSE achievement: students who attend UTCs are 26 percentage points less likely to get 5 or more GCSEs with good grades than similar non-UTC students, a large negative effect equivalent to twice the achievement gap between disadvantaged and non-disadvantaged students. UTCs also significantly reduce students’ chances of achieving grade C (now grade 4) in English and maths. These results are concerning: research at CVER warns about the negative consequences of students missing out on grade C in English, limiting student progression over the longer term (Machin et al. 2020).

We find no such detrimental effect on education outcomes for Year 12 entrants: UTC enrolment does not affect students’ probability of achieving at least one A-Level and makes them much more likely to enter and achieve a technical qualification at Level 3. Perhaps unsurprisingly, we find a strong effect on the probability of entering STEM qualifications (higher by 25 percentage points). Impressively, UTCs increase students’ probability of starting an apprenticeship by 14 percentage points. This is a potentially very positive outcome: evidence from CVER points to substantial apprenticeships payoffs for young people (Cavaglia et al. 2020). Finally, we document positive effects of UTC enrolment up to one year after leaving school. While UTCs do not appear to be better (or worse) at sending students to university, they are very good at propelling students into STEM degrees. Also, they enable transition to the labour market: as a result of UTC enrolment in Year 12, students are 3 percentage points less likely to be NEETs one year after leaving school.

What can explain these dramatically different results? One concern is that combining the academically-demanding GCSEs curriculum with additional technical subjects at a time where students are also adapting to a new school may prove too challenging, especially in view of the fact that students moving school in Year 10 are doing so at a non-standard transition time. Furthermore, Year 10 recruits are less academically able than Year 12 recruits (in terms of maths and English test scores), and we find that UTCs are better at teaching more academically-able students.  As more UTCs move to recruit students at a natural transition point (i.e. at age 11 as well as age 16), this might improve their performance to the extent that they become better able to attract a higher attaining group of applicants and have a longer time to teach the broader curriculum before exams at age 16. More generally, we need to bear in mind that UTCs are brand new schools and shouldn’t be judged too hastily. We find some evidence that UTCs improve with time. While the jury is still out on the longer term effect of this policy, our study gives grounds for hope.

This article was originally published on the TES website on 14 October 2020: https://www.tes.com/news/why-do-over-16s-utcs-perform-under-15s-dont.

Guglielmo Ventura is at research assistant at the Centre for  Vocational Education Research at the London School of Economics and Political Science.

Related TES coverage from Kate Parker: https://www.tes.com/news/revealed-utc-attainment-gap-pre-and-post-16s


Thursday, 25 June 2020

What you study vs where you study: how FE choices affect earnings and academic achievement


By Esteban Aucejo, Claudia Hupkau and Jenifer Ruiz-Valenzuela


Following the unprecedented number of job losses and the bleak economic outlook due to the Covid-19 crisis, more people will be considering staying on or returning to education. Vocational education and training (VET) is likely to play a crucial role in providing the skills needed for economic recovery, including retraining workers who have been made redundant. In this context, it is crucial to have good information on the returns to different fields of study that can be taken at FE colleges, and whether it matters which institution one attends for earnings and employment prospects. Our new research published by the Centre for Vocational Education Research (CVER) finds that when it comes to vocational education and training (VET), what you study is very important for future earnings.  Whereas where you study can also matter for younger people but less so for adults.

We used data from more than one million students over 13 years to investigate how much value attending an FE college adds in terms of academic achievement, earnings and employment, taking into account learners’ prior achievements and their socio-economic background. Our study considers both young learners, who mostly join FE colleges shortly after compulsory education, as well as adult learners, who have often worked for many years before attending FE college.

Our value-added measure indicates that moving a student from a college ranked in the bottom 15 percent of the college value-added distribution to one ranked in the top 15% implies a fairly modest 3% higher earnings on average, measured at around seven years after leaving FE college. The difference in earnings for adult learners is smaller, at 1.5%. The fact that college quality seems to matter more for young learners is likely due to young learners spending more time in FE colleges (i.e. they enrol in and complete substantially more learning than adults). The results in terms of the likelihood of being employed show even smaller differences across FE colleges.

There is considerably more variation in FE colleges' contributions to the educational attainment of their young learners. On average, the young people in our sample enrol in just under 600 learning hours, but only achieve about 413 (or 69% of them), around 42% achieve a Level 3 qualification, and 38% progress to higher education.

But were we to move a learner from a college ranked in the bottom 15% by value-added to one ranked in the top 15%, they would, on average, achieve 6.5% more learning hours (from 69% to 73.4%). They would be almost 11% more likely to achieve a Level 3 qualification (from 42% to 46.5%) and the likelihood of attending higher education would increase by 10% (from 38% to 42%). These are large effects. As young people are likely to attend their nearest college, the variability in value-added between institutions is a source of unequal opportunity between geographic areas.

What differentiates high value-added colleges from low value-added ones? Learning characteristics seem to play an important role. Colleges that offer a larger share of their courses in the classroom (as opposed to in the workplace or at a distance) have higher value-added in earnings for young learners. This is particularly relevant in light of the current crisis, where online and distance learning is expected to remain a regular feature, at least in the medium term. We also find significant correlations between the curriculum offer and value-added measures, with colleges offering more exam-assessed qualifications (as opposed to competency-based) showing higher value-added.  

While where you study does not imply large differences in earnings after college, what you study has a much bigger effect, especially for female and young learners. We carried out a separate analysis looking at students’ earnings before and after attending FE college. In this analysis the young people were aged 18-20 and so had been working for up to two years’ prior to study. Table 1 below shows the 3 most popular fields (in terms of learners doing most of their guided learning hours in that particular field, i.e. specialising in them) by gender and age group.

The two fields of engineering and manufacturing technology, and business administration and law show large levels of enrolment among males and lead to large positive returns. For instance, the typical young male learner who chooses engineering and manufacturing technology as his main field of study will earn, on average, almost 7% more five years after finishing education when compared to earnings before attending FE college, after adjusting for inflation.  For adult male learners specialising in this field, earnings rise by 1.5% five years after leaving college. In contrast, young male learners specialising in retail and commercial enterprise do not see any increase in earnings five years after attending FE college. These results take into account that earnings increase with experience, irrespectively of which field one specialised in. Business administration and law, and health, public services and care are the two fields that show high levels of enrolment and consistent positive returns for women across age groups.

While we find consistently higher returns to fields of study for women than for men, this does not mean that overall, they have higher earnings post FE-college attendance. It means that compared to before enrolment, they experience steeper increases in earnings after completing their education at FE colleges. We also find that many specialisations present negative returns immediately after leaving college that turn positive five years after graduation, indicating that it takes time for positive returns to be reflected in wages. The fact that timing matters suggests that policy makers should be extremely cautious about evaluating colleges in terms of the labour market performance of their students.

Our findings also have relevant practical implications for students since they could help them to get a better understanding of the variation in FE college quality and to compare the returns to different fields of study. This information is likely to be particularly important considering the evidence suggesting that students tend to be misinformed about the labour market returns of VET qualifications.

Table 1. Top 3 Fields of study by proportion of learners who specialise in them



Mean GLH
Estimated Return
Share specialising

main field
1 year post-FE
5 years post-FE
in the field
Young male learners




Engineering and Manufacturing Technology
632
0.04
0.068
20.60%
Construction, Planning & Built Environment
621
-0.001
0.023
16.60%
Arts, Media and Publishing
942
-0.064
-0.003
10.70%
Adult male learners




Health, Public Services and Care
77
-0.006
0.004
19.00%
Engineering and Manufacturing Technology
206
-0.008
0.015
18.90%
Business Administration and Law
131
0.003
0.009
14.20%
Young female learners




Health, Public Services and Care
525
-0.002
0.045
25.20%
Retail and Commercial Enterprise
597
0.036
0.115
23.40%
Business Administration and Law
430
0.040
0.118
13.60%
Adult female learners




Health, Public Services and Care
142
-0.008
0.020
34.30%
Business Administration and Law
189
0.004
0.019
14.80%
Education and Training
143
-0.007
0.027
12.70%
Note. The estimated returns reported are the marginal effects, one and five years after leaving the college, respectively, of choosing the field as the main field. This is a summary table. The complete tables can be found in Tables 9 to 12 of CVER DP 030

This blog post appeared first on TES and is republished here with permission.