Tuesday, 8 March 2022

Do Management Practices Matter in Further Education?

Better management practices in FE colleges could help students from disadvantaged backgrounds, say Sandra McNally, Luis Schmidt, Anna Valero

In the first study to evaluate management practices in Further Education (FE) colleges, published today, we find that well run colleges boost student performance and can help close the gap between poorer pupils and their peers.

The FE sector plays a vital role in helping people acquire education and skills,  in improving social mobility (Augar Review, 2019) and in “levelling up”  opportunity across and within regions. FE and Sixth Form colleges enrol about half of every group completing compulsory full-time education at age 16, including a disproportionate share of students from disadvantaged backgrounds. FE colleges are also important for adults who wish to train and reskill. Despite their importance, we know relatively little about how to improve efficacy in FE colleges.

Our study is the first to evaluate management practices in colleges – and although its findings are inevitably specific to the institutional context of the UK, it also has relevance to institutions with similar aims in other countries (such as community colleges in the US). In addition to looking at overall performance, we also examine whether better management practices help students from disadvantaged backgrounds. This is a very pertinent issue as the share of students from disadvantaged families enrolling straight after GCSEs is about twice what it is in other educational settings.

We investigate whether management practices in FE colleges influence performance in 16-19 education. We collect our own data on management practices using the methodology of the World Management Survey (WMS) (Bloom and Van Reenen, 2007) as applied to the FE sector. These methods were first applied to the manufacturing sector in a handful of countries and have now been carried out across 35 countries worldwide and in a variety of sectors including schools, universities and healthcare (see Scur et al. (2021) for an overview). Across these different settings, good management practices are a key driver of performance.

In our survey, college principals are asked 21 questions about their management practices across college operations, monitoring, target setting and people/talent management, and each is scored between 1 and 5, where 5 indicates the college has fully adopted good practice. We link our survey data to administrative data for educational outcomes, progression and other important characteristics of these institutions and the people who attend them.

Our first key finding is that structured management practices appear to matter for educational achievement and progression to university education. For example, if the management score increases from an average of 4.28 (out of 5) to 4.64, the probability of a young person achieving a ‘level 3’ qualification (e.g. A-levels or BTECs) or going to university increases by 2 percentage points. 

Our second main finding is that good management practices are more important for achieving a level 3 qualification for students from low-income backgrounds. In a hypothetical scenario where a learner is moved from a college with relatively poor management practices (i.e. 10th percentile) to a one with relatively good practices (i.e. 90th  percentile) they are eight percentage points more likely to achieve a level 3 qualification. This is nearly half of the educational gap between those from poor and non-poor backgrounds. The labour market return to level 3 qualifications is at least six per cent (Machin et al., 2018). Improving college management practices could reduce inequality and improve social mobility.

The effect on progress to university is driven by students who enter FE colleges with good GCSEs at age 16 and by institutions focused on higher education qualifications (Level 4 or higher, e.g. Foundation Degrees, HNCs, HNDs). Well-managed FE colleges have the potential to be engines of social mobility at this higher level, at least for those students who are already well-prepared when they enter.

What factors lead to good management? We find that spatial competition from nearby colleges may help. Although good leadership is correlated with good management, we find that management practices do not simply reflect the influence of college principals or more effective leadership. Management practices can be thought of as a type of technology (Bloom et al., 2016), evolving slowly as particular leaders come and go. An important area for future research is to further explore the interaction between management practices and leadership styles, as we know that principals do matter for outcomes in this sector (Ruiz-Valenzuela et al. 2017).

This paper suggests good management practices at FE colleges play a big role in the prospects of young people in general, and those from disadvantaged backgrounds in particular. Furthermore, as ‘good management’ is slow-changing, any positive effects apply to new groups of college entrants each year. Improving management practices in colleges across the country could therefore be an important channel for reducing inequalities.

About the authors
Luis Schmidt is a pre-doctoral researcher at the LSE’s Suntory and Toyota International Centres for Economics and Related Disciplines (STICERD).

Anna Valero is a Senior Policy Fellow at the LSE's Centre for Economic Performance, London School of Economics, and Deputy Director of the Programme on Innovation and Diffusion (POID), London School of Economics.

Sandra McNally is a Professor of Economics at the University of Surrey. She is Director of the Centre for Vocational Education Research at the London School of Economics and is also Director of the Education and Skills Programme at the Centre for Economic Performance, London School of Economics.

Tuesday, 23 November 2021

The Effects of College Capital Projects on Student Outcomes

 Stephen Gibbons, Claudia Hupkau, Sandra McNally, Henry Overman

About half of school leavers in England attend colleges of Further Education (FE), though these colleges are often considered the poor relation of schools and universities, enrolling lower achieving students and spending less per student (Britton et al. 2019). Capital expenditure accounts for about 10 per cent of FE College expenditure and in the 2020 budget the UK government committed £1.5 billion over five years to bring college facilities up to a good level.

Will this investment make much of a difference to student outcomes? In a Centre for Vocational Education Research (CVER) paper published today, we suggest that it will help - getting more students to a good upper secondary qualification, increasing enrolment in higher-education and even improving employment outcomes.

Our analysis uses information on capital expenditure programmes undertaken between 2006 and 2009, linking this to administrative data on individual educational and labour market outcomes up to 2017 (using the DfE Longitudinal Educational Outcomes data). We focus on large capital projects only. We find that these projects take about three years to complete and that changes in student outcomes take place at that time or the year after. We find that large capital grants increase the share of students enrolled on upper secondary courses that lead to “good” qualifications such as A-levels or BTECs (i.e. at Level 3). This matters because less than half of young learners entering FE colleges progress to these courses (Hupkau et al. 2016).  Level 3 qualifications are associated with higher earnings (McIntosh and Morris, 2016) and are a pre-requisite for university. Conditional on enrolment, large capital grants do not affect achievement. This is a still a good outcome because it shows that enrolments go up with no (negative) effect on achievement rates. Investment in capital projects also affects the probability of enrolling in higher education. The magnitude of effect is within the same ballpark as Machin et al. (2020) who consider the effect of marginally achieving a good grade in GCSE English on enrolment to upper secondary education and university degrees. Furthermore, any benefits from capital projects affect multiple cohorts of students.

These effects persist even after we allow for the fact that FE colleges see a marked change in student composition after the completion of capital projects – they attract students with higher prior achievement and a higher proportion of “non-poor” students (i.e. who did not receive free school meals when in school) – although there is no overall increase in the number of students. There is also a higher probability that students will achieve sustained employment. Effects are usually larger for the largest grants.

There are several reasons why capital expenditure may have these effects. First, substantial capital expenditure on new equipment, laboratories or workshops may improve learning on courses that rely on specific and costly assets (for instance, engineering). Second, better buildings may improve the learning experience. Safe, clean, and appealing learning environments – with no overcrowding, good lighting and heating - could improve concentration and lead to greater student and teacher morale and effort.  On the other hand, large capital expenditure projects may be disruptive and positive effects may take time to materialise. As we show, these positive effects also partly reflect changes in the composition of student intake – improving outcomes for colleges that receive grants but not necessarily for the whole system.

Our study takes advantage of two features of the data and the way the capital expenditure program was implemented to better get at the causal impact of expenditure on outcomes. First, we use rich data on student-level outcomes and characteristics, which allows us to show that improved outcomes don’t simply reflect improved intakes. Second, we show that improved outcomes aren’t explained by expenditure being targeted at colleges that were already improving. Results improve at investing colleges even when compared to a control group of colleges that will benefit from investment in the near future.

Our study is one of very few to evaluate the effect of capital expenditure for students in post-secondary education. Most academic studies evaluate effects in schools, mostly in the US. Our use of micro-data, and our careful attention to causality – allow us to go beyond the government’s own analysis of FE capital expenditure (Business Innovation and Skills, 2012) – which only found small effects on student numbers and no effect on achievement or retention. Our study suggests that these effects were under-estimated.

Our results show that capital investment in college infrastructure has a visible effect on student outcomes within a reasonable timeframe. Investing in capital infrastructure can benefit many cohorts of students and is best considered a long-term investment. These results are, however, reassuring for policy makers who may be more concerned about short-term returns as they show that for large capital projects, the benefits materialise as soon as the project is complete.

Wednesday, 14 April 2021

Apprenticeships, the Levy and COVID-19

Gavan Conlon, Andy Dickerson, Steven McIntosh and Pietro Patrignani 


Recent policy developments in the apprenticeship system

In the 2015 Queen’s Speech, the UK Government set a target for a total of 3 million new apprenticeships starts in England by 2020, with the pledge confirmed by the new Government in 2017. During the same period, the English apprenticeship system experienced a series of major reforms, affecting the duration, training requirements, content, technical level, and funding of apprenticeships. In particular, the government introduced an Apprenticeship Levy across the UK to help fund apprenticeship starts for large employers. Since April 2017, all UK employers with an annual pay bill of more than £3 million contribute 0.5% of their pay bill in excess of this threshold to the Apprenticeship Levy which they can then use for funding apprenticeships.

In this blog we examine the impact of these various changes to the apprenticeship system on recent enrolments, and then briefly describe the initial effects of the COVID-19 pandemic on apprenticeships.

Trends in apprenticeship starts

The 3 million target for new apprenticeship starts for 2015-2020 was not met. There were just over 2 million apprenticeship starts in the period, with a decline in apprenticeship starts from 509,000 in academic year 2015/16 to 393,000 starts in 2018/19 (the last full academic year unaffected by COVID-19)[1]. However, this significant decline in the aggregate annual number of starts masks divergent underlying trends in the nature of the apprenticeships and in the characteristics of learners. There was a substantial decline in the number of starts at Intermediate Level (RQF Level 2[2]) (and to a much lesser extent at Advanced Level - RQF Level 3), which was partly offset by a very rapid increase in the number of starts at Higher Level (RQF Level 4+) [3]. The fastest increase in Higher Level apprenticeships was observed in services activities, in particular in ‘Professional, scientific and technical activities’ and in ‘Financial and insurance activities’.

These trends were starting to emerge at the same time as the Apprenticeship Levy was introduced. Our research shows that employers subject to the Levy (i.e. those above the £3 million pay bill threshold) were generally more likely to engage with apprenticeship training as compared to non-Levy paying employers. In addition, the rise in starts at Higher levels was substantially greater for Levy employers as compared to non-Levy employers with similar characteristics, while the rapid fall in starts at Intermediate and Advanced levels occurred at a faster rate amongst non-Levy employers.

As a consequence, the fall in the overall number of starts cannot be directly attributed to the introduction of the Levy. Rather, the other changes to the apprenticeship system which occurred over the period, including the introduction of Apprenticeship Standards and removal of Frameworks, the new requirements of a 12-month minimum duration and a lower threshold of 20% of off-the-job training, all contributed to the changing patterns in apprenticeship provision.

Nonetheless, Levy contributions remain under-used (i.e. Levy-paying employers do not spend all of the funds they have available for apprenticeships within the 24 months allowed), while for non-Levy employers (who receive a government contribution of 95% for Apprenticeship costs), sources have reported a shortage of funds restricting their ability to engage in new apprenticeships[4]. Recent trends show that there has also been a substantial shift towards Levy-funded apprenticeship starts which accounted for 57% of new apprenticeships during 2018/19 as compared to slightly below 50% in the previous year. Finally, Levy funds seem to be more often directed towards apprenticeship starts for older learners (aged 25+) and those not from socially disadvantaged backgrounds.[5]

Effect of COVID-19 on apprenticeships

The emerging data available for the COVID-19 period (from 23 March 2020 when face-to-face delivery was halted and training moved online where possible) show that the trends described above were exacerbated by the COVID-19 pandemic. The total number of apprenticeship starts further declined to 323,000 during academic year 2019/20 and fell again in the first quarter of 2020/21 (by around 30% compared to the same period in previous years), while the proportion of Levy-funded starts increased to 65% in 2019/20 (although has declined to 57% in the first quarter of 2020/21).

The COVID-19 pandemic also appears seems to have intensified the shift towards Higher Level apprenticeships and older apprentices (aged 25+) and away from more socially disadvantaged apprentices. For obvious reasons, apprenticeship starts in ‘Leisure, Travel and Tourism’ and ‘Retail and Commercial Enterprise’ were particularly adversely impacted by the COVID-19 pandemic.

In response to the COVID-19 crisis, the Government introduced financial incentives for new apprentices in August 2020 – an additional £2,000 for each new start aged 16-24 and £1,500 for those aged 25+. The March 2021 Budget extended the incentive eligibility period to September 2021 and also increased the size of the bonus to £3,000 per new apprentice, irrespective of age. However, take-up of this scheme has been limited so far with approximately 25,000 employers having submitted claims for the bonus as of 1 February 2021 (compared to a budget allocation of up to 100,000 bonus payments)[6].

What next?

Apprenticeship training is a key component of skills development and has been subject to extensive policy attention both before and during the pandemic. Despite its importance, little is currently known about the effects of the COVID-19 pandemic on new and in-progress apprenticeships and how the funding rules for Levy and non-Levy employers should be reformed. In-depth analyses and discussions in these areas would inform policymakers and stakeholders on how to ensure that the apprenticeship system delivers skill development and training efficiently and effectively for different businesses and learners.



[2] The RQF (Regulated Qualifications Framework) system for England and Northern Ireland enables individuals to compare different qualifications according to their level (knowledge and skills), from Entry Level 1 through to Level 8. For example, A-levels are RQF Level 3 and UG (bachelors) degrees are RQF Level 6. The RQF replaced the National Qualifications Framework (NQF) and the Qualifications and Credit Framework (QCF) in 2015.

Thursday, 29 October 2020

STARK CONTRAST BETWEEN PRE- AND POST-16 PROVISION IN UTCs

Prime Minister Boris Johnson has put the government’s skills policy agenda in the spotlight. 

In a recent speech denouncing skills shortages in several technical occupations, the Prime Minister vowed to ‘end the pointless, nonsensical gulf … between the so-called academic and the so-called practical varieties of education’.

University Technical Colleges (UTCs) are new state-funded 14-19 schools established by the government in the past ten years with a very similar intent. In our new research article we seek to understand whether UTCs were successful at bridging the gap between academic and technical education in England.

UTCs were conceived as a response to a growing technical skills gap and to the perception that young people lacked an adequate and engaging technical skills provision at school. The cornerstone of UTCs is a teaching curriculum that blends core academic subjects with technical subjects that meet regional skill shortages (such as engineering, manufacturing or digital technology). UTCs also benefit from the direct engagement of local universities and employers: industry experts help design and deliver project-based learning seeking to develop the skills and attributes valued in the workplace. (More information here)

There are currently 48 UTCs open in England, according to Department for Education figures for September 2020. The first UTC opened in 2010, since then 58 more opened across England. Despite this expansion, over the years the UTC model has been dogged with controversy. As new schools with no established record and a lack of publicity, UTCs struggled to recruit enough students: recruitment at age 14 proved particularly challenging as typically English students would not change school at that age. Low operating capacity has dented the financial viability of a number of UTCs resulting in 11 closing or changing status. UTCs have also been criticized for their poor performance in national examinations (see e.g. Dominguez-Reig and Robinson, 2018 and Thorley, 2017). Part of this poor performance, however, might be due to the fact that UTC students look very different from typical students. In our study of the effectiveness of UTCs we device an empirical strategy to take this into account, and are able to evaluate the causal effect of enrolling in a UTC on student outcomes.

Our research focuses on 30 UTCs that opened between 2010 and 2014. Having access to education registry data linked to tax records (LEO data), we were able to follow cohorts of students who enrolled in a UTCs either in Year 10 (age 14) or Year 12 (age 16). For Year 10 entrants, we measure academic performance two years later in the high-stakes end-of-secondary school exams (GCSEs). For Year 12 entrants, we investigate post-16 course choices, achievement at Level 3 (e.g. A-Levels or BTECs) and at whether students start an apprenticeship. We are also able to look at students’ post-18 transition into Higher Education or into the labour market. Our research strategy leverages UTCs’ geographical availability across students’ cohorts allowing us to compare UTC students with arguably very similar non-UTC students.

Our findings reveal a stark contrast between pre- and post-16 UTC provision. First consider Year 10 enrolment, we find that enrolling in a UTC has a detrimental effect on GCSE achievement: students who attend UTCs are 26 percentage points less likely to get 5 or more GCSEs with good grades than similar non-UTC students, a large negative effect equivalent to twice the achievement gap between disadvantaged and non-disadvantaged students. UTCs also significantly reduce students’ chances of achieving grade C (now grade 4) in English and maths. These results are concerning: research at CVER warns about the negative consequences of students missing out on grade C in English, limiting student progression over the longer term (Machin et al. 2020).

We find no such detrimental effect on education outcomes for Year 12 entrants: UTC enrolment does not affect students’ probability of achieving at least one A-Level and makes them much more likely to enter and achieve a technical qualification at Level 3. Perhaps unsurprisingly, we find a strong effect on the probability of entering STEM qualifications (higher by 25 percentage points). Impressively, UTCs increase students’ probability of starting an apprenticeship by 14 percentage points. This is a potentially very positive outcome: evidence from CVER points to substantial apprenticeships payoffs for young people (Cavaglia et al. 2020). Finally, we document positive effects of UTC enrolment up to one year after leaving school. While UTCs do not appear to be better (or worse) at sending students to university, they are very good at propelling students into STEM degrees. Also, they enable transition to the labour market: as a result of UTC enrolment in Year 12, students are 3 percentage points less likely to be NEETs one year after leaving school.

What can explain these dramatically different results? One concern is that combining the academically-demanding GCSEs curriculum with additional technical subjects at a time where students are also adapting to a new school may prove too challenging, especially in view of the fact that students moving school in Year 10 are doing so at a non-standard transition time. Furthermore, Year 10 recruits are less academically able than Year 12 recruits (in terms of maths and English test scores), and we find that UTCs are better at teaching more academically-able students.  As more UTCs move to recruit students at a natural transition point (i.e. at age 11 as well as age 16), this might improve their performance to the extent that they become better able to attract a higher attaining group of applicants and have a longer time to teach the broader curriculum before exams at age 16. More generally, we need to bear in mind that UTCs are brand new schools and shouldn’t be judged too hastily. We find some evidence that UTCs improve with time. While the jury is still out on the longer term effect of this policy, our study gives grounds for hope.

This article was originally published on the TES website on 14 October 2020: https://www.tes.com/news/why-do-over-16s-utcs-perform-under-15s-dont.

Guglielmo Ventura is at research assistant at the Centre for  Vocational Education Research at the London School of Economics and Political Science.

Related TES coverage from Kate Parker: https://www.tes.com/news/revealed-utc-attainment-gap-pre-and-post-16s


Thursday, 25 June 2020

What you study vs where you study: how FE choices affect earnings and academic achievement


By Esteban Aucejo, Claudia Hupkau and Jenifer Ruiz-Valenzuela


Following the unprecedented number of job losses and the bleak economic outlook due to the Covid-19 crisis, more people will be considering staying on or returning to education. Vocational education and training (VET) is likely to play a crucial role in providing the skills needed for economic recovery, including retraining workers who have been made redundant. In this context, it is crucial to have good information on the returns to different fields of study that can be taken at FE colleges, and whether it matters which institution one attends for earnings and employment prospects. Our new research published by the Centre for Vocational Education Research (CVER) finds that when it comes to vocational education and training (VET), what you study is very important for future earnings.  Whereas where you study can also matter for younger people but less so for adults.

We used data from more than one million students over 13 years to investigate how much value attending an FE college adds in terms of academic achievement, earnings and employment, taking into account learners’ prior achievements and their socio-economic background. Our study considers both young learners, who mostly join FE colleges shortly after compulsory education, as well as adult learners, who have often worked for many years before attending FE college.

Our value-added measure indicates that moving a student from a college ranked in the bottom 15 percent of the college value-added distribution to one ranked in the top 15% implies a fairly modest 3% higher earnings on average, measured at around seven years after leaving FE college. The difference in earnings for adult learners is smaller, at 1.5%. The fact that college quality seems to matter more for young learners is likely due to young learners spending more time in FE colleges (i.e. they enrol in and complete substantially more learning than adults). The results in terms of the likelihood of being employed show even smaller differences across FE colleges.

There is considerably more variation in FE colleges' contributions to the educational attainment of their young learners. On average, the young people in our sample enrol in just under 600 learning hours, but only achieve about 413 (or 69% of them), around 42% achieve a Level 3 qualification, and 38% progress to higher education.

But were we to move a learner from a college ranked in the bottom 15% by value-added to one ranked in the top 15%, they would, on average, achieve 6.5% more learning hours (from 69% to 73.4%). They would be almost 11% more likely to achieve a Level 3 qualification (from 42% to 46.5%) and the likelihood of attending higher education would increase by 10% (from 38% to 42%). These are large effects. As young people are likely to attend their nearest college, the variability in value-added between institutions is a source of unequal opportunity between geographic areas.

What differentiates high value-added colleges from low value-added ones? Learning characteristics seem to play an important role. Colleges that offer a larger share of their courses in the classroom (as opposed to in the workplace or at a distance) have higher value-added in earnings for young learners. This is particularly relevant in light of the current crisis, where online and distance learning is expected to remain a regular feature, at least in the medium term. We also find significant correlations between the curriculum offer and value-added measures, with colleges offering more exam-assessed qualifications (as opposed to competency-based) showing higher value-added.  

While where you study does not imply large differences in earnings after college, what you study has a much bigger effect, especially for female and young learners. We carried out a separate analysis looking at students’ earnings before and after attending FE college. In this analysis the young people were aged 18-20 and so had been working for up to two years’ prior to study. Table 1 below shows the 3 most popular fields (in terms of learners doing most of their guided learning hours in that particular field, i.e. specialising in them) by gender and age group.

The two fields of engineering and manufacturing technology, and business administration and law show large levels of enrolment among males and lead to large positive returns. For instance, the typical young male learner who chooses engineering and manufacturing technology as his main field of study will earn, on average, almost 7% more five years after finishing education when compared to earnings before attending FE college, after adjusting for inflation.  For adult male learners specialising in this field, earnings rise by 1.5% five years after leaving college. In contrast, young male learners specialising in retail and commercial enterprise do not see any increase in earnings five years after attending FE college. These results take into account that earnings increase with experience, irrespectively of which field one specialised in. Business administration and law, and health, public services and care are the two fields that show high levels of enrolment and consistent positive returns for women across age groups.

While we find consistently higher returns to fields of study for women than for men, this does not mean that overall, they have higher earnings post FE-college attendance. It means that compared to before enrolment, they experience steeper increases in earnings after completing their education at FE colleges. We also find that many specialisations present negative returns immediately after leaving college that turn positive five years after graduation, indicating that it takes time for positive returns to be reflected in wages. The fact that timing matters suggests that policy makers should be extremely cautious about evaluating colleges in terms of the labour market performance of their students.

Our findings also have relevant practical implications for students since they could help them to get a better understanding of the variation in FE college quality and to compare the returns to different fields of study. This information is likely to be particularly important considering the evidence suggesting that students tend to be misinformed about the labour market returns of VET qualifications.

Table 1. Top 3 Fields of study by proportion of learners who specialise in them



Mean GLH
Estimated Return
Share specialising

main field
1 year post-FE
5 years post-FE
in the field
Young male learners




Engineering and Manufacturing Technology
632
0.04
0.068
20.60%
Construction, Planning & Built Environment
621
-0.001
0.023
16.60%
Arts, Media and Publishing
942
-0.064
-0.003
10.70%
Adult male learners




Health, Public Services and Care
77
-0.006
0.004
19.00%
Engineering and Manufacturing Technology
206
-0.008
0.015
18.90%
Business Administration and Law
131
0.003
0.009
14.20%
Young female learners




Health, Public Services and Care
525
-0.002
0.045
25.20%
Retail and Commercial Enterprise
597
0.036
0.115
23.40%
Business Administration and Law
430
0.040
0.118
13.60%
Adult female learners




Health, Public Services and Care
142
-0.008
0.020
34.30%
Business Administration and Law
189
0.004
0.019
14.80%
Education and Training
143
-0.007
0.027
12.70%
Note. The estimated returns reported are the marginal effects, one and five years after leaving the college, respectively, of choosing the field as the main field. This is a summary table. The complete tables can be found in Tables 9 to 12 of CVER DP 030

This blog post appeared first on TES and is republished here with permission.

Friday, 13 March 2020

Training grants: a useful policy to address skills and productivity gaps?

As work changes, firm-provided training may become more relevant for good economic and social outcomes. However, so far there is little or no causal evidence about the effects of training on firms. Pedro Martins looks at the effects of a training grants programme in Portuguese firms.


As most academics, I am fortunate to be able to update my own skills on a regular basis. For instance, when I attend a research seminar, I learn from colleagues that are pushing the knowledge frontier in their specific fields. Some of their insights will sooner or later also feature in my own teaching and research, thus increasing my performance and that of my institution.

However, workers from other sectors typically have far fewer opportunities to increase their skills on a regular basis. Recent research by the European Investment Bank indicates that, on average, workers in Europe spend less than 0.5% of their working time on training activities. In the current context of major changes in labour markets – including artificial intelligence and automation and perhaps even coronavirus – this training figure seems too low.

Economics has long predicted some degree of under-provision of training in labour markets. First, training is expensive for firms, as it typically entails significant direct and indirect costs. Second, employers know they will lose their investments in training if employees subsequently leave. It will be even worse if workers are poached by competitors. Moreover, even leaving aside the issues above, firms may struggle to estimate the effects of training on their performance (sales, profits, etc), which will again detract them from upskilling their workers.

The context above points to an important market failure in training. This context may also explain in part the disappointing economic performance of many European countries over the last years. While labour markets have become more efficient, incentives for on-the-job training may paradoxically have declined, as workers move more easily to other firms. However, public policy may play a role in alleviating the under-provision of training. Specifically, governments can subsidise training in the workplace in order to bring its private net benefit more in line to its social value.

The new working paper featured in this blog and recently presented at a CVER seminar (‘Employee training and firm performance: Quasi-experimental evidence from the European Social Fund’) contributes empirical evidence to this question. The research evaluates the effects of a €200-million training grants scheme supported by the European Union on different dimensions of recipient firms.

The study draws on the difference-in-differences counterfactual evaluation methodology, comparing the outcomes of about 3,500 firms that applied and received a training grant (of about €30,000) and around 6,000 firms that also applied but had their application rejected. Using rich micro data from Portugal, the country where the scheme was introduced, firms can be compared over several years both before and after their participation in the training grants scheme.

The results indicate that the scheme had significant positive effects on training take up, both in terms of training hours and expenditure. For instance, training increased by about 50 hours per worker per year in the firms that received the grant, compared to firms that had their applications rejected. Deadweight – funding training that would be carried out even without the funding – appears to be very limited, in contrast to the findings of an earlier study of a programme in the UK (link).

Moreover, the additional training conducted by firms led to a number of important outcomes that the study can trace, including increased sales, value added, employment, productivity, and exports. These effects tend to be of at least 5% and, in some cases, 10% or more.

For instance, the figure below presents the average difference in total sales between firms that received the grant and those that did not. (Periods -9 to -1 refer to the years before the grant was awarded; while period 0, the comparison year, is when the firm applied for the grant; period 1 is when the firm conducted the training; and periods 2 to 10 refer to the years after the training was conducted.)  The results indicate that total sales are 5% higher in the firm in year 2 and 10% higher in year 5. However, there were no differences between firms before the grant was awarded, which is reassuring as to the counterfactual nature of the study.

The employment results are also interesting as they come from both fewer separations and increased hirings. Firms that increase their training activities in the context of the grant appear to want to expand their workforce but also to retain the workers they already employ. Moreover, the employment effects are stronger when the scheme ran in periods of recession, suggesting that training grants can also act as an active/passive labour market policy, with a positive ‘lock-in’ impact.

In conclusion, there is a case to be made for workplaces to become a little more similar to universities. On-the-job learning can make firms (much) more productive - but that may require a bigger role from governments. Training grants may be a promising tool in this regard.


Wednesday, 11 December 2019

Can the manifesto pledges plug the skills gap?

The parties are offering plenty of promises on improving technical and vocational skills, but, says Sandra McNally (CVER Director), there are significant gaps in their thinking.


Improving technical and vocational skills is a key aspect of improving productivity and social mobility in Britain. The relatively high number of people with poor basic skills and low number of people with high-level vocational skills are long-standing national challenges and have been highlighted in reports by the OECD, the government and academics. In light of this, key priorities of the incoming government should be:

  • To raise attainment and improve educational trajectories for “the forgotten third” who do not get good GCSEs year-on-year and many of whom never achieve a good upper secondary education.
  • To address the shortage of higher-level technical education (at Levels 4 and 5) that was highlighted by the Augar review. 
  • To increase the ability for adults to upskill or reskill later in life. 

But none of the manifestos acknowledge any problem with a “forgotten third” of young people. The source of the problem is partly structural, partly a question of resources, with spending per student between 2010–11 and 2018–19 falling by 12 per cent in real terms in 16–18 colleges, and by 23 per cent in school sixth forms. The Conservative manifesto makes no promise to increase baseline funding beyond existing commitments. Both the Labour Party and Liberal Democrats make large spending commitments to FE in general with the Labour Party making specific mention of aligning the base rate of per pupil funding in post-16 provision with Key Stage 4. The Conservative manifesto does make a significant commitment to increase capital expenditure in Further Education Colleges. While this addresses one of the issues addressed in the Augar review, investment in buildings will not improve student outcomes if there isn’t also investment in their teachers (who are paid considerably less than teachers in schools).

The manifestos do not acknowledge that there is a particular problem with the lack of high-level vocational education in England vis-à-vis higher education. In England, only 4 per cent of 25 year-olds hold a Level 4 or 5 qualification as their highest level, compared to nearly 30 per cent for both Level 3 and Level 6. In contrast, in Germany, Level 4 and 5 makes up 20 per cent of all higher education enrolments.

The main Conservative pledge relevant to this is the establishment of 20 Institutes of Technology with a focus on STEM skills. The Liberal Democrats also promise some institutional reform with the establishment of national centres of expertise for key sectors, such as renewable energy, to develop high-level vocational skills. However, they go further in explicitly acknowledging a “skills gap” and committing to address this by expanding higher vocational training, without however, stating how they would go about this.

Labour promises a free lifelong learning entitlement for everyone, including training up to Level 3 and 6 years of training at Levels 4-6. To the extent that this removes some of the distortions in the financing of the post-18 education system (as well-documented in the Augar review), this would help to address the problem of the lack of higher-level vocational education. But it would be an expensive way of doing so, with the taxpayer (most of whom does not have Level 4-6 education) having to pay the full cost. Moreover, people who are educated up to Level 4-6 have a high private return from this investment compared to people with a lower level of education.

The party manifestos all have something to say about apprenticeships. They all acknowledge problems with how the apprenticeship levy is working. The Conservative manifesto states that they would look into the working of the Levy and see how it can be improved. Both Labour and the Liberal Democrats are far more explicit. They both commit to expand the use of the levy to other forms of training. While this seems like a sensible idea, only two per cent of employers actually pay the levy. All political parties could do with a few more ideas on how to incentivise the other 98 per cent of employers to invest in the training of their staff. The apprenticeship levy is not sufficient for this. The Conservative manifesto has ideas about how to expand R&D credits and it is a pity this does not extend to human capital.

With regard to lifelong learning, Labour and the Liberal Democrats make commitments that are universal whereas the Conservatives’ commitment is more targeted to specific groups through a National Skills Fund (which does not have much detail). Labour’s commitment is to a free lifelong learning entitlement (discussed above) whereas the key Liberal Democrat commitment is the introduction of “Skills Wallets” worth £10,000 for every adult to spend on approved education and skills courses, with the first £4,000 at age 25, £3,000 at age 40 and £3,000 at age 55. This idea has similarities to the “individual learning accounts” that were introduced in 2000 but abandoned a year later because of fraud.

Although the idea of investment throughout life is sensible (and does need to be facilitated), it would be important to ensure that similar mistakes are not repeated. But a more substantive issue is where employer investment appears in this framework. As a major beneficiary of adult training, there needs to be a mechanism for co-investment. This may also help to ensure that the training undertaken meets the needs of the labour market.

Any incoming government needs to be held to account on the extent to which their promises actually address national priorities and whether we see an improvement. The extent to which this is possible depends on the success of their overall economic strategy as well as to the success of specific measures relating to education and skills.

This article was originally published by King's College London's Policy Institute:  https://www.kcl.ac.uk/news/can-the-manifesto-pledges-plug-the-skills-gap