Thursday 29 October 2020

STARK CONTRAST BETWEEN PRE- AND POST-16 PROVISION IN UTCs

Prime Minister Boris Johnson has put the government’s skills policy agenda in the spotlight. 

In a recent speech denouncing skills shortages in several technical occupations, the Prime Minister vowed to ‘end the pointless, nonsensical gulf … between the so-called academic and the so-called practical varieties of education’.

University Technical Colleges (UTCs) are new state-funded 14-19 schools established by the government in the past ten years with a very similar intent. In our new research article we seek to understand whether UTCs were successful at bridging the gap between academic and technical education in England.

UTCs were conceived as a response to a growing technical skills gap and to the perception that young people lacked an adequate and engaging technical skills provision at school. The cornerstone of UTCs is a teaching curriculum that blends core academic subjects with technical subjects that meet regional skill shortages (such as engineering, manufacturing or digital technology). UTCs also benefit from the direct engagement of local universities and employers: industry experts help design and deliver project-based learning seeking to develop the skills and attributes valued in the workplace. (More information here)

There are currently 48 UTCs open in England, according to Department for Education figures for September 2020. The first UTC opened in 2010, since then 58 more opened across England. Despite this expansion, over the years the UTC model has been dogged with controversy. As new schools with no established record and a lack of publicity, UTCs struggled to recruit enough students: recruitment at age 14 proved particularly challenging as typically English students would not change school at that age. Low operating capacity has dented the financial viability of a number of UTCs resulting in 11 closing or changing status. UTCs have also been criticized for their poor performance in national examinations (see e.g. Dominguez-Reig and Robinson, 2018 and Thorley, 2017). Part of this poor performance, however, might be due to the fact that UTC students look very different from typical students. In our study of the effectiveness of UTCs we device an empirical strategy to take this into account, and are able to evaluate the causal effect of enrolling in a UTC on student outcomes.

Our research focuses on 30 UTCs that opened between 2010 and 2014. Having access to education registry data linked to tax records (LEO data), we were able to follow cohorts of students who enrolled in a UTCs either in Year 10 (age 14) or Year 12 (age 16). For Year 10 entrants, we measure academic performance two years later in the high-stakes end-of-secondary school exams (GCSEs). For Year 12 entrants, we investigate post-16 course choices, achievement at Level 3 (e.g. A-Levels or BTECs) and at whether students start an apprenticeship. We are also able to look at students’ post-18 transition into Higher Education or into the labour market. Our research strategy leverages UTCs’ geographical availability across students’ cohorts allowing us to compare UTC students with arguably very similar non-UTC students.

Our findings reveal a stark contrast between pre- and post-16 UTC provision. First consider Year 10 enrolment, we find that enrolling in a UTC has a detrimental effect on GCSE achievement: students who attend UTCs are 26 percentage points less likely to get 5 or more GCSEs with good grades than similar non-UTC students, a large negative effect equivalent to twice the achievement gap between disadvantaged and non-disadvantaged students. UTCs also significantly reduce students’ chances of achieving grade C (now grade 4) in English and maths. These results are concerning: research at CVER warns about the negative consequences of students missing out on grade C in English, limiting student progression over the longer term (Machin et al. 2020).

We find no such detrimental effect on education outcomes for Year 12 entrants: UTC enrolment does not affect students’ probability of achieving at least one A-Level and makes them much more likely to enter and achieve a technical qualification at Level 3. Perhaps unsurprisingly, we find a strong effect on the probability of entering STEM qualifications (higher by 25 percentage points). Impressively, UTCs increase students’ probability of starting an apprenticeship by 14 percentage points. This is a potentially very positive outcome: evidence from CVER points to substantial apprenticeships payoffs for young people (Cavaglia et al. 2020). Finally, we document positive effects of UTC enrolment up to one year after leaving school. While UTCs do not appear to be better (or worse) at sending students to university, they are very good at propelling students into STEM degrees. Also, they enable transition to the labour market: as a result of UTC enrolment in Year 12, students are 3 percentage points less likely to be NEETs one year after leaving school.

What can explain these dramatically different results? One concern is that combining the academically-demanding GCSEs curriculum with additional technical subjects at a time where students are also adapting to a new school may prove too challenging, especially in view of the fact that students moving school in Year 10 are doing so at a non-standard transition time. Furthermore, Year 10 recruits are less academically able than Year 12 recruits (in terms of maths and English test scores), and we find that UTCs are better at teaching more academically-able students.  As more UTCs move to recruit students at a natural transition point (i.e. at age 11 as well as age 16), this might improve their performance to the extent that they become better able to attract a higher attaining group of applicants and have a longer time to teach the broader curriculum before exams at age 16. More generally, we need to bear in mind that UTCs are brand new schools and shouldn’t be judged too hastily. We find some evidence that UTCs improve with time. While the jury is still out on the longer term effect of this policy, our study gives grounds for hope.

This article was originally published on the TES website on 14 October 2020: https://www.tes.com/news/why-do-over-16s-utcs-perform-under-15s-dont.

Guglielmo Ventura is at research assistant at the Centre for  Vocational Education Research at the London School of Economics and Political Science.

Related TES coverage from Kate Parker: https://www.tes.com/news/revealed-utc-attainment-gap-pre-and-post-16s


Thursday 25 June 2020

What you study vs where you study: how FE choices affect earnings and academic achievement


By Esteban Aucejo, Claudia Hupkau and Jenifer Ruiz-Valenzuela


Following the unprecedented number of job losses and the bleak economic outlook due to the Covid-19 crisis, more people will be considering staying on or returning to education. Vocational education and training (VET) is likely to play a crucial role in providing the skills needed for economic recovery, including retraining workers who have been made redundant. In this context, it is crucial to have good information on the returns to different fields of study that can be taken at FE colleges, and whether it matters which institution one attends for earnings and employment prospects. Our new research published by the Centre for Vocational Education Research (CVER) finds that when it comes to vocational education and training (VET), what you study is very important for future earnings.  Whereas where you study can also matter for younger people but less so for adults.

We used data from more than one million students over 13 years to investigate how much value attending an FE college adds in terms of academic achievement, earnings and employment, taking into account learners’ prior achievements and their socio-economic background. Our study considers both young learners, who mostly join FE colleges shortly after compulsory education, as well as adult learners, who have often worked for many years before attending FE college.

Our value-added measure indicates that moving a student from a college ranked in the bottom 15 percent of the college value-added distribution to one ranked in the top 15% implies a fairly modest 3% higher earnings on average, measured at around seven years after leaving FE college. The difference in earnings for adult learners is smaller, at 1.5%. The fact that college quality seems to matter more for young learners is likely due to young learners spending more time in FE colleges (i.e. they enrol in and complete substantially more learning than adults). The results in terms of the likelihood of being employed show even smaller differences across FE colleges.

There is considerably more variation in FE colleges' contributions to the educational attainment of their young learners. On average, the young people in our sample enrol in just under 600 learning hours, but only achieve about 413 (or 69% of them), around 42% achieve a Level 3 qualification, and 38% progress to higher education.

But were we to move a learner from a college ranked in the bottom 15% by value-added to one ranked in the top 15%, they would, on average, achieve 6.5% more learning hours (from 69% to 73.4%). They would be almost 11% more likely to achieve a Level 3 qualification (from 42% to 46.5%) and the likelihood of attending higher education would increase by 10% (from 38% to 42%). These are large effects. As young people are likely to attend their nearest college, the variability in value-added between institutions is a source of unequal opportunity between geographic areas.

What differentiates high value-added colleges from low value-added ones? Learning characteristics seem to play an important role. Colleges that offer a larger share of their courses in the classroom (as opposed to in the workplace or at a distance) have higher value-added in earnings for young learners. This is particularly relevant in light of the current crisis, where online and distance learning is expected to remain a regular feature, at least in the medium term. We also find significant correlations between the curriculum offer and value-added measures, with colleges offering more exam-assessed qualifications (as opposed to competency-based) showing higher value-added.  

While where you study does not imply large differences in earnings after college, what you study has a much bigger effect, especially for female and young learners. We carried out a separate analysis looking at students’ earnings before and after attending FE college. In this analysis the young people were aged 18-20 and so had been working for up to two years’ prior to study. Table 1 below shows the 3 most popular fields (in terms of learners doing most of their guided learning hours in that particular field, i.e. specialising in them) by gender and age group.

The two fields of engineering and manufacturing technology, and business administration and law show large levels of enrolment among males and lead to large positive returns. For instance, the typical young male learner who chooses engineering and manufacturing technology as his main field of study will earn, on average, almost 7% more five years after finishing education when compared to earnings before attending FE college, after adjusting for inflation.  For adult male learners specialising in this field, earnings rise by 1.5% five years after leaving college. In contrast, young male learners specialising in retail and commercial enterprise do not see any increase in earnings five years after attending FE college. These results take into account that earnings increase with experience, irrespectively of which field one specialised in. Business administration and law, and health, public services and care are the two fields that show high levels of enrolment and consistent positive returns for women across age groups.

While we find consistently higher returns to fields of study for women than for men, this does not mean that overall, they have higher earnings post FE-college attendance. It means that compared to before enrolment, they experience steeper increases in earnings after completing their education at FE colleges. We also find that many specialisations present negative returns immediately after leaving college that turn positive five years after graduation, indicating that it takes time for positive returns to be reflected in wages. The fact that timing matters suggests that policy makers should be extremely cautious about evaluating colleges in terms of the labour market performance of their students.

Our findings also have relevant practical implications for students since they could help them to get a better understanding of the variation in FE college quality and to compare the returns to different fields of study. This information is likely to be particularly important considering the evidence suggesting that students tend to be misinformed about the labour market returns of VET qualifications.

Table 1. Top 3 Fields of study by proportion of learners who specialise in them



Mean GLH
Estimated Return
Share specialising

main field
1 year post-FE
5 years post-FE
in the field
Young male learners




Engineering and Manufacturing Technology
632
0.04
0.068
20.60%
Construction, Planning & Built Environment
621
-0.001
0.023
16.60%
Arts, Media and Publishing
942
-0.064
-0.003
10.70%
Adult male learners




Health, Public Services and Care
77
-0.006
0.004
19.00%
Engineering and Manufacturing Technology
206
-0.008
0.015
18.90%
Business Administration and Law
131
0.003
0.009
14.20%
Young female learners




Health, Public Services and Care
525
-0.002
0.045
25.20%
Retail and Commercial Enterprise
597
0.036
0.115
23.40%
Business Administration and Law
430
0.040
0.118
13.60%
Adult female learners




Health, Public Services and Care
142
-0.008
0.020
34.30%
Business Administration and Law
189
0.004
0.019
14.80%
Education and Training
143
-0.007
0.027
12.70%
Note. The estimated returns reported are the marginal effects, one and five years after leaving the college, respectively, of choosing the field as the main field. This is a summary table. The complete tables can be found in Tables 9 to 12 of CVER DP 030

This blog post appeared first on TES and is republished here with permission.

Friday 13 March 2020

Training grants: a useful policy to address skills and productivity gaps?

As work changes, firm-provided training may become more relevant for good economic and social outcomes. However, so far there is little or no causal evidence about the effects of training on firms. Pedro Martins looks at the effects of a training grants programme in Portuguese firms.


As most academics, I am fortunate to be able to update my own skills on a regular basis. For instance, when I attend a research seminar, I learn from colleagues that are pushing the knowledge frontier in their specific fields. Some of their insights will sooner or later also feature in my own teaching and research, thus increasing my performance and that of my institution.

However, workers from other sectors typically have far fewer opportunities to increase their skills on a regular basis. Recent research by the European Investment Bank indicates that, on average, workers in Europe spend less than 0.5% of their working time on training activities. In the current context of major changes in labour markets – including artificial intelligence and automation and perhaps even coronavirus – this training figure seems too low.

Economics has long predicted some degree of under-provision of training in labour markets. First, training is expensive for firms, as it typically entails significant direct and indirect costs. Second, employers know they will lose their investments in training if employees subsequently leave. It will be even worse if workers are poached by competitors. Moreover, even leaving aside the issues above, firms may struggle to estimate the effects of training on their performance (sales, profits, etc), which will again detract them from upskilling their workers.

The context above points to an important market failure in training. This context may also explain in part the disappointing economic performance of many European countries over the last years. While labour markets have become more efficient, incentives for on-the-job training may paradoxically have declined, as workers move more easily to other firms. However, public policy may play a role in alleviating the under-provision of training. Specifically, governments can subsidise training in the workplace in order to bring its private net benefit more in line to its social value.

The new working paper featured in this blog and recently presented at a CVER seminar (‘Employee training and firm performance: Quasi-experimental evidence from the European Social Fund’) contributes empirical evidence to this question. The research evaluates the effects of a €200-million training grants scheme supported by the European Union on different dimensions of recipient firms.

The study draws on the difference-in-differences counterfactual evaluation methodology, comparing the outcomes of about 3,500 firms that applied and received a training grant (of about €30,000) and around 6,000 firms that also applied but had their application rejected. Using rich micro data from Portugal, the country where the scheme was introduced, firms can be compared over several years both before and after their participation in the training grants scheme.

The results indicate that the scheme had significant positive effects on training take up, both in terms of training hours and expenditure. For instance, training increased by about 50 hours per worker per year in the firms that received the grant, compared to firms that had their applications rejected. Deadweight – funding training that would be carried out even without the funding – appears to be very limited, in contrast to the findings of an earlier study of a programme in the UK (link).

Moreover, the additional training conducted by firms led to a number of important outcomes that the study can trace, including increased sales, value added, employment, productivity, and exports. These effects tend to be of at least 5% and, in some cases, 10% or more.

For instance, the figure below presents the average difference in total sales between firms that received the grant and those that did not. (Periods -9 to -1 refer to the years before the grant was awarded; while period 0, the comparison year, is when the firm applied for the grant; period 1 is when the firm conducted the training; and periods 2 to 10 refer to the years after the training was conducted.)  The results indicate that total sales are 5% higher in the firm in year 2 and 10% higher in year 5. However, there were no differences between firms before the grant was awarded, which is reassuring as to the counterfactual nature of the study.

The employment results are also interesting as they come from both fewer separations and increased hirings. Firms that increase their training activities in the context of the grant appear to want to expand their workforce but also to retain the workers they already employ. Moreover, the employment effects are stronger when the scheme ran in periods of recession, suggesting that training grants can also act as an active/passive labour market policy, with a positive ‘lock-in’ impact.

In conclusion, there is a case to be made for workplaces to become a little more similar to universities. On-the-job learning can make firms (much) more productive - but that may require a bigger role from governments. Training grants may be a promising tool in this regard.