We know how to evaluate job training programmes. It should be done in broadly the same way as evaluating the effectiveness of a drug, by comparing it to a control group that received placebos. It is a bit more complicated with people because they are hard to "control", but the general approach is the same. We don't evaluate the impact of a new drug by guessing, and shouldn't do the same with training programmes, which should be carefully evaluated before, during and after implementation.
In the United States, experimental estimates of the impact of youth training funded under the Job Training Partnership Act (JTPA) resulted in large cuts in the programme. The experiment found statistically significant negative impacts on the earnings of youths after 18 months' random assignment and negligible impacts on the earnings of females. Subsequent work confirmed these estimates using non-experimental methods*. Congress cut funding immediately because the programme acted like the kiss of death to participants who were worse off being on the programme than receiving no treatment.
The recent evidence on the Coalition's decisions to cancel the Future Jobs Fund (FJF) and introduce its Work Programme (WP) was just as devastating. It found that the FJF worked; and second, that, just like the JTPA, the Coalition's vaunted WP appears to have negative rates of return and isn't. The inference of course is that the country would be better off, and the deficit lower, if the WP was scrapped. I am aware of no pilots that were done to suggest that the WP was ever going to work – this is policymaking by guessing. It's the equivalent to introducing a drug that was not fully tested. It is the labour market equivalent of thalidomide. To the details.
The Department for Work and Pensions (DWP) published a damaging critique of Coalition policy entitled Impacts and Costs and Benefits of the Future Jobs Fund, which examined the impact of the FJF, introduced in October 2009. The programme was primarily aimed at 18- to 24-year-olds in receipt of jobseeker's allowance (JSA), with a smaller number of places available to JSA claimants aged over 24 in unemployment hotspots. Each job had to be at least 25 hours per week and had to be paid at least the minimum wage. Between October 2009 and March 2011, just over 105,000 jobs were created under the FJF, with a total cost of £680m. The study examined a cohort of participants who started their FJF job between October 2009 and March 2010. The FJF programme worked, and is estimated to have resulted in a net benefit to participants of £4,000 per participant, a net benefit to employers of £6,850 per participant, a net cost to the Exchequer of £3,100 per participant, and a net benefit to society of £7,750 per participant.
The Prime Minister had claimed the FJF was a waste of money and "has been one of the most ineffective job schemes there's been. The really damning evidence is that it's a six-month programme, but one month after the programme half the people that were on it are back on the dole. It failed".
I defer to my old friend Jonathan Portes of the National Institute of Economics and Social Research, who put it well: "The main point that emerges from this is that taking snap decisions without any evidence is bad for taxpayers, bad for the economy, and bad for society as a whole. The Prime Minister might have been right that the FJF would turn out to be a waste of money. But, until we had proper evaluation and analysis, we didn't know … Well, we know now; the Prime Minister was wrong ... But it's too late: the programme has already been cancelled, so instead of spending money on something we now know works for young people on the dole, for employers, and for society as a whole – we're spending it on other things. And we don't know (yet) if they work or not. That's a real waste." Ideology ruled.
But it got worse for the Government, when the DWP then published its latest data on the impact of the WP. The JTPA comes to mind. The WP was launched in June 2011 and pays private and voluntary sector organisations according to their success in helping the long-term unemployed back into work. From the 836,000 long-term unemployed who joined the programme, by the end of July 2012 31,000, or only 3.7 per cent, had found work lasting 13 or more weeks (the measure of a job outcome), against a target of 5.5 per cent.
None of the 18 WP contractors managed to hit 5.5 per cent, despite the Government having spent £435m so far. The figures the DWP released cover the first 14 months of the WP, and if you take only the figures for a full first year – the 12 months to which the key performance target was tied – the success rate drops back even further, to 2.3 per cent. People who didn't enter the programme did better, so it was less effective than doing nothing.
As the table above shows, the number of people claiming JSA for 12 months or more, which the WP was targeted at reducing, has increased by 211,000 since June 2011, while the number of long-term unemployed youngsters aged 18 to 24 is up nearly fivefold. Unemployment is also rising again; the claimant count increased by 10,000 in the latest data release and is up by 110,000 since the Coalition was formed. Not good signs. Unless the Government can demonstrate quickly by careful impact analysis that the Work Programme works, then it deserves to follow the US Job Training Partnership Act into the dustbin of history. And then the Coalition should bring back the Future Jobs Fund, which did work. Prove it.
*James J Heckman and Jeffrey Smith,'The Sensitivity of Experimental Impact Estimates (Evidence from the National JTPA Study)' in Youth Employment and Joblessness in Advanced Countries, David G Blanchflower and Richard B Freeman (eds), NBER and University of Chicago Press, 2000