Anthony Painter examines the truth behind the numbers in the Educational Maintenance Allowance debate.
In a tweet last week, the director of Policy Exchange, Neil O’Brien, described Education Maintenance Allowances as ‘one of the least effective policies’ ever. In essence, he was referring to the so-called ‘deadweight loss’ of the policy. The basis for this charge is a survey conducted by the National Foundation of Educational Research on behalf of the Department for Education.
It found that 88 per cent of EMA recipients would have stayed on anyway without the income support. Government ministers and acolytes have leapt on the findings. Game, set, and match EMA abolitionists?
Not so fast. The ‘deadweight’ argument is actually an exceedingly misleading one. Any major public policy will have a degree of ‘deadweight’, i.e. people who receive a benefit or service but don’t ‘need’ it. Let’s take a simple example. Millions of us have GP check ups every single year.
However, only a tiny proportion of us have a serious illness that is uncovered in the consultation process. Using the logic of the Government and its supporters, given the huge ‘deadweight’ of GP check-ups, the funding should be discontinued.
Imagine if Coca Cola decided that only 12 per cent of its advertising spend led to people buying its product. Would it then cut its advertising budget by 88% in order to eliminate deadweight? Of course not. It would be impossible to target the reduced budget on those who would have a propensity to buy a can of Coke if they saw an ad for the soft drink.
So the ‘deadweight’ argument is an utterly nonsensical one- albeit one that is draped in the language of common sense. If we accept it as a way of evaluating the effectiveness of a policy then almost all public policy interventions fall apart: common education, national health, universal welfare, public transport, and so on. It is toxic and it is wrong.
There are two genuine questions when it comes to assessing the success of a policy: does it work and is there a cheaper way of securing the same outcomes? On both these counts EMA stacks up well.
The most useful report in assessing the success of EMA has been published by the IFS. It finds that, in areas piloting EMA, participation rates for recipients of the payment increased by up to 8.1% for females (at age 17) and 5.5% for males (4.5% at 17.) It isn’t clear from the report what the baseline (pre EMA) participation for EMA recipients is but a not unreasonable assumption would be that in the pilot areas it was 40% (for comparison the IFS report shows that participation in full time education for 17 year –old females who were eligible for free school meals was 44%).
This would imply that EMA resulted in a 20% increase in participation for females and 14% for males.
A policy that increases participation amongst those groups most prone to chronic underachievement by somewhere in the 12% (according to the DfE survey) to 20% range is a strong policy. Moreover, with changes, i.e. cuts, to benefits elsewhere- housing benefit etc- the participation impact of EMA would increase if it remained in place. Families are more likely to be comfortable about a 16, 17 or 18 year-old kid (or two!) staying in full-time education with the EMA.
Its impact on educational outcomes is similarly significant. The 157 Group of Colleges has published research based on the experience of its constituent members (mainly large, inner city colleges.) For example, students at Lambeth College who receive the EMA are 13% more likely to pass their courses than those who do not.
When you bear in mind that these students are more deprived than non-recipients, this outcome is remarkable. Other colleges report similar impacts and this is supported by the IFS research which, for example, shows a 6.2% increased likelihood of black females in EMA pilot areas receiving a full level 3 (equivalent to 2 A Levels.)
There will be anything from 72,000 to 120,000 students who would not be in education if EMA did not exist based on extrapolation using the participation rates calculated above and the total number receiving EMA. What would these students be doing if they were not in school or college?
For each one who ends up as a NEET, i.e. not in a job as we know they won’t be in education or training, it will cost the public purse £56,300 over their lifetime according to York University research conducted on behalf of the Audit Commission. If just 18,000 or so end up workless then EMA pays for itself.
Finally, let’s consider the alternative policy- to invest £50million in hardship funds instead of £500million in EMA. It’s important to state that £50million is better than nothing! And if it were £100million, that would be even better. But just as Coca-Cola can’t eliminate its ‘deadweight’ advertising, this scheme will impact significantly those in need as well as those who could do without.
Colleges and schools will have to decide upon who are the worth recipients. How can they? There is no way to identify a genuinely needy case. There is also a moral hazard here- it provides an incentive for students to threaten to drop out or claim that they will not attend without hardship support. So the alternative becomes a bit of a scattergun.
So the policy choice is quite simple. It is not between a wasteful failure with ‘deadweight’ and a targeted efficient alternative. It’s actually the complete opposite. It’s between a policy that works but is more expensive (which is what enables it to work!) though pays for itself and a policy that, while it does some good, will be nowhere near as effective.
The coalition has chosen the latter but they should be under no illusion of the significant costs to individuals, educational attainment, social mobility, and the public purse in the long term as a result. Just let us be clear about the real nature of the choice.
Leave a Reply