ASH GROVE ACADEMY, a state primary which sits in Moss Roe, a poor suburb on the outskirts of Macclesfield, is an excellent school. Recently, its team won a local debating tournament, besting fancier rivals; its pupils are exposed to William Shakespeare and Oscar Wilde; lessons are demanding and there are catch-up sessions for those who fall behind. Most important, teaching is based on up-to-date research into what works in the classroom. It is the sort of school that ministers dream of replicating across the country.

But how to do so? When the Conservative-Liberal Democrat coalition came to power in 2010, it set about freeing schools from local-authority control. International studies have suggested that such freedom improves results. But giving teachers autonomy doesn’t automatically mean that all will make good decisions. So in 2011 the government provided a grant of £135m ($218m) to establish the Education Endowment Foundation (EEF), a laboratory for education research which would provide teachers with the information to make smart choices.

In the seven years since its foundation, the EEF reckons it has commissioned 10% of all randomised controlled trials ever carried out in education research. In doing so it has turned the English school system into a giant test-bed, with a third of all state schools involved in at least one of its trials. Its work has been used in other parts of the world, like Australia and Latin America, and other countries are considering copying England’s example.

But at home, its efforts have raised difficult questions. Does providing teachers with evidence of what works change their behaviour? And if not, what next?

Where the evidence leads

The EEF was given two main jobs. First, it dished out cash to researchers with interesting ideas, becoming, on its creation, by far the biggest funder of schools research in the country. Educationalists are inclined to small-scale research projects—the sort of studies, says Stephen Gorard of Durham University, where “academics would write up three interviews with head teachers and call it research.” The EEF has prodded them in a more rigorous direction.

Some of its results have been influential. On March 19th the government set aside £26m to fund breakfast clubs, after an EEF study found that they boosted attainment. Just as significant, studies have disproved numerous teaching methods, which is important in a field where fads are common. One recent study found that a programme in which 13- and 14-year-olds assisted 11- and 12-year-olds with their reading did not help the youngsters improve.

Its second job is to disseminate existing research. Its online “teaching and learning toolkit” summarises the findings of more than 13,000 trials from around the world, rating initiatives on the basis of their cost, the strength of the evidence behind them, and their impact, which is measured in the number of months by which they advance children’s learning. Getting a pupil to repeat a year, for example, is expensive and there is adequate evidence to suggest that it sets them back by the equivalent of four months. The EEF also provides broader evidence summaries on areas of interest for schools, such as written marking and digital technology.

Teachers claim to pay attention. A report by the National Audit Office, an official spending watchdog, found that two-thirds of head teachers say they turn to EEF evidence for guidance. But the EEF has come to the realisation that the “passive presentation of evidence is not enough,” says Sir Kevan Collins, its boss. Naturally, it did this by testing its approach. Results published last year found that providing schools with high-quality evidence about teaching led to no improvement in pupils’ performance. The study did not investigate why this was the case. One possibility is that teachers did not take up the ideas. Another is that successful strategies are hard to replicate.

Thus the EEF is increasingly focused on working out how to change behaviour. “One thing we know”, says Sir Kevan, “is that teachers really trust other teachers.” The EEF has joined with officials who work with groups of schools, either in academy chains, local authorities or charities, to spread the evidence-based gospel. It has also increased its meetings with head teachers and has provided extra funding for trials of promising schemes in poorer parts of the country. As ever, all approaches will be scrutinised to see if they work.

The most ambitious shift is the recruitment of 23 “research schools”, of which Ash Grove is one. As a research school, it gets money to help around 150 other local schools, by putting on events to spread the latest research, training teachers and helping them to evaluate the effectiveness of classroom innovations. Jo Ashcroft, the director of education at Ash Grove’s group of academies, notes that the schools “don’t have endless amounts of money”, so every penny has to make a difference.

It is too soon to judge whether such an approach will work. Most educationalists agree that teachers have become more focused on research in recent years. A hard-core minority spend their weekends at conferences debating the merits of star scholars such as John Hattie and Carol Dweck. The challenge for research schools will be reaching beyond these enthusiasts.

It will not be easy. Tellingly, one of the most popular briefs published by the EEF found there was little evidence to support most marking schemes employed by schools, which often infuriate teachers with their pernicketiness. Teachers “like proof they are right”, says Becky Francis of the UCL Institute of Education; it is more difficult to change behaviour when they are wrong. The EEF hopes that evidence will be more compelling when it comes from a friendly face.