Since being founded in 2018, ImpactEd has grown to help over 150 organisations and 1,000 schools better measure their impact. Having recently relaunched as ImpactEd Evaluation within ImpactEd Group, we’re committed to sharing more on what that data can tell us about how best to support young people.
Using evidence to inform decision making is not an academic exercise. We recently published the story of Jamal, a student at Miltoncross Academy whose educational journey has been transformed through support from a 1:1 mentoring initiative. His story is not unique: the best educational interventions really do make a difference. But for time-poor professionals, effective evaluation is easier said than done. We want to change that.
Over the course of this year, under the continuing theme of Making It Count, we will share regular insights from the evaluations we support, signposting effective practices for practitioners and policymakers to consider. Sign up to our newsletter for notifications of future releases.
At ImpactEd Evaluation we’ve recently analysed trends from the data collected through our School Impact Platform, our digital tool helping schools and those who work with them to better understand their impact. (See methodology below for further details). This data represents over 1,150 schools and organisations, so provides valuable insights into changing priorities in the education system over the last five years. It includes insights from individual schools and Trusts as well as that derived from strategic partners such as Challenge Partners.
So what can we learn from this?
With data back to 2018, we can paint a picture of how the focuses of schools and organisations have varied over time. In some ways however, what is most notable is what hasn’t changed:
This data reflects the changing educational landscape over time. For example, wellbeing focused evaluations constituted 26% of all evaluations on the platform in 2019/20 as schools monitored the impacts of the March 2020 lockdown. Although still a concern for many schools, in 2022/23 the number of evaluations focused on wellbeing returned to 11%.
As well as considering what interventions organisations evaluated, we also looked at what measures of impact they used beyond traditional academic outcomes. Wellbeing was the most consistent measure, but beyond this pupils’ motivation, metacognition and self-efficacy were the most popular areas.
With over 70% of evaluations on the School Impact Platform having used at least one measure of social and emotional skills, it is clear that schools and organisations value the ability to look at educational impact holistically.
Specific skills of interest do vary year on year. In the 22-23 academic year, pupil motivation and school engagement saw the largest increases in usage relative to the previous year. This aligns with an increasing focus on attendance and engagement - on which more below.
Excluding areas where we have a number of national partnerships (e.g. outdoor learning) our most notable increases in evaluation focuses were in attendance and behaviour. While this will be influenced by developments on our School Impact Platform, nearly five times as many evaluations took place on the School Impact Platform in 2022-23 compared to 2021-22. For behaviour, it was around 2.5 times as many.
Our Understanding Attendance project has been developed to address precisely this need: equipping schools to better understand the drivers of pupil absence in their settings and develop effective, targeted strategies. But there are no easy solutions - the rise in evaluations of attendance and behaviour, and the increase in popularity of our school engagement measure (used in 21% of all evaluations on the platform last year) illustrates a fundamental challenge in pupils’ engagement with education. We will be sharing more soon on how pupil belonging affects engagement and implications for educators.
Across all evaluations one of the lessons of our data is variability: how you approach the intervention is at least as important as what it is.
This is perhaps most pronounced in our data on the use of education technology. Our Evaluating Edtech paper shows a number of edtech programmes where evaluation has shown statistically meaningful impacts. However, this generally only occurs where implementation and intended impacts are clearly planned for by ensuring:
This is backed up by the external research: analyses such as the EEF’s review of edtech show positive but small effect sizes, with the most notable finding being the range of impact outcomes recorded. Edtech companies are increasingly aware of this - our Talking Impact research series recently highlighted how organisations can better evaluate their products.
This data provides insights into the priorities for schools and education leaders. Most important, however, is working together to address the challenges our data suggests.
Under the continuing theme of Making It Count, we'll be doing this in two main ways.
First, we'll work closely with our strategic partners to provide practical case studies and solutions. As a first step, we'd recommend consulting Challenge Partners' thematic review of the approaches their 550 partner schools have taken to collaboration in challenging circumstances. Showing similar themes to our data, the review also provides practical school-level examples of solutions to these enduring concerns.
Second, we'll continue to share further insights from our evolving dataset through events and regular publications. Our next release will focus on lessons we are learning on addressing pupil absence - sign up to our newsletter to be notified when this goes live.
Want to hear more about ImpactEd Evaluation? Get in touch for a conversation with a member of our team.