Can student engagement data help build mid-cycle confidence in the impact of APP activity?
With the focus on access and participation plans, how can engagement data help the higher education sector evidence their impact?
November 15, 2023

StREAM

Kortext

The Office for Students (OfS) launched its revised approach to Access and Participation Plans (APPs) last year to redefine the expectations for higher education providers. Within the requirements are details of the interventions providers must make to ensure the improvement of equality of opportunity for underrepresented groups to access, succeed and progress within higher education. 

Making higher education accessible doesn’t stop at getting in. Accessibility encompasses the entire student journey, from getting in to getting on, and going on to complete a course. 

Universities aim to provide consistent support to students throughout their time at an institution. Determining whether the support and outreach initiatives that have been devised and then implemented are working can be challenging. Here at StREAM, we are wondering whether student engagement analytics data can help. 

In particular, whether mid-cycle insights from student engagement data could help determine progress against targets and objectives as well as provide an opportunity for mid-cycle revisions in response to feedback, evidence of impact (or otherwise) and resource availability. 

The importance of evaluation

In the associated guidance to universities writing their APP, the Office for Students (OfS) explains the role and importance of developing initiatives that are underpinned by a theory of change: 

A theory of change model is a strategic planning tool … to ensure that long-term goals are defined and understood. … [A] theory of change model can provide a clear rationale for how short-term activities are connected to longer term improvements in access and participation, and the evidence practice which underpins them. By identifying the different intermediate outcomes that an intervention may produce, the activity can be designed to meet these outcomes. 

A natural corollary of this approach is a robust evaluation that provides high-quality evidence about effective practice when it comes to access and participation activity undertaken by universities. Within UK HE, this work is supported primarily by the work of the Centre for Transforming Access and Student Outcomes in Higher Education (TASO), an OfS-funded independent charity. 

The need for evidence

In his keynote address to the Universities UK Access, Participation and Student Success Conference in November 2023, John Blake, OfS Director for Fair Access and Participation, was clear that in the 20 years since APPs were introduced, it is still unclear exactly ‘what works’, concluding that the evidence simply isn’t there. Blake further spoke of the need to support institutions to take measured risks in the types of interventions that they create even where there is no guarantee that the intervention will improve the relevant metrics. But he was clear that if improvements in metrics are seen, then it is important to know how and why the intervention was a success. 

His comments set us thinking about whether student engagement data has a role to play here in helping universities to measure the impact of their interventions, particularly over the course of the APP duration so as to make appropriate enhancements or revisions to the intervention in response to feedback. 

Why student engagement data matters

Student engagement analytics, is helping universities create a clearer picture of the ways in which students are engaging with their studies. Our student engagement analytics platform, StREAM, explores how students are engaging day-to-day with their education, marked by a scale of low to high categories of engagement, in a secure and detailed manner. Codifying engagement provides a simple way of viewing and understanding the data so it is easier to understand and work with. 

Impact for individual students

If engagement is consistently low or a student’s engagement changes significantly – especially on the lower end – universities will be notified of a potential at-risk student in line with their own operational requirements. Unexplained changes in engagement patterns or engagement behaviours that are inconsistent with the rest of the cohort suggest that there is value in having an outreach conversation with a student to help explain and understand any factors that may be impeding their ability to participate fully in their studies. Identifying and collaboratively seeking to address issues in real-time provides clarity and consistency that allows universities to gain meaningful insight to help support progression at the individual student level. 

Cohort-level impact

The cumulative impact of effective support for individual student engagement is a positive impact on retention and continuation metrics. Rather than wait until the end of the APP cycle, we believe that mid-cycle insights can also be gleaned from student engagement data. Cohort, programme, subject, faculty and university level insights can help universities to understand the changing nature of demand on core student services across the academic year, identify fluctuations in demand and thereby enable an agile in-year response to help balance demand for a service with supply. 

Evidential standards

Reviewing the interventions implemented for individual students, for cohorts of students and groups of students as determined by characteristic or a group of characteristics, in conjunction with their outcomes can provide universities with a level of insight as to ‘what works’. However, these insights, or standards of evidence’ have different values from an OfS/TASO evalution perspective. Briefly, 

  • Narrative evidence (type 1) uses research-evidence to justify selection of particular activities selected as part of a cohort outreach strategy. 
  • Empirical enquiries (type 2) use data on impact and shows that interventions resulted in improved outcomes, although no direct causal effect is established. 
  • Causal (type 3) evidence is the highest standard of evidence and provides either quantitative or qualitative evidence of a change on participants resulting from an intervention that is not evidence for an appropriate control or comparison group who did not take part in the intervention. 

Seeking to provide causal evidence can be challenging. While the OfS are happy for universities to take risks when planning outreach interventions, I doubt any university would sanction an intervention whereby students who have clearly been identified as being ‘at risk’ were randomly divided into two groups, one of whom received an outreach intervention and the other did not. Moreover, will it be possible to determine whether an intervention is successful for students with a particular set of personal or demographic characteristics? If a student successfully engages with an initiative to support re-engagement with their studies, can that success be attributed to gender, race, sexual orientation, being a care-experienced student or something else? Could it simply be that the intervention captured the attention of the student because it provided some level of intrinsic motivation or appeal? 

Why we think student engagement data can help

In spite of these challenges, we continue to believe that student engagement data can nevertheless prove useful to universities through providing mid-cycle insights on the value of different outreach interventions. In recent years, we have worked with a range of clients who are all seeking to understand the impact of their interventions in respect of student engagement, continuation and progression and published the findings of their empirical research. Here is a sample of what our clients have found: 

  • The University of the West of England found that student engagement data provided them with new insights in respect of some structural barriers that were disproportionately impacting their ‘hard-to-reach’ students.
  • Indications of risks to continuation were identified for students at Nottingham Trent University after as little as two weeks at University.
  • Research conducted in partnership with Wonkhe to understand student perceptions of engagement analytics concluded that: 
  • Over 70% of students thought this data should be used to help universities ‘understand what additional support individual students need to increase their engagement with learning, become better at learning and boost their academic performance’.
  • Over 65% of respondents also felt that this data should be used to help universities ‘to understand what additional support students from specific demographic groups may need to increase their engagement with learning and boost their academic performance’
  • A similar figure agreed that the data should be used to identify and support less engaged students who might be experiencing low wellbeing or be at risk of leaving their course. 

So what?

The research findings detailed above show that universities are successfully using student engagement data to help address a range of use cases around student success, retention and wellbeing, among others. It is clear that more work needs to be done to better understand the relationship between the data and the outcomes and to answer the question posed by John Blake at the UUK conference around knowing how and why certain interventions are proving successful. 

A deep dive into student engagement data has the potential to reveal a range of previously unknown insights about the student cohort, including insights on the relationship between personal/demographic characteristics, engagement and student outcomes. The determination of ‘risk’ within StREAM is based solely on what students do i.e. how they engage with their studies, rather than on who they are, a decision based on our key principals around ethical and transparent use of data. However, triangulation with those personal and demographic insights provide valuable context when it comes to getting to know your students and agreeing how best to support each and every individual. 

Gathering data throughout the student journey allows universities to create a complex picture of what consists of their student population, and the unique requirements of individuals to access support and ultimately, succeed.  

Using this readily-available data can help universities ensure that their support services are relevant and accessible to all student cohorts and encourage widespread student participation, thus nurturing a truly accessible culture and learning environment for all. 

 

To share your opinion, ideas and explore these questions further with us, book a demo