Having the conversation with students about learning engagement and data analytics
Students have views on how learning engagement data should be collected, analysed and used. Here are the findings of new research from Wonkhe and StREAM.
October 03, 2023

StREAM

Kortext

Original article published 02/10/2023 on Wonkhe.com.

 

In the months ahead we’re expecting the conversation about the uses of learning engagement analytics to ramp up as the sector grapples with the challenge from government to demonstrate that it has the ability to support students in crisis following a debate over whether universities should have a statutory duty of care towards students.

Typically students have been closely involved in the conversation about the adoption of learning engagement analytics. Universities have needed to explain to students what data they are collecting, why, and what it means for their success, and engage with student opinions on their level of comfort with the idea. In some universities, students have access to their own data dashboard so they can see their own engagement patterns.

Universities considering whether to adopt, or extend their use of, learning engagement analytics will need to open up a conversation with their student body about it – partly for obvious ethical and practical reasons around data security and consent – but also to give the strategy the best chance of success through designing it with students in mind.

To help inform those conversations StREAM by Kortext and Wonkhe commissioned Cibyl to undertake a poll of students to explore their views of learning engagement analytics. A nationally representative sample was invited to take part in the survey in June 2023. Once the data had been cleaned the final sample was 496.

Download the full deck of our findings >

 

Background

Students generate a lot of data as they go about their daily lives at university, logging into university online platforms, engaging with learning resources, tapping into campus buildings, or signing up for meetings and activities. Learning engagement analytics, at its most straightforward, seeks patterns in that data to better understand students and adjust provision and support accordingly.

While the principle is very straightforward the practice of implementing learning engagement analytics requires careful thought. Students’ engagement can indicate lots of things: interest and motivation in their course, preferences for one mode of learning or type of learning resource over another, or patterns in personal wellbeing or life challenges that are shaping their capacity to engage. Making sense of the patterns requires deciding what is worth measuring and what not, and then building a picture of how the patterns link to outcomes such as retention, academic performance, or student wellbeing.

The StREAM platform aggregates the data a university has decided is relevant, applies weightings, and generates an engagement “score” for each student in near real time. The data can be viewed at cohort level, over time periods, or can generate a flag if a student is exhibiting persistent low engagement or if their engagement pattern has changed.

It has been really exciting in recent years to see how different universities are using learning engagement analytics – for example, to conduct research on the links between patterns of engagement and student outcomes, to indicate where students may need additional academic support, to flag and proactively support students who might be at risk of leaving their course, or in crisis, and to track and evaluate the impact of academic coaching programmes.

We wanted to test students’ views of what kinds of learning activities are the most meaningful to them, and their opinions about the validity of gathering data about these activities to inform student support, and improve the learning environment.

 

Students support the use of learning engagement analytics

A key finding was that the students who responded overwhelmingly supported the use of learning engagement analytics – 80 per cent of respondents said at the conclusion of the survey that it was a good idea, and only 11 per cent said it was not.

The students who were in favour explained their answer by citing the prospect of improving student support by identifying students who might not necessarily seek help themselves. Some also approved of the principle of analysing the available data – as one commented, “The data is there – as an economist it would be a crime not to analyse it!” Others liked the idea of being able to track their own engagement and use the data to inform their understanding of what is expected and track their progress.

Of the few that were not supportive of the use of learning engagement analytic data some explained they had concerns about intrusion, and the risk that universities would sell their data to third parties. Indeed, respondents generally wanted to be assured that their privacy would be protected, and that their data would be secure. But a larger proportion of those who were not supportive of the use of engagement analytics were worried about the way that their engagement patterns might differ from the “norm” because of their personal circumstances – disability, or wider life complexity.

We also found that most students – two thirds or more – are supportive of the range of ways that learning engagement analytics could be used. 71 per cent thought engagement data should be used to help universities understand what additional academic support individual students might need. 66 per cent thought it could be used to inform support for specific demographic groups and the same proportion felt it should be used to identify and support less engaged students who might be experiencing low wellbeing or at risk of leaving their course.

70 per cent were supportive of the idea that engagement data could be used to monitor the quality of teaching and learning (though we’d caveat this finding that we’re not necessarily advocating it be used for this purpose, only observing that students would probably be supportive if it were).

Students demonstrated a great deal of nuance in their response to the idea of learning engagement analytics. This suggests that opening up a conversation with students about what data is collected, and why, and how it might most effectively contribute to their success, could really help to inform university thinking about where learning engagement analytics can sit in their wider student academic and wellbeing support strategies.

 

Student academic engagement and success

Students most readily seem to understand the value of learning engagement analytics in relation to support for academic success. Improving academic support for individual students was the most popular use-case for learning engagement analytics, with 71 per cent of respondents supportive of using the data for this purpose.

When offered a range of possible interventions in a hypothetical scenario in which the system had flagged them as exhibiting “low engagement,” “an appointment with a personal tutor to discuss academic progress and refer you for additional support if needed” was the most popular response, chosen by 65 per cent of respondents. This suggests both that students associate “low engagement” with academic progress, and that they see the personal encounter with a tutor as the most appropriate remedy for the problem – above an email, call or text either from a personal tutor or from a central student support team.

We found that students consider most kinds of university activity meaningful at least to some extent in helping them do their best work while at university, but it was clear that students tend to rate activities linked directly to teaching and learning, and those that relate to in-person attendance at scheduled teaching the most meaningful.

Two thirds rated attendance at small-group sessions as very meaningful, more than visiting the library (60 per cent), accessing course pages on the VLE (54 per cent), or attending optional teaching sessions (37 per cent). Fewer rated extra-curricular or student service related activities like accessing advice or wellbeing services (45 per cent), or attending academic society events (31 per cent), as very meaningful. Although the nearly a fifth (19 per cent) of students who said that purchasing items from a campus shop or catering outlet was very meaningful to helping them do their best work reminds us that for some the value of a timely coffee to keeping the ideas flowing should not be underestimated.

Generally, most students were comfortable or somewhat comfortable with universities collecting data on the frequency of their engagement in the range of different activities, but the degree of comfort was associated with how meaningful they found them to helping them do their best work.

Students were more divided on the question of whether they felt they would benefit from a student engagement dashboard that could show them a student engagement score based on their learning behaviours. The largest number – 37 per cent – said they would find it motivating to track their progress and encourage them to do more, but 31 per cent said they would find it stressful and worry they were not doing enough.

This speaks to a need to design a student-facing dashboard carefully in consultation with students – universities might consider, for example, whether cohort comparisons could trigger a stress response, as opposed to students simply tracking their own patterns of engagement. Either way, clear communications with students when introducing a dashboard will be needed, including what insight it can provide that can help them make good decisions about their learning.

 

Student wellbeing and retention

Although students were most supportive of the academic support use-case for engagement analytics, it’s also clear that for a lot of students academic progress, engagement, and wellbeing are intrinsically linked. 66 per cent supported the use of engagement analytics to identify and support students who might be experiencing low wellbeing or at risk of leaving their course.

We asked students to complete the sentences, “when I’m engaged in my studies I feel…” and “when I’m not engaged in my studies I feel…” with the option of listing up to three words in each case. Students said when they are engaged they feel “motivated”, “productive” and “happy” whereas when they are not engaged they feel “bored”, “anxious” and “stressed.”

These are just the top three responses in each case – you can see the full range of answers in the word clouds in the results deck. The most popular answers suggest that changes in engagement pattern could usefully indicate issues with wellbeing, and that any follow-up intervention should be sensitive to this. It also points to a potentially productive and positive link between supportive learning cultures and student wellbeing. However, a few students reported that when they are engaged in their studies they feel “worried” “lost” or “overwhelmed” – and it could be valuable to also explore the contexts where students experience an anxiety response to learning that could actually be harming their engagement or academic progress, and considering what structural or individual factors might be in play in these moments.

The link to wellbeing is potentially especially pertinent when considering the experiences of disabled students. In the study, a larger proportion of disabled students were supportive of the use-cases for engagement analytics relating to academic and wellbeing support but disabled students were also more likely than non-disabled students to say that using learning engagement analytics is a bad idea. Diving into the qualitative comments, we see some disabled students supportive of the idea of a university reaching out to a student who may be struggling – as one said, “As someone with social anxiety, I’ve had times where I couldn’t bring up the courage to attend lectures and ended up falling behind.”

Others, however, are concerned about their university’s attitude to their disability and whether monitoring their engagement could exacerbate prejudice or less inclusive policies. One student said, “I had an experience where I was having physical health issues…I was called in for a meeting about my absences and was told that I must improve it or may have to discontinue the course, even though I explained myself and was getting consistently good marks on coursework and assignments.”

Increasingly we see universities updating and reviewing attendance and engagement policies to be less punitive and more supportive of students – this may be particularly important in giving disabled students the reassurance that they understandably need if they are going to be able to support the use of engagement data.

There is more complexity to effective use of engagement data for analytics than we were able to cover in one survey. But it’s clear that students grasp the principle, and are broadly supportive of it. There is also, however, much we can learn from the students that do not feel able to support the use of engagement analytics – whether that’s grounded in politics, personal preference, or experiences of marginalisation. Through building a sustained dialogue with students, not only about data and analytics, but about their engagement and learning behaviours and the feelings associated with learning, universities can be sure their student success and wellbeing strategies are as impactful as they can be. We hope where engagement analytics form part of those institutional strategies that these research findings can offer helpful insight and support for that conversation.

This article is published in association with Wonkhe. You can download the full deck of our survey findings.