Navigating the terrain: open AI vs trusted AI in higher education
November 16, 2023

Melissa Bowden

Kortext

Recent advances in generative artificial intelligence (AI) have paved the way for transformative applications across many sectors. From these, two branches have emerged – open AI and responsible (or trusted) AI.

In this blog, we’ll explore the differences between them and discuss their implications for teaching and learning in higher education.

Is AI good?

Decoding open AI

Core concepts

First, what do we mean by ‘open AI’? For this blog, we mean generative AI tools built on large language models (LLMs), such as ChatGPT and Google Bard.

We’re aware that generative AI tools can also produce audio, images, videos, simulations and code, but our overall focus here will be on text generators. These tools have been trained on vast amounts of data from the Internet, enabling them to respond to prompts with human-like outputs.

It’s worth pointing out that AI tools have existed for some time; we take them for granted in everyday life – from search engines to grammar checkers to navigation apps. However, technological advances have resulted in a proliferation of new generative AI tools.

Creative potential in education

Their impact has been the topic of much debate in higher education over the last few months with the launch of ChatGPT on 30 November 2022 acting as a catalyst for change.

A Jisc report found that students have been quick to embrace the benefits of these tools, using them to help with many tasks, including planning, writing, translating, revising, and researching.

Meanwhile it’s suggested that generative AI tools could be used by staff for tasks like lesson planning, designing quizzes, customising materials and providing personalised feedback to students.

Ethical considerations and pitfalls

But, there’s a catch – or rather several catches.

The data on which LLMs are trained is gathered from a wide range of sources on the Internet, including those that are factually incorrect.

These sources may contain human biases and stereotypes, which could ‘inadvertently be perpetuated by the AI system, leading to biased outputs’, according to research.

The outputs from text generators can seem so human, you’d be forgiven for thinking that they actually understand your question. In fact, standard ChatGPT simply predicts the next word in a sequence and, Jisc says, ‘it has no concept of whether the text it is producing is correct’.

Consequently, some tools have been known to produce ‘hallucinations’, or plausible untruths, such as creating references to non-existent journal articles.

All of these issues have resulted in universities being concerned about the impact of generative AI tools on student assessment, with fears they could facilitate cheating, encourage plagiarism, and undermine academic integrity.

Meanwhile, students have called for sector-wide guidelines on how these tools can be used legitimately in education, amid worries about false accusations of plagiarism from AI detection software.

Best AI tools

Understanding trusted AI

Core concepts

There have been moves – in the education sector and beyond – to regulate AI, enabling a shift from an open model to a more trusted model.

Within higher education, the Russell Group issued principles on the ethical and responsible use of generative AI. These five principles support the effective and appropriate use of AI tools within universities to enhance teaching practices and student learning experiences.

More broadly, Microsoft outlined six key principles for responsible AI: accountability, inclusiveness, reliability and safety, fairness, transparency, and privacy and security. It says these principles ‘are essential to creating responsible and trustworthy AI as it moves into mainstream products and services’.

Significance of trusted AI

Students want trusted AI incorporated into their education to prepare them for an AI-driven world.

A Jisc report found that students were concerned about ‘missing out on developing the AI skills they might require in future employment opportunities’. They want universities to ‘bridge the gap’ between current curricula and the evolving job market.

This includes institutions providing ‘training and guidance on responsible and effective AI use’, so students can gain a deeper understanding of these tools. They also ‘expressed a desire for institution-recommended tools that they can trust’, rather than relying on recommendations from friends.

Applications in higher education

There are positive case studies emerging on the responsible integration of generative AI within higher education.

In the Hong Kong University of Science and Technology (HKUST), students in the business management school ‘now use AI to design, create and then deconstruct a case study’, rather than discussing a case study prescribed by the school, which develops their critical thinking skills.

In addition, HKUST staff are designing an AI platform that will enable academics ‘to generate lesson plans, quizzes and other course design elements’. The bespoke chatbot will be trained by the university with relevant data, and will ask guiding questions if academics are struggling with a task.

At this point you might be thinking, ‘that’s all very well, and I’d love to do something similar in my institution, but I just don’t have the time’.

So what’s the solution? We have an answer.

Bridging the gap

Introducing Kortext Premium

Kortext Premium enables institutions to equip students with the capabilities of generative AI tools within a safe environment.

Our suite of next-generation AI-powered study tools is seamlessly integrated into the Arcturus platform as an upgrade to Kortext Essential.

Kortext Premium’s trusted AI tools can summarise content, generate insightful study notes and create interactive Q&A in seconds. In addition, students can translate text into over 100 languages, and produce references in all key styles to incorporate into assignments.

Balancing trust and innovation

Kortext has worked extensively with Microsoft, leveraging the latest technologies to embed the principles of responsible AI into the development of our innovative study tools.

Unlike open AI, our GPT-based tools are only applied to trusted content – academic texts that have been provisioned to a student by their institution and made available through the Kortext platform.

Students can’t see or amend the prompts that instruct the AI tools. To mitigate against plagiarism concerns, the outputs are repeatedly verifiable with different users getting similar results on the same text.

Our quality assurance process is rigorous, with staff checking that the AI-generated content is accurate. Based on that, we hone our prompt design to ensure there are no hallucinations in the outputs.

Kortext Premium is being adopted by Anglia Ruskin University London, Middlesex University and The University of Manchester, amongst others. We’re excited to discover how students will use our trusted AI-powered study tools to enhance their learning.

Conclusion: future perspectives

As the world embraces the capabilities of generative AI, it’s crucial for higher education institutions to prepare their students to navigate this new landscape.

Balancing trust and innovation will be pivotal in order to harness the full potential of these tools safely.

By understanding their nuances, universities can traverse this ever-evolving terrain with confidence, ensuring that AI serves as a valuable asset in their all endeavours.

To find out more about anticipated developments with generative AI in higher education, tap here to sign up for our winter webinar!

Related posts

Open access books on sustainability from the Kortext Open Resources Collection.
Oct 30 2024

Five open access eBooks on sustainability

Read our blog to discover which free-to-access eBooks on sustainability...
Black History Month 2024 Reclaiming Narratives
Oct 01 2024

Inspiring reads for Black History Month 2024

To celebrate Black History Month 2024, we’ve handpicked a list of...
Hand pointing towards arrows
Aug 06 2024

Kortext Open Resources Collection: a year in review

Read this blog exploring how the ever-growing collection of open access...
Student in yellow jumper
Jul 09 2024

Five ways Kortext supports accessible learning

Discover how Kortext is transforming digital learning for all students...
laptop showing kortext
Jun 28 2024

Kortext’s pricing and library support explained

Kortext offers transparent, flexible pricing tailored to your institution’s...
man using laptop
Jun 27 2024

Kortext’s eTextbook and eBook licence models

Did you know that we’re not just 1:1 eTextbook supplier? Kortext offers...
Anna França - Head of Collections and Archives at Edge Hill University,
Jun 18 2024

Connecting with the community: Edge Hill University Library

Read what Anna França, Head of Collections and Archives at Edge Hill...
Jun 05 2024

Celebrate Pride with diverse reading – for free!

For Pride Month, we’ve compiled our top ten non-fiction books from...
Leading the future of executive MBA education - Peter Rogers
May 22 2024

Leading the future of executive MBA education

Read our blog with Prof. Peter Rodgers from the University of Southampton...
Andrew Bates image
May 17 2024

From classroom to boardroom: harnessing AI in MBA education

Read our interview with Andrew Bates, Chief Product Officer, for an exclusive...