Emerging Technologies

Should universities be worried about the increasing capabilities of AI?

Person coding with MacBook Pro

If a piece of writing was 49 per cent written by AI, with the remaining 51 per cent written by a human, is this considered original work? Image: Unsplash/ Danial Igdery

Sarah Elaine Eaton
Educational Leader in Residence, Academic Integrity and Assistant Professor, University of Calgary
Michael Mindzak
Assistant Professor, Faculty of Education, Brock University
  • The use of technology in academic writing is already widespread, with teachers and students using AI-based tools to support the work they are doing.
  • However, as AI becomes increasingly advanced, institutions need to properly define what can be defined as AI-assistance and what is plagiarism or cheating, writes an academic.
  • For example, if a piece of writing was 49% written by AI, with the remaining 51% written by a human, is this considered original work?

The dramatic rise of online learning during the COVID-19 pandemic has spotlit concerns about the role of technology in exam surveillance — and also in student cheating.

Some universities have reported more cheating during the pandemic, and such concerns are unfolding in a climate where technologies that allow for the automation of writing continue to improve.

Over the past two years, the ability of artificial intelligence to generate writing has leapt forward significantly, particularly with the development of what’s known as the language generator GPT-3. With this, companies such as Google, Microsoft and NVIDIA can now produce “human-like” text.

AI-generated writing has raised the stakes of how universities and schools will gauge what constitutes academic misconduct, such as plagiarism. As scholars with an interest in academic integrity and the intersections of work, society and educators’ labour, we believe that educators and parents should be, at the very least, paying close attention to these significant developments.

AI & academic writing

The use of technology in academic writing is already widespread. For example, many universities already use text-based plagiarism detectors like Turnitin, while students might use Grammarly, a cloud-based writing assistant. Examples of writing support include automatic text generation, extraction, prediction, mining, form-filling, paraphrasing, translation and transcription.

Advancements in AI technology have led to new tools, products and services being offered to writers to improve content and efficiency. As these improve, soon entire articles or essays might be generated and written entirely by artificial intelligence. In schools, the implications of such developments will undoubtedly shape the future of learning, writing and teaching.

Misconduct concerns already widespread

In Canada, there is little data regarding the rates of misconduct. Research published in 2006 based on data from mostly undergraduate students at 11 higher education institutions found 53 per cent reported having engaged in one or more instances of serious cheating on written work, which was defined as copying material without footnoting, copying material almost word for word, submitting work done by someone else, fabricating or falsifying a bibliography, submitting a paper they either bought or got from someone else for free.

Academic misconduct is in all likelihood under-reported across Canadian higher education institutions.

There are different types of violations of academic integrity, including plagiarism, contract cheating (where students hire other people to write their papers) and exam cheating, among others.

Unfortunately, with technology, students can use their ingenuity and entrepreneurialism to cheat. These concerns are also applicable to faculty members, academics and writers in other fields, bringing new concerns surrounding academic integrity and AI such as:

  • If a piece of writing was 49 per cent written by AI, with the remaining 51 per cent written by a human, is this considered original work?
  • What if an essay was 100 per cent written by AI, but a student did some of the coding themselves?
  • What qualifies as “AI assistance” as opposed to “academic cheating”?
  • Do the same rules apply to students as they would to academics and researchers?

We are asking these questions in our own research, and we know that in the face of all this, educators will be required to consider how writing can be effectively assessed or evaluated as these technologies improve.

a chart showing the growth forecasts of AI
AI set to grow. Image: Statsita

Augmenting or diminishing integrity?

At the moment, little guidance, policy or oversight is available regarding technology, AI and academic integrity for teachers and educational leaders.

Over the past year, COVID-19 has pushed more students towards online learning — a sphere where teachers may become less familiar with their own students and thus, potentially, their writing.

While it remains impossible to predict the future of these technologies and their implications in education, we can attempt to discern some of the larger trends and trajectories that will impact teaching, learning and research.

Have you read?

Technology & automation in education

A key concern moving forward is the apparent movement towards the increased automation of education where educational technology companies offer commodities such as writing tools as proposed solutions for the various “problems” within education.

An example of this is automated assessment of student work, such as automated grading of student writing. Numerous commercial products already exist for automated grading, though the ethics of these technologies are yet to be fully explored by scholars and educators.

Overall, the traditional landscape surrounding academic integrity and authorship is being rapidly reshaped by technological developments. Such technological developments also spark concerns about a shift of professional control away from educators and ever-increasing new expectations of digital literacy in precarious working environments.

These complexities, concerns and questions will require further thought and discussion. Educational stakeholders at all levels will be required to respond and rethink definitions as well as values surrounding plagiarism, originality, academic ethics and academic labour in the very near future.

Loading...
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Education, Gender and Work

Related topics:
Emerging TechnologiesEducation and Skills
Share:
The Big Picture
Explore and monitor how Education, Gender and Work is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

We asked 6 tech strategy leaders how they're promoting security and reliability. Here's what they said

Daniel Dobrygowski and Bart Valkhof

November 19, 2024

Shared Commitments in a Blended Reality: Advancing Governance in the Future Internet

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum