Quantcast
Channel: Universities South Africa
Viewing all articles
Browse latest Browse all 153

Senior academics’ insights shared at the postgraduates and AI workshop

$
0
0

Universities and AI are such a hot topic, generating so many differing views and queries that a recent webinar on it triggered a stream of conversations in the Chat facility.

This was at Universities South Africa’s (USAf’s) Community of Practice on Postgraduate Education and Scholarship meeting on 8 May, which featured a workshop titled Future-Proofing Postgraduate Education in the Age of AI.

Led by Professor Sioux McKenna (right), a professor of Higher Education Research at Rhodes University, this workshop attracted 159 participants. 

Discussions ranged far beyond whether AI is good or bad. They led to the contested use of AI detectors and the importance of students declaring how and where they have used AI in their theses.

This is an edited version of the discussion.

QUESTION 1: Dr Louis du Plessis, Quality Specialist in the office of the DVC: Research, Innovation and Engagement at the Durban University of Technology (DUT): My role is primarily one of quality assurance in postgraduate studies. I often deal with people stuck in the old system. Trying to get them to engage with the fundamentals of what we are trying to get out of our students is problematic, because all they think of is ‘my KPIs (key performance indicators) say I have to graduate X number of students’. But the key focus of our engagements with our students, whether it’s on AI or data collection, concerns what we are trying to do.

Professor McKenna: Exactly. But having said that, academics feel they are being managed into oblivion. We’ve structured the university in such an instrumentalist way that we have neglected these big conversations. We have done some of this to ourselves and replicated it in the classroom, where there’s so little opportunity for students to experiment. There’s no time in the syllabus to do that intellectual gameplaying fundamental to knowledge-making. In some ways, perhaps generative AI is just throwing up stuff that has been problematic for a while.

QUESTION 2: Professor Lynn Morris (left), Deputy Vice-Chancellor of Research and Innovation at the University of the Witwatersrand (Wits): AI is where students outsmart us as academics. This is about postgrads, but we’re also all grappling with the undergrad issues. In the postgrad space, it’s much clearer: we want our postgrad students to use these tools because they’re useful. Those of you who have used AlphaFold (which predicts how proteins interact with each other) know these are phenomenal tools that will help our research. 

But I would like some help and advice: do we need brand new policies, or do we adapt our existing policies? If yes, how rigid should they be? We are getting students to sign a declaration to say which AI tools they’ve used, because we also don’t want to give them the impression they can’t use AI. So what tools did they use and what did they use them for? That might become a standard part of a thesis. The other issue is the oral exam. Must we be probing students deeply about what is theirs and what isn’t? 

McKenna: Everything I share today is simply my personal views and can of course be contested, but when it comes to policy, in South Africa, we think policies are laws, and so we want to pin down the nitty gritty details as if they were laws. What we need are principles, and they might be written into a policy, but that goes back to the bigger conversation of ‘what do we want’? Drawing on the HEQSF (Higher Education Qualifications Sub-Framework) and drawing on the doctoral standard, “this is who we want you to be’. AI can enhance that process, or it can constrain it, to the extent you are using AI in ways that prevent you from making personal meaning, and to the extent you present what you get from AI as texts you’ve written or as data that you have personally analysed when you didn’t. That’s problematic.

We’re a country that has a gazillion policies in our universities that nobody reads, that are nitpicking details and are far too long. We are in an era of flux. It would be much more useful to have a short document, called a policy, which encapsulates fundamental principles underpinning our understanding of AI.

About vivas (oral defence of theses). They shouldn’t be set up as a space to catch students who’ve used AI in problematic ways. The literature makes very clear that if there are any doubts by the examiners as to authenticity and authorship at the stage of examining the written text, it needs to be dealt with before the viva takes place.

I want to talk very briefly about the real problem of using AI detection. It’s worth noting that Turnitin (an AI writing detector) is banned in many US colleges because US students are litigious. They sue academics and universities. But in South Africa, we have been less critical of our use of Turnitin. Very significantly, the Turnitin AI detection tool provides reports to the academic, and not to the student. Yet the moment we start surveilling our students to the extent that they don’t even know what we are watching them for, we no longer have an educational relationship. Large language models are also starting to recognise when a student is a second-language English speaker. They become more critical and are likely to flag more false positives of AI use.

QUESTION 3: Mr Bulumko Mapukata (right), Internationalisation Officer, University of Fort Hare: Professor Morris raised the point about encouraging students to declare that ‘These are the AI tools I’ve used, and for this particular purpose’. Is that a method they [at Wits] are adopting, or are they proposing it?

Morris: Students are asked. This is just in our Faculty of Science now, and we plan to roll that out to our other faculties. It’s about research integrity in tech and people using it responsibly. 

Professor Brett Bowman, Head of Postgraduate Strategy in the Research and Innovation Office at Wits: We’ve asked for full transparency and have embedded fundamental principles in our Senate Standing Orders. But we defer to the faculties, because biochemists use it differently from, say, literature students. 

QUESTION 4: Laura Dison (left), Associate Professor in the Curriculum Division at the Wits School of Education and Assistant Dean for Teaching and Learning in the Faculty of Humanities: “There’s a lot to discuss about how we need students to reflect on their learning processes alongside intellectual understanding. I just wondered if there’ve been any developments in this area. It would mean we’d have to shift our assessment practices significantly, but it’s very hard given our contextual constraints.

McKenna: I don’t think there’s a simple solution. Ultimately, we will have to be much more innovative, not by making AI-proof assignments, but rather by having conversations with our students about the value of knowledge and getting them opportunities to engage creatively with it.

Professor Stephanie Burton (right) of the University of Pretoria and chair of the hosting CoP PGES, said she was pleased to see they were now using the word “opportunity” rather than “threat” when referring to AI. She said they were all on the learning curve and thanked Professor McKenna for taking them further along that curve.

Professor Burton alerted delegates to another webinar on Institutional A1 Policies and Guidelines in SA for Learning and Teaching, that USAf’s Community of Practice on Digital Education in Learning and Teaching had planned for 22 May.

Gillian Anstey is a contract writer for Universities South Africa.

The post Senior academics’ insights shared at the postgraduates and AI workshop appeared first on Universities South Africa.


Viewing all articles
Browse latest Browse all 153

Trending Articles