Global EditionASIA 中文双语Français
China Daily Global / 2025-07 / 10 / Page007

Tech calls the shots

China Daily Global | Updated: 2025-07-10 00:00
Share
Share - WeChat

Singapore universities reported fewer cases of AI plagiarism, but experts warn of risks

Editor's note: In this weekly feature China Daily gives voice to Asia and its people. The stories presented come mainly from the Asia News Network (ANN), of which China Daily is among its 20 leading titles.

The number of students caught plagiarizing and passing off content generated by artificial intelligence as their own work remains low, the public universities here said following a recent case at the Nanyang Technological University.

But professors are watching closely for signs of misuse, warning that overreliance on AI could undermine learning. Some are calling for more creative forms of assessment.

Their comments follow the Nanyang Technological University's decision to award three students zero marks for an assignment after discovering they had used generative AI tools in their work.

The move drew attention after one of the students posted about it on online forum Reddit, sparking debate about the growing role of AI in education and its impact on academic integrity.

All six universities here generally allow students to use generative AI to varying degrees, depending on the module or coursework. To uphold academic integrity, students are required to declare when and how they use such tools.

In the past three years, the Singapore Management University recorded "less than a handful" of cases of AI-related academic misconduct, it said, without giving specific numbers. Similarly, the Singapore University of Technology and Design encountered a "handful of academic integrity cases, primarily involving plagiarism", during the same period.

At the Singapore University of Social Sciences, confirmed cases of academic dishonesty involving generative AI remain low, but the university has seen a "slight uptick" in such reports, partly due to heightened faculty vigilance and use of detection tools.

The other universities — the National University of Singapore, the Singapore Institute of Technology, and the Nanyang Technological University — did not respond to queries about whether more students have been caught for flouting the rules using AI.

Recognizing that AI technologies are here to stay, universities said they are exploring better ways to integrate such tools meaningfully and critically into learning.

Generative AI refers to technologies that can produce human-like text, images or other content based on prompts. Educational institutions worldwide have been grappling with balancing its challenges and opportunities, while maintaining academic integrity.

Faculty members have flexibility to decide how AI can be used in their courses, as long as their decisions align with university-wide policies.

The National University of Singapore allows AI use for take-home assignments if properly attributed, although instructors have to design complex tasks to prevent overreliance. For modules focused on core skills, assessments may be done in person or designed to go beyond AI's capabilities.

At the Singapore Management University, instructors inform students which AI tools are allowed, and guide them on their use, typically for idea generation or research-heavy projects outside exams.

The Singapore Institute of Technology has reviewed assessments and trained staff to manage AI use, encouraging it in advanced courses like coding but restricting it in foundational ones, while the Singapore University of Technology and Design has integrated generative AI into its design-thinking curriculum to foster higher-order thinking. The idea is to teach students when AI should be a tool, partner or avoided.

The universities said students must ensure originality and credibility in their work.

Students interviewed by The Straits Times, who requested anonymity, said AI usage is widespread among their peers.

"Unfortunately, I think that (using generative AI) is the norm nowadays. It has become so rare to see people think on their own first before sending their assignments into ChatGPT," said a 21-year-old fourth-year law student from the Singapore University of Social Sciences.

Still, most students said they have a sense of when it is appropriate to use AI and when it is not. Many said they use it mainly for brainstorming, collating research, and sometimes while writing.

A 20-year-old fourth year economics student from the Nanyang Technological University said he does not see AI as anything more than a "really smart study buddy" that helps him clarify difficult concepts, similar to how one would consult a professor.

A third-year political science student at the Singapore Management University, 22, said she uses AI to fix her grammar before submitting her essays, but draws the line at copying essays entirely from ChatGPT.

But some students said they would turn to AI to quickly complete general modules outside their specializations that they feel are not worth their personal effort.

AI may improve efficiency, but there is a "level of wisdom that needs to come with that usage", said a third-year public policy and global affairs student from the Nanyang Technological University.

The 21-year-old said she would not use ChatGPT for tasks that require her personal opinion, but would use it "judiciously" to complete administrative matters.

Other students said they avoid relying too much on AI, as they take pride in their work.

A 23-year-old third year computer science student from the Singapore University of Technology and Design said he wants to remain "self-disciplined" in his use of AI because he realizes he needs to learn from his mistakes in order to improve academically.

Creativity needed

Academics say universities must bring AI use into the open and rethink assessments to stay ahead.

The Singapore Management University Associate Professor of Marketing Education Seshan Ramaswami embraces AI tools, but with caveats.

He has encouraged students to use AI, provided they submit a full account of how tools were used and critique their outputs.

He also uses AI tools to create practice quizzes, and a chatbot that allows students to ask questions about his class materials. But he tells them not to "blindly trust" its responses.

The real danger lies in uncritical AI use, he added, which can weaken students' judgment, clarity in writing or personal integrity.

Ramaswami said he is "going to have to be even more thoughtful about the design of course assessments and pedagogy".

He may explore methods like "hyper-local" assignments based on Singapore-specific contexts, oral examinations to test the depth of understanding, and in-class discussions where devices are put away and ideas are exchanged in real time.

Even long-standing assessment formats like individual essays may need to be reconsidered, he said.

Thijs Willems, a research fellow at the Lee Kuan Yew Centre for Innovative Cities at the Singapore University of Technology and Design, said that while essays, presentations and prototypes still matter, these are no longer the sole markers of achievement.

More attention needs to be paid to the originality of ideas, the sophistication with which AI is prompted and questioned, and the human judgment used to reshape machine output into something unexpected, he said.

These qualities "surface most clearly in reflective journals, prompt logs, design diaries, spontaneous oral critiques, and peer feedback sessions", he added.

Singapore University of Social Sciences Associate Professor Wang Yue, head of the Doctor of Business Administration Programme, said undergraduates should already have basic cognitive skills and foundational knowledge.

"AI frees us to focus on higher-order thinking like developing insights and exercising wisdom," she said, adding that restricting AI would be counterproductive to preparing students for the workplace.

Call for critical thinking

The same speed that makes AI exciting is also its potential hazard, said Willems, warning that learners who treat it as a "one-click answer engine" risk accepting mediocre work and weakening their own understanding.

The key is to focus on the quality of human and AI interaction, he said. "Once learners adopt the stance of investigators of their own practice, their critical engagement with both technology and subject matter deepens."

Jean Liu, director of the Centre for Evidence and Implementation and adjunct assistant professor at the Yong Loo Lin School of Medicine, said that while AI offers major advantages for learning, universities must clearly define the line between acceptable use and academic dishonesty.

"AI can act as a tutor who provides personalized explanations and feedback … or function as an experienced mentor or thought partner for projects," she said.

But the line is drawn when students allow AI to do the work wholesale.

"In an earlier generation, a student might pay a ghost writer to complete an essay," Liu said. "Submitting a ChatGPT essay falls into the same category and should be banned.

"In general, it is best practice to come to an AI platform with ideas on the table, not to have AI do all the work. Helping students find this balance should be a key goal of educators."

Universities must be upfront about what kinds of AI use are acceptable for students, and provide clearer guidance, she added.

Jason Tan, associate professor for policy, curriculum and leadership at the National Institute of Education, said the rise of AI is testing students' integrity and sense of responsibility.

Overreliance on AI tools could also erode critical thinking, he added.

"Students have to decide for themselves what they want to get out of their university education," he said.

The Straits Times, Singapore

 

From left: Posters out of the Fluid Interfaces group are seen at the MIT Media Lab in Cambridge, Massachusetts, on June 25. A researcher attaches electrodes to an electroencephalogram cap used to monitor the brain activity of subjects in a study tracking the cognitive cost of using ChatGPT at the MIT Media Lab on June 25. Researchers found lower brain engagement in the people who used ChatGPT than those who used Google or no technology to write their essays. THE WASHINGTON POST/GETTY IMAGES

 

 

A sign at the entrance to the National University of Singapore reads "To Seek, Strive and Excel". CORBIS/GETTY IMAGES

 

 

Most Viewed

Top
BACK TO THE TOP
English
Copyright 1994 - . All rights reserved. The content (including but not limited to text, photo, multimedia information, etc) published in this site belongs to China Daily Information Co (CDIC). Without written authorization from CDIC, such content shall not be republished or used in any form. Note: Browsers with 1024*768 or higher resolution are suggested for this site.
License for publishing multimedia online 0108263

Registration Number: 130349
FOLLOW US