Macau University of Science and Technology

Guidelines on Artificial Intelligence Application and Ethical Governance

(Updated in August, 2025)


 I.                   Guidance for Students

 For All students (both undergraduate and postgraduate)


Important note

 

Please be mindful that employing AI tools to generate an assignment, or any portion thereof, and subsequently presenting it as your own work constitutes academic misconduct. The University, in general, embraces the usage of AI tools in teaching, learning, and research, but strictly forbids academic misconduct (the employment of unfair practices during any form of assessment). Examples of misconduct include, but are not limited to, plagiarism, self-plagiarism (submitting the same work for credit twice, either at the same institution or different institutions), collusion, falsification, cheating (including contract cheating, wherein a student commissions another individual to produce or edit his/her work), deceit, and personation (impersonating another student or allowing someone else to impersonate a student during an assessment or examination).

It is important that you are fully aware of:

The use of generative AI tools is not to replace independent and critical thinking in the processes of teaching, learning and research, although generative AI tools are powerful and helpful;

The limitations inherent in the AI tools you are using:

*AI tools make predictions based on patterns learned from large datasets. These datasets may contain flaws, inaccuracies, biases, and limitations. They also have limited information about the world and events beyond a certain timeline.

*AI tools function as language machines rather than comprehensive knowledge databases. Avoid relying solely on AI-generated content as a primary source; it should be used alongside other reliable sources.

*Critical thinking and judging capability are of utmost importance to use AI generated content (AIGC).

*AI systems operate without a sense of morality and may generate offensive or misleading content without awareness of their implications.

   *Academic integrity and ethical implications of the use of AIGC should be considered carefully.

The factual accuracy of the content generated by AI tools:

*AI-generated text/software/code can have security vulnerabilities, bugs or errors. It is necessary for knowledgeable human review and iterative checks.

*AI assisted tools may produce fake citations and references.

 

   The risks of infringing on intellectual property (IP) rights in AI tools:

* AI is a double-edged sword. AI-generated software/code may utilize illegal libraries or calls, potentially infringing copyrights.

*hidden plagiarism can occur, as AI may use words and ideas from human authors without proper referencing, which qualifies as plagiarism.

*there is a risk of copyright infringement when using pictures or other copyrighted materials without obtaining consent by original producer(s).

*overly relying on generative AI may impair learning motivation and capability.

 

The risks of leaking sensitive university data or personal privacy

  *Strengthen data security awareness when using AI tools, ensuring that sensitive university data and personal privacy are better protected by using the university WeMust locally deployed models.

 

On the other hand, AI tools can be used (whilst recognizing its pitfalls) to enhance learning and research and teaching assignments.  The following are some of the numerous possibilities:

 

(i)  Incorporate generative AI tools into the curriculum:

 

 Generative AI tools can assist students at various stages of the learning process by providing explanations, generating content, references, and facilitating knowledge transfer.

   By exploring and experimenting with generative AI tools, students can gain insights into how AI technologies work and how they can be applied in different contexts.

   Using generative AI tools, students can develop practical skills, such as natural language processing, data analysis, and problem-solving, which are increasingly in demand in the workforce.

   Declaration should be explicitly made on how and to what extent the outcomes of generative AI, have been used in completing class assignments and projects.

▲  Use of AI tools should not be allowed, unless permission is granted in advance, in any kind of learning activity or assessment that will be counted toward the final grade of course or similar learning assessment.

▲  A student should be responsible for the act and its consequence associated with the misuse of AI tools in his/her academic work.   

 

(ii)  Use of AI tools in Research:

 

MUST adopts a stance that all members of the MUST community can benefit from a culture that promotes the effective and ethical use of AI. At the undergraduate level, a basic AI literacy is required, by taking specific credit-bearing courses or other training programs. At the PG or research level, AI tools can be incorporated into active learning methods, hands-on activities, and real-world projects:

 

  Real-world projects provide students with opportunities to tackle authentic challenges, develop problem-solving skills, and gain practical experience.

  AI tools can actively facilitate the exploration of new or complex topics through interactive explanations and referring resources that engage learners in the understanding process.

  Postgraduate students should have access to specialized training programs focused on advanced AI concepts and methodologies. AI tools, such as machine learning algorithms and data analysis software, can enhance postgraduate study by enabling advanced data processing, pattern recognition, and predictive modeling.

  Research opportunities, such as collaborative projects, internships, and industry partnerships, provide postgraduate students with hands-on experience and exposure to cutting-edge AI technologies.

  When incorporating AI tools into methodologies and data analysis techniques, it is crucial to avoid entering any personal, proprietary, or otherwise sensitive information into models or prompts.

   Please maintain a cautious attitude towards the data analysis results generated by AI tools. When necessary, use multiple verification methods to assess the results to ensure their accuracy.

 

II.  Guidance for Staff

 

 The University permits its staff to utilize AI tools in their professional endeavors and resultant outputs, on the condition that they refrain from asserting authorship over AI-generated work as their own original creation. AI and related digital technologies serve as exceptional tools, offering supplementary assistance in the process. In case of any inquiries, you are encouraged to seek guidance. The global community continues to explore myriad applications of emerging AI and other digital technology. All staff should have AI literacy and keep abreast of the development and application of AI technology and tools. 

 1.      Academics and staff engaged in teaching 

l  Academics and staff involved in educational instruction should engage in discussions with students to ensure their awareness of the University's policy and guidelines regarding the utilization of generative AI and other digital technologies. It is essential to communicate with clarity to students regarding the acceptable usage of AI tools within their particular academic context. 

l  Academics and staff, teachers in particular, should gain the knowledge and skills of how to detect or judge the ways and the extent of AIGC usage in students’ work, by taking regular training programs or workshops organized by the university and their faculty/school. 

l  Innovative pedagogies and diversified assessment methods, e.g., oral questions, short quizzes, cross-checking during classroom, should be adopted to ensure the originality and acceptability of student’s work. It is generally recommended that use of an AI tool for checking grammar not be allowed in a course with a learning outcome related to students’ writing skills and the assignment of take-home project or report work be avoided if it counts for a significant part of formal mark of a course, though it is subject to specific Faculty/School to determine the acceptable format and percentage mark for course assessment based on academic disciplinary characteristics and requirements. 


 2.      Academics and staff engaged in research

 Academics and staff engaged in research should understand the capabilities and limitations of AI tools, ensure ethical use, data privacy and network security. It is critical to maintain academic integrity by distinguishing and crediting AI-generated content; to foster collaboration, transparency, and continuous learning, staying updated on AI advancements, to fully engage in responsible AI usages and explore innovative applications for research and creative work. AI outputs in research results should be carefully evaluated, cross-verified with varied sources of reference while adhering to institutional policies and contributing to innovative research.

 

Be aware of the reliability and effectiveness of generative AI Tools:

     Cautions are necessary for the authenticity of the output of generative AI tools, and it is deemed appropriate to check the reliability and confirm the originality of AI-assisted work. 

  ▲    Despite new data management features, there are no guarantees of privacy or confidentiality in generative AI Tools. Treat input data as if it were public and avoid sharing personal, organizational, confidential, or copyrighted information. Strengthen data security awareness when using AI tools, ensuring that sensitive university data and personal privacy are protected by using the university WeMust locally deployed models. Please consult the WeMust team if there is any doubt. 

Examples of "prompts," which refer to the input text provided to the AI, can be found in the key references and Appendices of this paper. The following principles and best practices on the use of AI for academic and administrative staff are suggested:

  AI and related digital technologies should be used responsibly and ethically. 

  Academic staff should be provided with training on the use of AI tools to develop new courses to meet the educational challenges and adapting teaching pedagogies and assessment skills to meet individual student needs. Training should include seminars, practical workshops, tutorials, and online resources to support staff in integrating AI into their teaching and assessment practices. 

  Develop practical training programs and operational resources to support administrative staff in teaching, research, and administrative roles. 

▲  Academic and administrative staff should have basic training on effective and responsible AI application and ethical governance by taking up at least one training workshop relating to AI and/or digital technology and application organized by the University (e.g., EDC, HRO or ITDO) or Faculty/School per year. 

  AI can be used for formative and summative assessment, providing personalized feedback. 

In the Teaching and Learning assessment, AI tools can be employed to conduct equitable evaluations considering multiple factors like classroom performance, oral queries, cross verification or mitigating differential student access to AI text refinement capabilities. 

  Staffs should be aware of updated development in areas such as data management, AI tools and technologies, and online learning platforms, and updating course material related to interactive generative AI to keep the teaching current and relevant.