Experimental spaces for teaching

Artificial intelligence (AI) is part of everyday university life – including in studies and teaching. “There are still uncertainties, but now we are talking much more frequently about opportunities,” says Professor Pablo Pirnay-Dummer, the Prorector for Studies and Teaching. The initial fears were mainly that students would be able to cheat by having chatbots write their term papers. “The problem is, how can you check whether AI has been used? We need to concentrate more on highlighting its potentials and creating more binding framework conditions for using it.”
These framework conditions were first defined in a 2023 guideline and were adapted to current developments in 2025. The working group “AI in Studies and Teaching” developed the foundations for this as part of the Rectorate Commission on the Future of Studies and Teaching, which is led by Pirnay-Dummer. The working group is made up of instructors from every faculty at the university as well as administrative staff members and students. The central message: AI is allowed at the university in principle, however the instructors should decide at their own discretion on the extent of its use in courses and in written examinations. “Our position is that AI can provide support for studying and teaching and that we should use it for experimental spaces,” explains Pirnay-Dummer.
According to Pirnay-Dummer, instructors could benefit from generative AI as it can provide inspiration for the didactic preparation of content, assist in the formulation of supplementary exam questions, and provide creative impetus for the development of new teaching materials. And AI could help students create, translate and improve texts, explain complex topics, and provide constructive feedback. “We are finding that students are using AI more and more as a sparing partner in that they discuss their own solutions with the chatbot,” says Pirnay-Dummer. However, AI-generated content requires a healthy portion of scepticism. Therefore, it is the responsibility of the university and instructors to educate students in how to use AI.
Appropriately assessing performance when students use AI remains a challenge. The guideline suggests fundamentally rethinking unsupervised forms of testing, such as term papers and theses. “Some forms of assessing basic skills will certainly remain AI-free, for example in medicine or law,” says Pirnay-Dummer. “But it may already be useful to provide targeted support for reflective and productive assignments with AI tools.” For example, consideration could be given to whether additional oral interviews or group discussions would be a viable way to check if students have truly understood the content of their assignments.
This is also echoed by mathematician Professor Rebecca Waldecker. “Judging whether an assignment was done independently or was just copied is not a new phenomenon. A personal conversation was a way to provide clarity even before the rise of ChatGPT,” says the professor of algebra at the Institute of Mathematics who is also a member of the working group. She explicitly invites her students to engage with AI. “We should not underestimate the fact that it takes skills to work with AI. If you do not create the right prompts, you won’t get usable results. And even these have to be scrutinised critically.”
Waldecker regularly experiences this herself and publishes conversations with ChatGPT about mathematics on her own website. There her students can read what AI says, for example, about the ring of integers. “At first glance, there is little to criticise. But if you ask for details, things take a strange turn. This is when the chatbot starts providing answers that are incorrect. If you ask for proof of certain statements, AI gets more and more entangled in contradictions.” This can become problematic for inexperienced users because of the confident tone that AI uses to deliver incorrect results and list terms that have no place in that context, says Waldecker.
Despite this, Rebecca Waldecker emphasises the benefits of using AI. “AI provides unexpectedly creative ways to engage with issues. It enables new forms of group work and helps to structure tasks and organise work efficiently,” she says. “That can really enrich education and close the distance between instructors and students. “AI can be a helpful tool for instructors when preparing lectures or seminars and can support their own creativity, says Waldecker. “Sometimes it is the mundane things that I value most about AI, such as being able to create a presentation from bullet points.”
In a survey initiated by the working group “AI in Studies and Teaching” in summer semester 2025, well over 50 per cent of students at MLU reported having regularly used AI-based language models in their studies. “It’s not just a passing fad like some predicted,” says Dr Michael Gerth, spokesperson for the working group and managing director of the “Center for mediaenhanced Learning and Teaching” at MLU (LLZ). As a central institution of the university, the LLZ promotes the use of multimedia tools and formats for studying and teaching. Specifically, this involves supplementary digital methods for traditional in-person classes, like offering digital exams or video recordings that include AI-supported transcription. When it comes to digitisation in teaching, MLU has its finger on the pulse.
For example, until 2025, the collaborative project “eService Agentur Saxony-Anhalt”, or eSALSA for short, was based at the LLZ where it promoted innovation in university teaching for the entire state. The project is set to continue as a state initiative in line with target agreements.
The LLZ is also responsible for providing further training for instructors and students. “The AI Act, passed by the EU, also obliges the university to equip AI users with the appropriate skills,” explains Michael Gerth. “For two years we have been offering various courses for instructors, students and, in some cases, administrative staff that cover the general use of AI, how to use it in teaching, and special topics such as testing and exams. “The self-study formats were developed by eSALSA, however two courses for instructors take place in person. “Last year up to 300 people took part in the online courses. This shows how great the need for information continues to be,” says Gerth.
Further information in the LLZ Wiki:
https://wiki.llz.uni-halle.de/ Portal:Künstliche_Intelligenz
AI at MLU
At MLU the LLZ has created an online lexicon on AI: the AI Wiki. It offers basic explanations for beginners, tips for creating successful prompts, links to guidelines and self-study courses, a video series on AI from the perspective of students, and information about legal issues. The AI-Wiki also provides answers to frequently asked questions about “MLU-KI” (MLU AI) – the university’s internal AI access point that is still in the testing phase. “At the moment, MLUKI uses three open-source language models as well as ChatGPT as a commercial provider under a common web interface,” says Michael Gerth. The university opted for this solution as a way to provide students and staff with free access to generative AI that is data protection-compliant.


