Study
General information
Target group
Students of all degree programmes (Bachelor, Master, Magister, State Examination)
Personal responsibility and legal framework conditions for AI use
- The responsibility for the use of AI systems generally rests with the respective users, i.e. they are not only responsible for the data entered and content generated when using the AI, but also for its storage, reproduction and distribution.
- On the one hand, this includes the data entered within the prompt or generated by the AI, as this can be stored not only locally but also on the AI operator's server, possibly used for training purposes and thus distributed and reproduced.
- On the other hand, this also includes the data or content that the user creates as a result of communication with the AI and deliberately disseminates, e.g. the text of an email or the information on the homepage.
- In order to fulfill this responsibility, users are obliged to inform themselves about applicable regulations (e.g. data protection, copyright, AI regulation), to critically question the results and to design the use in a context-appropriate and ethically reflective manner.
- Further information can be found in the sections on data protection compliance and copyright compliance. They help to clarify possible uncertainties at an early stage.
Recommended AI applications with contractual data security
- The use of free AI systems such as ChatGPT means that the entered data and the generated content are usually transferred to storage in the cloud and therefore the use of the content for third parties, in particular for training the AI models, cannot be controlled by users.
- For this reason, the use of the AI available at the UR (Microsoft Copilot and DeepL in the web editor) is recommended, as contractual regulations with the manufacturers restrict distribution and duplication.
- This does not affect compliance with data protection, copyright and personal rights or the responsibility of the user.
Transparency and labelling obligation for AI use
- The use of AI systems must always be transparent. This means that the use of AI in the creation of content - regardless of whether it is text, images, audio, video or other formats - must be disclosed and AI-generated content must be clearly labelled as such.
- This includes content that has been created entirely by an AI and content that has been significantly influenced or supplemented by an AI. This labelling makes it clear that it is not purely human-generated or independently generated content.
- The specific form of the transparency obligation - for example, the extent to which the use of AI must be indicated - may vary depending on the specifications of the respective lecturer or person responsible and must be taken into account accordingly.
- In order to fulfil this transparency obligation, AI-generated content must be critically reviewed, correctly classified and comprehensibly documented in the respective context before it is passed on or published.
Possible applications of generative AI in studies
Writing and structuring work
- The AI can be used for brainstorming, structuring and linguistic support.
- Example 1: Students are given a suggested wording for the transition between two chapters and revise the content independently.
- Example 2: Students use an AI tool to revise the language of individual sentences from a draft text, for example with regard to readability and adherence to an academically appropriate language style.
- Example 3: Students use an AI tool to have the spelling and grammar of their text checked.
Content summaries
- The AI can be used for preparation / text comprehension. Publicly accessible texts (e.g. scientific publications, legal texts, other legally accessible content) may be used. Texts from GRIPS, for example, are not necessarily public and require the permission of their authors to be summarised, as they are stored and reproduced at the same time.
- Example: AI is used to summarise a publicly accessible English specialist article before going into detail about the original text. The summaries must be checked for accuracy.
Translate
- AI can be used to support the reading of foreign-language literature.
- Example: Students use DeepL for the first translation of a publicly accessible French research article and compare the AI-generated text with the original.
Generation of programme code
- The AI can assist with code generation and debugging.
- Example 1: Students have Copilot write a draft for a Python function, but understand the logic and then revise the code independently.
- Example 2: Students have unit tests generated for code they have written themselves in order to check their code. They add relevant test cases and test data to these tests to ensure that their code works well.
Exam preparation
- The AI can be used to simulate practice questions or to repeat content.
- Example 1: Typical multiple-choice questions on a psychology chapter can be generated and used for self-testing.
- Example 2: Students can load their own content into Copilot and simulate exam-relevant questions.
Study organisation
- AI can support the planning of learning phases, prioritisation and daily structures.
- Example: Students use an AI tool to create a learning plan for the exam phase, which they can customise.
Learning reflection through AI impulses
- AI can help students to reflect on their own argumentation or structure.
- Example: Students have Copilot comment on the structure of term papers and check the comprehensibility of comments.
Training and operation of own language models
- For learning purposes, students can train and operate language model slocally on their own computers, e.g. to generate content summaries or learning aids. This is permitted as long as the authors of the learning content do not expressly prohibit this type of use. It is important that the model is only operated locally and that no data is transferred to external servers, as otherwise copyrighted or confidential content could be processed in an unauthorized manner (see compliance with data protection).
- Example: Students use a locally installed language model (e.g. GPT4All) to summarize lecture content. The summaries generated serve as a basis for repeating the learning content, but are critically reviewed and only used for personal learning purposes.