AI in studies and science – opportunities and problems

A team of engineers conducts an experiment with robots. Close-up of a futuristic prosthetic robot arm being tested by a professional development engineer.
© Getty Images/anon-tae

AI - A technical milestone or a danger for science?

Ten years from now, when we look back on the winter semester of 2022, it could mark a milestone in the history of university studies – as a semester that fundamentally and rapidly changed the way young people study. The reason for this change is the use of generative artificial intelligence (AI): that is, the use of AI models that are capable of generating content on their own, be it text, images or sound. 

Abanda Pacilia has closely observed the change that began with the launch of the AI tool ChatGPT. “When students started using it, it wasn’t a big issue at first,” says the Cameroonian, who is pursuing a master’s degree in artificial intelligence at Brandenburg University of Technology Cottbus-Senftenberg. But then more and more professors noticed that something wasn’t right. For example, in presentations given by students, it often became apparent that they had not really understood the topic they were presenting. No wonder, because the content they used for their presentations and for written assignments was not produced by them. They had not compiled the information themselves: it had been done by the AI tool.  

The professors then declared assignments and presentations invalid where the content was obviously not produced by the students, Abanda reports. But that doesn’t mean students are now prohibited from using the tool for studying. “The professors encouraged us to use the AI as a tool to do the assignment more effectively – but not to produce finished results,” says the 27-year-old student. 

She thinks it’s the right thing to do. “I think the use of ChatGPT made many students a bit lazy.” Many would have simply used the content that the tool delivered to them – without trying to understand the background.  

Artificial intelligence in studies?

Yet ChatGPT and Co. can actually help students understand study content even better, finds Abanda Pacilia. She uses ChatGPT as a way of approach complex when writing term papers, for example. Recently she was able to obtain an explanation of individual algorithms that were important in a statistics term paper. 

Abanda says ChatGPT and the like are also helpful in getting suggestions for the structure of term papers. She gets tips on what should basically go into the introduction to a particular topic, and which aspects and sub-aspects of a particular question should be considered later. “You can build on this and then create the content for the individual sections yourself,” says the student.

Overall, Abanda finds AI tools like ChatGPT very helpful. “It allows me to get my tasks done faster,” she says. That's also what fundamentally excites her about AI. “It’s a technology that can help us a lot in our daily lives: whether it's in our studies, at home, or even in business,” she finds. 

How can AI help advance the natural sciences?

Xuemei Gu also uses AI tools every day. She is an Alexander von Humboldt Postdoctoral Fellow at the Max Planck Institute for the Physics of Light in Erlangen. There, she is a team member of a research group investigating how new AI can help make conceptual advances in physics, especially quantum physics and quantum optics. For example, she uses ChatGPT to create and improve the programming code for algorithms she is working on. “It can give me a lot of valuable suggestions to speed up my original code and offer solutions which I was not aware of before for reducing memory usage,” she says. In this way, the programming support often saves her a lot of time. 

She uses the code she creates for AI applications she is working on in Erlangen. For example, she is currently developing a tool that could suggest high-impact cross-disciplinary ideas and collaborations for scientists. “It aims to help inspire researchers and accelerate scientific progress.” Meanwhile, Xuemei recognizes that the technology might have its downsides. “If researchers only rely on the tool’s suggestions, there is a risk that they might all focus on the same topics”, she says. “However, this isn’t a flaw of the tool; it’s more about how it is used by the researchers,” she explains. 

In another project where she is developing an AI to explore new quantum optical techniques in telescope and microscope, she sees a different challenge: how an AI arrives at its results is often a black box. The solutions it provides aren’t always clear for humans to understand. “So you have to ask yourself whether you trust it,” Xuemei says. “In some cases, a certain level of expertise is necessary for an accurate evaluation.” 

It's also important to know what data an AI has been trained with. For example, the freely available version of ChatGPT draws on information that was available up to September 2021. “So if you’re working on a very current topic, it doesn’t make sense to use the tool,” says Abanda Pacilia. “This is because the information can then become outdated very quickly.” For this reason, she finds it important to also be able to access current research papers from databases.  

Ultimately, then, it depends on what you use AI tools for, Abanda finds. So to completely ban their use at the university is not a good solution, she believes. “The tools can help find new approaches to solve problems – and also achieve better results.” 

* mandatory field