News: 1745434690

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

As ChatGPT scores B- in engineering, professors scramble to update courses

(2025/04/23)


Students are increasingly turning to AI to help them with coursework, leaving academics scrambling to adjust their teaching practices or debating how to ban it altogether.

But one professor likens AI to the arrival of the calculator in the classroom, and thinks the trick is to focus on teaching students how to reason through different ways to solve problems, and show them where AI may lead them astray.

Over the past two years, Melkior Ornik, assistant professor in the Department of Aerospace Engineering at University of Illinois Urbana-Champaign, said both colleagues and students have fretted that many students are using AI models to complete homework assignments.

[1]

So Ornik and PhD student Gokul Puthumanaillam came up with a plan to see if that was even possible.

[2]

[3]

Ornik explained, "What we said is, 'Okay let's assume that indeed the students are, or at least some students are, trying to get an amazing grade or trying to get an A without any knowledge whatsoever. Could they do that?'"

The academics ran a pilot study in one of the courses Ornik was teaching – a third-year undergraduate course on the mathematics of autonomous systems – to assess how a generative AI model would fare on course assignments and exams.

There was still a significant disparity between the different types of problems that it could deal with or it couldn't deal with

The results are documented in [4]a preprint paper , "The Lazy Student's Dream: ChatGPT Passing an Engineering Course on Its Own."

"In line with our concept of modeling the behavior of the 'ultimate lazy student' who wants to pass the course without any effort, we used the simplest free version of ChatGPT," said Ornik. "Overall, it performed very well, receiving a low B in the course."

[5]

But the AI model's performance varied with the type of assignment.

Ornik explained, "What I think was interesting in terms of thinking about how to adapt in the future is that while on average it did pretty decently – it got a B – there was still a significant disparity between the different types of problems that it could deal with or it couldn't deal with."

With closed-form problems like multiple choice questions or making a calculation, OpenAI's ChatGPT, specifically GPT-4, did well. It got almost 100 percent on those sorts of questions.

[6]

But when deeper thought was required, ChatGPT fared poorly.

"Questions that were more like 'hey, do something, try to think of how to solve this problem and then write about the possibilities for solving this problem and then show us some graphs that show whether your method works or doesn't,' it was significantly worse there," said Ornik. "And so in these what we call 'design projects' it got like a D level grade."

As Ornik sees it, the results offer some guidance about how educators should adjust their pedagogy to account for the expected use of AI on coursework.

The situation today, he argues, is analogous to the arrival of calculators in classrooms.

"Before calculators, people would do these trigonometric functions," Ornik explained. "They would have these books for logarithmic and trigonometric functions that would say, 'oh if you're looking for the value of sine of 1.65 turn to page 600 and it'll tell you the number.' Then of course that kind of got out of fashion and people stopped teaching students how to use this tool because now a bigger beast came to town. It was the calculator and it was maybe not perfect but decently competent. So we said, 'okay well I guess we'll trust this machine.'"

"And so the real question that I want to deal with – and this is not a question that I can claim that I have any specific answer to – is what are things worth teaching? Is it that we should continue teaching the same stuff that we do now, even though it is solvable by AI, just because it is good for the students' cognitive health?

"Or is it that we should give up on some parts of this and we should instead focus on these high-level questions that might not be immediately solvable using AI? And I'm not sure that there's currently a consensus on that question."

Ornik said he's had discussions with colleagues from the University of Illinois' College of Education about why elementary school students are taught to do mental math and to memorize multiplication tables.

"The answer is well this is good for the development of their brain even though we know that they will have phones and calculators," he said. "It is still good for them just in terms of their future learning capability and future cognitive capabilities to teach that.

"So I think that this is a conversation that we should have. What are we teaching and why are we teaching it in this kind of new era of wide AI availability?"

[7]More and more CS students are interested in AI – and there aren't enough lecturers

[8]Professor freezes student grades after ChatGPT claimed AI wrote their papers

[9]Brit teachers are getting AI sidekicks to help with marking and lesson plans

[10]Survey: Over half of undergrads in UK are using AI in university assignments

Ornik said he sees three strategies for dealing with the issue. One is to treat AI as an adversary and conduct classes in a way that attempts to preclude the use of AI. That would require measures like oral exams and assignments designed to be difficult to complete with AI.

Another is to treat AI as a friend and simply teach students how to use AI.

"Then there's the third option which is perhaps the option that I'm kind of closest to, which is AI as a fact," said Ornik. "So it's a thing that's out there that the students will use outside of the bounds of oral exams or whatever. In real life, when they get into employment, they will use it. So what should we do in order to make that use responsible? Can we teach them to critically think about AI instead of either being afraid of it or just swallowing whatever it produces kind of without thinking.

"There's a challenge there. Students tend to over-trust computational tools and we should really be spending our time saying, 'hey you should use AI when it makes sense but you should also be sure that whatever it tells you is correct.'"

It might seem premature to take AI as a given in the absence of a business model that makes it sustainable – AI companies still spend more than they make.

Ornik acknowledged as much, noting that he's not an economist and therefore can't predict how things might go. He said the present feels a lot like the dot-com bubble around the year 2000.

"That's certainly the feeling that we get now where everything has AI," he said. "I was looking at barbecue grills – the barbecue is AI powered. I don't know what that really means. From the best that I could see, it's the same technology that has existed for like 30 years. They just call it AI."

Engineers on the brink of extinction threaten entire tech ecosystems [11]READ MORE

Ornik also pointed to unresolved concerns related to AI models like data privacy and copyright.

While those issues get sorted, Ornik and a handful of colleagues at the University of Illinois are planning to collect data from a larger number of engineering courses with the assumption that generative AI will be a reality for students.

"We are now planning a larger study covering multiple courses, but also an exploration of how to change the course materials with AI's existence in mind: what are the things still worth learning?"

One of the goals, he explained, is "to develop a kind of critical thinking module, something that instructors could insert into their lectures that spends an hour or two telling students, 'hey there's this great thing it's called ChatGPT. Here are some of its capabilities, but also it can fail quite miserably. Here are some examples where it has failed quite miserably that are related to what we're doing in class.'"

Another goal is to experiment with changes in student assessments and in course material to adapt to the presence of generative AI.

"Quite likely there will be courses that need to be approached in different ways and sometimes the material will be worth saving but we'll just change the assignments," Ornik said. "And sometimes maybe the thinking is, 'hey, should we actually even be teaching this anymore?'" ®

Get our [12]Tech Resources



[1] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_specialfeatures/aisoftwaredevelopmentweek&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aAljDOvH73AXWV_L7pUDuAAAAQY&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_specialfeatures/aisoftwaredevelopmentweek&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aAljDOvH73AXWV_L7pUDuAAAAQY&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_specialfeatures/aisoftwaredevelopmentweek&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aAljDOvH73AXWV_L7pUDuAAAAQY&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[4] https://gradegpt.github.io/

[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_specialfeatures/aisoftwaredevelopmentweek&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aAljDOvH73AXWV_L7pUDuAAAAQY&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_specialfeatures/aisoftwaredevelopmentweek&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aAljDOvH73AXWV_L7pUDuAAAAQY&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[7] https://www.theregister.com/2022/07/07/comp_sci_students_ai_lecturers/

[8] https://www.theregister.com/2023/05/17/university_chatgpt_grades/

[9] https://www.theregister.com/2024/08/29/uk_ai_for_teachers/

[10] https://www.theregister.com/2024/02/05/ai_in_brief/

[11] https://www.theregister.com/2022/07/18/electrical_engineers_extinction/

[12] https://whitepapers.theregister.com/



But when deeper thought was required, ChatGPT fared poorly.

Neil Barnes

That might be - just guessing here, you understand - because it doesn't think?

Yes, kill multiple choice tests!

Jou (Mxyzptlk)

If the result of AI is to make multiple choice tests go away, something the USA is more fond of than Europe (where I am), China India and so on, that yes ! Finally a good use for AI! Get rid of that multiple choice crap!

Only make questions that require actual thought! Something Europe, China and India are more fond of. Start teaching knowledge instead of "competence". Oh, and while we are at it, free for all, cause USA misses quite a number of geniuses from lower class families...

As for the calculation questions: OK, I have to give that, "Midterm 1", "Midterm 2" and "Final exam" would have been easy questions for me 30 years ago for my Abitur (German variant of between High School and University level). Today I needed more than a minute on both to recognize "Oh, wait, all that is stuff I once learned and I was even good at it! At least I actually understand the questions."... Today it would take quite a while to solve those - I would go at it if I'd be in retirement, just for fun.

Then design project is a bit beyond my capabilities, I would need more engineering training. Or do it the Adam Savage way: Build, iterate, correct, build again, iterate again, correct again until the result is good.

Deus Ex Machina

NapTime ForTruth

"And sometimes maybe the thinking is, 'hey, should we actually even be teaching this anymore?'"

The answer to this question is almost always "Yes", because education is not strictly about getting the right answer but about understanding *why the answer is right* and *what to do with the answer now you have it*. Without that understanding no learning occurs, it's just blind acceptance and recitation.

When we fail to understand how and why the machines - be they mechanical or electronic or digital or quantum - and their results work, we fall victim to the machines, we become their hostages. Much worse that that, though, we become stupid, less knowledgeable and less curious about the world around us and less able to interact with or change it.

There's the old story about a village built around a machine that has all the answers. The machine was gifted to them by a neighboring village. One day there is unrest, rumblings of discord and even war among the various villages. The local villagers gather around the machine and ask what should be done. The machine replies "5".

"Five!", the villagers exclaim, "We must five!"

Many voices repeat the number, "Five! Five everyone! We must five!" Heads nod, there is much congratulatory back-patting, smiles of relief and agreement. "We need only to five."

And an old person, the village crank, a generally disagreeable pessimist of sorts who routinely complains about change and novelty and the weather, says, "What the hell does five mean, you bumbling nitwits? How do we 'five' something?"

The question is met with equal parts confusion and jeers, "Ah, you always find something to complain about, you old crank! Never happy with the status quo, always doing things the hard way!"

"Look, we'll just ask the machine!"

And they do ask the machine how to five.

The machine replies, "Four".

"Three".

"Two".

...

Re: Deus Ex Machina

Like a badger

And an old person, the village crank, a generally disagreeable pessimist of sorts who routinely complains about change and novelty and the weather, says...."

Like all of us, you mean?

Maybe

Boris the Cockroach

course work is not the be all and end all that some in the education system seem to think it is.

And that results by exam(s) would be better (no smartphones, dumb calculators only)

Icon because students need to learn how to deal with a problem while under time pressure and no backup (debugging some non working code while the boss is screaming about a non functioning website/database/robot/flight control/nuclear reactor safety system is also a good teacher of howto work under pressure... )

Re: Maybe

Jou (Mxyzptlk)

> while the boss is screaming

You should have read enough "Who Me" and "On Call" to know to go away from such boss as soon as possible. Possibly the day he starts screaming like that. Depending on the country you live in you actually can. As far as Germany is concerned: If the boss misbehaves you can go away instantly as well, called "Außerordentliche Kündigung", based on gross misconduct.

NAPOLEON: What shall we do with this soldier, Giuseppe? Everything he
says is wrong.
GIUSEPPE: Make him a general, Excellency, and then everything he says
will be right.
-- G. B. Shaw, "The Man of Destiny"