Using llm’s for papers does not mean your brain is atrophying though. There are lots of ways to challenge the mind even if you use llm’s to write some papers.
Sure. And there are new pedagogies that educators are trying out that help people learn even in the presence of these tools.
But a huge amount of "ugh I'm too smart for this assignment" complaining that students do is just kids being immature rather than an honest attempt at learning through other means.
> Using llm’s for papers does not mean your brain is atrophying though.
It means that you are losing your time. If you are a university student and use LLMs for your classes while "challenging your mind" for stuff outside of class, maybe you should just not be studying there in the first place.
Yes, I too enjoy that universities never send students to pointless filler courses. I really enjoyed writing long essays about ethics in software engineering for people who'd barely even read them. I especially benefited from being told that "tell your boss, tell HR, then make a police report and quit if that didn't fix it" is not an appropriate response to being asked to break the law at work.
While we're talking about things we're grateful for, I am so glad that we've structured the education and employment systems such that not having a degree puts you at significant risk of unemployment, prevents you from ever immigrating anywhere for the first decade of your working life, and generally marks you as a failure.
Do you...not understand that the class on ethics in software development was completely pointless because it was neutered by a university administration that was too afraid to commit to any real ethical position?
I thought I was pretty clear when I told you I got marked down for taking the stance that one should not commit crimes when asked to by one's boss.
The problem isn't that I was asked to produce an essay as part of the process of teaching me about ethics. The problem is that the entire class was pointless busywork taught by lecturers who weren't particularly ethical and who failed to teach any of us about ethics. ChatGPT could have done the entire thing and the class would still have failed to leave me with anything other than lingering resentment.
> Using llm’s for papers does not mean your brain is atrophying though. There are lots of ways to challenge the mind even if you use llm’s to write some papers.
Writing is hard. Sometimes it means sitting with yourself, for hours, without any progress. Leaning on an LLM to ease through those tough moments is 100% short circuiting the learning process.
To your point, maybe you're learning something else instead, like when/how to prompt an LLM or something. But you're definitely not learning how to write. Whether that's relevant is a separate discussion.
Lately I've been saying often the phrase "the process is the product." When you outsource the process, then the product will be fundamentally different from what you would have delivered on your own. In my own case of knowledge work, the value of the reports I write is not in the report itself (nobody ever reads them...) but rather the thinking that went into them and the hard-won wisdom and knowledge we created in our heads.
> Leaning on an LLM to ease through those tough moments is 100% short circuiting the learning process.
Sounds like "back in my days" type of complaining. Do you have any evidence of this "100% reduction" or is it just "AI bad" bandwagoning?
> But you're definitely not learning how to write.
How would you know? You've never tested him. You're making a far-reaching assumption about someone's learning based on using an aid. It's the equivalent of saying "you're definitely not learning how to ride a bicycle if you use training wheels".
If they use LLM for writing papers, they probably use it for other things as well. I have seen so many instances of adult actually skipping the step of "whys" and "whats" and go straight to "ask the LLM and we trim backwards".
Its basically adults producing texts of slop messages to each other. It is actually atrophying.
You might be in a circle of people that wants to know "why" things work. For example, when there's a bug, we go through several processes of:
There's a bug...why does it happen? What were they thinking when they wrote this? How to prevent this from happening?
This is true even for simple bugs, but nowadays you just vibe code your away into the solution, asking the AI to fix it over and over without ever understanding how it works.
Perhaps its just the way things are. I mean who uses their head to do calculations nowadays? Who knows how to create a blurring effect in physical drawing?
If you used a wheelchair every day, your legs would atrophy.
Regardless of the existence of other ways to exercise your legs which you also will not do, because you're a person with working legs who chooses to use a wheelchair.
You’re describing an extreme so let me counter with a different “what if” extreme: If
, every day, you use a wheel chair for five minutes and run for several hours followed by 200 squats then your legs will not atrophy. My point is that writing papers is only one way to work your mind and using llm’s for this does not indicate how you use your mind overall.