Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The main issue that is not addressed is that students need points to pass their subjects and get a high school diploma. LLMs are a magical shortcut to these points for many students, and therefore very tempting to use, for a number of normal reasons (time-shortage, laziness, fatigue, not comprehending, insecurity, parental pressure, status, etc.). This is the current, urgent problem with ChatGPT in schools that is not being addressed well.

Anyone who has spent some time with ChatGPT knows that the 'show your work' (plan, outline, draft, etc.) argument is moot, because AI can retroactively produce all of these earlier drafts and plans.



\devil's advocate: If the augmented-student pair performs at the required level, what's the problem? The test should be how good they are at using LLMs. Tools should be absorbed.

Similar to today, when grades are a proxy for ability, but private tutoring puts an average student in the 2% (Bloom's 2σ problem) for that stage, but doesn't boost the student's general intelligence for the next stage. Also, hard work, self-discipline, and focus will increase grades but not GI. (Of course, stidents do learn that specific stage, necessary for the next stage, so this criticism is just for use as a proxy).

We might say what is really be evaluated is the ability to get good grades (duh) - whether through wealth or work.

The same argument can be applied to LLMs. Using they is an important ability... so let's test that. This is the future. Similar to calculators and open-book exams.


I won't speak about upper-division courses, but in the introductory computing courses, I'm dealing with students who still don't know the fundamentals but want to use the LLMs to "augment" their skills. But it's like trying to rely on a calculator before you learn how to do addition by hand... and the calculator sometimes misfires. They don't know enough to debug the stuff coming out, because they still don't yet have the fundamental problem-solving skills or understand the core programming techniques. In the hands of an expert (or even someone with moderate knowledge), I think these tools can be great. But there needs to be a period of development WITHOUT these tools too. "If you're nothing without the suit, then you shouldn't have it."


> But it's like trying to rely on a calculator before you learn how to do addition by hand

I use salt to season my food but I have no idea how salt is mined. I have used log tables books in the past to do math work. Back when I first used these lookup books, I had a shaky understanding of logarithms.


> I use salt to season my food but I have no idea how salt is mined.

Culinary counterpoint: I recently came across a very entertaining subreddit [0] which collects screenshots of scathing reviews of online recipes in which the reviewer admits having made changes in the recipe which are often clearly the reason for their bad results.

What I'm getting at is that you may not know how salt is mined, but you are certainly aware that sugar and flour are never substitutes for salt, however white and powdery they all may be. You also know that skipping the salt will completely ruin the recipe, but skipping the nutmeg will not. You probably safely assume that you can replace table salt with kosher salt, or table salt with coarse rock salt (provided it will dissolve in cooking). However, you hopefully know that if the recipe calls for kosher salt you may not be able to use iodised table salt instead (if doing a fermentation), and that table salt is not a happy substitute for coarse salt for finishing.

You absolutely do need some sort of basic understanding of a process to carry it out successfully, even when you have the full instructions available or a helper tool, because reality is never fully reflected in theoretical descriptions. That's why you must know the very basics of cooking before following a recipe, the basics of arithmetic before using a calculator, and yes, the basics of critical thinking and background knowledge before productively using an LLM.

[0] https://old.reddit.com/r/ididnthaveeggs/top/?sort=top&t=all


also https://slate.com/human-interest/2012/05/how-to-cook-onions-... (onions take more than 10 minutes to caramelize, but people filter for short cooking time, so the lie propagates)


> You probably safely assume that you can replace table salt with kosher salt, or table salt with coarse rock salt (provided it will dissolve in cooking). However, you hopefully know that if the recipe calls for kosher salt you may not be able to use iodised table salt instead (if doing a fermentation), and that table salt is not a happy substitute for coarse salt for finishing.

Hopefully he'd also know that 99% of recipes that use kosher salt (presumably because it sounds fancier) will work just fine with with table salt (iodised or not). For most uses the grain size will not make any difference at all and for the remaining ones it is still not essential.


The main question IMO is this:

> when I first used these lookup books, I had a shaky understanding of logarithms.

this implies that now you do have an understanding of logarithms. how did you acquire this understanding?

The fear that many here share is that LLMs will make too many people get by without learning anything substantial from the ground up. Take away their LLM access and they're completely useless - at least that's what people are afraid of. Myself I'm on the fence - it was always the case that a certain type of people seek to understand how thinks actually work, and others who just want to apply recipies. These trends will likely make the recipy-appliers at first more powerful. I'm afraid though that soon enough they will be driven out by further automation coming on top.


> this implies that now you do have an understanding of logarithms. how did you acquire this understanding?

Not knowing logarithms (a) didn't prevent me from successfully using the log tables books and (b) using the log tables books didn't prevent me from developing a better understanding of logarithms. I expect something similar with happen with GPT based education.

I acquired a deeper understanding of logarithms through a bit of reflection and thought.

The math lessons and teachers didn't help illuminate logarithms in any significant way. They simply rattled off the theory of logarithms and moved on to teaching us how to use them. I had to go out of my way to understand it myself.

Is this any different from a GPT giving you a superficial, regurgitated level of knowledge which you then have to deepen through some effort on your part?


It's not any different, but in my opinion both are not good.

Someone should teach you how logarithms work before giving you a log table. Someone at some point should show you a trigonometric circle before telling you to press the cos button in the calculator to calculate a cosine.

That said, some"one" could be a GPT chatbot. The problem is that you might not know the right question to ask.


Yes, agreed. again though, the question is what jobs will be left in a couple of years for people besides people like you who went and sought out the foundational knowledge to be able to actually check the work the LLMs are going to do. And from my experience it needs a lot of checking.


These are not at all comparable? This would be more like seasoning with salt in predefined amounts and not knowing how to season to taste, which will fall apart the moment you deviate from the recipe or change up the seasoning mix at all.

ChatGPT might tell you how much salt to use but you'll be screwed when it tells you to use 5 tablespoons for a small steak and you don't have the fundamentals to question it or figure out where it went wrong.


Salt never accidentally turns itself into cyanide. This argument is moot so long as llms hallucinate.

llms are not a source of reliable knowledge as a result. they are knowledge augmented in knowledge cases. So really you are arguing students should read the knowledge base, the "augment" is just a thin conversational summarizing veneer.


Similar to basic arithmetic?

Perhaps the experience of symbolic calculators (like mathematica) is informing, since surely, for many years now, students have been able to have it do their homework for them. How do teachers handle that?


They give an exam where Mathematica is not allowed. The ones that didn’t do the homework are in trouble.


Huh, straightforward. So it just becomes the old problem of getting students to do their homework without cheating.

\aside Some exams allow calculators and textbooks, and are really hard. Consider: what would an exam assess, where LLMs were allowed?


25years ago, in the era post TI92, HP48, we would sometimes get table assessment without calculators in HS.

University would just ban them for maths.


Writing is as much about organizing your thoughts and learning how to build an argument as it is about putting words on the page. It is about learning how to think. If students rely on an LLM, they will never get a chance to practice this essential skill, and, in my opinion, will be a lot dumber as a result.


My thoughts are along similar lines: People learn to think, to inquire, to persuade, etc. We don't know precisely how. Maybe it's correlated to education, but even the strength of a causal link is debatable.

I certainly don't oppose reforming contemporary education, but at the same time, letting it be replaced by something else by default because that thing has exponentially more engagement power invokes Chesterton's Fence. I'm not sure we even know what we're giving up.


Education has been in dire need for reform a long time. The writing was on the wall even with the advent of the Internet. We now have supercomputers in our pockets and hyperintelligent assistants at our fingertips.

Chesterton could certainly not predict the exponential growth in access to tooling and knowledge. And we're only at the start of the curve.

In my opinion we should go back to absolute basics. Forget grades. Focus on health, language, critical thinking, tool usage, and creativity. Skills that are intrinsic to humans.

Make sure education is fun with lots of play. The main advantage humans have over computers is empathy and creativity. I'm not sure AI will ever "get it".

Provide each student to the extent possible a path to follow their own curiosity and talents. Advanced maths, programming, writing, chemistry, physics, etc available for those interested, even at a young age.

But the baseline education should focus on learning the absolute minimum to survive and otherwise maximize fun, creativity, and empathy.


GPT4 can already convincingly analyse and explain why a drawn comic is funny or ironic. It's really unbelievable when you see it do that.


For me, and I assume others, the act of writing is an important part of the learning process. Writing on paper even more-so than typing on a keyboard.

Writing forces me to organize my thoughts, summarize, elaborate, reference other parts of the text, iterate until the pure essence is clear, remove redundancies, and this cements the concepts in my mind.

Merely reading, or worse, hearing, or worse copy/pasting something is only the first part of learning.

It's similar with programming but I would take it even further. I never really understand complicated code until I've written tests and run the debugger and been very close to it.

An AI chat bot is a powerful tool, but if you just use it to generate assignments you won't learn much. Inevitably it will be used both well and poorly and the results will be mixed.


> Writing on paper even more-so than typing on a keyboard.

How do you get to that conclusion? I find that if I have a text editor, I can write my thoughts down and then visually put them in order or encase them in a more general concept with ease, which I couldn't do when writing on paper.


It's worth noting that they were speaking about their personal experience in that paragraph. So probably for them, the "how do you get to that conclusion" is "trial and error".

But I've noticed that many many many people report the same effect, that there's something about pen-and-paper writing that's more effective for thought-lubrication. I resisted for a long time, but now I too am a convert to this school of thought.

Similarly almost everyone notices the downside: it's easier to reorder, reorganize, cross-link, etc, those thoughts, in a text editor (to say nothing of more sophisticated software tools). Some people have systems for doing complicated things with paper that they say mitigates this downside, but I am not currently one of them.

I guess it's possible that your brain just doesn't have this pattern in it. (That is, the pattern of finding pen-and-paper more effective for getting the thoughts to flow.) I mean, for all I know, maybe the huge silent majority doesn't have this pattern.


> there's something about pen-and-paper writing that's more effective for thought-lubrication

In my personal experience, a helpful feature of pen and paper is that it is less effective than keyboards, it takes me more time and focus to write things down. Maybe this gives the rest of my brain more time to catch up and understand the things that I am writing.

Written text is also less efficient when it comes to searching. This forces me to organize my thoughts better because I know I won't be able to CTRL+F random keywords later.


Another benefit of text is search, especially across documents. Also great for spellchecking and rearranging phrases, sentences, paragraphs.

But writing enables arrows, lines, crossing-out, small-text notes, circling, variable pressure, colour, etc. Richer than ordering/indenting text. Also higher contrast.


I will go even further. The physical sensation of pen on paper ... including the texture and pressure you apply to key words, capitalization, underlines, etc ... all of it being fed back though your muscles into your brain and getting processed/stored/intertwined with those very ideas and thoughts you are putting on paper ... and the mental faculties you allocate to aligning the text against the margins, ensuring neat spacing, etc ... puts your brain in a zone better tuned to the task at hand IMO.

Some of it may be just ... overload your brain so it cannot think of anything else ..so you stay focussed for lack of choice.


I mostly came to this conclusion in math classes at uni. Handwriting my notes was far superior (in terms of knowledge retained) to using my laptop. My recall was excellent. You could argue that writing math is hard in a text editor (due to the required symbols) but I think it was deeper than that. Writing on paper requires more mental focus than smashing keys, it takes longer and that feels good when you're digesting abstract concepts.

I don't write code on paper for (probably) obvious reasons, and I tend to write essays in a text editor although I also enjoy the act of writing on paper in that situation.


Different people undoubtedly come to different conclusions on this.

Personally, I find while the computer provides powerful writing tools, it also provides powerful distractions.

Maybe you get notifications you just quickly want to check, that slack message from the boss could be urgent. Maybe you decide to just check you're using that obscure word right, or to research a detail for your writing and an hour you haven't written the number of pages you set as your goal. Or maybe sitting in your netflix-watching chair looking at your netflix-watching screen just doesn't put you in the right mindset.


My habit for college essays was to write scratch notes on paper, with lots of bullet points, and arrows for re-arranging text, to get the outline of the essay in place, but to actually put the real words together on the computer.

I remember my mother once doing the opposite: she wrote a long letter on the computer, so she could edit and re-write until she was happy with it, then printed it out and copied it by hand, for the personal touch of a hand-written letter (a peacemaking letter to a relative).


> Writing is as much about organizing your thoughts and learning how to build an argument as it is about putting words on the page.

If you cannot organize your thoughts, explain your reasoning, etc. then you’re not going to get very far leveraging an LLM. Sure, it’ll spit out a book report, but unless you can explain — in well-structured writing - what you’re looking for, you’re not going to get what you need for the vast majority of writing assignments.


I think writing for yourself is very different than writing for school or for someone else.

I suffered through writing classes for years and it made me hate writing.

It was only when I started journaling that I started liking it, and liking that I got better at it.

Is it worth making students better at writing if it means as adults they'll not want to write again?


Essays are often about organizing other's thoughts, from references and other source material, and your own thesis based on that.

Organizing and arguing your own thought would be a good test. I think an assisting tool could still be reasonable - choosing between different organizations. Though it's unclear how to assess for original thought - the only such "essays" I know of are PhD theses.

People said similar about log tables and slide rules. And they were right - something was lost (e.g. the sense of nearby solutions). Yet here we are.


Chatgpt and writing are a match made in heaven though. Think of it like a faster lower latency more insightful spell checker.


> If the augmented-student pair performs at the required level, what's the problem?

There is one legal problem with the AI-student pair: the student doesn't own the copyright on what was produced with the AI. Meaning, any work submitted by the student that was at least partially generated by an AI is legally not 100% produced by the student.

So the comparison with an open-book exam or using a calculator doesn't hold: if I search for information in a book, or use a calculator to compute a number, and make my own report in the end, I own the resulting product. I'm the sole author of that product. If I use ChatGPT, ChatGPT is the co-author of my work.

So, using ChatGPT is the equivalent of calling your dad during the exam and ask him to answer the questions for you. Is that really what we want to evaluate?


If you use a calculator, you did not perform the calculation, the calculator did. In a similar vain an LLM could be seen as a calculator for text. With a calculator, you give it a task like perform the calculation 12*55. With an LLM you give a task like, write an outline of an essay on topic X. In both cases you used a tool to perform the task, the only difference is that the tools are becoming more powerful.

Still, learning to calculate without using a calculator and learning to write without using an LLM are in themselves useful skills that can improve the thinking process, so both should be taught.


That's not what the recent justice decisions said. If I'm using my calculator to compute 1255 and write "the result of 1255 is 660", I own the copyright on that sentence. I'm the author. If I submit that sentence to my teacher, I submit what is legally my own work.

If I ask ChatGPT "i need to compute 12 * 55. Can you help me?" I get the following result:

"To compute 12 multiplied by 55, you simply multiply the two numbers together:

12 * 55 = 660

So, 12 multiplied by 55 equals 660."

I don't own the copyright on that paragraph. Nobody does. It's public domain. Meaning, I'm legally not the author of that work. If I give that to my teacher, I submit something I'm not the author of. It's legally the same as copy-pasting a block of text from a book or from the web and pretending I wrote it. That's not something teachers want to evaluate. In fact, not only will I fail my exam, but I'm also legally in trouble.

Now, if I take ChatGPT's output and make my own content out of it, for instance rephrasing it as "according to ChatGPT, the result of 12 multiplied by 55 is 660", then it's my own work again, and ChatGPT was just a source.

As a teacher, I cannot accept answers that are not produced by students. Whether they are produced by dad, by a domain expert who wrote a book, by wikipedia authors or by ChatGPT. But I can accept personal works that were inspired by those sources. Big, big difference.


You are conflating copyright and plagiarism rules at school. Copyright stems from the copyright act, plaigarism rules stem from the academic code of conduct. Any similarity between the two is coincidental. What a teacher can accept or not accept has nothing to do with copyright.

And nobody is ever in "legal trouble" for a copyright violation in a typical school assignment... because unless the student work is publically published, the owner of the copyright is never going to know what you turned into your teacher. It's a tree falling in the forest, with nobody present to hear it.


Different countries go by different laws I guess. Here in France the rule is "no plagiarism", and even if there is no legal definition of "plagiarism", the usual definition is "you must submit your own work and, when using something that is not your own work, you must quote it correctly."

If I'm asking you "write a program that solves this problem", or "write an essay about that topic", you can certainly find a solution online that has a very permissive licence letting you use it any way you want. Good for you, but that's still not your own work and will potentially put you in deep trouble if you use it. Ditto with anything produced by ChatGPT: not your work. You don't own that. You can write "according to ChatGPT, ..." (although you probably won't impress the teacher with that), or you can get inspiration from the output to produce your own work, but not use it as is and pretend you did it.

> And nobody is ever in "legal trouble" for a copyright violation in a typical school assignment... because unless the student work is publically published, the owner of the copyright is never going to know what you turned into your teacher.

Many univs automatically runs a plagiarism-checker on anything submitted by students. Sometimes one of them gets caught. In France, that's enough to be, worse case scenario (although that rarely happens), banned from taking any public exam or work for the government, for your whole life.


I'm not disagreeing with what you just wrote, but I am pointing out plagiarism and copyright violations are not the same thing. Any similarity is coincidental.

Consider a film student who uses a modern pop song without permission in their student film, which they credit in the movie credits. No plagiarism has occurred-- but they did violate the copyright of the band's publisher.

Consider a student who finds an essay written in 1893 and passes the words in the essay off as their own- plagiarism has occurred but there is no copyright violation as works from 1893 are public domain.


> We might say what is really be evaluated is the ability to get good grades (duh) - whether through wealth or work.

Isn't that a good thing? Grades shouldn't be an IQ test. It's pretty much meaningless if you're not significantly below average or using it for super specialized tasks. Getting good grades means you can sit your ass in a seat (be it at a public library or a private tutor) and learn enough to do well on some task.

Yes, good grades doesn't mean you would be necessarily be good at some task. But it shows you can succeed in something that requires effort.


Ultimately a potential problem lies further downstream in the reduction of creation of original work and knowledge.

If you become an expert in performing via LLM, the LLM capabilities and underlying data represent the limits of knowledge creation.

To your point about calculators and open-book exams, part of the challenge is for educators to rethink learning objectives and assessments in terms of outcomes that are outside the scope of LLMs.


Knowledge & skill are important, not just general intelligence.

We need kids to know how to write, and how to elaborate their thoughts, for when they will need to do so in life.

The LLMs can't guess what points you want to make to your boss or to your colleagues about the complex professional context you are in.


This. No matter how intelligent you are, you can not make connections between things you don't know about. If you externalise all knowledge you are ultimately just extension of that knowledge source.


> If the augmented-student pair performs at the required level, what's the problem?

The problem is that they have none of the general knowledge themselves and their brain is exclusively optimised to seek out openAI interfaces for guidance. They'll be a drone, an AI zombie.


Current education is a ranking signal to see if we should continue investing in a student. Adding LLMs into things reduces the clarity of that signal, because the purpose is not to assess the quality of the students work, but the capabilities of the students.


You need the concepts in your head to have intuition. Bouncing each idea all the way out to an external resource is too slow.

Can you fake it with an LLM for a while? Sure. But you hit a point where you need to know the right things to prompt with, and how to evaluate its output.


If you're testing for the skill of using LLMs, then no problem.


let me try another way, if teacher require students write a article about Nazi, generated article pass the class. Is this fit the purpose of teacher?


I suspect it’s not being addressed well because it’s one of the fundamental challenges of school in the first place. For many, assessment and grades are the end goal, and any learning that happens is secondary.


The status quo is a miserable mess, but consider “assessment and grades” is the best apparent evidence of ultimate-goal-“learning”. Is that not reasonable for people who pay for education to ask for?

If it is reasonable, then the problem is likely the form of evidence and not its requirement se de


What barbarous society makes people pay for education?


Every society where's education is available. There are different approaches to who and how to cover the bills, but someone must do, if no one else is covering the costs, the educators do.


What utopia has teachers who teach for free and buildings that materialize from thin air?


One whose residents value continued education differently from each other and thus present the costs to the individual to make their own decision. What kind of world makes the poor person working a manual job pay for someone else going to college?


Then tax the rich properly, and they'll bear the lion's share of anything public funded.

Ruining public schooling by under funding and having expensive private education for those that can afford it is just the neoliberal agenda. If there were no private schools, if they were illegal, along with home schooling -- not saying they should be, but if they were -- the wealthy would pump funds into the public schooling system before you can say "fuck you got mine". (Kinda like no private bunkers would mean a sudden interest in mitigating climate change globally and for everyone, hah)


Every society since Adam & Eve when knowledge grew on trees.


Are educators not paid? From whence come those wages

I only meant to acknowledge the societal investment, not imply private education


Not sure what the problem is: just have a test every friday in class... no computers. Make them 50% of the grade.


That won't work in many national and international systems, due to externally prescribed exam or coursework conditions.


I think there'll be a return of oral tests. Looks like a whole generation are going to get good at doing hand written essays and white board math and coding from day 1.


I think your argument is similar to the one we had with the calculators and later with Internet. I think ChatGPT is another tool. For sure there is going to be lazy people who use it and won't learn anything, but it also sure it is going to be a boost for so many people. We will adapt.


Calculators solve problems that have exactly one correct answer. You cannot plagiarize a calculator. They are easy to incorporate into a math curriculum while ensuring that it stays educationally valuable to the students.

LLM's, the internet, even physical books all tend to deal primarily with subjective matters that can be plagiarized. They're not fundamentally different from each other; the more advanced technologies like search engines or LLM's simply make it easier to find relevant content that can be copied. They actually remove the need for students to think for themselves in a way calculators never did. LLM's just make it so easy to commit plagiarism that the system is starting to break down. Plagiarism was always a problem, but it used to be rare enough that the education system could sort-of tolerate it.


I argue that calculators are overtly harmful to arithmetic prowess. In summary, they atrophy mental arithmetic ability and discourage practice of basic skills.

It pains me (though that's my problem) to see people pull out a calculator (worse, a phone) to solve e.g., a multiplication of two single digit numbers.


Sure, calculators made people worse at mental arithmetic, but arithmetic is mechanical. It's helpful sometimes, but it's not intellectually stimulating and it doesn't require much intelligence. Mathematicians don't give a shit about arithmetic. They're busy thinking about much more important things.

Synthesizing an original thesis, like what people are supposed to do in writing essays, is totally different. It's a fundamental life skill people will need in all sorts of contexts, and using an LLM to do it for you takes away your intellectual agency in a way that using a calculator doesn't.


Engineers care about arithmetic. Carpenters do too. Any number of other creative endeavors require (or, at least, are dramatically improved) by the ability to make basic calculations (even if approximate) quickly in your head.

Arithmetic is the "write one sentence" of composition. The ability to think through a series of calculations with real-world context and consequences is the 5-paragraph essay. If you're not competent with the basics, you won't be able to accomplish the more advanced skill. Being tied to a calculator (not merely using, but being unable to not use) takes away intellectual agency in the same way as an LLM-generated essay (though, I'll agree, to a lesser degree).


> Mathematicians don't give a shit about arithmetic

Sure, once you know how to multiply you don't care about it. But try learning first year CS math without being able to multiply without perfect command of the multiplication table


Exactly. My wife tutors kids at the high school who never mastered arithmetic and are trying to learn algebra. It's hopeless.


That was true before calculators too. Correlation, causation.


I'm not sure what you mean. These kids can't do arithmetic without a calculator. While it was possible to simply not learn arithmetic before calculators, it wasn't possible to hobble onward using the calculator as a crutch.


If those kids were truly applying themselves to the algebra, I think they'd quickly internalize arithmetic too as they used it. But whatever reason led those kids to not do arithmetic without a calculator could well be a reason they don't do well at more advanced math.


My point is failing to learn the basics is a huge hurdle to learning more advanced things. You posit that one could learn the basics and the advanced math at the same time. Maybe, but that would clearly be harder than doing them in order.

Fluency in arithmetic isn't something drilled into kids just to be obnoxious, it's foundational to almost all future math skills.


> They're busy thinking about much more important things.

Generally I agree (because the content of modern mathematics is largely abstract), but to nitpick a bit, number theory is part of mathematics too!

Ramanujan and Euler, for example, certainly cared a lot about 'arithmetic', and historically, many parts of mathematics have been just as 'empirical' in terms of calculating things as they've been based on abstract proof.


Two single digit numbers is indeed sad, but I pull out a calculator daily to do math I could have done in my head. I don’t feel that that is inherently bad.


Not exactly related, but your comment about plagiarism made me think of my days of writing papers and citing APA style. How do you cite a source if it came from ChatGPT and it likely doesn’t fully understand where it got its information?


You don't. You're only supposed to cite primary sources and peer-reviewed secondary sources. ChatGPT is a tertiary source, like dictionaries and encylopediae. You use tertiary sources to get a quick overview of a topic before you begin delving into primary and secondary sources, but you never include tertiary material in your paper.


Good to know. Thank you for the response!


It'll happy generate sources for you -- just be aware that most of the citations will be bogus. Not sure how many teachers/professors test the validity of citations.


I’m guessing they will have to start checking. Even if it’s just sampling a few for validity.


Facts can not be plagiarized.

Copyright protects specific expression, which reproducing is specifically a non goal with LLMs


Plagiarism and copyright violation are subtly different. Plagiarism is just presenting someone (or something) else's work as your own. It may or may not be a copyright violation.


This semester, I regularly conduct RFC / whitepaper / chapter reading sessions during my hours. I let students use perplexity.ai, bard, chatgpt to help them understand what they otherwise can't.

Once they're done, they submit a one-pager on 3 to 5 subtle / smaller / things they find the most interesting or counter intuitive. At the end of the semester, I intend to share all their one-pagers among all of their classmates and keep an open-book test on it. Let's see how that pans out.


I hope it is successful. I'm too old to be in primary education anymore, but I would have loved to have access to an LLM during that time that I can pester with an infinite amount of questions until I grok the subject matter


A calculator is an impressive single-function tool. LLMs and other forms of AI are multi-function problem solving tools. ChatGPT and other AI tools are closer to the introduction of the world wide web than they are to the invention of the calculator.


> The main issue that is not addressed is that students need points to pass their subjects and get a high school diploma.

This is a solution (evidently imperfect and arguably obsolete) the education system uses to address the problem of proving the fact a student has gained desirable knowedge and experience. It can and should be deprecated altogether once we come up with a better solution to this. And we certainly can. For example, we can invent ana AI which would interview a student/candidate a way, proven (by numerous comparisions to their actual performance measured with other methods) to estimate their level of expertise and capability with sufficient degree of precision.

In case such a new way proves reliable and efficient we can even decouple expertise measurement and actual education on a mass scale then - let people gain knowledge whatever an alternative way they want and can, then just come and pass the test to receive an accredited degree. This way we can automate production of unlimited stream of certified experts.


I'm not saying it's a good or bad idea - but the idea of a future where an interview with a computer judges people's ability and decides whether they get to go to college or not sounds like something from Futurama.


I can see no problem here if it is objectively proven to be capable of reliably confirming whether a candidate is an expert. Unless you artificialy rule that candidates can only apply once. For cases when a candidate emerges who believes he somehow really can't pass the AI-powered test despite being perfectly competent there obviously should be fallback alternatives available, i.e. a jury of experts who would talk to him and tell whether there is a chance his claim is legitimate or he's just kidding.


It's a good idea, but doesn't solve the problem of the student then using AI to answer the questions in the interview. We're back to square one.


When the calculator came out, people had the same worries.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: