Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This was one of the reasons why the use of such tools was strictly prohibited at my former university.

Another argument given was: even if you only have to *think* about using such a tool you are already in a situation where good scientific practices is no longer guaranteed. In other words: if you/the students would have followed all rules of good scientific practise right from the beginning, you would never need to use such a tool. But I guess if you are the developer of such a tool or work in that area of research, you probably see things differently...

Also; how many different ways are there to explain rules on scientific practices within 150 word? How much similarities would you expect from O(100) different students - even if they write independently? - I'm not sure if that is taken into account in such tools. On a different scale: when piping e.g. a typical PhD thesis though such a tool, the first introductory paragraphs will always have red flags (simply because that topic was already introduced 10000 times and everyone read more or less the same introductory textbooks). The important part - the main part of the thesis - of course should be unique (but if the supervisor/examist/committee is not able to "detect" this in their own, well...). Of course literally copy&paste an introduction is still not okay. But -as the blogger also said- this can easily detected by issuing a simple yawhoogle search in case the text already reads suspicious (e.g. if the style of writing varies a lot between paragraphs etc).

So yes, I'd agree that the use of such tools is relatively limited when it comes to "real" scientific works but in this particular case it was quite neat to see how easily you can use it to atomatise the collection of evidences if you have a large class of students...



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: