Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How about we feed an "AI" with it, like MS does with licensed code? Then we can host that AI and let people use that, without having it output where some text is from, just like MS does for code.


Right. Just let OpenAI do it. Then it’s allowed. And internally, let the “model” be a 7zip compression algo. Just call it an “LLM”. Courts won’t know the difference. Haha


"Courts won’t know the difference."

Whether or not I'd suggest that if AI makes reverse-engineering easy (and I see no reason why it won't) then users will use it on an individual basis. Detecting the myriads of breaches would be a nightmare for any law/courts system. Ultimately, the paradigm will have to change.


We could also abolish copyrights for software in general and for everyone, including the tech giants.


We could, but is that the best alternative?

Despite everything I've said here until now I'm not against creators receiving fair recompense for their efforts. What I'm against is the enormous inequity in copyright law which seriously disadvantages consumers. I believe it is not in the best economic or strategic interests of the nation for such inequity to exist—in fact, I reckon it's very damaging.

Solving the copyright problem won't be easy because it has its roots in a much bigger issue—that of social inequity and inequality.


> We could, but is that the best alternative?

Yes, absolutely. Insiting on business models that depend on scarcity when that scarcity doesn't actually exist is absurd and the costs to society are astronomical.

> Despite everything I've said here until now I'm not against creators receiving fair recompense for their efforts.

No copyright doesn't mean no copensation, it just means that compensation cannot be enforced on a per-copy basis. Creative works, including for-profit creations, have existed long before copyright.

But let's also not pretend that creators receive fair compensation today.


Right. Then there's the issue of compiled code which is the elephant in the room, as ultimately AI will be able to decompile code with ease. If it cannot, say through encryption, then AI will be able to emulate it.

If I can think this then I'd reckon I'm not alone, the thought must be high on the agenda for MS and like.

The implications are enormous.


It's time to move from tokenisation to training on full range of byte as input then




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: