Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm kind of excited to see how scifi authors will tackle the generative AI revolution in their novels.

As of now, the models still need large amounts of human produced creative works for training. So you can imagine a story set in a world where large swathes of humanity are regulated to being basically gig workers for some quadrillion dollar AI megacorp where they sit around and wait to be prompted by the AI. "Draw a purple cat with pink stripes and a top hat" and then millions of freelance artists around the world start drawing a stupid picture of a cat because the model determined that it had insufficient training data to produce high quality results for the given prompt. And that's how everyone lives their lives....just working to feed the model but everything consumed is generated by the model. It's rather dystopian.



I have a novel I've been working on intermittently since the late 2000s, the central conflict of which grew to be about labor in an era of its devaluation. The big reveal was always going to be the opposite of Gibson's Mona Lisa Overdrive, that rather than something human-like turning out to be AI, society's AI infrastructure turns out to depend on mostly human "compute" (harvested in a surreptitious way I thought was clever).

I've been trying to figure out how to retool the story to fit a timeline where ubiquitous AI that can write poems and paint pictures predates ubiquitous self-driving cars.


I would say it's very profitable in terms of ideas...if you put the work. The problem is that most main-market sci-fi is not about ideas, but about cool special effects and good vs bad guys.


Sure, 90% of everything is crap.


> As of now, the models still need large amounts of human produced creative works for training.

That will likely always be the case. Even 100% synthetic data has to come from somewhere. Great synopsis! Working for hire to feed a machine that regurgitates variations of the missing data sounds dystopian. But here we are, almost there.


Eventually models will likely get their creativity by:

1. Interacting with the randomness of the world

and

2. Thinking a lot, going in loops and thought loops and seeing what they discover.

I don't expect them to need humans forever.


Agreed, by some definitions, specifically associating unrelated things, models are already creative.

Hallucinations are highly creative as well. But unless the technology changes, large language models will need human-made training substrate data for a long time to operate.


It's ironic that you nonetheless think "scifi authors" will be writing those novels, not language models.


I would read that! But hopefully it won’t be written by ChatGPT.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: