I sometimes hear people promote this kind of thing as a “good” use of LLMs and I just really don’t see it working out
It's the classic tale of making a wish and having it go horribly wrong
In this specific case, it seems like creating a ton more work for yourself. You have a book fake written that you have to un-write and cram into a book form. Just seems like less effort to use literal pen and paper.
I find reactions to this very odd. This is a natural progression of writing. With LLMs tools at their disposal, more people have the type/capacity to write. Similar to moving from writing by hand to typewriters. We need to regulate and train people to use these tools, sure, but the usage is good.
If anything they have it backwards, as LLMS are potentially useful to edit and refine a draft (check grammar and such), but not great with structure or creating a harmonious, creative text.
it can't be a creative or imaginative tool when it can't create or imagine
I've seen exactly ONE good use for chatGPT and it's writing the summaries of the weekend's matchups in Fantasy Football.
If you need a trillion dollar LLM to get writing prompts then maybe writing isn't your thing.
In fact, none of the "good uses" of LLMs have actually worked out. The specialized AI to analyze cancer X-rays? Turns out it is wrong more often than the specialists employing it. AI to help with coding? It requires constant checking; at best it writes wildly inefficient code.
I think it's like saying a machine that makes you do crunches is good for your abs. You lose the actual exercising motion and knowledge and eventually you're going to need it.
at that point you are basically an editor, not a writer, which is a completely different skill set