d416@lemmy.worldtoLinux@lemmy.ml•How can I use a local LLM on Linux to generate a long story?
8·
7 months agoThe limited context lengths for local LLMs will be a barrier to write 10k words in a single prompt. Approaches to this is to have the LLM have a conversation with itself or other LLMs. There are prompts out there that can simulate this, but you will need to intervene every few hundred words or so. Check out ‘AutoGen’ frameworks that can orchestrate this for you. CrewAI is one of the better ones. hope this helps
Here is the consummate thread on whether to use microsoft copilot. some good tips in there… https://lemmy.world/post/14230502