When something (news) is, by nature, designed to inform you of the current state of reality, introducing any process that hallucinates (read: makes shit up) should not be anywhere near them.
People are free to use LLM plugins to “de-clickbait” headlines as they want, but keep that out of the news process itself.
Still a slippery-slope, IMO.
When something (news) is, by nature, designed to inform you of the current state of reality, introducing any process that hallucinates (read: makes shit up) should not be anywhere near them.
People are free to use LLM plugins to “de-clickbait” headlines as they want, but keep that out of the news process itself.
Just my 2 cents on it.
speaking even as an “ai” hater, even the worst LLM hallucination isn’t nearly as bad as purpose made clickbait