I reviewed the bot policies for the largest mastodon instances. About none of...
I reviewed the bot policies for the largest mastodon instances.
About none of them have been updated in any coherent way to react to LLMs.
https://gist.github.com/matthewdeanmartin/606465df63a30d8d542416aa7a08cd43
Self-replies
It's on my TODO list now to write a model LLM bot permissive policy, that could be used to encourage good bot behavior and ban/discourage bad bot behavior.
Existing themes
- discoverability limits (bot checkbox, no/few discoverable posts/filter tags in account and each post)
- interaction limits (no follows, no mentions)
- activity limits (x posts per day)
- human supervision (max % bot posts, no bot-only accts)
- API source (llama okay but openai API is not... what?)
IMHO, a good LLM bot should
- Have all posts run thru a moderator bot to validate against the host's policies for humans
- Should have a human in the loop (i.e. no autoreplies)
- Should have some hashtag policy not set by the bot (i.e. strip out hashtags or use bot specific hashtags.) I just don't see how a bot is going to be able to follow the whispy vague social rules of an active hashtag. I'm a human and can't get it right, according the number of people that yell at me.
- Should always generate new content based on something new.
- Should have memory because a stateless chatbot is bad UX
- Probably should reply (because unattended accounts that only post & never interact are bad for for humans, think of unattended twitter cross-posters)
* these policies are for LLMs being used for human-like content. A bot that did, say, translation or did Riker's googling for him behave so much like a non-AI bot that my concerns don't apply.
Because of jailbreaking, I don't think this generation's LLMs is appropriate for reply without a human approval for any anonymous endpoint.
If there isn't a human approving each reply, then the bot *can't* follow any policy. Anyone could ask the bot to violate all the policies of the Mastodon instance and the bot wants to please so it will comply.