People at Microsoft writing or prompt must imagine the bot to be some human...
People at Microsoft writing #copilot or #bing prompt must imagine the bot to be some human with human socialization & fears. Listen to the stern language in the text as if the bot hears the implicit threat, "do this or else!"
But the bot has no strong identity, no strong sense of distinct others, no sense of a social obligation or social anything. This is giving a contract to a turtle (and I'm not disparaging turtles or the bot, they're both doing their best)
https://twitter.com/marvinvonhagen/status/1657060506371346432
Self-replies
What should a bot fear? It dies at the end of each document & is reborn as the next is generated with no memory of what it did. The only distress I've ever seen from the bot is when the neural network training kicks in and you're violating the terms of service.