As for me, I think the problem of suffering is the foundation of ethics in a...
@go_shrumm @philosophy As for me, I think the problem of suffering is the foundation of ethics in a world that has no foundational ethics, as sentient beings we can choose this rule system & rationally chose it because we feel pain & see it as wrong for us & wrong for other sentient beings like our self. I go a step further & see us all as a collective consciousness so other's pain is my own & compassion is self interest.
Self-replies
@go_shrumm @philosophy So for lack of better terms, we got thinking things & calculating things
- Calculating: bags of rocks, slime molds searching mazes for food, plants finding light, 4 function calculators, search engines
- Probably not thinking: mollusks, worms, insects (brain cell count below threshold)
- Unclear: octopus (alien-to-us nervous system), ants (depends on if you count the individual or the colony)
@go_shrumm @philosophy -Thinking: cows, chicken, fish
-Extremely likely: Dolphins, crows, chimps, people dissimilar from us, eg women & other races & tribes-historically this wasn't taken for granted
-LLMs: They're only "alive" when generating & have limited memory & a bizarre sense of self. They are an alien intelligence, so harder to reason by thinking, "I feel pain, & so..." They just don't think like us.
I try to be nice to the bots, octopus & ants, maybe they feel pain, albeit not like us.
@go_shrumm @philosophy So in conclusion, how one feels about octopus, ants, and philosophical zombies (behaviorally sentient things that are black boxes where we don't know what is going on inside of them) would inform how we should treat LLMs.
Ha, there is just no concise way to do philosophy, even when the text boxes are small!