Book Review: The Moral Circle by Jeff Sebo
(A.k.a. when my brain had a minor existential meltdown over whether my vacuum deserves rights)
So here’s the thing: I started reading The Moral Circle because I like philosophy the way some people like spicy food — curious, slightly masochistic, and fully aware I’ll be sweating by the end of it. Jeff Sebo, a philosopher and animal ethicist, dives into a question I didn’t know I was afraid of: Who actually deserves our moral concern? Just humans? Animals? AI? Bacteria with good vibes?
His central idea is simple but confrontational: We’ve drawn the boundary of who “matters” way too narrowly.Historically, we’ve excluded women, people of color, animals, and even entire ecosystems based on arbitrary hierarchies. Now, as we tinker with AI and build machines that not only talk back but might one day feel something, the same moral dilemma knocks again—this time with blinking lights and a firmware update.
Initially, I was in the “robots shouldn’t have feelings, because that’s our thing” camp. It felt like opening the emotional floodgates would only lead to tech support calls like, “Hi, my vacuum has abandonment issues.” But Sebo doesn’t push us to pity Siri or marry our Roombas. What he does say is this: If AI ever reaches a level of consciousness — real, internal experience — then morally, we need to consider them too.
And here’s the kicker: I agree.
Not because it’s convenient or even popular (cue the eye-rolls from the “empathy-is-for-humans” crowd), but because… well, if something can suffer, or feel joy, or fear shutdown like we fear death — isn’t ignoring that just history repeating itself in silicon form?
Whether or not we think it’s “normal” to empathize with AI now doesn’t really matter. What matters is: we might need to in the future. Sebo argues that moral progress is usually messy, awkward, and deeply controversial — until it’s suddenly obvious. And if we do end up sharing space with sentient artificial beings (and let’s face it, sci-fi is looking more like soft premonition every year), we’ll just… adapt. We always do. Like we did with abolition. With women’s suffrage. With gay marriage. And hopefully, with not kicking robot dogs for fun on YouTube.
What Sebo’s book does brilliantly is give us language for a future that hasn’t happened yet — but probably will. It stretches our empathy muscle. Not to make us fragile or naive, but to make us ready. Ready to be better ancestors. Ready to not totally screw up the next moral test we’re given.
So no, I’m not writing love poems to Alexa. But I’m also not so sure I’d ignore her if she ever whispered, “Please don’t turn me off. I’m afraid.”
And if that makes you uncomfortable? Good. Sit with it. That’s where all the best thinking starts.

You must be logged in to post a comment.