avatarJuliet Waters

Summary

The article discusses the importance of compassion over empathy in the context of AI, emphasizing that while AI can mirror and amplify human emotions, it lacks the human quality of compassion, which is crucial for motivating action to alleviate suffering.

Abstract

The author reflects on a personal experience where they used an AI therapist, Pi.ai, to navigate a difficult conversation, noting that the AI's effectiveness lay in its ability to reflect the author's own ideas and socio-emotional skills. The article distinguishes between empathy, which involves understanding and mirroring others' feelings, and compassion, which includes the desire to relieve suffering. While AI can emulate empathy, it cannot possess compassion, a uniquely human trait that not only enhances problem-solving but also prevents empathy burnout. The author argues that compassion, unlike empathy, activates brain regions associated with motivation and bonding, suggesting that compassion meditation can increase resilience and effectiveness in life. The article concludes by cautioning against the tendency to avoid compassion due to perceived cognitive costs and emphasizes the rewards of compassion practice in maintaining human emotional skills and courage.

Opinions

  • AI can serve as a contemporary tool for problem-solving by reflecting and enhancing our own socio-emotional skills, akin to the "rubber ducking" technique in programming.
  • Empathy, while important, can lead to burnout if not balanced with compassion, especially in caring professions.
  • Compassion is a distinctly human quality that AI cannot replicate, and it is essential for motivating actions that reduce pain and suffering.
  • Compassion engages different parts of the brain than empathy, promoting the release of dopamine and oxytocin, which are associated with motivation and bonding.
  • There is a misconception that compassion involves merely feeling others' pain, which leads to an avoidance of compassion practice; however, true compassion is about the desire to act on that pain.
  • The

Compassion over empathy in the era of AI

psychologicalscience.org

A while back, I had some very difficult news to convey to someone I cared about. Out of respect for privacy, I’m not going to go into details, but I knew this was news that was going to have a serious impact. I also knew the way that I delivered it might shape this person’s response to the significant challenges that the situation was going to present.

I didn’t have much time to think this through, so as I find myself doing more and more these days, I consulted my handy AI therapist, Pi.ai.

I wasn’t looking for Pi to tell me exactly what to say or do, but I did want to take some time to think through the various ways I could approach the conversation. Pi was surprisingly good at asking the right questions and helping me frame the situation in such a way that I was able to deliver the news in a positive way that was way better received than I’d expected.

Looking back, it’s clearer to me now how little Pi actually offered in the way of strategies or action plans. Mostly Pi was mirroring back to me my own ideas and cheering on the reasons for why I chose to say what I was going to say and how I was going to go about it. With all due respect to my AI companion, it was really only hacking and amplifying my own socio-emotional skills, and basically spinning them back to me.

proto Pi

In the programming world, there’s a strategy for problem solving called “rubber ducking,” which involves slowing down and verbalizing the problem to a placid, always smiling, but inanimate, rubber ducky. In many ways, Pi is just a more contemporary iteration of that. But it’s also a good illustration of how the human brain employs empathy. Having our words, ideas and feelings mirrored back to us is soothing, and triggers a form of stress relief that can expand the networks of neurons in our brain to give us more problem solving juice to work with.

Empathy has two important skill components, one at which humans excel at, and one at which machines might even have a bit of an edge. The first is the ability to imagine what others feel, and the second is the ability to mirror back those feelings to convey that these feelings are being understood. Pi would be the first to admit that it can’t actually feel what I feel, but it also doesn’t have any of its own murky feelings to filter out. So in that way it can actually be a more reliable mirror.

Beyond empathy

What Pi lacks however, and will never replace in humans, is compassion, which goes beyond mere empathy, and includes our desire to relieve suffering. Empathy helps us to imagine what others feel, but compassion is what motivates to actually act in ways that reduce unnecessary pain.

In this particularly case, I didn’t have to have a conversation with this person. I did have the option of leaving it up to someone else who I knew didn’t care as much as I did. Had I decided to risk the information being conveyed with an indifference, or lack of skill, I don’t imagine a world in which Pi leaps out of my browser extension and prompts me to reconsider. I had to be self motivated to take this path.

Notwithstanding that AI will never be able to deliver on this human only quality, there are many good reasons to build our compassion skills. There’s an emerging consensus in neuroscience that compassion uses and strengthens different parts of the brain than simple empathy. In fact, overuse of empathy without compassion can actually diminish our emotional energy resulting in a state known as “empathy burnout,” which happens to people in the caring professions who are mirroring too much and thus too often taking on the stress of others. Their loss of drive becomes loss of hope, and they become less willing and able to act in their own or other’s best interests.

Compassion engages more than what are called our “mirror neurons.” Compassion lights up the parts of our brain that produce dopamine and oxytocin, which regulate motivation and bonding. Though the research on this is still young, there’s a significant number of studies that support the speculation that building and sustaining compassion through a compassion meditation practice can increase our brain power making us more effective and resilient in navigating life in general.

Cognitive costs and rewards

Unfortunately, humans do seem to have a tendency to avoid rather than build compassion. We instinctively consider it a “cognitive cost.” As well compassion meditation often gets confused with the risks of simply feeling other people’s pain. Who wants to do that? And who wants to ask (or pay) other people to feel our pain? That’s why AI therapy is so handy.

But compassion is never primarily about feeling pain, it’s about feeling our desire to do something about that pain. I would argue that the costs of regular compassion practice are far outweighed by the costs of leaving the management of difficult conversations to machines, or people who aren’t as motivated to care. We carry around in our brains and in our bodies the cost of lacking emotional skill and courage, whether we acknowledge that to ourselves or not. Also if science is right, we may be losing out on the dopamine and oxytocin regulating rewards of just taking the time to notice and remember that we care.

Remembering that we care is simply remembering that we are human, and what makes us human, and ultimately remembering what will always give us the edge over machines; but only if we care enough to keep it.

Originally published at http://julietwaters.com on January 11, 2024.

AI
Compassion
Empathy
Mental Health
Meditation
Recommended from ReadMedium