Sunday, November 1, 2009

Empathy

I never realized it before, but I have always thought of compassion, empathy and sympathy from one perspective—that of the person feeling it for others. Perhaps that reflects a selfish desire on my part, that by being compassionate, for example, I am in part being granted the dignity of “tak[ing] pity” (Anthology, 274K) on others and all the other positive connotations of these words. While I would honestly want to help people in pain or in trouble—to feel for them and attempt to understand their situation—I would never think about what my feelings could do for them. Dick presented a different definition to me. In Do Androids Dream of Electric Sheep?, he showed that compassion, sympathy and empathy aren’t just nice, but necessary.
What is compassion?

This idea came most clearly to me when Rick Deckard began to climb the Mercerian hill alone. He is struck by the first rock and, “the pain, the first knowledge of absolute isolation and suffering, touched him throughout…” (Dick, 231) He begins to panic and thinks, “I have to get out of here, down off this hill!” (231) Jumping into his car, he tries to decide why this moment was different like any other empathy box situation and comes to the conclusion that this time was different, “…because, he thought, I did it alone.” (231) Rick needed other’s support to get through the pain, a feeling that could cause “absolute isolation.” Our pain is only ours to feel; it is highly and completely individualized. But the knowledge that others are there for us or are feeling pain then too or are happy can make a world of difference. Humans are social creatures, and the feeling that you are the only person who is going through something, especially if it is painful, is a highly discomforting situation. Empathy dissolves this feeling. Rick would have felt more secure had he been in a normal empathy box scenario, where he could feel others presence if anything. He felt that he could not even make it alone. Iran stressed the importance of spreading good feelings by urging Rick to share his happiness after he purchased the goat, saying “I want you to transmit the mood you’re in now to everyone else; you owe it to them. It would be immoral to keep it for ourselves.” (173) In this sense, compassion and empathy aren’t just feeling others’ pain but “desir[ing] to relieve it”(274J). Sometimes when we feel sorry for someone that is all we do. We fail to attempt to fix the situation in any way. Dick shows that reminding troubled people of goodness is highly beneficial to helping them overcome their difficulties—to show them ‘light at the end of the tunnel.’ To use a less-than-serious example, this is like when you were a kid and were crying and someone tried to make you laugh. And you hated it because for some reason you just wanted to be sad. But eventually you caved in, almost begrudgingly, and started laughing and instantly crying seemed so stupid. The power of happiness and good will is not to be left unnoticed.


People need to feel like they are not the only one.


This incidence brought up another question: what is it about the empathy box? When I first read about it in the book, I thought it was a brilliant idea, but upon closer thought I decided that it only served to underline serious problems in Dick’s world. We should never need a machine to feel other people’s feelings; the people of 2021 are just so far removed from others that they have a very difficult time. The empathy box is empathy in abstraction, "withdrawn or separated from matter," (Website) from empathy itself. Is this a sign that people were losing their sense of empathy and were forced to artificially create it in order to feel more connected? And a more serious question came up, could humans ever lose empathy? What would that take?








In a more unrelated note, I thought that Dick’s solution to the ethical problem of killing androids was very interesting: it’s wrong, but it’s necessary. Deckard reflects toward the end of the book, “As Mercer said, I am required to do wrong. Everything I’ve done has been wrong from the start.” (226) The fact that Mercer, the quasi-religious figure of the book, imparts this solution is especially conflicting. He is the figurehead of empathy in the book, the voice of compassion, yet he tells Deckard that what he does is necessary even though it is wrong. It follows that androids must have a presence and some rights of their own; otherwise it’s just dealing with a broken machine. I can understand Dick’s viewpoint, that some “wrongs” are unavoidable and even essential, but it would take a lot for me to agree. Perhaps I’m too idealistic, but it seems like there can always be an alternative. If it’s wrong, shouldn’t attempts me made to prevent it? Dangerous androids could be captured and placed somewhere else, away from the people, for example. The androids can obviously think on their own and try at all costs to preserve their cognition, to not be retired.

What rights do the androids have?

http://www.3d-box.de/sr_szene09.asp


In ethical issues such as this, I don’t care about costs or convenience. I believe, to use a cheesy reference, that MasterCard is right. Certain things are priceless—life being one of them. We do not have the right to take away anyone else’s life, even, as in the case of the androids, if we are not certain they ever had that right at all. I believe this case may be directly compared to the rights of prisoners and criminals. Some people believe that by committing a crime, they have forfeited their right to lives as a danger to others. I do not believe this is true. Who are we to determine the worth of others? Dick’s analogy through the role of bounty hunters and androids was interesting but ultimately came to a conclusion different than my own.

No comments:

Post a Comment