What Really Happens When We Die?

Michael De DoraBy Michael De Dora

There are many different conceptions of the term "life after death." Adherents generally agree there is a permanent life force, like the soul, that survives bodily death; and that there exists another realm of being beyond what we see on Earth. Western religions tend to endorse a Heaven (eternal bliss) and a Hell (unending suffering). Reasons for such beliefs include: "it keeps people moral"; "it is reassuring to know that regardless what happens on Earth, good people will be rewarded, and bad people will be punished"; "it is comforting to know I will be reunited with my loved ones."

But our desires, no matter how strong, are not good reasons for our beliefs. We should not believe something merely because it sounds advantageous. Our beliefs should be supported by good scientific evidence or philosophical rationale. For example, I love my family and friends, and sharing eternity in paradise with them sounds quite nice (though I think eternity with any company would eventually get boring). But I see little to no scientific or philosophical basis to believe this will happen (I find Eastern ideas about the afterlife equally unsupportable). To be sure, I do not know that I am correct. But I am reasonably comfortable with my position.

What are the implications of my belief that life truly ends with death? I plan to maximize, enjoy, and treasure the time I have on Earth, and when I reach the end, I will have two comforts. First, I will fear nothing, as I believe there is nothing to fear. And second, while I will miss my loved ones and this wonderful existence, I will feel sufficiently happy with the life that I have lived. That is the best I believe I can hope for.

Read more from: What Really Happens When We Die?

 

Michael De Dora is executive director at the think tank Center for Inquiry in New York City.


9/20/2010 4:00:00 AM
  • Afterlife
  • Death
  • Humanism
  • What Do I Really Believe
  • About