By that I mean, it must be an inherently comforting thing to think - we inherently know this and want there to be something after death, because it feels right, or more meaningful. There’s a reason basically every civilization ever has some sort of afterlife ethos.
I realize I am basically horseshoeing my way into evangelicalism but still. Maybe life was better if we believed there was something beyond this. [edit - please note that yes, the world is shitty, things are awful and getting worse, and that is exactly my point – we get THIS SHIT, and nothing else? god that’s awful]
After I die, I’m dead. who gives a fuck there’s no after life? I certainly won’t I’ll be dead.