The zombie apocalypse is one of the most popular post-apocalyptic scenarios in pop culture today. A viral outbreak happens, the world is populated with the infected/undead, and the entire narrative follows how a group of survivors grasp the tragedy of their new world. Walking Dead, the Last of Us, Crossed etc...this is a scenario that's become so ubiquitous that even the CDC has a protocol in case an undead outbreak occurs. However, there's one thing that's always bothered me about the zombie apocalypse and any given series/movie that uses it:
How do you end it?
Yes, the survivors get through the immediate narrative of the movie, but then what? Zombies still roam the Earth and aren't going away, society has descended into anarchy, and the world is still irrevocably screwed. Our protagonists survive horror after horror, slowly lose their morality for survival, and are forever scarred and for what: To live the rest of their lives in a barren wasteland with no hope in sight? While many zombie fans say that an inevitably horrible world and the depravity of humanity is the entire point of the apocalypse, is that really all there is?
Is it too corny for a zombie apocalypse to have a cure or a way to return to normal? Is there a way to inject a shining light in a world of the undead, or is all zombie fiction destined to end with a screwed up world and desperation for survival? I don't know if I'm the only one to think this, but I find it very difficult to care about characters doing horrible things to survive when you know that it will never get better and their survival is ultimately meaningless.
(P.S: if there are zombie apocalypse stories that are noticeably brighter than what I've described, I'd love to know about them. Closest I can think of is Warm Bodies.)
How do you end it?
Yes, the survivors get through the immediate narrative of the movie, but then what? Zombies still roam the Earth and aren't going away, society has descended into anarchy, and the world is still irrevocably screwed. Our protagonists survive horror after horror, slowly lose their morality for survival, and are forever scarred and for what: To live the rest of their lives in a barren wasteland with no hope in sight? While many zombie fans say that an inevitably horrible world and the depravity of humanity is the entire point of the apocalypse, is that really all there is?
Is it too corny for a zombie apocalypse to have a cure or a way to return to normal? Is there a way to inject a shining light in a world of the undead, or is all zombie fiction destined to end with a screwed up world and desperation for survival? I don't know if I'm the only one to think this, but I find it very difficult to care about characters doing horrible things to survive when you know that it will never get better and their survival is ultimately meaningless.
(P.S: if there are zombie apocalypse stories that are noticeably brighter than what I've described, I'd love to know about them. Closest I can think of is Warm Bodies.)