I'm talking Game Of Thrones, Akame Ga Kill, Walking Dead...
Is it me or are shows these days really fucking nihilistic? Like we have these shows were the essential conflict happens, and it's never going to get better for the protagonists. There is no hope for a happy ending. And look, it maybe good story telling to chase our protagonists up a tree and throw rocks at them but it's just so fucking bleak.
I feel like TV shows these days are just like "Here's cute little puppy dog, it's yours. Love it." and then "Okay, it's been 3 months. Sit right there, I'm going to beat your dog to death with this ball point hammer. Oh don't cry, heres another Puppy." and then theres another hammer.
You can argue even that it's realistic, but oh my God real life is so fucking horrible who wants this to be their escapist fantasy?
Is it me or are shows these days really fucking nihilistic? Like we have these shows were the essential conflict happens, and it's never going to get better for the protagonists. There is no hope for a happy ending. And look, it maybe good story telling to chase our protagonists up a tree and throw rocks at them but it's just so fucking bleak.
I feel like TV shows these days are just like "Here's cute little puppy dog, it's yours. Love it." and then "Okay, it's been 3 months. Sit right there, I'm going to beat your dog to death with this ball point hammer. Oh don't cry, heres another Puppy." and then theres another hammer.
You can argue even that it's realistic, but oh my God real life is so fucking horrible who wants this to be their escapist fantasy?