[ Drama ] Open Question : The Walking Dead (Season 5); the best season?
So I’ve been watching TWD for the past 2 months and I’m currently in the middle of Season 5 and Wow… Just WOW! So deep and emotional! And not in a chick flick way… And the acting is just impressive and a major step-up from previous seasons. I think its the best season in the WHOLE series; even though I haven’t watch Season 6 onward! What’s your opinion? Is it going to get better in the upcoming seasons or is that it? The peak of TWD Series?