The Walking Dead Has Become a White Patriarchy

At Salon, Lorraine Berry reviews the new season of “The Walking Dead,” and finds something very familiar in this zombie-apocalypse television series. “I have been disappointed to discover that, while the writers occasionally take a moment to comment on the state of gender—and of race—in this new world, in the end they leave these issues to die and reconstitute a world in which white men rule.” 

Read at Salon

© 2011 Religion & Politics