Toggle Menu

Rap Sheet

Links on R&P from around the web

The Walking Dead Has Become a White Patriarchy

posted on November 13, 2012

At Salon, Lorraine Berry reviews the new season of “The Walking Dead,” and finds something very familiar in this zombie-apocalypse television series. “I have been disappointed to discover that, while the writers occasionally take a moment to comment on the state of gender—and of race—in this new world, in the end they leave these issues to die and reconstitute a world in which white men rule.” 

Read at Salon