Zombies first appeared in literature in the 17th century, and in film in 1932. But it was the popular TV series "The Walking Dead" that elevated the cannibalistic undead to pop culture stars.
- Paul Scott, associate professor of French at the University of Kansas