The walking dead have been a mainstay of horror films for decades. Although usually confined to the West Indies, Hollywood has made sure audiences will fear zombies anywhere and everywhere.