Hollywood's recent obsession with the apocalypse continues. Above and below, three new trailers about how the world copes with a global catastrophe (World War Z), or what it's like afterward (Oblivion, After Earth).

Disasters and the post-apocalypse are Hollywood staples (Planet of the Apes [1968] is one example, and by my count, After Earth is Will Smith's third effort in the genre), but if they really are more popular now, what's the reason? I would suggest a mixture of politics (many of these films carry thematic and visual cues to the 9/11 attacks) and economics (because the home theatre experience has improved so much, the only reason to go to the cinema is for visual spectaculars).

There are also mega-disaster films that play to the fears and prejudices of particular groups. The forgettable Knowing, starring Nicolas Cage, seemed to be aimed subtly at evangelicals (and Cage is now involved in a film version of the Left Behind book series), while The Day After Tomorrow was somewhat less subtle about its environmental credentials.