Reading about dystopias is all fun and games, until you realize you’re living in one.

dys • to • pi • a
/dis’tōpēǝ/

noun

an imagined state or society in which there is great suffering or injustice, typically one that is totalitarian or post-apocalyptic.

Definition from Oxford Languages

The Bible might arguably be the first apocalypse novel out there, but throughout the 20th and 21st centuries, dystopian stories have thrived, encouraged by the trauma inflicted on the human race through events as far back as the French Revolution, the Crimean War, and of course, World Wars One and Two. Every time our existence is threatened we tell stories of it, imaginations of what it could have been like if the ‘bad guys’ had won.

But a common thread through all the stories, from Brave New World (1932) to Nineteen Eighty-Four (1949) or Fahrenheit 451 (1953), is the perspective that this post-apocalyptic, totalitarian regime is nonetheless undesirable, that it should be railed against, fought tooth and nail to the last of our dying breaths, because it represents the total lack of all freedoms we have come to enjoy and expect in our civilization.

We enjoy these sorts of novels – and later, of course, films – because they present a thrilling view into a terrible society from a safe place. At worst, they offer a few hours of escapism from our otherwise mundane lives; at best, they offer insight into why we believe in freedom and justice, and why we should continue to prevail against what we perceive to be evil.

Did George Orwell not predict omnipresent CCTV?

Photo by Scott Webb on Pexels.com

The realms of the disaster novel, the dystopian future, and the post-apocalypse, are of course more then idle entertainment. The very best of those authors carefully analyze visible trends in today’s existing society and extrapolate where they might lead if left unchecked. And in many of those cases, the spirit of the prediction, if not the letter, has come eerily true. Everything from CCTV and media propaganda to a distaste for all things intellectual and scientific was at some point predicted by some of the best science fiction authors the world has ever put forth.

And yet, for decades, we’ve relegated these stories to the file of ‘interesting, but couldn’t really happen’, simply because we like to believe that the real world is more grounded, that society’s checks and balances would kick in to prevent such a disastrous outcome. We like to feel that our privileged lives can’t be touched by the ugly realities that we’ve been warned about for over a century now. And we forget; we forget the true injustices of great wars and holocausts and genocides, because they didn’t happen to us.

So what happens when one day you wake up, and realize that the world you thought you believed in, the one in which you were safe from persecution, is gone? Worse yet, what happens when you come to the realization that for many, it never existed at all?

An imagined state or society in which there is great suffering or injustice.

Let’s focus on this first part of the definition of the word ‘dystopia’. We don’t need to imagine such a state; our society, here in the United States, exhibits tremendous suffering and injustice. It has for centuries, was founded on the blood of indigenous people who were savagely conquered, and then built by the slaves who were ripped from their home and severed from all nationality, culture and family they ever knew. And despite movements to give these people equal rights dating back as far as the American Civil War, we continue to live in a world that judges men and women all the harsher for the color of their skin.

How blind is justice, really?

Photo by JJ Jordan on Pexels.com

The worst type of society is not one in which discrimination is legal and supported; it’s one in which it’s professed to be abolished, and yet allowed to persist. It’s not one in which black people must sit at the back of the bus; it’s one in which they’re silently judged if they don’t. It’s not one in which a black person knows they’ll be treated worse by police; it’s one in which they fear it.

There is great suffering and injustice in this country, and the world over. The society we live in – the one I grew up in – is run by white men standing on the shoulders of black workers. The worst of it is that when President Obama was elected, people began to feel hope that a change was coming; people began to wonder – could a black person really make a difference to the world? And when his terms were over, the white supremacists retaliated, hard. They made damn sure to elect someone who could undo all the progress made in eight years, to find someone who would speak their language: the language of oppression.

And this brings me to the second part of the definition of a dystopia:

Typically [a society] that is totalitarian or post-apocalyptic.

For those of you who decry, “but we live in a democracy!”, I challenge you to look around you at your government’s reaction to the Black Lives Matter protests that are sweeping the nation. Let’s define totalitarian for a moment:

To • tal • i • tar • i • an
/tōˌtaləˈterēən/

adjective

relating to a system of government that is centralized and dictatorial and requires complete subservience to the state.

Definition from Oxford Languages

Let’s break this down. The United States government is heavily centralized, to the end that Washington is the be-all and end-all of the government itself. Representatives are elected from their states, of course, but all paths eventually lead to DC. It is exceptionally difficult to get a law passed in one state that is considered unlawful in others (look at the effort to legalize marijuana as an example), and federal law triumphs over all local and state laws.

And if you think the United States isn’t an elected dictatorship … a dictatorship is really nothing more than a form of government “characterized by a single leader or group of leaders with little or no toleration for political pluralism or independent programs or media”. Let’s think about the United States for a moment in this context: we have a single leader who has definitively demonstrated his lack of tolerance for any kind of political pluralism and independent media. From phrases such as “fake news” to the glorious “Fox isn’t working for us anymore!“, the president of the United States has never appeared more dictatorial.

And he requires – demands – complete subservience. Look at the violence instigated by the George Floyd protests, in which unarmed protestors have been viciously attacked by heavily militarized police, beaten, bruised and bloodied and left in the streets. Ask yourself, is a government in which police brutality is not only tolerated but outright taught not a totalitarian regime?

And finally, we are undoubtedly post-apocalyptic. From entire continents burning to deadly viruses and violence in the streets, one could be forgiven for thinking the end of the world is definitely upon us. And whilst practically speaking, of course, the world will keep on spinning with or without the human race, perhaps the end of the world is closer than we think – in a different sort of way.

And this is the only place that I feel I can draw any sort of hope. Perhaps the end of the world isn’t the end of all humanity, but rather the end of inhumanity. Perhaps … just perhaps the slew of apocalyptical events that have decimated 2020 can lead to a change, something that could bring people together, allow space for listening, allow for justice, a space where people stop rejecting science and embracing ignorance.

It’s hard to see, especially when you’re in the midst of it. But the very worst thing that could happen to the world, and to this country right now, is for us to simply pretend it isn’t happening, and that our lives can continue unaffected. For some, that may well be true – the wealthy and privileged, naturally, are the exempt in any good dystopian story – but for the rest of us, we need resist the status quo with every ounce of our strength.

Otherwise, to quote an otherwise questionable movie: “So this is how liberty dies … with thunderous applause.”

Movie Night: I Kill Giants

Year: 2018
Genre: Fantasy/Drama
Cast: Zoe Saldana, Imogen Poots, Madison Wolfe

Rating: 5 out of 5.

Barbara Thorson struggles through life by escaping into a fantasy life of magic and monsters.

IMDb

I’m not a super big fan of graphic novels (which isn’t to say I don’t like them, I just don’t have much experience with the medium), so it came as a pleasant surprise to realize the origin of this charming, sad and rewarding tale came from illustrated pages (and quite acclaimed ones, as I understand it).

Not that this should – or did – affect my take on the film itself, which stands strong in its own right. Masterfully crafted – somewhat in the style of Peter Jackson’s take on The Lovely Bones, with a seamless blend of intimate personal shots and grandiose, epic CGI giants – the visuals nonetheless serve only as a backdrop to an intense and rewarding story of love, despair, loss, grief and renewal.

Going into the movie with no previous knowledge of the story, and having seen it billed as ‘fantasy’ with glorious posters of villainous-looking giants, it genuinely wasn’t clear to me for a large portion of the film whether the titular creatures were real, or merely in the imagination of the protagonist, played ably by Madison Wolfe. When the truth is finally revealed, it’s done in a truly heartbreaking manner, and by the end of the movie I wasn’t crying ugly tears, you were.

Unfortunately, this touching story of growing up with tragedy seemingly flopped hard on release, with IMDb showing it making less than $500K globally on a budget of almost $15M. One of the reviews there implicates a terrible marketing campaign, which I mostly agree with; I was expecting the movie to be an action/adventure giant-killing romp, when in fact all of that serves only as the scenery for a touching growing-up drama.

Despite the poor reception, for me this was a flawless piece of cinema, albeit in a somewhat niche category, and I would wholeheartedly recommend it to anyone interested in the sadder side of things.

10/10 would watch again.

What Would It Take to Make a Good Video Game Movie?

I’ll happily admit to being a fairly casual gamer. I don’t have a console, I’m not the first person in line at GameStop when a new title is released, and I typically idle the minutes and hours away with mindless entertainment like Angry Birds on my phone.

That being said, there are a couple of more ‘serious’ games I enjoy playing from time to time; particularly ID Software’s titles such as Doom and Quake (I’ve been playing those games since the late ’90s), and one of my all-time favorite PC games back in the day was Max Payne, mainly because of the heavy emphasis on plot and storytelling. I recently finished playing through Doom (2016), and although the story was minimal the combat mechanics were fun, and the whole 15+ hours of gameplay were hugely entertaining.

Sometimes, though, I want the experience of a solid video game without the effort of having to, you know, actually play it. I guess I’m not the only one to think this, because throughout the years there have been endless adaptations of video games to film. Sadly, most of these have met with spectacular failure, both at the box office and critically. This led me to wonder: why are so many video game adaptations terrible, and what might it take to make one that is actually good?

Strong Source Material

The original Macintosh FPS, Marathon, was exceptionally plot-centric.

Not all video games are created equal. Whilst early PC titles such as Quake, Doom, Myst and others may have broken boundaries in terms of 3D graphics, gameplay mechanics and multiplayer options, most of them were pretty thin on plot. It was mostly just find the bad guy, kill the bad guy, repeat. There were some exceptions to this; I recall playing a early Bungie-developed game called Marathon, which not only was groundbreaking from a physics modeling perspective and introduced LAN-based multiplayer, but also because the plot featured so heavily in the game that certain levels were impossible to complete unless you interacted with the story.

It wasn’t until I played Max Payne in the early ’00s, however, that I realized just how strong a video game story could be. With graphic novel-inspired cutscenes and a strong emphasis on character development, I ended up playing the game through dozens of times just to relive the story.

It stands to reason, then, that in order to make a successful adaptation, you need something to adapt in the first place. A strong plot and compelling characters are necessary for any story, and unfortunately, many video games lack these elements.

An Understanding of Adaptations

One of the biggest points of contention when a non-filmic source material is adapted to film is the authenticity of the writers’ efforts to maintain what was in the original story. Take Peter Jackson’s efforts with The Lord of the Rings: to many, it represents a masterpiece of western cinema, and a fitting adaptation to an equally timeless and epic set of books. To others, however, Jackson took too many liberties with the source material, from omitting characters to changing plot devices, and even creating scenarios that never occurred in the books at all.

In the books, Narsil was reforged right at the beginning; in the movies, not until the end.

However, I believe the critics of these films are missing the point of an adaptation. It isn’t meant to faithfully replicate every scene in the book on celluloid; to do so would be uninventive, slow-paced, and frankly boring. An adaptation should take the core, central elements of a story and rework them into the new format – that being a 2-3 hour film that you sit and watch. If that means changing characters, motivations and plot points, then so be it – it’s an adaptation, not a replication.

I think that this balance is something many video game adaptations miss the boat on. In some cases, they try too hard to match the original material, and in others they deviate too far from it. Sometimes they pander too heavily to the fans, and in others they try too hard to make it accessible to people who’ve never experienced the original game. There’s a fine line between these two extremes, and a successful adaptation should be able to satisfy the original players’ desire for familiarity, whilst creating a world that can be experienced easily by someone who’s never heard of it before.

A Strong Cast

This is probably more essential for films in general, but it’s just as relevant for video game adaptations as it is for any other type of movie. A strong cast is vital to the success of a movie, because as viewers we need to feel invested in the characters, their motivations and relationships, and the chemistry between them is important.

The chemistry between these characters was unmistakable.

Now this doesn’t mean the cast need to necessarily be famous or well-known; perhaps one of the best examples of chemistry in film is the original Star Wars from 1977; no one knew who Mark Hamill, Carrie Fisher or Harrison Ford were at the time, but their on-screen chemistry is what arguably makes the movie. Compare this to the lackluster connection between Hayden Christensen and Natalie Portman in the prequels; it didn’t matter that Portman was arguably more up-and-coming at the time, because they simply didn’t seem to have any real connection.

There are examples of video game adaptations that bagged well-known actors and yet failed on the chemistry; Doom (2005) has both Karl Urban and Rosamund Pike (the latter of which would go on to nominated for an Oscar), but the relationship between them as brother and sister falls flat at every turn. This is in part due to a failure to develop the relationship through the plot (the two share almost no screen time), but also because the two actors just don’t seem to ‘click’.

Homage to the Original Game

Certain games are known for inventing, developing, or innovating certain types of gameplay features. Doom brought us the BFG – a massively overpowered weapon that can decimate almost any enemy in a single shot; Max Payne was one of the first games to introduce ‘bullet time’ – a feature where gameplay slows down during battle sequences, allowing the player to see individual bullets flying past. And there are movies where these concepts are adapted well, of course – and others where they aren’t.

In the 2008 adaptation of Max Payne, we see Mark Wahlberg make his way through a very noir New York city – just as in the original game – but the limited use of bullet time was frustrating. This was one of the cornerstones of the game, and although it features at certain points in the film, it never felt like it was as important an aspect as it should have been. There are other aspects of the original game that were modified as well, including some of the key character motivations and climactic scenes.

The first-person sequence was one of Doom’s best assets.

On the flip side, a film that I felt did this well was, again, Doom (2005). Not only did it bring us the BFG in a way that could never have been done in a game (when fired, it takes out massive chunks of wall and ceiling, a mechanic that would be exceptionally difficult to recreate in a game), but it also boasts an incredible first-person scene that bears all the classic hallmarks of a FPS game, including using multiple weapons to defeat multiple demons in a non-stop, long-take action sequence.

Not all video game movies are bad, and not all are as bad as some people make them out to be. That being said, the highest-rated game adaptation on Rotten Tomatoes is Angry Birds 2, and it holds a he level of something like say, John Wick (83%, 89% and 90% for each film respectively), which bears all the hallmarks of a video game movie without actually being one.

I think the key thing is a successful blend of many of the smaller elements that work in various movies – faithfulness, strong casting choices, and an understanding of how to make a good adaptation. Who knows? Perhaps one day we’ll see a game adaptation that truly checks all the boxes, and I know I’ll be first in line to see it when it does come out!