For the past thirty years movie critic Roger Ebert has been
the last word in box quotes. He can literally bite his thumb at you and make-or-break your film. So in 2010,
when he came out on the Chicago Sun-Times blog and declared that video
games could never be art, you can imagine the schism that followed within the nerd
community.
Now, I realize that geeks and games have always been
synonymous. The two share an almost symbiotic relationship in the cultural
development of untold millions of adolescent young boys and girls the world
over, myself included.
Film buffs are a different story.
There was a time when dorks and pock-marked youths were on the frontlines of test screenings, sneak previews and old fashioned scuttlebutt when it came to the movie industry. There were no forums or torrent links. There was no online grapevine nor were they there to make a buck or stick it to the man. They were there because they had to be. Literally.
In perhaps what was the precursor to ‘The Scene’, the subcultureof film geekdom in the pre-Internet days was a place where bootleg copies
of lesser-known films would circulate only as fast as they could be handed off
and where rabidly obsessive idolaters akin to the fanboys of today could convene in the dark corners of the campus
quad to discuss the latest print of “DuckSoup” or “Night of the Living Dead”.
In the words of a few old timers I’ve met, it was back when geeks
weren’t afraid of a little grunt work to live out their fantasies.
The films of the Marx Brothers saw a revival on college campuses throughout the 1960s. |
Something since then has changed. Blame shortening attention spans or whatever but there is a marked
difference in the geek genome these days. Film buffs are quite different than
video game enthusiasts (*with the notable exception of the San Diego Comicon,
where the two schools occasionally meld in a strange industry backchannel).
Yet where one has been elevated to a pretentious standard,
the other has shoved to the wayside like some kind of fallen angel. Why?
Again, Ebert’s infamous article serves as a useful primer. Consider
his assessment of the difference between art and games.
“One
obvious difference between art and games is that you can win a game. It has
rules, points, objectives, and an outcome. [One] might cite a[n] immersive game
without points or rules, but I would say then it ceases to be a game and
becomes a representation of a story, a novel, a play, dance, a film…You can
[also] go back in time and correct your mistakes. In chess, this is known as taking
back a move, and negates the whole discipline of the game. Nor am I
persuaded that I can learn about my own past by taking back my mistakes in a
video game.”
Subjectivity of his analysis aside, he makes an interesting
point upon which most gamers would understandably bristle in kind.
There’s a certain immaturity (others would call it audacity)
in the gaming community that tends to value the kind of wit you’d find on a Reddit
forum over the thoughtful sentimentality of “art”. This isn’t particularly out
of any fiendish insensitivity towards gushy hipsters or self-serious museum
curators. Rather, it is a reflection of
a relationship that we, as a culture, are only beginning to understand.
If art requires a soul to bear and a heart to rend, games
demand a mind to match and a logic to reason. In this way, games and the
legions who follow them are more like the lampooners of yore; crass but
incisive (not to mention thoroughly analytical) rabble-rousers, hotshots and
trolls who revel in their cliquish adoration like virtual Thomas Nasts or digital MarkTwains. They are driven by urges that are as complex as they are
expressive: the desire for answers. In this manner, games can be an art form.
But not in the same way that dance or music or literature is.
Click. Read. |
If pride wasn’t a driving motivation for Ebert to write his
article, it certainly was for the gamers who responded to him.
He cited a debate he had with a noted expert on gaming wherein
the Georges Melies’ film “Le Voyage dans la Lune” (1902) is
used as a comparison for the early stages of cinematic maturity and the early
stages of contemporary video game design.
While most NES aficionados would readily point to 8-bit precursors like “Pong” or “Space Invaders” as the developmental equivalent, the reasoning they
apply is far too literal. Ebert, too, is mistaken in this comparison because he
misunderstands the medium’s ultimate goal. Simplicity of form is not synonymous
with simplicity of function. Both camps therefore draw some erroneous
conclusions about where we’re heading.
Instead, let’s back-track.
Consider the transitional phase between photography and
motion pictures at the turn of the 20th Century. The early
experiments were practically nothing more than a camera capturing an action. A
simple action. No story, no grand metaphors, no production design. The English photographer Eadward Muybridge
became famous for his series of stills that have come to be known as “SallieGardner at a Gallop”.
The story goes that in 1872 he was hired to conduct an
experiment designed to determine whether a galloping horse was, at any point
during its stride, entirely airborne (that is, with all four legs off the
ground at once).
While working for the University of Pennsylvania
in the 1880s, Muybridge would later make an entire career out of what might be dubbed
“zoopraxiscopy”. It was a technique
he applied to the photographic capture of all manner of basic activities such
as dancing, military marches, wrestling, baseball, football, fencing, domestic
work, manual labor and sometimes just random inanimate objects in motion.
Now consider the numerous designs which soon thereafter
tried to capitalize on Muybridge’s technological achievement. The phenakistoscope. The zoetrope. The praxinoscope. The
mutoscope. Edison’s kinetoscope. The very illusion of movement was enough to draw the
crowds. The majority of these attractions featured simple animatics that were
nothing more complex than what Muybridge had recorded in his work for the
University.
For the motion picture medium, the techniques pioneered here
would ultimately give birth to perhaps the prototype genre upon which all
others were based: the actuality.
When film photography was introduced to the scene, a
graduation from the crude flip-book technique to a machine-styled 24-frames-per-second meant that the
medium could explore other possibilities. But that’s not to say that people
forgot their simple “actualities”.
In fact, one might argue that the early experiments in ‘cinema verite’ and Western culture’s
long-standing relationship with the theater fused at this point to create a
divergence in genre. Fiction versus non-fiction.
But don’t mistake the kind of non-fiction of Muybridge’s day
with that of a modern documentary. Even early experiments in the portrayal of
“real life” by film pioneers like DzigaVertov aren’t really up to the challenge of analogizing this division.
We have to turn to the likes of the Lumiere Brothers for further study.
Take their famous 46-second clip, “La Sortie desusines Lumière à Lyon” (1895).
Roughly translated as “workers leaving the Lumiere factory”,
the short clip shows just that: a group of women workers in shirtwaists and
morning hats pouring out of the factory at 25
rue St. Victor, Montplaisir. Now consider the other 9 films that the
Brothers showcased at their filmic debut at the Grand Café in Paris. “Horse trick riders”. “Black smiths.”
“Cordeliers Square in Lyon – a street scene”.
This is a quicksilver re-imagining of exactly the same kinds
of things that Muybridge was documenting for the ravenous public eye. And while
such early attempts appear primitive and lacking in imagination, it becomes
hard to defend one’s position on video games or other new technologies with
which we are similarly experimenting today.
Which brings me to the larger point.
Where does all this talk about video games come into play?
While the very basic nature of the television screen can
house the moving images of either a film or a video game equally, the
difference between the two mediums isn’t so stark as people would have you
believe.
In order to channel my argument to a finer point, I’m going
to use my own affinity for and experience with first-person-shooters to demonstrate the common thread.
I’m going to forgo a comprehensive history of the FPS genre,
skipping over ground-breakers like the original “Wolfenstein 3D” (1992)
and “Half-Life”
(1998) in order to expedite the argument.
Such titles were not the imitation of some
pre-existing form but are notable for the creation of an entirely new format
within the gaming genre.
Forget the cosmetic leaps that were made between “MazeWar” and “Quake”. It’s fallacious to assume that advances in
3-dimensional graphics alone constitute a fundamental advance in the medium.
Visuals aside, the real stand-out in this chaotic mixture of
innovation and design came to a head with the release of the console-based “GoldenEye007” for the Nintendo 64. This was the first major mainstream breakthrough for the FPS genre.
Here, you begin to see a certain trend emerging. It was no
mere accident that the popularity of GoldenEye
just happened to be an interactive off-shoot of a major film franchise. Not
only was there a built-in market for the game itself but the game capitalized
on the same principles that the designers of modern shooters apply to
addictively good effect (which is to say nothing of the guns-glamour-and-girls
factor innate to Ian Fleming’s famed
spy-master).
From there, numerous sub-divisions sprouted forth. One of
the more recent and successful adaptations has been the tactical shooter, a close cousin to war-themed games like the “Battlefield” or “Soccom” series.
When it comes to imitation, these games unabashedly
transplanted existing elements from the popular culture and suffused them
within the medium in a kind of competitive social experiment.
Take the “Rainbow 6”franchise. The title, characters and scenarios are all gleaned from the
912-page best-selling novel written by none other than Tom Clancy. The fact
that his name was lent to later shooters like “Ghost Recon” and “Splinter
Cell” plays hell with the boundaries of artistic overlap. Coupled with the fact that Clancy’s books
were also translated into numerous film adaptations throughout the 1990s and you can start to see where the line
begins to blur.
The success of “SavingPrivate Ryan” (1998) similarly spurred the rise of another shooter subset,
namely in the form of the World-War-II-based “Medal of Honor” (which was also developed
by Steven Spielberg and DreamWorksInteractive).
The oft-imitated beach landing. |
The cavalcade of war games that followed in its wake, like
the original “Call of Duty” or “Day of Defeat” continued this trend
of reproduction. During the 2000s, the competing franchises did little to mask
their inspirations.
Look at this sequence
from the “Call of Duty: United Offensive”
expansion released in 2004. Now compare that to the scene from “Episode 8” from the HBO miniseries “Band
of Brothers” (2001).
Or compare the orchestration of levels from “Call of Duty 4: Modern Warfare” (2007). “Charlie Don’t Surf” is not only a
direct reference to the Coppola film “Apocalypse Now” (1979) but the
entire operation is modeled after the‘four-corners’ sequence from Ridley Scott’s “Black Hawk Down” (2001). On top of that, look at the mission “All Ghillied Up”,
in which the operator conducts a two-man sniper operation in the Pripyat
Marshes of Belarus. Does this seem at all similar to the entirety of the 2001
film “Behind Enemy Lines”?
Nor were those examples merely an adolescent phase of gaming
development. Look at Treyarch’s 2010 release of “COD: Black Ops”. The entire Russian roulette scene is essentially a
recreation of the famous scene from Michael Cimino’s “The Deer Hunter” (1978). Observe.
What about regurgitations of the darkly observed miniseries “Generation Kill” (2008) from David
Simon? It’s not uncommon for the players themselves to adopt the same tropes you’ll find in pop culture reflections
of 21st-Century combat.
Need I go on?
My point is that the act of mimicry is the natural process
through which we tend to explore and test the boundaries of a new medium. The
imitation, replication of a familiar concept in an unfamiliar environment.
Do you think that the ubiquitous similarities James
Cameron’s Colonial Marines from “Aliens”
(1986) or the mobile infantry from Verhoeven's "Starship Troopers" (1997) share with the set-ups from innumerable sci-fi shooters like “Halo” or
“Doom” was just some kind of coincidence?
The implications of this trend are not nearly so alarming or
horrible as they might appear. While a pessimist might interpret this behavior
as outright plagiarism, industry moguls would be wise to take note of this and
accept it for what it truly is: flattery.
Sure, anyone can go pick up a Jane’s Guide to Military
Firearms and a copy of “The Hurt Locker”
from their local RedBox and get cracking on a new design. But it’s hard to segregate
these virtual “experiments” (as they should be seen) from the early days of the
Lumiere Brothers and their actualities. Here and now, art is no longer the
imitation of life but of concept. Of situation. Of ideas manifested in an
entirely new realm beyond the physical plain.
It’s hard to see the difference between the World’s Columbian Exposition in 1893
and that of the E3 Convention or
similarly-minded bazaars of modernity. We are as anxious to develop a new
technology today as we were in the time of the steam engine. Once the potential for application has been discovered, it's almost inevitable that multiple branches sprout forth all at once. Some of them have to be pruned of course but others thrive and diverge entirely.
Now. |
Then. |
That is the biggest fallacy that Ebert made. The touchstone
of this entire misconception can be traced back to that schism of geek culture,
which divorced the genuses cinephile and gamer. The filmmaker and the
programmer. The “artist” versus the “logician”. It’s a bridge that can no
longer be crossed but it need not be burnt.
For instance, consider the evolution and development of the
middle class and the concept of leisure or private life.
The most basic interpretation of events following the
Industrial Revolution, the reduction of the labor force, the expansion of the
bureaucracy, the surplus in material goods coupled with the generational booms
in between, all adds up to an individual with a whole lot of free time on their
hands. This is by no means a universal development. But in a highly stratified
and largely middle class society like our own, idle hands will likely forge
their own unique forms of cultural distraction.
It is at this juncture that the essentially practical mind
meets the pensive expressionist. Where one set of skills may serve a person in
strict matters of survival, the other serves the life of the mind, not the
body.
In that respect, video games and art (and by extension:
films) are not all that dissimilar in the function they ultimately serve.
Where art might deal with the classical realm of emotions
and human nature, perhaps video games will explore a yet-untapped dimension of the human experience such as free will
or agency in a world that seems increasingly suffocating and restrictive.
True, the gamer is now in a world defined by different laws
and customs than the classically-defined connoisseur of “art”. But his is one
no less rich in vibrancy and ingenuity.
For gaming requires a more scientific approach. A more grounded study of
inquiry and analysis. It is an exercise in the Socratic Method.
In this manner, games can serve as pedagogical wizardry,
cocooned in layers of delectable distraction.
For myself, the biggest allure of video games is not
necessarily that they are solely a deluded escape or fantasy. It’s that I have
control over something. Control over an ideal that is then free to mold at the
will of the user. In this way, a game is more an amplification of the person
using it than a magnification of the person making it.
It allows us all to become the artists of our own destinies.
And in that sense, it seems to be an equalizing force in a world where the
monopoly of visions rest in the professional lives of but a select few.
Ebert ultimately conceded that he believes there is still
hope yet for games to come of age. But he also thinks that most of us won’t
live to see the maturation of the medium. And I’ll give him that.
The early film culture reached its apogee when American
director D.W. Griffith released his
landmark film “Birth of a Nation” (1915), a supremely romanticized story of
the post-war South during reconstruction following the assassination of Abraham
Lincoln. All racial and cultural baggage aside, the movie was a roaring
success. It was massively popular and even the current U.S. President, Woodrow
Wilson, declared that the experience was like “writing with lightning”. It
introduced the world to new narrative techniques that were unique to the medium
of film. It was the first movie to feature intercut action sequences—scenes
which jumped from one location to another—in order to establish suspense and
drama. It was a new frontier for the human soul to explore.
And while games like “Lollipop Chainsaw” leave little room for soul-searching, it’s hard to dispel the
nagging suspicion that we won’t need that extra life to see what’s coming our
way.
No comments:
Post a Comment