This won't surprise anyone who knows me. I spend most of my work breaks and micro-breaks listening to videogame podcasts and reading videogame blogs.
This sporadic immersion in game journalism steeps my brain in videogaming conversations throughout the day as I work. It's almost blissful, really. I'm continually amazed at how the flame-ridden, nerdy, proto-adolescent conversations in the Usenet enclaves of yesteryear have gradually transformed into a culture
of mostly intelligent conversations
surrounding game design, theory, and possibility.
While quite a few venues of videogame discourse seem to have grown up with their interlocuters, gaming as a medium and practice appears to be entering the awkward, slowly evolutionary, "teenage" phase of its development, seemingly trailing the critical expectations surrounding it by a few years.
This prototypical stage has lead to an admittedly frustrating disconnect between public mass media, which continue to approach gaming as a primarily juvenile pop-cultural phenomenon (with all of the hysteria accorded to this cultural sector if potency or influence becomes a perceived issue), and a deeply invested critical community of game designers who see where videogaming can evolve as a new art form.
I'm sure many critics in the blogosphere would like to see the evolution of games accelerated toward their obvious potential, as recent discussions concerning the need for a "Citizen Kane" of gaming have indicated. But I have a confession: I actually hope that games remain in their current stage for a bit longer as the critical community eventually breaks free of the apologetic rut in which it seems to be trapped and instead begins to foment, mature, and refine the discourse surrounding games and their relationship with aesthetic cultural practice.
I think the current state of gaming is, in fact, the perfect backdrop for a deepening understanding of what the subject of gaming criticism should be, and what games themselves are capable of becoming.
In the past couple of years, for economic and technological reasons beyond the scope of this post, the attention of game criticism has been effectively split between two spheres: "AAA" games created by large development studios and marketed by publishers with deep budgets, and indie games usually developed by teams that can be enumerated on a single hand.
The former sphere is driven by mass-market sales and largely comprised of franchises or "intellectual properties" which are executed as flawlessly as possible and iterated for as long as the market or brand loyalty for that IP stands. Recent examples include Uncharted 2, Batman: Arkham Asylum, Rachet & Clank Future. Growth here can be characterized as iterative and evolutionary, and primarily honed to the tastes and expectations of their targeted demographics.
The indie sphere is characterized largely by innovative design and a rapid development model, a necessity given the restricted budgets & resources. It also seems to have a stronger imperative to create uniquely engaging and/or experimental game experiences aimed at a smaller, more auteur-oriented audience. Iterations or sequels are sparse here and are usually moot. Examples include: Machinarium, Flower, Braid.
I'm admittedly painting these two spheres in large brushstrokes, but I think that anyone who has read or participated in game criticism lately has noticed this rough distinction and the split attention it has effected. This has created some very productive and encouraging moments of critical dissonance where expectations developed in one arena have been brought to bear upon the other.
For instance, such dissonance has been fruitful in galvanizing critics into holding games accountable
for "growing up" in the face of market forces that seem to be actively discouraging this growth in the AAA arena.
I think of this as the "ludic pull" in criticism, a drive to break gaming out of the imitative constraints and genre assumptions it has placed upon itself and to explore what makes videogaming experientially distinct and important as a medium unto itself.
In the opposite direction, the paradigm of the super-produced "blockbuster" title has continued to contextualize gaming in the tropes of cinematic narrative in its critical, marketing, and visual vocabularies, for better or worse.
While this has allowed games to mature in some aspects, such as visual devices and narrative structure, arguments
have been brought against this implicit "cinematic imperative" in game design, which purportedly risks hampering growth and exploration, relegating videogames to a perpetual "para-cinematic" medium.
I think this fear of marginalizing videogames is largely misplaced and disempowering to a certain extent. This is the "apologetic pull" in game criticism, and it has outlasted its own usefulness. I agree with much of Michael Abbott's argument
concerning the places where cinematic appropriations are actually worthwhile in videogaming if they're taken confidently as tools in a larger palette. The problem is that most game critics and designers aren't entirely confident in that palette yet, mostly because it hasn't been fully defined.
In opposition to this apologetic pull, I'd ask critics to consider the following: Could films, in fact, come to be perceived as "paraludic" in the coming century?
When it comes down to it, I think the tables are slowly turning in this direction. Though it's difficult for visual arts and film critics to see it now, I believe that cinema in its current form will eventually be percieved as a subset of whatever it is that games are becoming.
In my perception, videogames aren't just a new narrative medium or visual art, or interactive entertainment. Agency, interactivity, and systemic thinking are indeed significant aspects in gaming, but they meld with subjective experience and imagination to such
an unprecedented extent, that I'd venture to say that videogames are becoming a completely new cultural aesthetic practice. What we're facing is the birth of a new technology of the subject, or technology of subjectivity, which I don't think has really occurred since
That's a pretty big change to be evolving toward. As with most paradigm shifts of this order, criticism really won't have the vocabulary to wrestle with it until the shift has already occurred. Modernist critics couldn't entirely fathom or articulate the rupture that art had undergone in the late 1950's until well into the 1970's, when a philosophical discourse on contemporary art had finally solidified around necessary ruptures in its own assumptions, namely with the advent of post-structural and post-historical criticism.
I think gaming criticism is finally entering the preliminary stages of developing such a framework.
However, the recent pursuit and question of a "Citizen Kane" of gaming indicates that escape velocity from quasi-modernist genre concepts hasn't been achieved yet. When we can move past that question and put it to rest as, at best, misplaced, then the real questions can begin to be asked.
LOOK WHO CAME: