Today, the most novel feature of new technology is ordinariness.
The logo for the Dutch videogame studio Guerilla Games is an object lesson in mixed metaphor: an orange "G" contorted into the chevron shape of a military rank insignia. Guerilla insurgencies are often organized and sometimes even state-based, but they are hardly represented by the formal emblem of command and control military structure. Guerilla warfare is irregular, asymmetrical, and lithe. It ambushes and sabotages, seeing itself as a noble defense of the many against the oppression of the few.
Guerilla Games's managing director Hermen Hulst is a hulking blonde with a square jaw and a name that deserves to run an insurgency. He's taken the stage in front of his studio's big logo at the PlayStation 4 launch announcement in New York City, a kind of meta-marketing effort at which the electronics giant announces but doesn't quite reveal their "next generation" videogame console.
Hulst is anything but a guerilla. His studio has been a wholly-owned subsidiary of Sony Computer Entertainment since 2005, during which time it has exclusively produced sequels to one title: Killzone, a first-person shooter set in a science fictional future in which human descendants have mutated to adapt to life on the harsh planet Helghan. It is a game in which burlish dudes in body armor discharge big guns at shadowy men with superhuman powers in orange-eyed gasmasks. It is a game that makes you want to use the word "motherfucking" unnecessarily when you talk about it.
Hulst has taken the stage to announce the latest specimen in the franchise his studio is destined to iterate forever. Its title, Killzone: Shadow Fall, feels almost computer generated, as if chosen from a thousand candidates for the most absurdly, mind-numbingly plausible phrase to be rendered in riveted-metal or orange stenciling on a shiny, black box.
Shadow Fall is set in a sprawling, futuristic city. Like everything in this PlayStation 4 presser, it looks gorgeous: glass and metal spires, floating island-ships, fluorescing aqua ribbon-windows flanking skyscrapers that rise from waterfalls. Hulst explains that in this chapter of Helghast lore, the city has been divided by a security wall, making the game feel "like Cold War Berlin."
The first of many boring game walkaround videos plays behind Hulst. A flyover of the environment. A player-character in a bulky suit. He (always he) maneuvers through a crowd before an explosion initiates a barrage of gunfire. A helicopter-like vehicle. Some sort of metal rope. The metal rope looks nice, I guess, if you're into metal rope. In any case, it's nothing like Cold War Berlin, no more than Transformers is like the Rwandan genocide.
As it turned out, Hulst wasn't the only "guerilla" executive shilling mainstream pulp as faux-politics. Sucker Punch game director Nate Fox came out swinging like Sean Penn, decrying the rise of the police state in contemporary America and Britain. "Our security comes at a high price: our freedom," he moralized, waiting a beat before proposing an exit from such dystopia: "what if a handful of people developed superhuman abilities?" You can look forward to the results in inFamous: Second Son, yet another sequel in yet another franchise.
Meanwhile, as two of its invited developers use police state metaphors to introduce their titles, Sony officials announces a slew of me-too "social" features -- including a "Share" button on every controller -- meant to connect the console to the myriad corporate surveillance services that have become today's norm.
Other promises of novelty seem all too familiar, including the console's continued obsession with visual verisimilitude. Even the irascible French writer/designer David Cage equated emotional depth with visual resolution, promising that this time the hardware could finally provide it. His evidence? The empty, chestnut eyes of an old man's disembodied head. Drive Club creator Matt Southern argued that better racing games demand more accurate seat upholstery leather grain simulation. Blizzard, Bungie, and Square Enix showed projects they'd already announced, confirming that the PS4 would serve as a decent enough host for high-gloss, big budget videogames. Sony made its own concessions, adding enough digital download, streaming, social, and touch-control features to lightly tick the boxes of current trends so as not to terrify its shareholders for a quarter or two.
It's easy to feel disappointed by Sony's ambitions, but perhaps that's the wrong attitude. What if the real delusion can be found in our expectation for something "revolutionary" in Sony's announcement rather than in their having failed to deliver such revolution? What if the problem is not the lack of novelty, but our assumption that novelty ought to be imparted in the first place? Perhaps the best thing about the PS4 is that there's nothing very special about it.What's novel about novelty anyway? It seems like a tautological question. We get excited about newness for the novelty of it. In order to see or do something we previously couldn't. In order to do the same things better or faster or cheaper. Novelty excites us because it promises something fresh. Novelty is the product of innovation, that favored value of our contemporary technological lives.
But innovation installs a trap for itself: it must continue endlessly. Like its parent economic growth, innovation must be ceaseless to be coherent. If a medium or a technology can simply be improved to a point beyond which further improvements become incremental or invisible, then it risks plateauing, ceding ground to another one capable of imparting even newer newness.
Over the past decade in particular, innovation in consumer electronics has become a cultural affair as much as a technical one. Internal details and outer capacities that a previous generation would have found only in esoteric enthusiast publications now produce mainstream headlines. Even the announcements themselves, previously limited to industrial and retail audiences, have become cultural events. An Apple product reveal shares more in common with a televised awards ceremony than it does with a corporate sales conference.
Of course, inevitably, everything plateaus. The new MacBooks or iPhones or PlayStations become less surprising, and yet our expectations rise with each iteration. At such a juncture, innovation becomes less about generating creativity, originality, ingenuity, and other related virtues and more about calling a set of reasonable if tepid decisions "innovative" in order to produce the rhetorical force of novelty absent its earnest payload. Innovation is often a simulation of innovation. Yet, nobody wants to run the headline, "Consumer Electronics Company Announces Slightly Revised Version of Popular Product" -- nor to have it run about them.
But we need not assume that the era technical innovation has concluded to recognize that the rhetoric of innovation may exert a cognitive and cultural burden that is too onerous to bear. In order for a technology to become widespread and familiar enough to have broad influence, it must begin to disappear. We must use it more than we talk about it.
This is the paradox the media philosopher Marshall McLuhan described with a notion borrowed from Gestalt psychology: figure and ground. The figure is what we notice, the medium or technology we see and use, think and talk about. But that medium cannot exist in a vacuum; it works inside a context, or a ground. Understanding a technology requires an examination of both facets. McLuhan's famous quip "the medium is the message" also invokes the figure/ground idea: instead of focusing on the "content" of a medium, like a television program or an app or a videogame, we ought to consider the technological forms that deliver that message.
Technologies like television and automobiles work so well because we have forgotten about them. We've put them into the backs of our minds. They are the water we swim in. That's not to say that nothing new can arise. HDTV and electric vehicles offer examples of innovations that are rapidly and broadly comprehensible thanks to our familiarity with the domains they extend. And as a result, novelty in TV or automobiles is precious. It's much harder to make an innovation stick precisely because we have so deeply incorporated such devices into our lives. Instead, novelty appears only when we have reason to consider it: helping your parents face the broadcast-to-HDTV transition, or familiarizing yourself with the latest display choices (plasma? OLED?) when an old TV breaks and you must select a new one.
On first blush it might seem like videogame consoles are unlike televisions and automobiles, subject to more disruptive and lurching change. After all, we need not buy new programming or new roads every time we upgrade our sets or our cars, while a new PS4 requires a new investment in games, peripherals, subscription services, and so on. But thanks to the acceleration of planned obsolescence, we have become more accustomed to losing whole libraries of media when new devices emerge. The VCR gives way to the DVD and then to the Blu-Ray. Or for that matter, the Apple device that ran your favorite apps and games effectively last year suddenly slows to a crawl with the latest OS update, one that proves necessary to run the latest program updates anyway. And besides, media like apps, movies, and games are consumables, in the sense that we use them up rather than using them over and over. The truth is, all media technologies have become seasonal fashions more than stable furnishings.
Given our increasingly promiscuous technological lives, perhaps the dissonance in Sony's PS4 announcement has less to do with mismatches in the claims of its hardware and software designers, and more to do with the idea that anybody would bother to make such claims in the first place. It would have been enough just to say, "Look, we made a badasser PlayStation, just like you expected."
A first-person shooter is a first-person shooter. A driving sim is a driving sim. FIFA is FIFA. There's nothing revolutionary about them, no more than there's anything revolutionary about a wacky family sitcom or an apocalyptic action flick. Sure, some new digital filmmaking technique or digital distribution mechanism might slightly alter our experience of such media, but the moment we focus on that purported innovation, we tend to become annoyed and distracted. Just think of Hollywood's recent experiments in technical innovations like 3D and 48 fps -- they leave audiences cold. All we really want are decent films that don't run so damned long.
David Cage's dreams of interactive cinema notwithstanding, big budget, high-gloss commercial videogames aren't appealing because there's some new zenith to reach, one only made possible by a controller with a "Share" button or a touch-pad or a streaming delivery system. They're appealing because they have reached their zenith, and their zenith is also their nadir. In some sense, we love big budget, high-gloss videogames because they are terrible, because they have to compromise to a lowest common denominator to justify their absurd budgets, and because we secretly know that our highest aspirations for them are actually quite low. We don't really want Cold War Berlin; we want big motherfucking guns.
This is an unpopular claim to make out in the open. Few will own up to the fact that they mostly want to shoot aliens and play Madden NFL on PlayStation, just like they mostly want to see explosions and watch football on TV. That's not to say that's all videogames can do, of course. Just as Downton Abbey or Storage Wars or Mad Men make different uses of the plasma TV, so some games disrupt videogames' crass pornography of playable action: Cage's quasi-interactive drama Heavy Rain or That Game Company's stylized environmental dreamscape Journey, or Jonathan Blow's dense world-puzzler The Witness, a game whose PS4 launch was even revealed at the Sony announcement. But to become figures, these works have to arise from the wide, stable base of videogames' ground. We need the tripe to make the Kobe sensical. After all, 50" TVs and surround sound home theaters weren't invented to make Lord Crawley appear more visually captivating or more aurally directional.
Such pragmatism may seem vulgar, but it's also weirdly bewitching. There's something earnest about embracing the dumb ugliness of stupid games, of wanting to play the latest, shiniest versions of Killzone and Gran Turismo not because they contain anything new or different, but just because we recognize our own propensity to produce endless desire for them.
The PlayStation 4's errs not by being something other than what it is, but by holding on to the idea that its particular brand of novelty is in any way novel, by mistaking itself for figure rather than for ground. By calling itself "PlayStation 4" instead of just "PlayStation," because really all anyone wants is whatever PlayStation is made available, doing whatever things it ought to do at whatever moment it does them. Apple recognized this problem when it tried to correct the mistake of the "iPad 2" by reverting to its follow-up as just "the iPad," a name that still hasn't stuck. Leica, the old and traditional German photographic and optical equipment company, stopped numbering its digital M rangefinder cameras this year, after burning through as many numeric increments in six years as it had in the previous two decades. At some point, a camera is just a camera, no matter how nice it is.
Maybe it's time. Our gizmos and gadgets, our phones and laptops and tablets and videogame consoles just aren't very special anymore. And counter-intuitively, that's what's so newly special about them: their familiarity, their ordinariness. Their ever-accelerating status as ground rather than figure. We mistakenly believe that the label "next generation" implies newness and innovation, a promise of the technological utopia we've been dreaming of. But if you pause to reflect on the matter, you'll quickly realize that all those earlier generations were once next generations themselves, for some previously current generation. Innovation is like a chinese finger trap: the more you tug deliberately at progress, the less progress you make, because the deepest, most profund novelty is the kind that blinds us to novelty. Every "next" thing shouldn't have to be a revolution. It can just be what comes next.
More From The Atlantic
- The Ph.D. Bust, Pt. II: How Bad Is the Job Market For Young American-Born Scientists?
- 59% of the 'Tuna' Americans Eat Is Not Tuna
- Ad War: BuzzFeed, the Dish, and the Perils of Sponsored Content
- Technology & Electronics
- Arts & Entertainment
- Sony Computer Entertainment