Age of Intellivision(S)

On the Paris-to-London-to-New York routes, circa 1955, you’d often hear and read that young writers were beginning to abandon their typewriters for experimental filmmaking. Sometimes, like black author James Baldwin, they’d go do far as to specify, “sixteen millimeter”; semi-pro, underground, but with some audience enthusiasm and help from the right critics, it could go anywhere. The guys who would have been novelists a generation ago were now aspiring towards an easier muse.

That beginning became the first stage of a metaphor that took shape in the minds of a number of the game and computer industries’ 90s visionaries, and would-be visionaries: games were leaving their shoot-‘em-up phase, entering a realm of creative potential that not only stood to equal Hollywood’s century of narrative arts, but eventually to surpass them with new storytelling forms we can scarcely imagine. That era would begin at around the turn of the millennium.

With National Public Radio much in the news lately, you might enjoy this prescient bit of futurism from 32 years ago. One of the first mentions of what we’d now call “streaming video” is found in this June 24, 1993 article in the Chicago Reader. “One of the most telling moments of my ordeal-by-NPR came while Linda Wertheimer was interviewing a computer developer on what will happen when computers are linked into televisions—the so-called intelligent TV. He predicted the development of literally hundreds of new interactive television networks and services “that would give the individual TV viewer an incredible amount of power to program for their own tastes rather than have to rely on these programming guys.” Replied a perturbed Wertheimer: “Is there any way we can dodge this bullet?

PC gaming took flight, with a distinctive sub-culture of its own, including low bandwidth online games. But living rooms didn’t tend to have computers; around the household couch, simplified, hot-rodded video game consoles still had the lure of being easy to use, with much lower prices than computers, and faster screen performance.

I’ve always thought Intellivision was a clever brand name. It was Mattel’s early video game system and stood for “intelligent television”, which was both a bit of an insult, earned or not, to TV-as-it-existed, and it was Mattel’s proud but inflated boast of what their plastic toy could do to make it smarter.

The Nineties started with a duopoly in living room console gaming between rivals Nintendo, a century-old maker of card and board games (think: the Japanese equivalents of Bingo and Monopoly), and Sega, which dominated Japan’s public video arcades (think: pool halls, juke boxes, and the Mob). Sega achieved a win in the home gaming wars with the Genesis. For a long time, Nintendo and Sega were like the Leno and Letterman of their field, a stable rivalry.

It evolved into a rivalry between the kings of the home computer and the kings of the living room television; which empire would conquer the other? It didn’t quite work out that way, but it wasn’t a crazy way to look at the starting positions of the players in an extraordinary media and consumer industry gamble.

Gaming consoles could present email and proto-websites on ordinary pre-HD TVs. In an era of 56K modems, before broadband, let alone online multimedia, wouldn’t that have been enough to win what was then called ‘the living room war’? How much more help from cheap tech would it take to turn a “toy” game machine into a family’s digital entertainment center? It was a pragmatic, bottom-up way of looking at household computer needs circa 1985-’95.

In the mid-Nineties, video game graphics made a major jump in quality, partly thanks to what SGI (Silicon Graphics) had learned from designing the vastly more expensive workstations that were coming into motion picture CGI by then. 3D graphics (not stereoscopic 3D, the kind you see with glasses) were a cinematic-level advance in visual realism. The Nintendo 64 was a clear winner over rival Sega’s Saturn, but now there was a massive and determined new player in video games, Sony, leveraging its assets in consumer electronics and manufacturing, its successful invasions of the movie and music industries, and a limitless war chest.

But Nintendo would survive, then and later, by tacitly dividing up the battlefield with a rival. They didn’t compete on the latest and coolest of fast game hardware, and would consistently market a less expensive console and a smaller library. That was the root of their success: they aimed their products at kids and families. Their most popular game characters were like classic age Disney. In the insular world of Japanese business, Nintendo tacitly left most over-15s to Sony.

Unsurprisingly, Bill Gates, Steve Ballmer, and the rest of Microsoft looked it another, top-down way: could a PC, or some version of one, be stripped down into an un-computerlike push-button game console? It could already do all the big stuff. Could it do the small stuff cheaply and flashily enough to take over the living room?

“Opening the Xbox” (2002), is a fairly typical, maybe slightly better than average technical/business history. It’s another case of, the world of 25-30 years ago is mostly the same as today’s, but as you read, differences emerge.

For example, there are women with roles in the story, but in 1998 it’s still a 95% male cast of characters. Because Microsoft is (at least supposedly) stodgy and buttoned-down, team building for the game team seems to be performative jerk behavior, stupidly overcompensating for nerdiness along the lines of “We got kicked out of the restaurant for being rowdy! We bought matching leather jackets and got matching Microsoft tattoos! Then, we hired strippers and models to wear slutty outfits and serve green vodka Jello shots! Xbox green, get it? We are so cool!!”

It’s basically a high-tech work environment familiar to anyone who’s read Tracy Kidder’s 1981 “The Soul of a New Machine”. There are still young teams in internal competition, whether designing hardware or writing software. There are still tradeoffs and compromises to get a product out the door. “It’s better than what it’s replacing and is (claimed to be) better than the competition.” How much better? How much more does it cost to achieve it? Looking ahead, how long are they likely to be able to maintain a lead in a competitive market?

To simplify a bit, in “Opening the Xbox”, there are three major rivals: Sony, the world’s largest consumer electronics company; Microsoft, the world's largest software company; and Nintendo, effectively the world's largest toy company. One sidelight of the Xbox mission statement was the central conviction that if Microsoft created the finest-in-class platform for graphic and narrative art, the strongest artists in video gaming would just naturally migrate to the Xbox.

There’s some truth in that, but there are so many conflicting tradeoffs to make. How big is the installed base of consoles? What’s the royalty fee structure? What’s the fickle public perception of the platform, and its game play characteristics? Cool or utterly sad?

As the story begins in 1996-‘98, Atari, the once spectacularly successful American games pioneer of the Eighties, is long gone. (Technically, no. Their entry in the high powered, high stakes console wars, the Atari Jaguar, was in stores. But yeah, effectively it was already dead.)

By the end of the Nineties, a new generation of “super consoles” was being developed that would eventually include Sony’s 2000 PlayStation 2, Microsoft’s so-far on-paper-only 2001 Xbox, and archrival Nintendo’s 2001 GameCube.

The first one out of the gate, Sega’s 1999 Dreamcast, sold a million copies in Japan its first year. But it didn’t acquire much worldwide momentum. The console was so expensive to develop and manufacture that Sega soon threw in the towel and became a pure games software company.

That seemed to leave just Sony and the upstart, Microsoft, both with so much money from other businesses that they could afford to slug it out nearly forever. That’s the main drama of the few, intense years covered by “Opening the Xbox”, and it’s where the cliffhanger ends, after the Xbox’s New York premiere in November 2001, less than two months after 9/11.

The author is prescient enough to guess a few bumps in the road ahead. One brave choice was the Xbox’s internet-only, no modem design. It made things consistent for developers and brought online gamers to a new degree of visual quality, but broadband took a couple of years to reach most people. So did HDTV, but that wasn’t really an issue. Xbox, like consoles from that point on, could be connected to either standard.

As predicted, even with a better-than-decent launch, it took a while to build up a game library that was even one quarter of PlayStation’s. The Xbox console’s fate with Japanese consumers, a matter of some initially optimistic interest to Microsoft, would be a lastingly disappointing one. Far from being an American samurai, it was a flop. But in most of the rest of the world, Xbox games would eventually be competitive with Sony PlayStation’s, and in some years surpass them.

A quarter century later, both companies plus “little” Nintendo would all still be in the game console business, although in recent years, the party has finally seemed to slowly fade for the hardware side of Xbox. Most recently, the Microsoft ecosystem was rocked by the news that Xbox’s iconic hit Halo would now be non-exclusive to Xbox. It seemed to mark the end of an era, one that was conceived in the world of the Nineties.

These articles are derived from lectures, talks and web posts. Most have also been posted on Ricochet.com.