With a Whimper: How Science Fiction Shows Die

I’ve just finished watching the fourth and final season of Continuum, a show I raved about back in 2013. By the final episode I was watching just to get it over with. I’d come this far with the show, I’d be damned if I’d quit before I found out how it ended.


There’s nothing left now but the Continuum trading cards.

It was a disappointing experience. Continuum had barely managed to win renewal for a six-episode final season and felt drained of energy as it trudged toward the finale. The budget appeared to be lower than in previous seasons, which is not necessarily bad in itself — Continuum isn’t a show that needs the sort of spectacle it had back in its first year — but there didn’t even seem to be enough money for retakes. If an actor gave a flat line reading, it was in the episode. And the actors were moving through the scripts like zombies. Maybe cast morale was low. After all, they were six episodes away from being out of work.

The last episode, where we finally learned whether Kiera Cameron was able to return to her son in the year 2077, was perfunctory. It more or less resolved the story, albeit with a sad twist at the very end, but there were too many plot threads from earlier seasons that went nowhere and seemed utterly pointless in retrospect. When the final episode arrived there were far too many characters and way too many of them were uninteresting. It was quite a comedown from the brilliant first season.

Continuum. And a bearded guy.

Who is that bearded guy, anyway? And what’s he doing on what used to be a good show?

Yet when I gave it a nanosecond of thought I realized that this is the rule for science fiction shows, not the exception. Remember the wonderfully conceived Battlestar Galactica reboot on SyFy/Sci-Fi? (Of course you do. It was only six years ago.) It was a beautifully filmed, morally complex show, much like Continuum, yet by the final season it had degenerated into mystical BS. And then there was Lost, which ultimately managed to give mystical BS a bad name. (The other day I came across a blog post by Lost writer Javier Grillo-Marxuach explaining that Lost ended exactly the way they’d planned from the beginning, which didn’t make me feel any better about the hot mess that the show turned into in its final year.)

What about other great science fiction and fantasy shows? Firefly never had a chance to jump the shark because it was cancelled after only fourteen episodes, but that other Joss Whedon show, Buffy the Vampire Slayer, had a funereal, almost tedious final run. (Sister show Angel pulled off a decent final season by the skin of its canine teeth.) Fringe managed to keep it — mostly — together, but it was clear that the six-episode final season renewal (apparently this is a thing now) didn’t give the writers enough space to resolve everything that needed resolving. Characters alluded to events that we hadn’t even seen, suggesting that entire scripts had been dropped. Too bad, because the last season involved a nifty twist that had clearly been planned from the beginning. (Watch the first two seasons again and notice how many hints are dropped about what eventually happens in Season Five.)

The iconic example of a science fiction show that ends with a long, drawn-out whimper was The X-Files. Creator Chris Carter has said that they expected to run for five seasons and they had just enough story for that many, which explains why by the sixth season the show was monotonously vamping its way through its so-called mytharc episodes. Frankly, I still don’t understand what the show’s underlying mythology was about, but maybe the miniseries this coming winter will explain it.

To be fair, this isn’t just a problem with science fiction shows. Most successful shows are allowed to stay on the air until they reach their level of incompetence, with only a few gracefully stepping aside once they’ve put together enough seasons for a syndication package (or a Blu-Ray set). It’s harder to name a show that stayed good until the end than it is to name a show that fell apart. Those two AMC stalwarts Mad Men and Breaking Bad pretty much pulled it off, though both had seen better years than their last ones. Despite a calamitous dip in the middle when showrunner Aaron Sorkin left, The West Wing came close, finishing with a bravura two-season election arc that only faded at the very end, when the death of star John Spencer forced a hasty rewrite of the election results.

In the age of serial television, though, the tendency of shows to plummet in quality toward the end seems particularly regrettable, given that viewers caught up in the continuing plot arcs are reluctant to abandon shows that just aren’t as good as they used to be. (Okay, I’m reluctant. I can’t speak for anybody else.) With a standalone show like Law & Order, there comes a day when you simply stop watching and never look back. But if I’d given up on Continuum, I’d be forever wondering whether Kiera Cameron eventually got back to the future.

The Most Miserable Time of the Year: Sad Christmas Songs

Christmas is a happy time, right? Okay, I know there are people who immediately groaned when they read that. Some of my friends turn into Ebeneezer Scrooges immediately after Thanksgiving (though I won’t name any names). In general, though, Christmas is a time of joy and celebration, a time for televising the umpty-umpth rerun of every Rankin-Bass Christmas cartoon ever animated, for getting drunk on spiked eggnog and mulled wine, and for listening to Christmas music on the radio, your iPod or your favorite streaming music service. And Christmas music must be the happiest music ever written. Right?

Blue Christmas

Elvis may be smiling on this album cover, but it’s a sad smile.

Not necessarily. It took Amy, who being Jewish has a somewhat different perspective on Christmas music than I do, to point out what should have been obvious to me years ago: A lot of modern Christmas songs, maybe more Christmas songs than not, are real downers.

I’m not talking about religious Christmas music, which is mostly about how thrilled the singers are over the birth of their savior. And I’m not counting Greg Lake’s “I Believe in Father Christmas,” which isn’t really a Christmas song so much as it’s a “bitter-rejection-of-Christmas” song.

When you get into secular Christmas music, though, the landscape starts to change. Sure, songs like “It’s Beginning to Look a Lot Like Christmas” and “Let It Snow” are so upbeat that they make you want to dance. “The Christmas Waltz” even tells you what dance you’re supposed to dance to it. But a surprising number of pop Christmas songs are about how depressing a time of year this can be, especially if you want to be someplace that you’re not or with somebody you can’t be with.

I don’t know if it was the first depressing Christmas song, but Irving Berlin’s “White Christmas,” at one time the most popular Christmas song in the Great American Songbook, is definitely a sad song. As the rarely used intro says, “It’s December the 24th and I’m longing to be up north.” It’s about a person (probably Irving Berlin himself, stuck in Los Angeles writing songs like “White Christmas” for the movies) having a somber Christmas because it’s not “like the ones [he] used to know.”

During World War II, there were thousands of GIs who wouldn’t be home for Christmas because they had a war to fight in Europe and Japan. The 1943 song “I’ll Be Home for Christmas” acknowledged this by painting a picture of a perfect Christmas homecoming — “We’ll have snow and mistletoe and presents under the tree” — then blowing it apart with what may be the most heartbreaking punchline of any song from that era: “I’ll be home for Christmas…if only in my dreams.”

By the 1960s, Elvis Presley was singing about having a “Blue Christmas” (a song actually written in the late 1940s), lamenting that the object of his affections would “be doing all right with [her] Christmas of white, but I’ll have a blue, blue, blue Christmas.” (Three “blues” in a row. Can’t get much bluer than that.) The song was a huge hit and still gets played, in versions by Elvis and dozens of others, every Christmas. And note how it neatly references the Irving Berlin song, not only in its title but in that line about “in your Christmas of white.”

It may have been Karen Carpenter, though, who really started the sad Christmas ball rolling with 1970’s “Merry Christmas, Darling,” a song written by her brother Richard in collaboration with Frank Pooler and not made any happier by the fact that Karen died a much-too-early death a little more than a decade later. It’s a sweet but heartbreaking song about a woman separated from her beloved for unspecified reasons and remains one of the greatest Christmas songs to come out of the 1970s.

The Karen and Richard Carpenter Center for the Performing Arts

The Karen and Richard Carpenter Center for the Performing Arts, made relevant only by the fact that we saw David Benoit and his trio perform a tribute to A Charlie Brown Christmas there a few nights ago

Dan Fogelberg must have been drinking from the same pain-spiked bowl of eggnog when he wrote and sang 1980’s admittedly rather sappy “Same Old Lang Syne,” a song that’s often played as a New Year’s song, but that takes place on Christmas Eve. In it, the singer meets an old girlfriend who apparently broke his heart long ago and who is now in a loveless marriage with an architect. (Honestly, Dan, I don’t know how you got through the song without gloating — unless the whole song is one big gloat.)

It’s hard to say if Mariah Carey’s 1995 Christmas hit “All I Want for Christmas Is You,” released again in 2011 as a duet with Justin Bieber, is a happy song or a sad song, given that she leaves us hanging in the end as to whether the person she loves leaves her standing, kissless, under the mistletoe. The uptempo arrangement suggests that this is really a happy Christmas song, but we don’t know for sure. Come on, Mariah. End the suspense and tell us if Mr. Right finally showed up! (Given that formfitting Santa suit you’re wearing on the album cover, I’m guessing he did.) And feel free to give me one of those presents under the tree that you seem so uninterested in. I’m guessing there’s some pretty expensive stuff in there.

Mariah Carey

Yeah, I’m sure the guy never showed up, maybe because her then-husband Tommy Mottola would have had his fingers broken.

When talking about sad Christmas songs, though, there’s one that leaves all the others in the dust (or snow) and you certainly wouldn’t guess it from the song’s name: “Have Yourself a Merry Little Christmas.” As sung by Judy Garland in the 1944 film Meet Me in St. Louis, it’s about a family being torn apart from friends and relatives, not to mention having to miss the 1904 World’s Fair, because of their father’s job in New York. Garland sings the song to comfort her little sister, played by Margaret O’Brien, but that didn’t prevent lyricist Hugh Martin (who had written the song many years earlier) from having to change lyrics like “Have yourself a merry little Christmas./It may be your last./Next year we may all be living in the past” to their marginally more cheerful versions in the film. And when Frank Sinatra recorded the song, he asked Martin to “jolly up” the lyric “Until then we’ll have to muddle through somehow.” It became “Hang a shining star upon the highest bough.” Ironically, when Judy Garland sang the song to her daughters on television in the 1960s, she sang the Sinatra version rather than the one she had sung in the film.

The truth is, Christmas music is like all other types of pop music. There are sad songs, like the ones we’ve discussed. There are upbeat songs, like “Sleigh Ride.” And there are dance songs, like “Rockin’ Around the Christmas Tree.” Christmas may be a happy occasion — although we’ve been told many times that suicides soar around the holidays, Snopes.com says that’s just an urban legend — but the truth is that we’d eventually get sick of hearing nothing but cheerful music.

I say bring on the heartbreak songs, if only because sometimes it brightens your day to learn that someone else’s life is going even worse than yours is.

UPDATE: After I posted the column above, Amy and I discussed some other downbeat Christmas songs. The most obvious was Wham’s 1984 “Last Christmas,” which earns its sadness rather cheaply: The singer apparently had a one-night stand last Christmas and expected it to outlast the holiday. It didn’t. Another Christmas song that could conceivably be seen as a downer is “My Grown-Up Christmas List,” a much-covered Amy Grant hit from 1990. Although the song is hopeful — it’s about an adult who asks Santa Claus to cure everything that’s wrong with the world — it requires the singer to list all of those things and that’s the downer part. “What,” the singer asks, “is this illusion called the innocence of youth? Maybe only in our blind belief can we ever find the truth.” Upper or downer? Your call.

My vote for most suicide-inducing Christmas song of all time, though, goes to “I Heard the Bells on Christmas Day,” which has lyrics taken from an 1863 poem by Henry Wadsworth Longfellow called “Christmas Bells.” Depressed by the death of his wife and the even more recent death of his son in the American Civil War, the poet wrote:

And in despair I bowed my head
“There is no peace on earth,” I said
“For hate is strong and mocks the song
Of peace on earth, good will to men!”

Then pealed the bells more loud and deep:
“God is not dead, nor doth He sleep
The Wrong shall fail, the Right prevail
With peace on earth, good will to men.”

I suppose Longfellow was trying to find hope and meaning in the tragedies he had been through, but there’s a sense that he doesn’t believe a word of it. If “Have Yourself a Merry Little Christmas” is about a family falling apart, “I Heard the Bells on Christmas Day” is about a world falling apart. It makes “Merry Christmas, Darling,” where the missing lover is at least alive and presumably coming back some day soon, sound positively cheerful.

Have Yourself a Haunted Little Christmas

Disneyland loves holidays. They commemorate the major ones by, at the very least, sprinkling decorations along Main Street USA. Halloween gets pumpkins and skeletons. And for Christmas there are colored lights on almost everything, sometimes covering an attraction completely. (See Sleeping Beauty’s Castle, below.) A few attractions receive a full Christmas overlay, which is more than just colored lights on the outside. In my last post, I talked about the Christmas overlays for the Jingle (usually Jungle) Cruise, It’s a Small World, and the World of Color. But there’s one attraction where Disney installs an overlay in September that stays in place right through Christmas: the Haunted Mansion.

The Haunted Mansion

“Ho Ho Arrrggh!”

The Haunted Mansion gets an overlay based on the movie The Nightmare Before Christmas. Because the movie is about both Halloween and Christmas, so is the overlay, mixing scary (and somewhat silly) monsters with Christmas ornaments. All of this is on top of the Haunted Mansion’s usual array of blood-curdling screams, tricky elevators and holographic ghosts.

A haunted ballroom

A haunted ballroom decorated with Christmas ornaments

The best way to get into the Haunted Mansion is to get a FastPass. FastPasses can be purchased by sliding your Disney Passports into a small machine an hour or more before you plan to visit an attraction, in return for which you get these small tickets:

Haunted Mansion Fastpasses

FastPass tickets for the Haunted Mansion attraction

This tells us that our passes are for the Haunted Mansion and can be used between 2:05 and 3:05 p.m. (Your times may vary.) It used to be that Disney was relaxed about the length of the time window you had for using the passes. Alas, they claim to be cracking down. For instance, we couldn’t have used those Fastpasses later than 3:05, though there was a time when you could have. (It was never possible to use the passes earlier than the time stamped on them.) During the magic hour within which the pass is active, you can skip the often lengthy Mansion lines and leap ahead of most of the crowd, getting into the ride in five or ten minutes. If you’ve ever stood in the line for a popular Disneyland ride, you know what a time savings like that is worth.

Once inside the Haunted Mansion, you find yourself in a foyer that turns out to be an elevator, with pictures on the walls and roof that turn into more and more terrifying forms as the elevator goes down.

The Haunted Elevator

Beautiful Christmas images in the Mansion’s elevator entrance…

Monsters in the elevator

…transform into monsters as the elevator descends.

Once you get downstairs, you wind through hallways that lead to the moving cars that carry you through the mansion. And, on the wall, you see images relating to the movie:

Scenes from the Nightmare Before Christmas

Pictures of everything from an innocent snowman to Jack the Pumpkin King playing Santy Claws

Most of what’s fun about the Haunted Mansion are the animatronic horrors you see along your ride and the haunted ballroom filled with holographic ghosts that you can view from the rafters above. Here are some of the photos that Amy and I took on our last visit:

A tentacle with pumpkins?

Is that a tentacle covered with jack o’lanterns or a giant wizard’s hat?

Halloween at Christmas

The town of Halloween discovers Christmas.

Christmas list

Santy’s naughty-or-nice list

Jack as Santy

Jack the Pumpkin King disguised as Santy Claws

Angel skeletons

“Angel skeletons we have heard on high!”

After the ride, we dined at the Pizza Port restaurant in Tomorrowland, then went back to Main Street and watched the park’s imagineers light the Sleeping Beauty Castle, gateway to Fantasyland:

Sleeping Beauty's Castle

Sleeping Beauty’s Castle with Christmas lights turned on

We did some other things, like visit Crush the turtle (from Finding Nemo) and draw pictures of characters from Disney cartoons at the Animation Academy in California Adventure, but I’ll devote entire blog posts to those things later on in our second Year of Living Disney. By then, Christmas will probably be over, though Disneyland Resort will continue its holiday celebration right up through January 6, when we may go back to get one last glimpse of the Christmas overlays before they go away. We’ll see.

Another Year of Living Disney

How you feel about Disney — the corporation, their movies, their parks — is a litmus test for something, but damned if I know what. One thing it’s surely a litmus test for is how close you’re paying attention. The parks and the feature-length animations have gone through dramatic ups and downs over the years, under Walt Disney himself, during the Jeffrey Katzenberg renaissance, under later budget-cutting management and now under the creative auspices of Pixar founder John Lassetter, who has at least for the present stabilized the company’s heart and soul and put them in a very good place.

In the Lassetter era, Amy and I have come to love most things Disney. Three years ago, we had SoCal annual passes to Disneyland, which is about 40 minutes away from us in Anaheim. I wrote a blog about it when it was over and promised to write more, but I never got around to it. I want to keep that promise now and write about some of our most interesting experiences at the park as they happen, before they fade into the vasty nothingness of my memory and become all bibbledy. (That was a paraphrase from Kaylee in Serenity, in case you’re trying to remember where you’ve heard it before.) We activated the passes on December 5, 2014, which means they’ll be active through December 5, 2015 (they throw in the 366th day as a bonus), and will get to see two Christmas seasons at Disneyland. And if you can’t get into the Christmas spirit at Disneyland, you’re due to be visited by three ghosts on Christmas Eve. Prepare accordingly.

Christmas in Disneyland

The most wonderful time of the year at the happiest place on earth

(My apologies to those who don’t celebrate Christmas. I’m an atheist and I’m already planning how to decorate the tree. I consider it an open season for happiness, love and lots of pretty lights. Then again, I don’t even mind when they start playing Christmas songs before Halloween.)

Our friends George and Greg were visiting from out of town, so we had a busy day planned. I expected to conk out somewhere during what became nearly a 12-hour visit, but at some point the Christmas spirit, the Disney spirit and sheer adrenaline kicked in. I haven’t had a better time in years.

To get an annual pass, you first have to register for one, then get it validated at the park. I’ve had the registration on my bulletin board for months:

Pass Voucher with Mr. Incredible

Mr. Incredible says: Give these guys their annual pass or…or I’ll do something incredible.

Disneyland is nothing if not efficient. After a short bus ride from the Toy Story parking lot (the easiest-to-use parking lot on Earth), you go to a booth and swap the voucher for a card:

Disneyland Annual Passcard

Donald, Mickey and Goofy in handy wallet size

What I’m calling Disneyland is technically Disneyland Resort, which consists of two parks (Disneyland and California Adventure), plus the Downtown Disney restaurant and shopping district, as well as several hotels. We headed for Disneyland first:

Train around Disneyland

The circumferential railroad over the entrance to Disneyland

where they waste no time in letting you know that this entrance is to Disneyland what the wardrobe was to Narnia:

Sign on Disneyland entrance

Abandon hope all ye…wait, that’s a different sign.

It being our first time back at Disneyland in a while, we took it easy on hitting the rides, just walking around looking at the sights. (It was also a Friday, when we usually don’t tend to go because of the line lengths.) We did, however, go on a few, the first of which was the Jingle Cruise, which is normally the Jungle Cruise but decorated with what the park calls a holiday overlay. It’s a far less elaborate holiday overlay than the ones for It’s a Small World and The Haunted Mansion:

Elephant decorated for Christmas

“Jingle tusks, jingle tusks!”

The overlay also meant that we got some new holiday jokes from the aspiring stand-up comics that Disney hires to serve as tour guides on the jungle, er, jingle boats. (To see what the Jungle Cruise looks like during the non-holiday season, check out my YouTube video from 2012. You’re welcome.)

I had plenty of time to make myself sick on the irresistible pastries sold in store after store on Main Street USA:

Mickey Mouse, Rice Krispies and M&M: Motion sickness waiting to happen

Mickey Mouse, Rice Krispies and M&Ms: Motion sickness waiting to happen

We also checked out several gift shops for potential Christmas presents. The gift shops, actually, are a wonder in themselves. There are dozens of them and every one seems to have a completely different line of t-shirts, plush animals and trinkets, obviously providing enough work for half the population of China, where most of them seem to be made. This includes the department-store-sized gift shop in Downtown Disney:

The World of Disney Gift Shop

The World of Disney gift shop in Downtown Disney

The World of Disney gift shop is vast and practically a park in itself:

Gift shop interior

Inside the gift shop park

But I’m getting ahead of myself. We met George and Greg at the Carthay Circle Restaurant in the relatively new Carthay Circle theater in California Adventure, a replica of the theater where Disney’s first feature film, Snow White and the Seven Dwarfs, opened in 1937, thus launching an era of movie animation that continues till this day. The restaurant itself is an elegant and old-fashioned luxury establishment where we had our own private dining nook:

Carthay Circle Restaurant

Our friends George and Greg dining with us in the Carthay Circle restaurant

Afterwards we went back to Disneyland and took the Disneyland Railroad around to It’s a Small World, a much-derided attraction that becomes spectacular at Christmas:

It's a Small World becomes a dazzling array of Christmas lights this time of year

It’s a Small World becomes a dazzling array of Christmas lights for the holidays.

They occasionally turn off the lights to project images on the exterior.

They occasionally turn off the lights to project images on the exterior.

It's pretty spectacular on the inside too.

It’s pretty spectacular on the inside too.

Here’s my video of the 2011 Christmas overlay for It’s a Small World:

From It’s a Small World it was just a few feet to the viewing area for the Disneyland Christmas Fantasy Parade.

The Frozen float at the parade.

Frozen, Disney’s biggest cash reindeer in years, was well represented at the parade.

Woody in the Christmas Parade

Woody dashing through the snow

Santa in the Christmas Parade

Santa Claus is coming to town — and bringing the North Pole with him.

Here’s my video of the 2011 Christmas Fantasy Parade:

It’s had 145,000 hits so far, making it far and away my best-viewed video. Apologies for the blurriness at the beginning, but it goes away quickly.) Here are some shots of this year’s parade:

We had a break before our next big event — more about that in a minute — so we caught the Disneyland Railroad into the prehistoric past, where the dinosaurs from the 1964 New York World’s Fair still roam:

At the end of the line (and back in the 21st Century), we cut through Downtown Disney to the Grand Californian hotel, where George and Greg were staying. We collapsed for a while in the lobby and looked at the Christmas decor:

Christmas tree in the Grand Californian lobby

The Grand Californian lobby, complete with humongous Christmas tree

which included a giant gingerbread house, recipe included:

Giant gingerbread house, also in the Grand Californian lobby

Giant gingerbread house, also in the Grand Californian lobby

We weren’t the only ones who needed a break. Santa headed for the Grand Californian lobby too, but those darned kids kept bothering him:

Santa at the Grand Californian

Santa trying to take a break at the Grand Californian

Finally, we added the pièce de résistance (yes, that was a direct cut-and-paste from Wikipedia) to the evening by watching what I consider Disneyland’s single greatest attraction: the World of Color. Named for the Disney television show from the early 1960s, the World of Color is to other water shows what the Starship Enterprise is to a frisbee. Somehow the Disney imagineers make colored fountains of water dance in choreographed patterns as they project animated images onto them. Combined with sounds, projections on the roller coaster in the background and occasional gouts of fire, this results in one of those experiences that can only be fully appreciated in person, but I’ll do my best to convey it to you in words and images.

Since this was the holiday version of World of Color and Frozen has been such a monster hit for Disney, it was hosted by Olaf the Snowman, who I fully expect to replace Mickey Mouse as the iconic symbol for the entire corporation. (Maybe then they’ll stop fighting to get copyright extensions on old cartoons.) The character images have a kind of three-dimensional appearance when projected on water and they photograph rather blurrily, but here’s Olaf in all his fuzzy wet glory:

Olaf in World of Color

Olaf the snowman in World of Color

And here are his friends Anna and Elsa plotting to build him:

Anna and Elsa build Olaf

Do you wanna build a snowman?

One of the reasons it’s difficult to represent World of Color in pictures is that it’s hard to convey the sheer, soaring magnitude of it, but look at this photo:

The World of Color fountains

The World of Color fountains erupt over the crowd.

See that image of Mickey Mouse in the background? That’s on a ferris wheel — and it’s HUGE. Yet the World of Color fountains dwarf it. And this isn’t even as high as they go.

By the way, the World of Color holiday show doesn’t just honor Christians (and Christmas-loving atheists like me):

World of Color honors dreidels

“I have a little dreidel…” (They play Feliz Navidad too.)

All amazing things must end. World of Color plays the crowd out with bowing fountains and a holiday farewell:

Happy Holidays from World of Color

Happy Holidays from World of Color!

Here’s my video of the 2011 version:

The images were quite different then, though it’s not hard to find a video of the current World of Color on YouTube:

And that’s it. It was nearly 11 p.m. and the first day of our new annual pass had come to an end. We said goodbye to George and Greg, drove home listening to Christmas playlists on Spotify, greeted our two cats and collapsed.

But our second Year of Living Disney is just beginning. More to come.

Sherlock Who? The Brilliance of Steven Moffat

I was never a fan of the original Doctor Who. Maybe I would have been if I’d given it half a chance — a science-fiction-writer friend of mine was so enamored of it that he was guest of honor once at a Doctor Who convention — but whenever I’d catch a few minutes of it on my local PBS station it looked like a science fiction home movie filmed in someone’s basement. (I still can’t bring myself to sit through those old episodes, even though I’ve now got them on two different streaming services.)

The Doctor and friends

Rory Williams, the Doctor, Amy Pond: Two parents, one child.

But the new Doctor Who, rebooted in 2005 by Russell Davies, is different. For one thing, it looks like it was filmed in a much larger basement, one with a budget for special effects. Even that would be meaningless, though, without good actors, good characters and good writers. The new Doctor Who has those in spades and maybe the old Doctor Who did too; I’ll probably never sit still long enough to find out. Davies cast two wonderful Doctors, Christopher Eccleston (for only one season/series, alas) and David Tennant, and one of the best Companions ever, Britpop singer Billie Piper as Rose Tyler, with her dingbat mother Jackie (Camille Coduri), and deceptively useless boyfriend Mickey (Noel Clarke). Davies also gave stage actor John Barrowman probably the best television role of his career (Malcolm Merlyn on Arrow not excluded): the irresponsible, omnisexual Captain Jack Sparrow, who was later spun off to his level of incompetence on Chris Chibnall’s Torchwood, an unfortunate move. But that’s another story.

During Davies’ tenure a new writer, Steven Moffat (who had earlier created the Britcom Coupling), appeared on staff and wrote quite a few episodes, including a couple of brilliant ones, “Blink,” which introduced both the Weeping Angels and future movie star Carey Mulligan, and the two-parter “Silence in the Library/Forest of the Dead,” which introduced River Song (Alex Kingston), a time traveler who the Doctor didn’t know yet but who definitely knew him. There was no question that River Song would be back; she said as much. The real shock was the way, or several ways, in which she came back.

When Davies dropped out as showrunner after the David Tennant years, Moffat replaced him and recast the Doctor as Matt Smith, who played him with the same manic energy as his predecessors but also with a surprising edge of melancholy. Moffat also introduced a new companion, Amy Pond (Karen Gillan), who would go on to become as important to Smith as Rose Tyler had been to Eccleston and Tennant, perhaps more so, because her presence on the show ultimately revealed as much about the Doctor as it did about her. The first two-and-a-half seasons of Moffat’s tenure are basically the Amy Pond Cycle and despite the standalone nature of many of the filler episodes — you know, the ones that come between the two-parters — Moffat has managed to make everything seem more coherent, more of a whole, with recurring themes woven throughout a season or half season, even if only in the final moments of each episode: Amy repeatedly glimpsing a crack in space with light pouring out; Amy repeatedly glimpsing a woman with an eye patch staring at her through a rectangular hole; the Doctor repeatedly recalling his own death notice, the one he wasn’t supposed to see with the time and place of his demise printed on it. These recurring images gave each season a focus and the viewer a sense of what sort of startling developments they were hurtling toward.

Okay, if you’re much of a Doctor Who fan at all you probably saw all of this a while ago, but I’m running late and only catching up on the Moffat years now. If I’d known how good they were, I’d have done it sooner. But stick with me. I’m going somewhere here.

The Amy Pond Cycle — and I’ll explain why I call it a “cycle” in a moment, though if you saw it you can probably guess — explores a question that’s always hung over the series like a cloud, but to my knowledge has never been actually addressed on the show itself: Why does the Doctor need a Companion? Sure, the Companion is a kind of audience surrogate and a device for giving the doctor a sounding board for his thoughts and exposition, but you can’t rely for — What is it now? 36 years? — on a screenwriter’s crutch. The companions, who are almost always attractive young women, must mean something to the Doctor and it isn’t sexual (though Davies explored that angle by having Billie Piper’s Rose fall in love with David Tennant’s heartthrob Doctor).

But after some brief sexual tension in the beginning, it became clear that Amy Pond had no romantic designs on Smith’s Doctor. She loved her fiance Rory Williams (Arthur Darvill) and returned to marry him, with the surprising result that not only did Rory become a co-companion of the Doctor’s but an indispensable member of the cast. Whatever Amy meant to the Doctor, Rory came to mean it too. The idea that the Doctor is a child who likes to run off on impetuous adventures with his chums has been explored before, but never as intensely as Moffat explored it here. Smith’s ageless adolescence eventually outlasts the semi-ageless adolescence of the Ponds (who really should be the Williamses, or at least the Pond-Williamses, but only once, in what looked like it would be her final episode at the end of Season 6, does the Doctor finally call Amy by her married name).

Over time Amy manages to become mother to almost everybody on the show, including the Doctor, her husband and her daughter (and that last twist is so stunning that I’m not even going to mention it in case someone hasn’t seen it). And Rory, as the Centurion, actually manages to become older than the doctor, who’s only 1,200 years old, while Rory makes it to 2,000. Even Amy ages into perhaps her 50s at one point (though that timeline is erased.) In the end his chums have all outgrown him, or at least outmatured him, except perhaps for River, who drops in and out of his life like she drops in and out of the show (but is apparently seeing him on a nightly basis for a while after their marriage, as one episode implies). But it’s Rory, not Amy, who finally points out to the Doctor how irresponsible his flitting around randomly through space/time with innocent people on board the TARDIS is. In the end, the doctor actually outlives the Pond-Williamses (though the show conveniently ignores the fact that he could simply drop back into the 20th century and visit them any time he likes).

I call the Amy Pond era a cycle because — and you probably guessed this — it’s circular, ending with an explanation of why Amy was waiting for him in the first place, even as a little girl, and ends on a shot of her young face. The most significant things revealed about the Doctor during the Pond Cycle are that he can’t stand to be alone and he can barely stand to sit still. He can’t settle down to a normal family life, as he tries to do at one point with the Pond-Williamses. He has to be going somewhere constantly, almost as though he’s running from something, and what he’s running from seems to be loneliness, maybe the loneliness of being the last of the Time Lords, or maybe from his guilt at having been complicit in the destruction of his homeworld, Gallifrey. (Note: Gallifrey is apparently revealed in an upcoming episode to still exist in an alternate universe. How that will affect the Doctor I don’t know, but I suspect it will be up to Peter Capaldi’s Doctor to show us and from the couple of Capaldi episodes I’ve seen, he’ll be different in a number of ways.)

The companions, then, are a hedge against loneliness. But why are they almost always attractive young women? I haven’t watched many of the Clara Oswald episodes yet, but when he meets her (for the second time, as he finally realizes) in “The Snowmen” and asks her to sail away with him on the TARDIS, she asks, “Why me?” And he responds (I’m QFMing here), “I never know why. I only know who.”

Which is about the best description of romantic love I’ve ever heard.


Meanwhile, Moffat has been running a second show based on an even more iconic character, Sherlock Holmes. He’s created (along with Mark Gatiss) what I regard as the best television version of Holmes to date and he’s done it by being faithful to Sir Arthur Conan Doyle in the only way that matters: in spirit. His Holmes has been lifted out of the late Victorian era and plopped down into our own as if by TARDIS, and the result is that the old stories, or new stories based on the old themes, seem fresh again, as Conan Doyle’s stories must have seemed back when the pages of The Strand hadn’t turned yellow and crumbly yet. It doesn’t hurt that Benedict Cumberbatch’s over-the-top yet tightly controlled performance catches Holmes’ arrogant self-confidence with such convincing bravado that you feel like you’re discovering the character for the first time, even if you finished reading the originals by the time you were out of middle school and read the Solar Pons stories afterward as literary methadone.

Sherlock Holmes and Doctor Watson

Sherlock Holmes and Doctor Watson: Together again for the first time.

Moffat hasn’t given Sherlock the kind of overarching thematic structure that he magically imposed on Davies’ existing structure for Doctor Who. Instead it’s a show of moments, some large, some small, a great many of them brilliant. And getting better: My favorite moments are from the two most recent episodes, “The Sign of Three” and “His Last Vow,” specifically the drunken bachelor party (What could be better than being inside the head of a drunken Sherlock Holmes?), Holmes’ rambling toast at Watson’s wedding that went from insulting to moving (and I seem to recall that he solved a crime in there someplace too), and his slo-mo near-death sequence after being shot in the chest by an unlikely assassin (and if anything can be better than being in the head of a drunken Sherlock Holmes, it’s being in the head of a dying Sherlock Holmes trying desperately to deduce how NOT to be a dying Sherlock Holmes).

Sherlock could be accused of being everything from maudlin to too-clever-by-half, but that it can be these things and more in such an original and spectacular way is what makes it such transcendentally good TV. (No, I’m not sure what “transcendentally” means either, but I’m not using it to talk about meditation.) The visual innovations are particularly impressive. Some of them remind me of the screen tricks that the CSI shows have been pulling since the early 2000s, but when the first episode began with comical text messages appearing above the cell phones of a room full of reporters at a press conference, I knew that I’d stay with the show for its visual bravura alone and immediately called Amy in to watch with me. (The first season had just hit Netflix.)

It isn’t just the brilliance of Moffat and Gatiss that makes Sherlock so good. It’s seeing Benedict Cumberbatch and Martin Freeman in what will probably be the best roles they’ll ever be offered and, for Cumberbatch at least, the one that will lead off his obituary several decades from now. Cumberbatch’s talents are so unique that he’s a difficult actor to cast, but in the right role he’s the Olivier of our age. (I suppose that’s self-contradictory in that what made Olivier stand out was his extraordinary range; he was never typed by a single part, not even Heathcliff or his immortal Hamlet. Cumberbatch has range too, but he’s so much more interesting at this end of it that I don’t even enjoy watching him at the other.)

If Cumberbatch goes down as the great British actor of our era, I think Moffat will go down as the great British showrunner of our era — and maybe just the greatest showrunner period. I hope he turns down requests to go Hollywood, except perhaps to develop something for HBO, because there’s something quintessentially English in his style. But if he has the showrunner’s equivalent of Olivier’s range, maybe he can do something quintessentially American too. I’d just rather he not come up against the Hollywood executives who have made such an uneven hash out of the career of his American equivalent, Joss Whedon (and don’t even get me started on J.J. Abrams). I want Moffat to remain forever an original — and as brilliant as he is now.

Why “The Watchers on the Wall” Was the Best Episode of Game of Thrones Yet

Go read or at least glance at this article from Wired before you read what follows. Don’t worry. I’ll still be here when you get back.

Now, I wish somebody would give me a convincing explanation, as that article tried to do and failed, of why last Sunday’s penultimate episode of Game of Thrones‘ fourth season is inferior to the penultimate episode of Season Two, “Blackwater.” Yes, this seems to be the common wisdom (and became the common wisdom about three nanoseconds after the episode ended), but I don’t get it.


The Battle of Blackwater

One really cool special effect!

“Blackwater” had two things going for it, Peter Dinklage and a really cool wildfire effect, but otherwise it was all about saving the despicable Lannister family from the slightly less despicable (but still despicable) Stannis Baratheon. Not much to root for there, except Tyrion, the only non-despicable Lannister, and maybe Sansa Stark, who had the misfortune to get caught up in all this. Tyrion used the wit and strategic cleverness that nobody in his family gives him credit for to win the battle that nobody else in King’s Landing had the guts to fight, and thus both he and Sansa were saved. Yay. And, yes, it was tragic that Tyrion’s father came in at the last minute to steal all the credit for it from him. But otherwise, am I supposed to have cared which side won? The best thing I can say for the Lannisters is that they’ve got Tyrion. The worst thing I can say about Stannis is that he has all the charisma of slime mold. It’s hard to root for the lesser of two evils, especially when you’re not even sure which evil that is.



Winter: Still coming!

“The Watchers on the Wall,” on the other hand, was about the show’s central theme: Winter is coming. And with it are coming things that will destroy the Seven Kingdoms of Westeros, a place where people are too busy squabbling over the throne to care that giants with mammoths are getting ready to eat their children. (That was not based on any spoilerish foreknowledge from the books but on a reasonable guess about what the stakes are.) This was about people fighting over something that, in the world of the show, genuinely matters: keeping the horrors that are north of the Wall from getting south of the Wall. This was about the few, the not-so-happy few, the band of brothers who are the only ones in Westeros who understand how important what they’re doing is and are willing to die for it despite the fact that nobody else gives a flying f*** about them or will ever think themselves accursed because they weren’t there. This was about stakes that were a lot higher than whether a Baratheon or a Lannister is sitting on the throne. This was about saving everybody in the Seven Kingdoms that viewers care about right along with the ones they don’t. This was about stakes that actually made me care who won. This was about stakes that mattered.

But according to the article I linked to above, this episode had less “heft” than “Blackwater” because “Blackwater” had Peter Dinklage as Tyrion in it. As much as I love both Dinklage and the character he plays, that’s not enough. Apparently nobody besides me feels it was sufficient that “The Watchers on the Wall” had Jon Snow, who in case nobody was paying attention has probably just become the most important character in Westeros.

At least until Daenerys and her dragons show up.

Captain America, S.H.I.E.L.D. and the Age of Multidimensional Media

It wasn’t until I saw Captain America: The Winter Soldier and the last six episodes of the first season of Marvel’s Agents of S.H.I.E.L.D. that I realized just how radical an experiment Marvel Studios is performing with their Marvel Cinematic Universe (MCU) movies and TV shows.

Captain America: The Winter Soldier

S.H.I.E.L.D. goes down in flames.

I’m a huge fan of serial TV shows. The broadcast networks have traditionally objected to them because they don’t rerun well and are hard for viewers to catch up with if they haven’t been watching from the beginning, but it’s gotten to the point where, if a show doesn’t have a serious serial continuity, I don’t have any interest in watching it. It turns out that the formula developed many decades ago on radio for soap operas is, in fact, ideal for showcasing what makes television in many ways superior to movies — i.e., the long-term ability to develop characters, relationships and situations such that the whole of a television series becomes greater than any of its individual episodes. But what Marvel Studios is doing with the MCU is even better than serial television. They’ve taken the concept of serial content in a series — of movies, of TV shows — and made it three or even four dimensional. They’re effectively doing something that I’ve only seen done before in one medium: comic books.

Let me back up for a moment. Marvel Studios is the Hollywood wing of Marvel Entertainment Group, which also publishes the Marvel line of comics. That’s the line where, back in the early 1960s, writer/editor Stan Lee and a few artists, primarily Jack Kirby and Steve Ditko, created what have become some of the most popular superheroes ever to don spandex unitards. The difference is that, in the 60s, their popularity was isolated to comic books and a few animated television shows. Today their popularity has expanded to movies and live-action television (though one character, the Incredible Hulk, achieved live-action TV success as far back as the late 1970s).

Marvel Studios was initially created in 1996 as a clearing house for licensing movie and TV rights to those heroes and, though it did a remarkably good job of attracting buyers, those buyers did an even better job of making money from Marvel-owned properties. Sony parlayed the Amazing Spider-Man into an ongoing series of summer blockbusters and Twentieth Century Fox has created what is, if anything, an even more popular series of movies out of the X-Men and their most popular solo member, Wolverine. (The Hulk, who was initially licensed by Universal, has had a somewhat more checkered cinematic history, and The Fantastic Four, while they turned a profit for Fox, generally proved to be a critical embarrassment in movie form. Fox is scheduled to reboot that series in summer 2015.)

In 2004, Marvel Studios realized that if other companies were making this much money off their characters, they could make even more money, or at least keep a larger percentage of the profits, if they made the movies themselves. They would also have more control over what was done with their characters and concepts. Over the next few years they quietly reacquired the rights to superheroes who either hadn’t done well for other studios (the Hulk) or had never even been given their own films (Iron Man). In 2008 Marvel Studios surprised everyone, or at least critics, by releasing a remarkably good film based on the latter character, who had mostly been a second-string superhero in the comic book world, starring Robert Downey, Jr., as alcoholic billionaire and arms merchant Tony Stark, who escapes from Afghan terrorists and a potentially heart-stopping load of shrapnel in his chest by building a supersuit that not only keeps his heart beating but lets him slug bad guys like the Hulk and fly through the air like Superman.

The real surprise, though, comes at the end of the film, mostly after the credits, when Stark is recruited by Clark Gregg’s Agent Phil Coulson and then Samuel L. Jackson’s Nick Fury to become part of the Avengers Initiative, a superhero collective being assembled (a pun that old Avengers fans will get) by Marvel’s superspy organization S.H.I.E.L.D. The same basic coda was appended, in one way or another, to the next three films in what Marvel Studios was now calling the MCU: The Incredible Hulk (2008), Thor (2011) and Captain America: The First Avenger (2011). (I’ve skipped Iron Man 2 (2010), a film for which this now predictable coda would have been redundant.) While each of these movies was basically standalone or the launching point for a series, it was becoming clear that they were also part of a larger whole. This whole, which eventually became known as Phase One, culminated in Marvel’s The Avengers, the highest-grossing movie of 2012 and the point at which it became clearest that all of these films were taking place in a shared universe, something that had only been hinted at up until then. This shared universe concept is common in superhero comics and has resulted in continuities so tangled that you pretty much need Wikipedia to sort them out, but it has only occasionally been used in films, so occasionally that I’m having trouble thinking of examples. (It’s more common in television, where character crossovers between shows and spinoffs from hit shows were almost a requirement in the 70s and 80s and still occasionally occur, with the interconnections between the Law and Order and CSI shows in the late 2000s probably being the most recent examples, unless the NCIS shows are doing something similar.)

Marvel’s The Avengers took elements and characters, some of them quite minor, from all of the previous films and threw them together into one big superhero soup. Marvel had been doing this in the Avengers comic books since 1963 and comic books in general had been doing this at least since DC Comics launched the Justice Society of America in All-Star Comics #3 back in 1940. Having such a series-jumping chronology in the movies was remarkable but it didn’t become extraordinary until it made the leap to television in the fall of 2013 with Marvel’s Agents of S.H.I.E.L.D., a direct spin-off from Marvel’s The Avengers.

I’ve talked before about how I had great hopes for Agents of S.H.I.E.L.D. and also about my frustration that it was taking its sweet time about realizing them. The reason why it was taking so long finally became apparent with the 17th episode, “Turn Turn Turn”: The show’s writers had been waiting for the second Captain America movie, Captain America: The Winter Soldier, to come out.

Just as Captain America: The First Avenger had been, quite unexpectedly, the best movie of Phase One, Winter Soldier was the best movie so far of Phase Two and possibly the best MCU movie yet, better even than Marvel’s The Avengers. (To be fair, Joss Whedon was handed a nearly impossible task in writing and directing The Avengers. He had to balance at least half a dozen major characters, four of whom had film series of their own — or maybe three, the underperforming Hulk having apparently been phased out after Phase One — and all of whom had to be given roughly equal screen time and importance to the plot. Not surprisingly, the standout was Tom Hiddleston’s Loki, borrowed from the Thor films, who chewed the scenery with charmingly vengeful gusto as the movie’s villain. More surprisingly, the other standout was Scarlett Johansson’s Black Widow, who I’m pretty sure first hit the screen in Iron Man 2, with her clever backhanded method of interrogating villains by making them think they’re interrogating her.)

Winter Soldier ends with — stop here if you’re one of the few MCU fans on earth who still don’t know what happens — the near total disintegration of S.H.I.E.L.D., which turns out to have been riddled since World War II with sleeper agents from their sworn enemies, the Nazi carryover organization Hydra. The movie ends with Captain America more or less triumphant but S.H.I.E.L.D. in shambles and Samuel L. Jackson’s Nick Fury erroneously believed to be dead. And that’s where it impacted the TV show. Agents of S.H.I.E.L.D. in the absence of S.H.I.E.L.D. had become a program without a premise and that suited it beautifully. After floundering all season in search of a theme, it had finally found one: a team of agents without an agency trying to defeat the enemy that had stolen it out from under them.

Turn Turn Turn

Things fall apart and S.H.I.E.L.D. becomes very centered.

For the final seven episodes of the season, S.H.I.E.L.D. was the best thing on television — yes, even better than Game of Thrones, which is straining admirably not to start plodding toward its climax the way George R.R. Martin’s books are doing. Agent Coulson’s team developed personality along with purpose. They fought against one another — Agent Ward turned out to be one of the sleeper agents — as well as against other agencies and ended up as a team of self-described vigilantes. The final episode resolves all this a bit too neatly, or at least too quickly, but it leaves some interesting plot threads dangling and the hint that at least one of those threads is going to generate the premise for the second Avengers film, which will terminate Phase Two in 2015.

It’s the way that the MCU continuity has not only jumped back and forth between movies but the way (and the speed) with which it has jumped between movies and TV (and apparently back again) that makes it revolutionary. (There was only a four-day lag between the opening of Winter Soldier and the introduction of its aftereffects into the show.) It would still be possible for a newcomer to jump into the multidimensional network of the MCU without being completely confused, but that window is rapidly closing and I would expect that, by some point in Phase Three, figuring out not only the plot but the interconnections between films, characters and TV shows (with yet another MCU television series, Agent Carter, debuting during S.H.I.E.L.D.‘s midseason hiatus in the 2014-2015 season) might become nearly impossible for a newbie.

This is clearly a studio executive’s nightmare and precisely the reason that broadcast television has fought — in vain, fortunately — against serial TV shows. If the audience doesn’t buy in early, it becomes extremely difficult to buy in late. But the way in which we watch television and movies is changing. We don’t necessarily catch TV shows while they’re on the air, the way we used to in the long-ago 20th century. We DVR them or buy the DVD sets or we get them On Demand or we binge watch them off Netflix or Amazon Prime Streaming. If we’re really desperate we resort to certain Internet back channels, which I’ll leave unnamed, to get our hands on content. The producers of Breaking Bad credited Netflix (and probably some of those back channels) with the show’s abrupt surge of viewership in its two-part final season, with viewers who had finally gotten word about how good the show was rapidly catching up through all-day streaming sessions.

My friend Sean Tucker thinks Marvel Studios is using the MCU to position themselves for a brand new media world and I think he’s right. Now that widescreen TVs with Internet connections have come to dominate the living room, the age of genuine on-demand viewing, which we’ve been promised since at least the 1980s, has arrived at last and I for one wouldn’t mind seeing the cable-TV companies die out altogether. (Unfortunately, they also own much of the Internet infrastructure and until that de facto monopoly is taken away, the true age of multidimensional media is going to be postponed — but I doubt for very long.)

Very soon now, we’ll be watching television and movies in the way people have long read comic books — picking up back issues and reading new ones in whatever order necessary to follow tangled continuities or just indulge sudden whims. To some extent, we’re already there — Amy is downstairs now binge-watching the entire seven years of West Wing on Netflix, something I did a few years ago myself — and I think we’ll need the original thinking of companies like Marvel Studios, which is taking continuity concepts from comic books and repurposing them for higher-budget visual media, to provide content that fits the new way we view what soap opera fans have long referred to as “our stories.” The multidimensional interconnections provided by the MCU may be the perfect model for a world in which TV and movies are only distinguishable by the size of the screens we watch them on — and much of the time not even by that.

I, for one, am thrilled to see the new era arrive. I just wish it hadn’t taken so long.


Video Games as Story, Video Games as Life: Part Three

So where were we? In the first two parts of this series on video games as virtual reality, I talked about how in 1981 I’d had a vision — not a full-blown clairvoyant vision with my eyeballs rolling backward in my head but more of a daydream about the future — of a time when computer game programmers would create games so realistic that playing them would be like entering an actual world and interacting with the objects and the people there. I talked about how various games, like the SubLOGIC/Microsoft Flight Simulator and Doom had upped the ante on realism in computer graphics to the point where, today, we have games in which the world you see on the computer screen is almost indistinguishable from an actual video. It’s as though game designers can invent an entire planet, populate it with intelligent beings, wait a few million years for those beings to build infrastructure and then let the player carry a camera to that planet and snap pictures of it. Take a look at this screen shot from developer Crystal Dynamics’ recent reboot of the Tomb Raider series:

Tomb Raider 2013

The new, photorealistic Lara Croft

If you’ve played the game (and I strongly recommend it, even if you hated the old Tomb Raider series where Lara Croft looked like a Barbie Doll on steroids) you know that the jungle village the young Lara is entering isn’t a painting or a photo but a detailed three-dimensional model that you, the player, can explore in almost breathtaking detail. The pulleys and elevators actually work (in a virtual kind of way), the wooden surfaces are detailed down to the smallest splinter, and those mountains in the background really can be climbed, at least where they aren’t too steep. And this village is only a tiny fraction of an entire tropical island filled with lovingly designed and vividly rendered sets. (Unfortunately, while this new version of Tomb Raider is better and more playable in almost every way than the old games were, it suffers from one flaw of the original: Lara still tends to fall down and die a lot. But the game itself is so good that I found I really didn’t mind that much.)

Essentially, the graphics problem facing designers of realistic virtual reality games has been cracked. The worlds in modern games can look real, at least if the game designers want them to. (Some game designers deliberately opt for nonrealistic graphics, the way painters in the 19th century began opting for impressionistic images on their canvasses rather than photorealistic ones.) But to my mind the best games are based around stories and story is a more difficult problem than photorealistic graphics, because it can’t be cracked using mathematical algorithms executed by extremely fast multiprocessing CPUs, at least not any algorithms that are on the game-design horizon at the moment. Putting a story in a game isn’t like putting a story in a novel or a movie, which writers and directors have been doing since long before any of us were born. A game story has to be interactive, capable of changing according to decisions made by the player, and that creates a paradox. A story is a series of events with a dramatic shape, building to a climax, but a story shaped by a player could easily collapse into a mass of unrelated events that are about as interesting as watching a traffic jam on the Los Angeles freeways. How can game designers reconcile story and interactivity without creating chaos?

In my 1981 vision, I imagined that artificial intelligence would have been developed sufficiently by now that the game designer would merely have to create a world that had the potential for story and each player would find their own story in that world. For instance, the game could be set in a Central American country on the verge of revolution with a well-meaning but hot-tempered dictator whose beautiful wife is thinking about having an affair. You, the player, would be an American reporter who could choose to get to know the dictator, the leader of the revolutionaries and/or the dictator’s wife and interact with them in ways that could shift this explosive situation in an almost infinite number of different directions. If these characters were full-blown artificial intelligences, the results of the story would be genuinely unpredictable. Even the game designer would have no way of knowing how events would proceed or what the ultimate results would be for each individual player. That’s a game I’d love to play. Game developers could create that game today — maybe they already have — but it would be a cheat. The game designer would have to create certain story paths in advance that would be affected by decisions you make, each decision making you more likely to go down one path than another. And, in fact, plenty of such games have been created and it’s these predetermined story paths that create what happens in the course of game play. Still, the game designer has a certain amount of leeway in how much freedom they actually give the player and how many different predetermined paths the game can take. At one extreme, this type of game design can produce an open world with so many possibilities for story that it comes very close to being the kind of game I envisioned back in 1981. At the opposite extreme, the number of game paths is so limited that the player only has the illusion of free choice and is channeled down a single predetermined story path with only minor variations along the way. Let’s call the first kind of game an Open World game and the second kind of game a Closed Story game. A really clever game designer can create a game that combines both approaches and, in fact, many designers have. We’ll call this the Open Story game. In the rest of this post, I’m going to describe (and review) three games from the last two or three years, each of which takes one of these approaches. For the Open World game, I’m going to talk about The Elder Scrolls: Skyrim from Bethesda Design Studios. For the Closed Story game, my example will be The Walking Dead: Season One from Telltale Games. And for the Open Story game, I’ll describe Dishonored from Arkane Studios. Let’s start with the first:

The Elder Scrolls: Skyrim

Back in the first installment of this post, I talked about how I’d seen the first Elder Scrolls game, Arena, in development in 1993 at Bethesda Softworks back when I lived around the corner from their offices and had my mind genuinely, if not literally, blown by what I saw. My mind was even further blown — honestly, I don’t have much of it left — when I saw the game itself. It created an entire continent, Tamriel, and let the player travel across it to complete a lengthy quest. Along the way you could pursue hundreds, perhaps thousands, of side quests and visit hundreds of cities and dungeons, some of which would appear on the game map from the beginning, allowing you to fast travel to them (saving a lot of walking time), and some of them you just had to discover by talking to people or wandering around the countryside, at which point they would appear on your game map. I had always loved the exploration element of computer role-playing games, ever since I played Ultima III on my Commodore 64 back in the mid 1980s, and Arena had the hugest world I’d ever explored. The graphics were low-resolution by modern standards and most of the dungeons were algorithmically generated, so that after a while they started to look alike, but I was stunned by the sheer size of Arena’s world. I still am when I go back and play it using the DOSBox-emulation program. With each subsequent Elder Scrolls game the world has grown smaller — most of the later games have restricted themselves to a single province of Tamriel — but it’s also become more detailed. Skyrim restricts the action largely to the eponymous province, but dear God is it detailed! And it feels huge, as big as the whole continent of Tamriel did in Arena. It contains a multitude of environments, from forests to mountains to arctic wastes, and you can’t go far without stumbling on ancient ruins, tiny villages or someone just spoiling to lock weapons with your character. There’s no plot forced on you in Skyrim, but the opening sequence, where you nearly get your head sliced off on a chopping block, hints that there are larger conflicts involved and it isn’t long before you find out what they are. You can choose to engage in these conflicts or not, and if you choose to do so you can either pick sides or never get around to picking sides. Mostly Skyrim provides you with a world to live in and explore. (You can buy your own house in several of the game’s many cities and the game DLC — downloadable content — even supplies a feature that lets you build your own log cabin from materials you forage in the wilderness.)


Skyrim: Not so much a game as a world to live in

Skyrim comes about as close as any game I know to that game I envisioned more than three decades ago. It still doesn’t feature true artificial intelligence — the game’s many quests and character interactions are still pre-scripted — but the number of ways in which you can combine the game’s dozens of quests and personal choices, along with the sheer exploration element and the games smooth, unforced system of gradual character building make it feel like an RPG version of the old SubLOGIC Flight Simulator. You really do feel that your game experience in the Skyrim world is unique and that while thousands if not millions of players have probably fought the same battles that you have, the way in which you combine the events that make up your character’s life, the total experience that you have in the game, may be different from anyone else’s. There’s a story in Skyrim — actually, there are multiple stories — but after hundreds of hours of play I’ve yet to figure out what the end of it is or even if it has an end. Out of the many characters I’ve created, the one that I’ve worked up to level 45 on my Xbox has come up against a dragon encounter that has an awfully climactic feel to it, but he wasn’t capable enough to handle it, so I reset to the last saved game and sent him off to deal with some of the game’s other plot elements, like the battle for secession being fought between the Skyrim rebels and the occupying army sent by the Empire to put them down. My various characters have chosen to fight with the rebels and with the Empire. Both factions have strong arguments in their favor and notable moral flaws. Sometimes I choose not to get involved with them at all and just busy myself with a little dungeon diving, returning magical artifacts to the NPCs — non-player characters — who’ve asked me to retrieve them. Last night I looked at the number of quests I’ve completed in the Xbox version and was startled at how long the list has become and yet there are whole towns full of people I’m afraid to talk to lest they add dozens of new quests to the list of ones I’ve yet to complete. The very fact that Skyrim has this degree of complexity and no obvious goal for the player to work toward other than the goal the player selects for him or herself is one of the many things that make it feel real. It has a story, but in many ways the story is what you make it. The game designers never force you to do anything, except defend your life when attacked by a hungry wolf or marauding bandits. Rather, the story of the game is what you choose to make out of what may well be hundreds if not thousands of possible quests and decisions, and if you’re not interested in story you can simply explore the world and look at the scenery, playing it as a sandbox game. In Skyrim, you come about as close to shaping your own experience as in any game I’ve ever seen.

The Walking Dead: Season One

It’s hard to imagine something as different from Skyrim in its approach to story as the old adventure games were, though occasionally they could come close. I talked in the first installment of this post about how the early Infocom game Deadline created a surprisingly vivid, interactive world using only words and made me feel like I was actually living in that world, or at least visiting it, for the duration of the game’s 12-hour story. Honestly, though there were more technically impressive adventure games in later years, especially when graphics were added to the equation, I don’t think I ever played another adventure game that had the combination of story and personal freedom that Deadline did. That may well be because game players were put off by the sheer freedom that Deadline allowed. They didn’t know what do with it. They were used to games that pushed them down a single, linear path where the player’s next move was always obvious, leaving little room for nonlinear exploration. I remember people complaining about having to talk to suspects in Deadline because they had no idea what to ask. I’ve heard many of the same complaints about the Elder Scrolls games. People find themselves plunked down in the middle of a vast and complex world and have no idea what to do. Funny, I’ve never had any trouble figuring out what to do in these games. I do the same thing I’d do if I found myself plunked down in a strange city on a Saturday night. I walk around looking for fun and excitement.

But after Deadline, most adventure games became fairly linear, at least by comparison with Deadline or Skyrim, and the player was usually given a clear-cut goal plus a series of problems that had to be solved in order to reach that goal. This could be fun if cleverly done (my favorite later adventure games were the ones from LucasArts, the Monkey Island games in particular), but after a while it got boring. By the late 1990s adventure games were largely dead, at least in the United States. European adventure games would occasionally reach these shores and threaten to revive the genre, but none really succeeded. A few persistent adventure game designers in the U.S., like Jane Jensen, continued to design games for indie publishers, but faced an uphill struggle to get those games to a wide audience.

Then, around 2007, a company called Telltale Games, staffed in part by former employees of LucasArts, began to reignite interest in the genre. They began by taking some of the best of the old LucasArts franchises, like Monkey Island and Sam & Max, and producing new, shorter installment of them, treating them like episodes of a television show and publishing them in “seasons” consisting of five or six games apiece, often involving a continuing story line. The games were sharply produced, with excellent graphics and clever puzzles, drawing old adventure game fans like me back out of the woodwork and creating new fans largely through word of mouth. Pretty soon Telltale had become the new LucasArts and was developing games around franchises other than old adventure games, like the Back to the Future films and Jurassic Park.

Their most successful franchise to date, one that has garnered Telltale a raft of awards and apparently a substantial number of sales, is one based on the AMC-TV series The Walking Dead, probably the most popular original program currently running on basic cable. (Amy and I just finished watching the fourth season and are huge fans.) Games based on other media, especially movies, have a shaky reputation, because they tend to be, well, bad. There are any number of reasons why, including the need to produce them quickly so they’ll come out with the movie, the expense of acquiring the franchise rights (which siphons money out of the game-design budget), and the differences between movies and games. Movies and TV shows have fixed stories, while games are interactive and the player should be allowed to change the story. But how do you allow a player to change a story that the player already knows the end of, having seen it in the theater?

I don’t know what it cost Telltale to acquire the rights to The Walking Dead, but they’ve come up with a clever way of dealing with story. Instead of following the characters and plot points of the TV show, they’ve created an entirely new group of characters experiencing the same zombie apocalypse that the show’s characters are dealing with. This allows them to tell a completely different story, one where you don’t already know who will live and who will die, or where events will lead over the course of multiple “episodes.” As the player, you take the part of Lee Everett, a university professor who has been convicted of murdering his wife’s lover. For Lee, the apocalypse is both a tragedy and a blessing. As the game starts, Lee is being transported to prison by a police officer, whose squad car crashes when a “walker” — the main name the series uses for zombies — steps in front of it. In the resulting crash the officer dies and Lee escapes, only to encounter a young girl named Clementine whose parents have gone to Savannah, Georgia, and haven’t returned. The remainder of the story is about Lee’s attempt to reunite Clementine with her parents, an adventure during which the pair fall in with a group of apocalypse survivors each searching for their own salvation from the end of the world as they know it.

The Walking Dead Season One

The cartoonishly “realistic” graphics of The Walking Dead Season One

The graphics in The Walking Dead, while more or less three-dimensional, are by no means photorealistic. In fact, they look faintly cartoonish, like rotoscoped images of real human actors given a comic twist. The story gives the illusion of interactivity and your choices in conversations and certain actions have an effect on the course of the game’s plot, but the effect is minimal. For instance, in an early scene from the game, if you lie to a farmer about how you wound up injuring yourself in the police car crash, he detects the lie and eventually expels you from his farm, forcing you to join others on a trip to Macon, Georgia. But if you tell him the truth, you still eventually end up on the trip to Macon, just for other reasons. No matter what you do, you’ll end up going to Macon, because that’s where the next section of the story takes place. (Hint: Never lie to anyone in The Walking Dead game. They always pick up on it and you inevitably do better, though only a little better, if you tell the truth.)

I’ve played through the entire first season of the game, albeit without that level of experimentation in most of it, but I suspect the level of interactivity remains about the same. Assuming you don’t do anything that gets you killed (which is quite easy to do in some scenes), you have some minor control over the details of the plot, but the broad story will continue on the same path no matter what you do. The designers of the game have done a brilliant job of making you feel as though you have freedom of choice in the game, but it’s only an illusion. You’re a puppet on a string, but the story is told well enough that it’s a string that’s fun to be on.

The Walking Dead: Season One is the antithesis of that virtual reality game I visualized back in 1981. The graphics are only semi-realistic, there’s very little freedom to branch out and explore, and the story is always going to come out the same. Yet they somehow make it fun and the story they tell is both exciting and moving — much like the TV show, if not quite as well written. I recommend it as a good story decently told, but if an open world is what you’re looking for, The Walking Dead: Season One is not the place to find it.


Dishonored was a surprise. I’d heard that it was a good game, but never imagined how deeply I’d get caught up in it or how its imaginary world would affect me emotionally. It’s a linear game, in the sense that it’s mission-based, much like the Call of Duty games, but in the Call of Duty games you feel like you’ve got a drill sergeant at your back the entire time, telling you where to run and where to shoot. In Dishonored you don’t have any choice as to what missions you go on, but once you’re on a mission your freedom is almost total and the number of options open for you in the environments where the missions take place and in the activities you can perform within those environments is surprisingly large. Dishonored is one of the most replayable games I’ve ever encountered, maybe not quite as much so as Skyrim but more so than any other basically linear, mission-based game I’ve yet seen. Arkane Studios, a subsidiary of Bethesda Softworks and its parent company ZeniMax Media, have absolutely outdone themselves with this game in terms of both storytelling and player freedom. I can’t recommend it wholeheartedly enough.


The stylized but not really cartoonish graphics of Dishonored

The setting is a steampunk version of some late 19th, early 20th century European/American society. You play the Empress’s personal bodyguard, who in the opening sequences is framed for her murder by the very people who have engineered it in an attempt at a palace coup. These blackguards have also kidnapped the Empress’s daughter, the rightful heir to the throne and (if rumor is to be believed) your own daughter as well. You are given a kangaroo trial, thrown in a maximum security prison and assigned an almost immediate execution date, but you escape with the help of a rebel underground that knows you’re innocent and wants you to rescue the young Empress so she can take the throne in place of the pretender who is overseeing the government.

That’s the set-up and each mission is engineered to bring you closer to this goal. Near the beginning you are given some cleverly conceived tools to help you in your missions by a godlike young man known only as the Other whom you encounter in a dream. You soon learn methods of acquiring more tools, the use of which is one of the most thrilling parts of the game. The most amazing tool is the heart, which is apparently the heart of a deceased young woman who doesn’t quite realize yet that she’s dead but who can whisper information to you about your surroundings when you hold her heart in your hand and activate it. This is a chillingly ingenious touch, not only useful but oddly moving. You also acquire a teleport device that allows you to jump from the street to ledges and from rooftop to rooftop, so you can move stealthily through the cities where the missions are set.

And stealth is crucial to this game. You can kill your enemies (or just the people who get in your way) using guns and bladed weapons, but the more you do so, the more chaotic the society becomes, and the more the plague that’s running through this society increases, because the dead bodies you leave behind attract rats, which spread the plague further. Simply put, the more dead bodies you leave in one mission, the more difficult the next mission becomes, until you find yourself fighting a lawless society and swarms of rats that can strip the flesh off your body in seconds if you don’t kill them first. You are much better off sneaking up on people and choking them into temporary unconsciousness than you are killing them, never mind that these “people” are just collections of pixels acting according to flowcharted scripts.

For me, the most stunning part of the game was the climax. I successfully achieved my goal, I thought, but I had played the game as though it were a shooter, killing whenever possible because it’s easier to play that way. But there are consequences to this kind of play and they caught me off guard when at the start of the final mission a character I’d considered a friend performed an act that startled the hell out of me. And what should have been a happy ending had a dark pall cast over it by the effects my action had on someone I loved. The final sequence is hallucinogenic in its intensity, as the Other takes you on an animated 3D tour of the acts you’ve performed throughout the game and you realize how insane some of your actions appear when viewed from outside time and space. Honestly, it took my breath away and I immediately started playing the game again, concentrating on killing as few people as I possibly could. There have been games in the past that have tried to enforce morality and consequences on the player, but I don’t think I’ve ever seen it done as subtly and effectively as here.

Yes, Dishonored is linear in the sequence of its missions and offers nothing close to the vast, nonlinear landscape of actions and missions found in Skyrim, but the freedom allowed the player within the individual missions is so vast and the use of its tools so varied and imaginative that you feel like you lived through this story to a much greater degree than you feel like you lived through the story of The Walking Dead. Like that game, the graphics are stylized rather than hyperrealistic, as they are in Skyrim, but the games Goya-esque character models and the huge steampunk machines — the giant, whale-oil-fueled mecha suits that city guards wear in some of the towns are both amazing and terrifying the first time you see them storming rebels in the city streets — make the game such a unique experience that I’d be hard put to say whether I prefer this game or the more open Skyrim, which comes closer to my long ago dream of virtual reality. The Walking Dead, though its story pulls nicely on the player’s heartstrings, comes in a distant third as a piece of interactive storytelling because it’s ultimately so lacking in true interactivity.

Have we achieved the ultimate in virtual reality storytelling? Will we someday see my game about the Central American nation on the verge of revolution? In some ways, maybe we already have. In others…I suspect game designers still have some surprises up their sleeves in terms of realism and interactive storytelling but I can no longer guess what they might be.

I can’t wait until I see them, though. And hope I live to see a lot of them..

Welcome to Level Seven: S.H.I.E.L.D. at PaleyFest

I’ve already blogged more than once about the ABC series Marvel’s Agents of S.H.I.E.L.D., not because I think it’s a great show but because I think it could be a great show and because it’s getting better with almost every episode. Shows produced by Joss Whedon, even ones like S.H.I.E.L.D. where he isn’t involved in its production on a day-to-day basis, tend to start out slowly and hit their stride at the end of the first season or even the beginning of the second. Honestly, there are times when I wish I could look into the future and see if S.H.I.E.L.D. is really going to live up to the promise I think it has, because I may just be wasting 60 minutes of my life each week by watching it. But I don’t think I am.

And, as if by magic, I got a peek into the future last weekend and saw next week’s episode of S.H.I.E.L.D. I can report that it’s getting even better, but it’s still not quite as good as I’d like it to be.

Felicia Day

Geek goddess Felicia Day hosts a panel of tiny little people from the ABC show Marvel’s Agents of S.H.I.E.L.D.

One of the advantages of living in Los Angeles, as I’ve been doing for the last five years, is that you occasionally get the opportunity to look into the future of television, even if it’s only a week into the future. Last Sunday I drove to the Dolby Theatre in Hollywood (that’s where the annual Academy Awards ceremony is held) to watch part of Paleyfest, an event put on annually by the Paley Center for Media. I was there to watch a panel on S.H.I.E.L.D. and it was an impressive panel indeed, at least in terms of who showed up. Basically the entire regular cast of the show — Clark Gregg, Ming-Na Wen, Brett Dalton, Chloe Bennet, Iain De Caestecker and Elizabeth Henstridge — were there, as were the show’s three showrunners: Jed Whedon (Joss Whedon’s brother), Maurissa Tancharoen (Jed’s wife) and Jeffrey Bell, along with co-producer Jeph Loeb. The panel was hosted by geek goddess Felicia Day, best known to Joss Whedon fans like myself for her roles in Dr. Horrible’s Sing-Along Blog and several episodes of Dollhouse.

Actually, you didn’t have to live in LA to see this panel. If you knew about it in advance, you could have watched it on streaming video using the PaleyFest app — and maybe you did. But if you were actually at the Dolby Theatre you got to see something extra, something that wasn’t included in the streaming video. You got to see the April 1 episode of Agents of S.H.I.E.L.D.

Jeph Loeb asked all of us to go home and tweet or blog about the episode or at least about how much we liked it. We were advised not to give out spoilers. Yeah, right.

I suspect by now hundreds of attendees have given out spoilers in defiance of Mr. Loeb’s request, so it would be completely redundant for me to do so here. Therefore all I’ll say is that it’s one of the best episodes so far, largely because it focuses purely on the show’s serial arc and doesn’t attempt to stand alone in any way. I suppose it would be a minor **SPOILER** to mention that it centers around J. August Richards’ character Deathlok and the oft-mentioned but never-seen character known as “the Clairvoyant.”

I love serial TV shows and generally don’t care for standalone episodes of those shows, so the mere fact that this episode stuck to the serial arc was enough to make me happy. But it also advanced the serial arc significantly with a few surprises and plot twists, which is more than I can say about most episodes to date. And that’s all you’ll get out of me without inserting flaming bamboo splinters under my fingernails. For more spoilers, you’ll have to go elsewhere. Try Google. Or Twitter.

As for the panel, I have surprisingly little to say about it. It was fun seeing the cast and producer/writers in person, along with the lovely Ms. Day, who I’ve followed on Twitter for a year or two now because she’s a great source of geeky news about Joss Whedon TV shows. In general, though, the conversation on the panel didn’t reveal anything that I didn’t already know or at least suspect, such as the fact that Clark Gregg, who plays Agent Phil Coulson on the show, is a really nice guy. Twice, he literally leaped off the stage and ran into the audience to give someone a hug, most touchingly when the huggee was a youngish female fan who appeared to have Down Syndrome and was having difficulty articulating her question to the cast. I laughed when Chloe Bennet mentioned that fanfic ‘shippers — fan fiction writers who like to invent relationships between fictional characters from television and elsewhere — had created a romantic couple out of her character (Skye) and Elizabeth Henstridge’s character (Simmons) and were calling the couple “Skimmons.” (A quick check on Google revealed that, yes, there’s actually fanfic about “Skimmons.” It had slipped right past me.) Otherwise, the conversation consisted largely of cast members answering banal questions from the audience like “What superhero would you like to be?” (For the record, not all of the questions were banal. If you were there and asked a question, I can assure you that it wasn’t one of the banal ones. Hey, you’re intelligent enough to be reading my blog, right?)

Also, if you were watching the streaming video you probably saw me without realizing it. When a heavyset guy in a Captain America sweatshirt got up to ask a question, I was the gray-haired guy with glasses over his shoulder on screen right. I reached up twice to touch my glasses, not as a signal to anyone viewing the video stream or as a nervous tic, but to make sure the guy I was seeing on the giant video screen above the panel was actually me. Sure enough, the guy on the video screen reached up to touch his glasses too.

Two other pieces of important news gleaned from the panel — actually, from Jeph Loeb’s introduction to the panel — are that, starting with the April 1 episode, there will be no more interruptions in the Agents of S.H.I.E.L.D. schedule for seven straight episodes, right up through the season finale in May, and the April 8 episode of S.H.I.E.L.D. will be a direct follow-up to the movie Captain America: The Winter Soldier, which will be released to theaters on Friday, April 4. (Hey, that’s today! Better get tickets soon!)

If you saw the screening of next week’s episode or you’re reading this after it’s already appeared on TV, I’d be curious to hear what you thought of it. Feel free to leave a comment on this post.

And keep your fingers crossed that S.H.I.E.L.D. gets renewed (word has it that it will) and that it has the greatest second season of any show in the history of television. Or at least of any show executive-produced by Joss Whedon.

That would make a lot of people, including myself, very happy.

Video Games as Story, Video Games as Life: Part Two

(This is Part Two of an article about the evolution of computer games into a form of virtual reality, largely as witnessed by the fanatic computer gamer who writes this blog. If you’ve already read Part One, you might want to skim over it again, because I’ve added several paragraphs of material.)

In early 1997, a revolution occurred in computer gaming on DOS and Windows PCs.

I was lucky enough to be in a position to witness this revolution from the very beginning, something most gamers weren’t. In January of 1997, I was working on a project for a local software company and to help me accomplish it they loaned me a computer that was considerably better than the aging DOS/Windows machine I already owned. The loaner had a 200-mhz, first-generation Pentium processor (still more-or-less state-of-the-art at the time and vastly superior to the aging 66-mhz 486 machine I used for word processing and gaming) and something else that was on the bleeding edge of microcomputer technology: a 3DFX Voodoo graphics board.

For much of that decade, computer equipment manufacturers had been attempting to create 3D accelerator cards for microcomputers and gaming consoles that would allow programmers — mostly game developers — to write code for three-dimensional animation that would move polygons across the video display of a microcomputer at considerably faster rates than were possible using purely software-based 3D programming, the type that I’d written about in my book Flights of Fantasy. The first Playstation used one of these and its graphics were very good for the time (about 1996), if not quite revolutionary, though they were good enough to turn Lara Croft into the world’s first video game sex symbol (unless you count Zelda and Princess Peach). These graphic cards took over many of the more time-consuming tasks that the computer’s main processor needed to perform in drawing 3D images, freeing up the CPU to handle other tasks while the graphic card did most of the drawing.

Tomb Raider 1

Lara Croft on Playstation 1: The world’s first 3D-accelerated sex symbol

Unfortunately, nobody had been able to produce a 3D accelerator card that had all the capabilities programmers really needed for truly realistic graphics, until a previously unknown company called 3DFX introduced the Voodoo Graphics chip set. A graphics accelerator board built around this chip set could not only draw 3D polygons on a computer display at unheard-of speeds, it could do so in resolutions of at least 640×480 pixels (considered fairly high-resolution on 14″ monitors) with more than 32,000 different colors on the screen simultaneously, an astonishing number when you consider that most computer games of the period ran in 320×200 mode and only put 256 colors on the screen at one time. 3DFX also released an API — application programming interface — for the chip set called Glide that simplified the task of writing programs that used the chips’ capabilities.

The first game company to recognize the potential of the 3DFX chip set and make a game available that utilized it was, not surprisingly, Id Software, the company that had revolutionized 3D gaming and invented the first person shooter with Wolfenstein 3D and Doom. In 1996, Id had released Quake,  a game that I had played and enjoyed but that hadn’t struck me as the kind of quantum leap beyond Doom that Doom had been beyond Wolfenstein 3D. What was clever about Quake was buried deep inside its code, where the average player couldn’t see it. Id’s head programmer, John Carmack, had designed the Quake graphic image so that it could easily take advantage of graphic hardware quite different from the graphics cards that it was initially intended to run on. Just before I received my loaner computer from a local software company, Carmack had released a modification to Quake that anyone could download freely from Id’s Web site. It allowed Quake to generate its 3D imagery using the 3DFX chip set, eliminating the need for much of the 3D code built in to the game program. Being the first gamer on my block — and probably one of the first gamers in the state of Maryland — who had a computer at his disposal with a 3DFX Voodoo board installed, I downloaded it immediately and ran the accelerated version of Quake. The results were breathtaking.

Here’s a detail of the Quake screen running in 320×200 pixel mode with 256 colors on-screen:

Quake without 3D acceleration

Quake running in low resolution in 256 colors

And here’s a detail of the Quake screen in higher resolution with more than 65,000 colors on screen at one time:

Quake with 3D acceleration

Quake running in high resolution with more than 65,000 colors

I’ve blown up these crops to make the differences more obvious, which causes them both to look blurry, but the differences should indeed be clear. And though few gamers had 3DFX accelerator boards in January of 1997, a lot of game developers did, and when they saw what the new boards did for Quake, the revolution began. Everybody who wanted to write 3D games wanted to write them for 3DFX-based graphics boards.

To get a better idea of what happened when 3D graphics were introduced, watch this YouTube video:

Graphical Evolution of First-Person Shooters: 1992-2012

Be sure you’ve got your YouTube settings at 720 or higher when you start watching. The 3D accelerator shift becomes most apparent here with Half Life in 1998 rather than with Quake 1 or Quake 2 in the previous two years, but that may have as much to do with the resolution settings or video capture software used by the creator. It was the accelerated version of Quake 1 that really set off the revolution. But it’s still obvious in this video that something dramatic happened between roughly 1996 and 1998 and that something was 3DFX.

Within a few years 3DFX had serious competition from chip sets developed by NVidia and ATI. By 2000 the company went bankrupt and sold off its patents to NVidia, which along with ATI remains one of the two major producers of 3D graphics boards for gamers. But by then, 3D-accelerator technology had become ubiquitous. Even ordinary computers, those used exclusively for spreadsheets or Web surfing, had 3D acceleration built in, even though in many cases it was never used.

Once the revolution began, 3D computer animation improved so rapidly that by 2004 games like Half Life 2 could produce scenes that looked like they’d been shot in the real world by video cameras:

Half Life 2

Half Life 2: Is it real or is it a video game?

By 2006, games like The Elder Scrolls: Oblivion had even achieved my holy grail of computer graphics realism: You could see individual blades of grass blowing in the wind:

The Elder Scrolls: Oblivion

Blades of grass seemingly blowing in the wind

This was actually a bit of a cheat. You weren’t seeing individual blades of grass. You were seeing transparent polygons with clusters of grass “painted” on them through a technique known as texture mapping. But it looked real and in the world of computer game virtual reality, that’s what counts.

Only a few years over schedule, computer graphics had achieved the vision I’d had for them in 1981, when I’d looked at a picture of a crudely drawn warrior facing down a crudely drawn monster in a very crudely drawn maze on the box for the game Morloc’s Tower:

Morloc's Tower

A crudely drawn warrior in a crudely drawn maze. Computer game virtual reality had come a long way in 25 years.

There was only one element missing from my original vision of computer games as virtual reality: artificial intelligence. MMOs — massively multiplayer online games — allowed real human beings to interact as computer drawn characters, but not really in a way that produced a story. I wanted computer-generated characters who were as artificially smart as they were artificially realistic, so that I could enter a world and create my own story from existing elements, stepping into a kingdom on the verge of war or a conflict between mobsters and becoming involved with the people involved in such a way that I would actually affect the direction of that world’s history.

Computer game designers were working on this too, but without true AI creative substitutes had to be found that would give the player the illusion that they were in a real world interacting with real people. Essentially, designers have gone in two different directions to create this illusion: allowing players maximum freedom within a richly imagined world or forcing players down a pre-conceived story path that only offers a veneer of true freedom. In the next (and, I hope, final) installment of this post, I’ll discuss three games that take very different approaches to creating stories within virtual worlds: The Elder Scrolls: Skyrim from Bethesda Game Studios, The Walking Dead Season One from TellTale Games and Dishonored from Arkane Studios.