Overvolted

How is this a game?
— 5:24 PM on January 26, 2009

I checked dictionary.com before starting this blog post to make sure my definition of "game" was correct, but apparently, a game is just "an amusement or pastime." So congratulations to Prey, Mario is Missing, and the new Prince of Persia. You are, technically, games. I don't agree, but at least you've got a dictionary on your side.

I remember when Mario is Missing came out on the Super NES (I was in junior high, so I'm dating myself a bit). One of the biggest points of ridicule for the game was that you couldn't lose. It just couldn't happen. You pretty much just followed the "game" along its set path until it concluded. This was around the time we were getting our asses handed to us in the Mega Man X games and, especially in my case, repeatedly loading Gradius III for that extra dose of humility, so a game that was impossible to lose just seemed, well, stupid.

Flash forward to the past few years, and we've had titles like BioShock, Prey, and Prince of Persia take the sting out of losing. BioShock's Vita-Chamber system didn't bother me that much, because it still meant dragging my wrench back to wherever the Big Daddy elected to wander. However, it ticked off my girlfriend and obviously a number of other people, and the option to disable the chambers eventually materialized in an official patch. At least someone at 2K Games learned a valuable lesson there.

My major disappointment was Prey. The Spirit World mechanic broke that game completely—worse than, say, how Knights of the Round broke the classic Final Fantasy VII, because you could still lose that game if you were a complete mouth-breathing moron. Dying in Prey just meant going to the blue world for a little while, then coming back exactly where you were when you died. You could shoot the flying things in the spirit world for bonus health or what have you, but it didn't really matter, because dying was just sort of a formality in that game and not of any actual consequence.

Unfortunately, once I realized I could stand in a room full of enemies, not shoot anything, leave, and come back to the game in a half hour and still be more or less in the same place, I got decidedly sloppy. What's the worst that could happen? I go to the Spirit World a couple more times? Oh noes! Not that! I essentially stopped caring entirely, which is a shame, because although the weapon design was just a little too esoteric for its own good, the environments in Prey were pretty incredible. I spent the rest of the game looking at the pretty architecture and periodically killing bosses with the wrench. The actual thrill of playing was pretty much gone.

I confess I haven't picked up the new Prince of Persia. I've heard mixed things about it, but I have zero interest in a game I can't lose. If Elika is just going to bail me out of every situation, why am I even playing? If it's just to wander around a beautifully rendered world, I'm sorry, I'll just go watch a movie. The Pang Brothers film Re-Cycle is available on Blu-ray, and it's not going to trouble me with the formality of pressing buttons just to see the cool stuff.

I can appreciate that game designers have grandiose cinematic visions for their games and want us just to experience them. Every nascent art form has been defined by what came before it, just like how film was defined by photographs and plays, and photographs were defined by paintings. It just so happens that video games are being defined by film, and that's fine—great even—because it produces brilliant works of art like Mass Effect and Call of Duty 4: Modern Warfare. The problem comes when designers seem to forget their end of the bargain: that they have to produce a game that rewards the player for doing well and punishes him for poor performance. With something like Prey, I wonder why they even bothered making it a game.

One of the justifications for this "no death" design is that dying just breaks the flow of the game, and the player is just going to load an auto-save or what have you. To that, I have two responses: first, I was a big fan of the checkpoint system employed in Far Cry and the Rainbow Six: Vegas series, because it forced you to make it through on the game's terms while simultaneously cleaning up any kind of auto-save, auto-load business. Second, the frustration of being delayed, of having to get back to where you were when you died, is a punishment for failure. It makes the game a game. Do well and you get to advance. Fail and you get to stay where you are until you figure things out.

The auto-save system works just fine, anyhow. My first run through Mass Effect involved lots of retries, and on repeat plays, I remembered where the more difficult encounters were so I could save in advance and experiment with ways to pass them successfully. When I finally did clear a room or kill that last Geth Sniper, I got a feeling of actual accomplishment. The game wasn't holding my hand; I had to figure it out and make it through myself. If I had just re-materialized exactly where I was before with barely a skip in my step, how much skill would advancing really have required? I'm not good; I'm just persistent.

I wrote this blog post because I wanted to articulate the importance of losing a game and the role that simple, time-honored game mechanics like saving and loading—be they off a function key or off a save point—have in maintaining balance and keeping a game from getting too easy. If I'm playing a game that thinks so little of me as a player that it has to keep me from dying, what's the point?

84 comments — Last by khands at 9:37 AM on 02/04/09

Stereoscopic 3D still sucks
— 1:06 PM on January 21, 2009

Some of you may have noticed a sort of 3D renaissance taking place recently in the entertainment industry. Such high-profile, quality releases as Beowulf and My Bloody Valentine 3D haven't been doing the movement a whole lot of favors, but James Cameron's next magnum opus—and first feature film since Titanic—is being filmed in 3D. All kinds of wacky stuff is being produced to play in 3D in IMAX theatres, as well. If you didn't know any better, you'd think this was some crazy new technology threatening to revolutionize entertainment. Especially if you're one of our younger readers, you may not be aware that this stuff already had its time, and guess what? It failed miserably.

This point is relevant because plenty of other firms are trying to get stereoscopic 3D going on your computer, too, including Nvidia with its GeForce 3D Vision. A kiosk I continually pass by at Fry's Electronics has a monitor tweaked for stereoscopic vision, too. All you have to do is put on these crazy 3D glasses and you, too, can enjoy Hellgate: London (yes, this really is one of the games they demo it with) in stereoscopic, questionable-quality 3D! Given how well it worked, I wouldn't pay five bucks for it, much less three figures.

I'm not inclined to point fingers at this company or that company when I say 3D is frankly still an awful idea. Consider what's arguably one of the best stereoscopic 3D implementations on the market right now: Nvidia's GeForce 3D Vision. I'm told it produces excellent image quality, but there are some real barriers of entry here. The technology requires a 120Hz monitor, and it incurs a fairly precipitous performance drop. Most damningly, while it's pretty much offering the pinnacle of 3D glasses, the fact that 3D glasses are even present is a problem in itself. If I take a head count of my close friends, at least 50% of them wear glasses, and while that's anecdotal (I also associate with an abnormally large number of southpaws, for what it's worth), it's still indicative of a large number of people for whom 3D frankly isn't going to work that well. Ever put 3D glasses on over your regular glasses?

The problem is that stereoscopic 3D is still, in my opinion, more or less asinine. When I have to put on these kooky glasses (or fit them over my regular glasses, which is uncomfortable at best), I feel like it just screams "gimmick." In fact, the whole concept has always felt like a gimmick to me. All I have to do is pop out my Freddy's Dead: The Final Nightmare DVD for proof of that. The 3D finale of the movie isn't just a failure because the movie sucks (and it does, but I love it anyway), but because 3D just isn't all that exciting. It had a very brief run of popularity in the 70s and 80s before dying a well-deserved death. I love Captain EO as much as the next guy, but 3D had to go.

My biggest problem with 3D is that, frankly, it's not 3D—just an approximation of it. When you put on the glasses, it just feels like an optical illusion. This is compounded by the fact that for some of us (maybe just me?), there's something of a concerted effort required to get the most out of the experience. It's not one of those things where it just looks perfect from every angle; there are certain ways to look at it that seem better than others. Having my viewing experience of a film or game significantly altered by not looking at it dead on just seems silly to me. And as I said before, after all this trouble, I don't think it looks that good, especially with the coloring, contrast, and brightness problems it can induce.

I wouldn't care quite as much if 3D were just a fad, but it unfortunately has lasting effects. Although video games aren't going to have this problem—they're inherently predisposed to producing three-dimensional spaces anyhow—movies produced for 3D have historically turned into artifacts of the era. Taking advantage of 3D in film means shooting scenes in a fashion you ordinarily wouldn't do, and there's a very good reason for it: outside of 3D, those shots tend to look really, really lame. Consider the sword pointed directly at the camera in the cinematic abortion Beowulf. I'm sure it looked fantastic in 3D, but it just doesn't work in regular viewing.

The final bone I have to pick with 3D is a more obvious one: do they really think we're this naive? I've noted that 3D came and went decades ago, and I believe very little has changed since then. The technology hasn't undergone any radical evolutions, and you're still wearing the same stupid glasses. They're just slightly different stupid glasses. As a consumer, I feel somewhat offended by having old trash being pawned off to me as new hotness, especially knowing full well that it never had legs to begin with.

When I spoke to Scott about this, he said 3D was one of the ways movie theaters were trying to stay relevant. This certainly wouldn't be the first time theaters—and studios—have tried to innovate to maintain business. Your movies are in wide-screen right now because of the advent of television; prior to television, films were shot in standard aspect. But wide-screen worked because it more closely approximated how we see, which is to say we have a greater field of vision laterally than we do vertically. It also didn't require us to do anything different to watch our movies. We didn't have to put on glasses, or stand on our heads, or anything.

I think 3D is a naked attempt to stave off obsolescence for a means of distribution that wouldn't be fast approaching death if its real shortcomings were addressed (but that's a rant for another time). And at the end of it all, my point stands: 3D sucked then, and I don't believe anything has changed.

48 comments — Last by Firestarter at 5:17 AM on 01/25/09

The Radeon HD 4000 series pushes things forward
— 1:08 PM on January 14, 2009

I'm sure I'm not the only person looking forward to AMD's Mobility Radeon HD 4000 graphics processors wending their way into notebooks everywhere. I'm currently in a curious position where I'm shopping for a laptop I don't necessarily need, because once I graduate from college this quarter, I may not have much use for one anymore outside of the odd trip. And let's face it: when you're a mild agoraphobe, trips aren't exactly common on your "to do" list. Yet I can't help but be excited about the 4000 series hitting the market.

To be honest, the transition to DirectX 10-class hardware seemed to lower the bar for Nvidia's mid-range GPUs. In the generations leading up to the G80, every mid-range Nvidia part was more or less a top-of-the-line GPU chainsawed in half. The GeForce 6600 GT was the first eight-pipeline part to hit that market, and it was able to run Doom 3 at 1600x1200—an impressive feat at the time. The GeForce 7600 GT wound up being so powerful that it performed eerily close to the 7800 GS, and as a former 7600 GT owner, I can tell you it was a monster for its time.

Then the GeForce 8600 GT and GTS came along and were met with a collective "meh." Maybe Nvidia was tired of mainstream hardware cannibalizing its high-end parts, but the G84 was a smaller slice of the G80 than some were expecting, and its performance often fell below the previous generation's performance-class hardware. Meanwhile, AMD struggled with somewhat mediocre Radeon HD 2600-series offerings (and, before those, the unremarkable Radeon X1600 lineup).

So why the history lesson? Because AMD and ATI's missteps seem to have hurt the mobile market by letting Nvidia get away with slower mainstream products. ATI mobile hardware suffered a steady decline after the Mobility Radeon 9600 (which dominated the market during its time), and only with the Mobility Radeon HD 3000 series did AMD manage to offer somewhat compelling performance. With little competition in sight, the G84 became Nvidia's high-end mobile GPU. For how mediocre Nvidia's current mobile hardware is, though, the 3000 series is often worse. G84 and G96 (essentially two sides of the same coin) derivatives run amok largely because of how middling the Mobility Radeon HD 3650 is. And now, nearly two years after the G84's launch, the best we've got is an underclocked GeForce 8800 GT. That's the mobile top of the line. It really doesn't get any better than that.

This is why the Radeon HD 4000 series is so important. The desktop HD 4670 doesn't trail the GeForce 9600 GT and 9800 GT by very much at lower resolutions, and the RV730 is a much smaller chip with a smaller power envelope. When it finally lands in notebooks, there's a very good chance that even its underclocked, mobility-optimized flavors will be competitive with GeForce 9800M-series GPUs. And AMD gambling on a smaller, more efficient die over a larger one may very well pay off. While the GTX series is simply too much chip to fit into a mobile chassis right now, AMD will be fitting the more svelte RV770 into desktop replacement notebooks across the land (assuming they nab the design wins, which they very well could).

Even better, while we still can't fit a full 128-SP G92 into a laptop (remember, the 9800M GTX is essentially an underclocked 112-SP 8800 GT), the RV770 will be making the transition with all 800 stream processors intact. Finally, a low-end market that's essentially being cannibalized by strengthening integrated graphics will once again be able to justify going with a dedicated low-end GPU, since the HD 4350 and 4550 are both capable enough parts for the casual gamer, with the 4350 essentially doubling the performance of its predecessor and anagram, the 3450. When you realize that the HD 3450 and 3470 are two of the fastest low-end mobile GPUs currently available, you can get excited even about these entry-level parts.

The Mobility Radeon HD 4000 series will hopefully force Nvidia out of its complacency, and the Mobility Radeon HD 4670 stands to be an outstanding contender for 15.4" notebooks, allowing mobile gaming platforms to advance again. My upcoming review of MSI's GX630 laptop, which carries a GeForce 9600M GT with 512MB of GDDR3 RAM, should be proof enough of how poor mobile gaming is right now. When a GPU that can barely handle Enemy Territory: Quake Wars at 1280x800 (settings maxed, antialiasing off) constitutes mainstream mobile gaming hardware, something is seriously off. The HD 4600, if the performance of its desktop cousin is anything to go by, could very well change this paradigm and push things forward.

While I can't get too excited about the Mobility Radeon HD 4800 line (large gaming notebooks are, in my opinion, just massive sinks of money), the possibility of enjoying a Radeon HD 4600-series chip in a 15.4" notebook is exactly the kind of thing that makes me keep an eagle eye on the future of laptops—and if I wind up going to grad school, I can almost certainly see one of those in my future. If AMD can get these out to manufacturers in quantity and pull off another coup in the mobile market the way it did with the desktop HD 4000 line, the next few months may be happy ones indeed for mobile gamers.

26 comments — Last by paulWTAMU at 6:20 PM on 01/15/09

HP and the opposite of progress
— 2:37 PM on January 7, 2009

I'll come right out and say from the get-go that I'm typically a big fan of HP's laptops. I'm currently on my seventh laptop in as many years, and two of those systems have been HPs. Why I parted with the others is a discussion for another time. Prior to the release of the current dv lines (dv4, dv5, dv7 et al), HP was often the first brand I looked at when I was either in the market or window shopping. Unfortunately, I'm not as big a fan anymore. I'll explain.

When I bought my dv2500t custom built off HP's website in December 2007, the company had a fantastic coupon deal going, and I'd already tried a few of the 14.1" HP models in stores to get a feel for the chassis. Prior to that, I'd owned a special edition dv6500z (15.4" with a Turion 64 X2 and GeForce Go 6150). The chassis design HP was using at the time seemed to be growing a bit long in the tooth, but I actually found it a very attractive compromise between the all-out gaudiness of modern Toshiba "Fusion" builds, with their hideously glossy keyboards, and the dull silver units Dell is still pumping into the marketplace.

HP's dv2500t.

One of my favorite things about the HP design was the slope on the bottom half, which felt more ergonomic and comfortable to use than traditionally flat laptops. The contours of the unit both open and closed were attractive to me, too. So while I'm not the type to impede progress, I can understand why HP stuck with this build for as long as it did.

In the time I've had to spend with my dv2500t, I've encountered the kinds of quirks I'd come to expect. There's no such thing as a perfect laptop: design is a balancing act between heat, performance, battery life, comfort, aesthetics, cost, and so on. The upshots to this particular build are the latch-less design, the incredibly firm hinges on the lid (I'm typing this on a plane, and the tray table wobbles more than the laptop screen does), high quality speakers, and a generally pleasing aesthetic.

The dv2500t is a little heavier than comparable models, but it feels surprisingly solid. Even though the black and silver of mine is the most common coloring (my special edition dv6000z had a very distinctive white instead of black), it's still attractive. The imprint design under the glossy finish is borderline impossible to damage, so a thorough cleaning would be all the shell needs to look brand new. Finally, the recessed touchpad can be toggled on and off via a small button just above it. The beauty of the touchpad placement is that I can count on one hand the number of times I've accidentally brushed it with my palms.

Of course, it's not all kittens, unicorns, and rainbows. The wireless switch is very loose on HP laptops from this era, the headphone jack placement in the front is less than ideal if you're using the laptop on your lap, and the optical drive could be a bit more secure. On top of that, HP followed the trend of touch-sensitive media buttons, and I'm not a big fan of it. With no feedback to speak of, I sometimes have trouble gauging if I'm hitting one of those buttons properly, and the touch-sensitive volume control only makes matters worse. This is a minor flaw, though; media keys have Fn key shortcuts, and I just use the speaker icon in the Windows tray to change volume. I can adapt.

The number one issue for me—and I'm sure for others—has been the hard drive bay. It lies under the left palm rest, I'm fairly certain the GeForce 8400M GS is hanging out under there, too. If not that, then the Core 2 Duo T7250. Either way, something there is getting mighty hot, because even my 320GB Western Digital Scorpio Blue hits punishing temperatures of 56C when the system is under load. It's enough to make my palm sweat, so it makes me nervous about the laptop's longevity (not to mention the hard drive's). Heat is the whole reason I'm running the Scorpio Blue instead of a Scorpio Black—believe me, if I could get a 7,200-RPM drive in here, I would.

So, the dv2500t has one major flaw in a sea of minor ones, but overall I'm a smitten kitten with this laptop, and I've had a hard time finding anything else that excites me. Asus makes some great laptops, but while I can easily find an extended battery for this HP system, spare batteries for Asus notebooks are scarcer, and worse, they often top out at 2.5 hours from the factory. No dice; I want three hours bare minimum, and even my two-year-old 12-cell still gets more than four hours to the charge. As for Dell and Lenovo laptops, I generally find them aesthetically unappealing, and aesthetics are a big deal for me: I want to enjoy using my laptop. I want it to feel like an extension of myself rather than just a device I use to read The Tech Report at school when I should be paying attention.

Where am I going with this? Frankly, the overhaul of HP's laptop lineup (which some call overdue) has left me disappointed. My favorite aspects of the old build are gone: the black, textured keyboard has given way to shiny silver or bronze ones, the incline of the keyboard is now flat, and the tasteful bi-color design that accentuates the inside of the unit has been changed in favor of one color for the lid and one color for the body. I could handle these updates fine if HP had improved the overall internal design, but that doesn't seem to be the case with the dv4t. The hard drive bay is still right smack under the left palm rest, and if you play with the units in retail you'll see that corner is still notably hotter than the rest of the notebook.

The new dv4t model.

The final, killing blow for me may be a simple one. I like Intel's Centrino 2 platform, and the fact that you can get a 2.4GHz dual core laptop processor with a 1,066MHz front-side bus in a 25W envelope two years after the seminal E6600 desktop chip is, at least to me, impressive. However, the discrete graphics option is dismal compared to the system's predecessor: HP includes a GeForce 9200M with DDR2 video memory. This part is similar to the one in my current laptop, though mine at least has faster-clocked GDDR3 memory.

Ultimately, HP's current dv-series laptops (and the dv4t specifically) seem like a step forward and two steps back. I find them less impressive than their predecessors from an aesthetics standpoint, and while I can't say the same with certainty for the 15.4" models, the dv4t seems to have inherited all of the dv2000t series' problems. It's disappointing to me because HP has been my go-to brand for a while. Yet even when I'm getting tired of the sluggish performance of my GeForce 8400M GS, a year and a half of progress would net me no real improvement in graphics performance. The best thing I can do is hope that HP announces something more compelling or starts offering Nvidia's GeForce 9400M IGP, which is actually competitive with low-end discrete GPUs at this point.

For a while, I felt HP's laptops were the most attractive, beautifully designed on the retail floor, and now they look little better than their over-glossed Toshiba rivals. I sincerely hope HP makes a good showing at CES 2009, but if not, I won't be too upset. My laptop isn't long in the tooth just yet, so I can happily keep it if no one wants to sell me something more interesting.

27 comments — Last by dmitriylm at 11:32 AM on 01/14/09

On the contrary: Dead Space and Fallout 3
— 6:23 PM on December 22, 2008

Cyril and Geoff both recently posted positive reviews of two of the more popular games of 2008, Dead Space and Fallout 3. Being the contrarian I am, I wasn't terribly taken with either of these games. Don't get me wrong; they're both fun. I've beaten Dead Space and played to level 20 in Fallout 3, having gone through what I estimate must be at least two-thirds of the story.

However, people talk about Dead Space like it's one of the scariest games ever made, and Fallout 3 as though it's the game of the year. Neither game deserves its reputation, in my view. I'll tell you what my vote is for game of the year at the end of this blog post.

Dead Space
As I said before, Dead Space is not a bad game. I played it to the end and for the most part had a good time with it. While Scott has had major issues with the PC controls, I found my Radeon HD 4870 (with vsync disabled) was able to deliver a satisfying enough experience. I do agree that there are many times when it's over- and under-sensitive, and I personally had to get used to not walking diagonally into everything due to the awkward third-person perspective, but I did get the hang of the game eventually.

A large part of my problem with Dead Space—and maybe it's just because I'm a film student who makes horror movies—is that the horror aspect is simply bad. When people talked about it being one of the scariest games ever made, I was on board. I'm fairly hard to scare, but I do know what constitutes scary. Freddy Krueger in the original A Nightmare on Elm Street constitutes scary. The tunnel scene in 28 Days Later constitutes scary. The chilling finale of Ringu constitutes scary. Dead Space is not scary. It isn't like I can't be scared by a horror game. Fatal Frame scared the daylights out of me the first time I played it. The first three Silent Hills are clearly creepy games. Even parts of Doom 3 wigged me out. Dead Space is not scary.

Yahtzee over at Zero Punctuation did a fantastic review of the horror elements of the game, probably better than I ever could, but here's my basic thesis: this is a game made by people who might have heard of horror movies and video games, even seen and played a couple (specifically Event Horizon and Doom 3), and figured they knew all they needed to know to scare the crap out of everyone. EA likes what they see, the game gets hyped up as being ultraviolent and absolutely terrifying, and all of a sudden we have a shiny new IP that comes with its own animated movie on release day.

Why am I so certain about the design team's unfamiliarity with quality horror? Simply put, I think Dead Space makes all the same mistakes crappy American horror films make: assuming that gore and violence are somehow inherently scary. While small children (and teenagers looking to get laid) might be scared by Friday the 13th, that philosophy clearly doesn't hold out later in life, when you'll be playing Dead Space.

Let's start with the decor, and I'll compare that to the game's possible spiritual predecessor—Doom 3. id Software's last shooter gradually ramps up how bloody and generally messed up the Martian base is. On the other hand, Dead Space just paints blood all over the walls indiscriminately and copiously, and anyone who's seen Peter Jackson's genre classic Dead Alive (Braindead outside of the states) knows blood and gore just become silly after a certain point.

Some atmosphere-related design decisions in Dead Space also seem to defy all logic. There's a door to a slab in the morgue that swings open of its own volition when you enter the room, but the game doesn't really have the kind of supernatural trappings to support something like that, and it just feels like a cheap Halloween trick. Likewise, the abundance of flies and maggots makes no sense at all, and I know I'm not the only person who was put off by this. On Earth or some other planet? Sure, flies and maggots, go for it. In a carefully controlled environment on a space station? It just doesn't work. And finally, there's the lab full of canisters with fetuses in them. Really? Didn't anyone think that was reaching a bit? Very little in the game supported that, either.

As a side note on the horror trappings segment, consider this: I wouldn't be so critical of these decisions if the game didn't make a big deal of being a serious horror venture, the same way the Transformers movie decided to cram two hours of worthless story down my throat when all I wanted was giant robots destroying stuff.

Most people would agree that Doom 3 was too dark. While I'm not one of those people, I can definitely appreciate that sentiment. The designers of Dead Space apparently can, too, because the game is wonderfully, magically, beautifully lit. It's never too dark for you to see anything, never too dark for something to jump out and get you, and consequently, never dark enough to be creepy.

To close my tirade about the visual style, I'll say a word about the necromorphs themselves. I must be the only person who wasn't impressed by these, and I wish I had a more articulate way of saying why. There was no rhyme or reason to them, no sense of design other than "this might look cool." This excludes the tentacle babies, of course, but this feels like a point where someone just said, "wouldn't it be scary if there were like... mutant babies?" Tour immediate comparison might be to the cherubs in Doom 3, and guess what: the cherubs were much creepier in my opinion. The cherubs giggled like infants. The tentacle babies just feel like a set of "also ran."

The visual stuff wouldn't be so criminal if the sound design could at least carry its end of the bargain. But I don't think it does. "Creepy sounds" aboard the Ishimura almost always feel random, like metal sometimes clanking in places you shouldn't even be able to hear it. There's no restraint or control here; it just sounds like a looping "creepy sound" background track. The only time the sound design rises to the occasion is in a vacuum, and sadly, those zero-gravity vacuum scenes are the only ones I found inspiring.

Finally, while the much-touted "strategic dismemberment" gameplay just felt like moving the hit boxes from the torso and head to the joints, it was fun enough. The real problem lies in the fact that, honestly, most of the weapons in the game are fairly worthless. I set myself on fire with the flamethrower the first time I used it. The pulse rifle doesn't fire shots in a way that's really intuitive for severing limbs. The best alternative weapons I found were the line gun and the ripper, but the plasma cutter you start the game with is easily all you'll ever need—especially when the ammo counter is full. Good ammo capacity, quick reload, excellent firing speed, abundant ammo... I mean, I used it to kill the end boss, for crying out loud. Using anything else just felt silly.

So, I found Dead Space competent and potentially enjoyable in terms of pure gameplay, but it irritated me whenever it started putting on airs of grandeur. It was still fun enough for me to finish it, although that's more of a reflection on my desire to play games that involve killing lots of things as opposed to being pulled into the story or atmosphere.

Fallout 3
Speaking of atmosphere, let's move on to Fallout 3—a game that pretty clearly has its share of devotees screaming about just that. Now, I haven't played Fallout or Fallout 2, but I can say with relative certainty that Fallout 3 feels almost nothing like them. Why? Because it feels too much like Oblivion.

If you've logged some ungodly hours on Oblivion like I have, you've probably become intimately aware of all the ins and outs of the game—particularly the technology of the engine. You know where to expect slowdowns; you know how the models look and act; you know all of it. So naturally, Fallout 3 in many ways looks exactly like its predecessor. Movement and jumping even feel identical to Oblivion. Dialogue operates in almost the exact same way, where the game pauses and centers on the person you're speaking to. And dialogue is really the first place where Fallout 3 fails spectacularly in my opinion.

Look, it's 2008. Mass Effect exists. Heck, I think even the original Half-Life 2 had better models with better animation and better voice acting. Why is Bethesda still camping out in the uncanny valley? Why has something that was a problem with Oblivion gone more or less unchanged? It gets worse when you listen to the pacing of how the characters speak and realize they were clearly directed to speak slowly enough that the player could actually read the entire subtitle at the same time. In other words, totally unrealistically.

The way the dialogue itself branches doesn't do the game any favors either. This was barely a good idea back in Jade Empire, until BioWare realized how to more or less fix it and did so in Mass Effect by giving you options that were "the gist" of what you would say while not being exactly it. The game then became involving because you were not only wondering how your interlocutor would respond, but also what exactly your character would say.

And to finish harping on the dialogue, Liam Neeson was a horrible idea as a voice actor to play your father. No matter what ethnicity you choose for yourself (and thus for him), your father always sounds exactly like Liam Neeson. The disconnect between the character's appearance and his voice becomes maddening, and it threw me out of the game.

So, from the technology and dialogue perspectives, Fallout 3 unfortunately feels dated and poorly directed to me. I'm glad Bethesda finally fixed the hideous water from Oblivion, but they also made everything gray and brown. While everyone else seems to love these color choices, I can't help but feel like this palette didn't do Quake any favors back in the day, and it's not exactly doing Fallout 3 any favors now. For this reason, I prefer Oblivion, which utilizes crazy things like colors and contrast. I do recognize these colors are supposed to be appropriate considering the game's wasteland theme, but I feel Bethesda stuck too closely to that theme. It's like the ending to the movie Identity: just because something works perfectly with what you had in mind, that doesn't mean it was ever a good idea to begin with. Fallout 3's color palette needed flexibility it never got, and as a result, every environment feels the same, and the game feels even more damningly repetitive than Oblivion did.

Finally, while I love the Perks system, I feel like Bethesda applied a band-aid to the leveling issues of Oblivion rather than actually fixing the problem. Capping your level at 20 was a horrible idea for a game as expansive as Fallout 3, because it removes one of the chief ways of rewarding the player for exploration: experience points. I have no desire to continue exploring. I've already seen what I'm fairly certain is almost every weapon in the game, and everything so far has been gray or brown, so there's not going to be anything new out there. What have you got left to offer me, really?

I will say that the lockpicking mini-game is a nice change of pace from Oblivion's, and the hacking game is loads of fun for someone like me who loves word puzzles. While lockpicking could get tedious and old in both games, the hacking game was always enjoyable for me and in an odd bit of contrast, highlighted how poor BioShock's really is. I adore BioShock, but I got sick of playing Pipe Dream all the time. The simple word puzzles in Fallout 3 were more my speed by far.

For what it's worth, I do enjoy Fallout 3, and I frankly suspect it's going to live and die by its mod community the way Oblivion did (vanilla Oblivion isn't particularly great, either). Fallout 3's combat system is leaps and bounds ahead of Oblivion's, and I've found VATS to sync up nicely with real time-combat (your mileage may and probably will vary). In many ways I feel like Fallout 3 is like the livelier, more exciting cousin of S.T.A.L.K.E.R.: Shadow of Chernobyl. It has many elements I liked from that game but none of the ones I could do without. But it's also painfully repetitive, a flaw that's marred with some terrible design decisions and a strange disconnect between the fantastic concept art and the mediocre visual execution I've come to expect from Bethesda. The dialogue, something that an RPG often lives and dies by, is in every way awful: badly written, badly directed, and badly executed visually.

Given this protracted list of grievances on these two generally popular games, one might find themselves wondering what I like—if anything at all. What holds up to my scrutiny? What would I peg as "game of the year?"

Mass Effect
I'm probably extraordinarily biased here. Mass Effect feeds on nerds like me who grew up largely on Star Trek: The Next Generation. It's a funny thing: thematically, it feels like it sits squarely between Star Trek and Star Wars, offering Star Trek's sense of adventure and intelligent writing, with the action and grand mythos of Star Wars. There's something that feels very grown up about the Mass Effect experience.

Where Mass Effect really beats the competition—save perhaps for the Half-Life 2 series—is in its delivery of a truly cinematic experience through the use of smart writing and outstanding direction. It's funny what a difference subtitles make; without being burdened by them, Half-Life 2 and Mass Effect are able to mesh successfully the interactive quality of playing a game with the passive feeling of watching a movie. Subtitles can draw you out sometimes and remind you that all you're doing is playing a game. They're a very small detail, but they feel like all the difference in the world.

Part of what makes the voice acting really work is the use of big—but not too big—voice actors, coupled with a voice for your own character that never feels mismatched, no matter how much you customize his look. In Fallout 3, your father always sounds like good ol' Oskar Schindler regardless of whether he's black or Asian. Shepard's voice in particular never has that effect, and the voice acting surrounding him (or her) is the same. Cult movie nerds are going to recognize Keith David's voice acting as Captain Anderson, but he doesn't distract. In fact, the only voice that doesn't work is Star Trek: The Next Generation alumni Marina Sirtis as Matriarch Benezia. Benezia is underwritten to begin with, and Marina's voice acting just doesn't match the character. The mercy of her being underwritten is that she's underused, making her easy enough to ignore.

What finally nails the dialogue is the incredible modeling and animation for the characters, matching Half-Life 2 in some ways, exceeding it in others. Facial expressions are effective while being just stylized enough to avoid the precarious drop into the uncanny valley that Fallout 3 kicks around in.

As I mentioned before, the dialogue branching is also absolutely stellar, and BioWare smartly got rid of the single "good vs. evil" bar they used in previous games and allow you to more or less be simultaneously good and evil. It's a great dynamic, and the "good and evil" aspects are somewhat less blatant than before; the "Renegade" and "Paragon" meters might as well just be labeled "Riggs" and "Murtaugh." (If this makes no sense to you, I'm sorry you never saw any of the Lethal Weapon movies.)

On the other side of the coin, I found Mass Effect's combat to be shallow at first, but I eventually came to appreciate the depth of it. The space bar tactical menu is both intuitive and ingenious, allowing slowpokes like me time to think about and give orders, though targeting with it is sometimes problematic. After playing and enjoying the Rainbow Six: Vegas games and using the cover system in Mass Effect, I find myself now wondering why more action games don't employ a cover system. It's such a simple way to increase the depth of combat while making the environment feel more like an actual place instead of a backdrop. If I did have a major qualm, it would be how uninteresting the tech skills are in combat compared to the biotics. While biotics do all kinds of simple, fun stuff, tech skills are all basically variants on "grenade." Outside of AI hacking (which is awesome), they're pretty dull.

If Mass Effect has major failings, they have to be the abysmal inventory management system (which I hear is impressively even worse on the Xbox 360) and the landing missions that have you piloting the Mako. The inventory gets too complex over time, and it's difficult to determine whether or not half the stuff you're carrying even has any use. Something more visual that actually takes advantage of the mouse and keyboard would've been more appropriate here. At least, I would've liked some level of intelligent organization.

The Mako sections of the game, which are unfortunately numerous, range from enjoyable (flat terrain and shooting things) to the kind of thing you'd punish your kids with for misbehaving (everything else). I'm sure BioWare is very proud of the physics they've implemented here, but driving the Mako over harsh terrain is the kind of thing that makes you wish getting kicked in the Good and Plenty's would magically teleport you to wherever you wanted to go. If you play the game enough, you'll start to wonder if a swift roundhouse to the genitals isn't preferable to driving places. This is a genuine internal monologue you'll have. You'll really consider it.

So, even though the Mako portions of the game make me want to shave off my nipples, everything else is like a mad descent into nerd rapture. The number of subtle ways you can play and replay the game, the decisions you can make, and the incredibly intelligent dialogue branching... everything coalesces into one of the most replayable games I've ever played. The amount of time I've spent playing and replaying Mass Effect borders on the obscene; this is the kind of time you usually only devote to games designed to be simple and easy to return to, like Tetris and Audiosurf. It seems like the amount of complexity instituted in a game often serves to reduce the replay value, but in Mass Effect, it's been balanced beautifully.

The most curious thing about Mass Effect and how massively it's affected me (har har) is this: while visual quality tends to play a major part in whether or not I enjoy a game, I couldn't really care less about Mass Effect's graphics. I love playing the game on my Radeon HD 4870 at home, but even on the shamelessly overclocked GeForce 8400M in my laptop at the lowest possible settings, I still find it insanely fun and involving. With other games, I can't deal with having to reduce settings so far.

Call of Duty: World at War gets to wait until I get back home to my Radeon. But Mass Effect? My laptop would have to be fully immolated by my overheating GPU for me to put it down. Easily my favorite game of the year, and definitely in my overall top five.

72 comments — Last by Ojhysseus at 2:17 AM on 04/03/09

A look at laptops for the season
— 6:00 PM on November 25, 2008

As I'm sure you've noticed from the Christmas lights going up and the endless repetition of "Jingle Bell Rock" and "Santa Baby," the holiday season is upon us. While America faces a nasty economic downturn that is spiraling out into layoffs all over the country, we must now more than ever remember the true meaning of Christmas: to put ourselves blindly into debt buying gifts for others and ourselves. This guide is for the selfish jerks like myself who might seize the opportunity to upgrade their own laptops at holiday prices.

I must point out that, even though Christmas towers above you like a red-and-green commercial juggernaut, you need not trample old ladies at your local Best Buy on Black Friday to "get yours." Deals have been good all year round, and mobile hardware has been evolving relatively slowly as of late. Centrino 2 wasn't a huge upgrade over previous platforms, and Nvidia's GeForce 9M-series mobile GPUs have a lot in common with the previous generation. Basically, if you bought your laptop last Christmas like I did, you're probably okay. My HP Pavilion dv2500t with a GeForce 8400M GS (featuring a whopping 64MB of video memory) still runs Left 4 Dead, Quake Wars, and Call of Duty 4 pretty well, and those are the only games I really need to run on the go. But if Old Reliable is starting to feel a lot older and less reliable, and your average battery life is around the 30-minute mark, then I suggest reading on.

A brief treatise on netbooks
First, if you're looking for a simple Internet and word processing machine, a good netbook will probably fit the bill. The netbook guide we published in September remains a helpful resource, although two new and notable contenders have popped up since then. HP has produced a consumer-friendly Mini 1000 netbook lineup, which swaps the somewhat mediocre Via C7-M processor from the Mini-Note 2133 for an Intel Atom while retaining the same delightful keyboard design. Dell has also started selling its Inspiron Mini 12 for $549—that's a little expensive for a netbook, but Dell does provide a 12" 1280x800 display.

The last netbook I'll discuss here is a strange new contender: the Asus N10J. Asus has produced a weird little chimera that squeezes a 1.6GHz Intel Atom processor with a 256MB GeForce 9300M GS into a little 10" chassis. Shoving dedicated graphics hardware into tiny laptops is par for the course for Asus and has been for years, but considering the unusual nature of an Atom netbook with a decent GPU, I'm surprised the N10J doesn't have an ASRock logo on its lid. We're currently working on a review, but if you're willing to shell out for the privilege of owning the only netbook designed to game (sort of), you can already do so.

This leaves us with the crowd favorites: Asus' Eee PC 1000H and Acer's Aspire One. Our editors have already sung the praises of the Eee PC 1000H as being one of the most well-rounded netbooks on the market, but if you're willing to compromise for a lower price tag, the Acer Aspire One is a dandy alternative. Though it has a smaller display and keyboard than the 1000H, the Aspire One's attractive, well-rounded design has made it a solid alternative to cheaper Eee PCs. The Eee PC 1000H tends to run about $100 more than the Aspire One, so it's up to you to determine whether or not the increased screen and keyboard real estate is worth the extra dosh.

A quick word about Asus
Asus gets a special mention here for one specific reason: the tendency to shove discrete graphics into just about everything it can. A visit to Newegg should yield 14.1" Asus laptops with your choice of GeForce 9650M GT or Mobility Radeon HD 3650 graphics chips. This is important, because I'm fairly certain there's a healthy gaming-oriented readership here, and those people may be happy to get solid gaming performance in a portable, utilitarian machine. Probably best of all, these notebooks are generally quite affordable. The most expensive one on Newegg right now costs only $1,199.

There is, as always, a catch. I've had experience with Asus notebooks, and depending on how picky you are, you may or may not be able to forgive their quirks. I've owned two—the 14.1" A8Jm (with a GeForce Go 7600) and the 12.1" F9Dc (with a GeForce 8400M G)—and both of them had weak hinges and mediocre battery life. The G50V I recently reviewed didn't really share these issues, but after lurking around forums, I've found that battery life remains fairly middling on Asus laptops (with 14.1" models pushing 2.5 hours tops). If you find this a reasonable trade-off for a gaming-friendly GPU, go hog wild. Personally, I've found myself willing to give up some graphics performance in exchange for better battery life and the possibility of an extended battery (which is hit and miss with Asus). It's not like I don't have a gaming desktop at home or anything.

I just wish my 8400M GS would run Rainbow Six Vegas 2 a little better—that's what I play with friends back home, and I'd really like to die less. When I'm in the market again, I'll no doubt be eyeballing Asus laptops once more.

And a little about Apple
First of all, if you're considering Apple, it's probably fair to say you're not looking for the best deal in the world. The new MacBooks and MacBook Pros have met with derision in some corners and praise in others, and our own Cyril seems pleased with his aluminum MacBook. I'm personally happy to see GeForce 9400M integrated graphics materialize, and it's nice to see Apple more or less call Intel out for foisting crappy graphics on the marketplace. That said, the new aluminum machines have problems of their own, like a very transparent market separation in the removal of FireWire from the consumer-grade MacBook. $1,299 is pricey for a 13.3" laptop, as well. If you must get your Mac fix, my vote goes to the original MacBook, which Apple sells for under a grand.

This recommendation comes with hesitation, of course, since I'm not a huge fan of Macs myself. I find Windows Vista to be just as good as Mac OS X, and the first-gen MacBook tends to run somewhat toastily. Finally, I like having an individual right-click trackpad button. Your mileage may vary.

The mainstream cheapsauce: HP's Compaq CQ50
Integrated graphics from AMD and Nvidia beat the stuffing out of Intel's chipsets and allow for cheap, low-end gaming on even the lowliest of laptops. By extension, I've found the CQ50 to really be the sweet spot for people who just want a cheap full-on laptop. It includes a dual-core AMD processor and GeForce 8200M integrated graphics with a basic 15.4" screen, and while the battery life is less than two hours, asking for more than that under $500 is barking up the wrong tree. The GeForce 8200M itself may not be quite as fast as ATI's Radeon HD 3200 IGP, which produces startlingly good performance, but it's definitely an improvement over the GeForce 7150M and its predecessors.

Of course, I also recommend this machine with some reservations. Talking to you as a friend, I've had rotten luck with anything that had the word "Compaq" on it. While HP swallowed up Compaq some time ago, I've always been wary of Compaq machines, and I can't guarantee your CQ50 won't develop some kind of horrible problem. When something this complex is $499, think about the steps that needed to be taken to get there. Reviews online are generally positive, though, so if you're willing to take the plunge, you can start here.

The ultraportable of choice: Lenovo's ThinkPad X300/X301
Let's say a netbook just doesn't cut the mustard for you. And let's say you find the MacBook Air appealing, but you feel Apple cut too many corners and pushed the price too high. If you want something that has more than a single USB port and is still remarkably portable and powerful, Lenovo's ThinkPad X300 is probably what you're looking for. (Some of you may have already seen this, but Lenovo's "ad" for the X300 comparing it to the MacBook Air comes highly recommended.) Although the X300 is a hair thicker than Apple's MacBook Air and has a slightly slower processor (an Intel Core 2 Duo SL7100 clocked at just 1.2GHz) it features a higher-resolution screen and—heaven help us—a DVD burner. You can get an X300 here, but if you're really interested in the cutting edge, there's the recently released X301.

The X301 bumps the integrated graphics up to Intel's new GMA 4500MHD and the processor up to a 1.4GHz, 45nm Core 2 Duo. If you go for the whole kit and kaboodle with dual batteries (one of which goes in the optical drive bay), the X301 will push over seven hours of useful battery life. And remember, you can actually remove the batteries yourself without having to disassemble your laptop. Unfortunately, the X301 starts in the neighborhood of $2,300, so it's not for the timid or money-crunched. Let's be realistic, though: if you could afford it, you'd probably buy it. I would.

I should also mention that Lenovo generally outfits ThinkPads with some of the best keyboards in the business, and the units themselves are often so solidly constructed they could serve as murder weapons and continue to work. With these ultraportables, you can at least confidently say you're getting what you pay for. Or at least most of it. Two large is still a lot of money.

The casual performer: Lenovo's ThinkPad T400
As much of an HP fan as I am, and as much as I like having a stylish notebook, I would probably be shortlisting the ThinkPad T400 if I were in the market today. Lenovo starts this model at a pretty reasonable $949, given that it's a 14.1" Centrino 2-based system with an optional Mobility Radeon HD 3470. The 3470 isn't the kind of powerhouse you're liable to find in a similar Asus laptop, but remember what I said about the Lenovo pedigree in the previous section. The T400 is a very well-constructed notebook with great battery life, and it brings with it the perks of buying Lenovo beyond the great shell and keyboard: the swappable drive bay and the myriad of accessories you can cram in it (a DVD writer, a second hard disk, or just another battery).

I will say that the ThinkPad look isn't for everyone, and I enjoy my HP notebook for that reason. The aesthetic screams "business or pleasure" compared to the monolithic black slab that is the typical ThinkPad. If you don't mind having an aesthetically challenged notebook, you can configure a T400 on Lenovo's site or buy a pre-built one through our price search engine.

The behemoth: OCZ's DIY 17"
This is a hard recommendation for me to make. It's hard because, as popular as 17" notebooks might be, these monstrous desktop replacements have always seemed kind of asinine to me. You pay a hefty premium for gaming performance, but you don't get much for your money—mobile GPUs are always less powerful than their desktop cousins, and upgrade options are severely rarefied. If you're just using it to crash LAN parties, you're worlds better off just building a small-form-factor desktop. Sure, it might involve a little extra packing and moving, but you'll also save something like a grand while getting equivalent performance in a much more upgradable form factor.

Dell, HP, Sony, et al make things even worse by foisting massive "notebooks" with 18" and bigger displays upon us. At this point, you just look sillier and sillier not going for a straight LAN desktop. Because of how impractical I find these machines, I'm omitting them from consideration entirely. Also omitted from consideration are Dell XPS and Alienware systems, since these tend to be somewhat overpriced for what they offer.

To be frank, I'm not particularly enthralled with "gaming PC" brands like Alienware. All the die-hard gamers I know make do with either a reasonably specced laptop (like one of the smaller Asus machines) or a desktop they or someone they know built. This continues to be one of the elusive joys of being a PC gamer: building your own machine and tweaking it yourself, something you just can't really do with an Xbox 360 (except for the odd Linux junkie). If you go out and buy an HP Blackbird 002 for serious gaming, it's a respectable system, but it's not yours. There's some cred to be had here. Alienware exists to try and foist that sort of cred on to the frat boy set. This is purely my opinion, but that's what a blog is for, right?

This is why, if you simply must have a hulking gaming laptop, my favorite contender in this market is the OCZ Do-It-Yourself 17" machine. We'll be reviewing one of these in the future, but it's essentially a notebook that lets you install as much hardware as you reasonably can by yourself, giving you at least some feeling of accomplishment as a do-it-yourself sort of gamer. The notebook comes equipped with a Centrino 2 chipset and is one of the rarefied machines sporting ATI's Mobility Radeon HD 3870, available in both single-GPU and CrossFire configurations. At Intel's Centrino 2 launch, I actually had the opportunity to see a Mobility Radeon HD 3870 X2 running Assassin's Creed on a gaming machine, and I was fairly impressed. My own admitted AMD bias coupled with my joy of tinkering with computer hardware makes the OCZ DIY 17" an easy recommendation for those that simply must have a big, beefy gaming laptop but would like to maintain some of their geek cred in the process.

Conclusions
At TR, we're fans of the smaller, more utilitarian machines. I also like having a laptop I can really play with, forcing it to run all kinds of crazy crap and seeing what it can and can't do, and with that desire comes a need for something other than an Intel GMA X3100. So for me at least, the 14.1" neighborhood (and 13.3" by extension) is where I like to play the most and where I've found the coolest stuff. I like what Asus offers in this segment, but battery life is a big deal for me, since I don't feel like lugging my AC adapter everywhere I go. I'd rather carry a spare battery than have to hunt for a plug, but doing without either one is best.

In the end, I might well recommend the ThinkPad T400 for its well-roundedness. It's reasonably light, configurable, a solid performer, and it has great battery life. Odds are I'll personally buy an Asus or HP on the next go around, however. Looks aren't everything, but they're something.

86 comments — Last by A_Pickle at 1:51 PM on 12/01/08