Overvolted

Low definition
— 10:51 AM on July 10, 2009

Sony and some market analysts are wondering why Blu-ray uptake isn't skyrocketing like they'd hoped. I have a few ideas.

About a year ago, I decided that, since I had a high-definition television and had it on good authority that movies looked better in high def, I might as well take the plunge. At the time, the best deal for a Blu-ray player was still the Sony PlayStation 3, and I had a problem with that: I just didn't want one. As many of the more PC-inclined gamers reading this know, most of the more interesting and desirable games these days go multi-platform. They often take a while to hit the PC, but it's usually worth the wait—they tend to be cheaper and sometimes even much improved, like Mass Effect and Assassin's Creed. As PC gamers, we even just got an incredibly solid port of Street Fighter IV, but that's a blog post for another day. Suffice it to say, buying a $400 PlayStation 3 just for Blu-ray seemed silly to me. So I did what any sensible geek would do: I built a media-center PC.

One of my biggest reasons was the availability of LG's HD-DVD and Blu-ray combo drive. This optical drive was selling for around $130 when I bought it, but the frugal shopper can grab it for as little as $100 now, if they know where to look. Amusingly enough, despite the death of HD-DVD, the drive seems to play those discs just fine. Better still, HD DVDs are dirt cheap and typically include the exact same transfer you'd get on the pricier Blu-ray. Of course, inexpensive drive and media aside, AMD has been aggressive about making a media center easy to build, with dirt-cheap processors and its 780G chipset. For maybe a little more than I would've spent on a PlayStation 3, I was able to build a full on computer capable of streaming Netflix and other online video, well before that capability started showing up in consoles (and now even televisions). Also, by popping in a cheap Radeon HD 3650, I created a box friends could come over and game on. Between that, my desktop, and my laptop, I'm pretty set for hosting my own small LAN party.

But I digress—let's get back to griping about HD movies. My biggest problem here is the prohibitively high cost of the media itself. While HD-DVD's death has produced a wealth of discs that are often even cheaper than their DVD counterparts, Blu-rays still command an absurd $10 premium over standard-definition media. I and multiple friends now buy movies much less often because we won't buy a DVD movie if it's available on Blu-ray, but we also won't pay the exorbitant price for the Blu-ray itself. As a result, we just don't buy either one. I'd have a hard time believing more people aren't running into the same issue.

Second is the software compatibility nightmare. Between the three versions of Blu-ray and the draconian digital rights management schemes for both Blu-ray and, to a lesser extent, HD-DVD, you can expect all kinds of cheerful software problems to plague your viewing experience. I actually keep both PowerDVD 7 and WinDVD 9 on my media center because, every so often, a disc comes out that just won't play on one of them. First, it was The Matrix on HD-DVD. I've since had the privilege of dropping $80 on two different import copies of Brotherhood of the Wolf on HD-DVD, only to discover that StudioCanal did such a dismal job mastering these discs that I'd have to buy a dedicated HD-DVD player to even dream about subtitles. I love the movie and have seen it multiple times, but I still don't speak French.

Third, and probably most irritating, is how often the improvement just feels incremental at best and unnoticeable at worst. High-def movies are woefully inconsistent in terms of quality, far worse than DVD has ever been. At worst, DVD looked like a clearer VHS; the worst-looking DVD I ever had was an old Canadian copy of Hellraiser III: Hell on Earth, which looked like it was mastered off a VHS tape and had a jumpy quality to the picture. Meanwhile, HD movies often have nothing but better color contrast and saturation to go for them. I remain skeptical about how much of this is artificial tweaking, the same way retail televisions have their contrast and saturation jacked up to make them more eye-popping for consumers.

If you could explain why my HD-DVD of Clerks II looks better and sharper than my Blu-ray of Iron Man, I'd really appreciate it, because this problem has really sucked the wind out of my interest in high-definition. There's just no way of knowing if a high-definition disc is going to be an appreciable improvement over the DVD except for checking reviews online, and about the only safe bets are CG movies (TMNT looks stunning in HD). Video quality has become such a crapshoot on HD that it prevents me from impulse-buying discs. I just watched the DVD of Splinter last night, which had stunning upsampled quality easily rivaling some of my high-definition discs. Meanwhile, my HD-DVD of Army of Darkness looks awful. Movies from the 1980s like The Thing often wipe the floor with more recent releases. (By the way, The Thing does make a fantastic case for high definition.)

The unfortunate reality is that, the vast majority of the time, high-definition media just isn't compelling enough to warrant the absurd cost of entry and the price of the discs, let alone the DRM headaches. If you're a movie buff, I'd strongly encourage you to think twice before making this leap.

189 comments — Last by insulin_junkie72 at 7:41 PM on 07/27/09

Requiem for a 4870
— 12:14 PM on June 29, 2009

It's been a bad month for celebrities, so forgive the poor taste. We've lost David Carradine, Ed McMahon, Michael Jackson, Farrah Fawcett, and Billy Mays. My loss isn't as great as an entire human life, but the little things always irritate us the most, don't they? I therefore request a moment of silence for my fallen Radeon HD 4870.

Around August of last year—my birthday—I bought myself a very special present: a VisionTek Radeon HD 4870 512MB. It was replacing a pair of CrossFired Radeon HD 3850s from the same company. The 3850s were pretty solid, but I wanted to go back to having a single card. Multi-GPU setups can be finicky creatures, and I hated having to disable CrossFire whenever I wanted to use my third monitor.

First, a bit of history. I've almost always been an ATI guy. I've owned five All-in-Wonder cards in my lifetime and loved them all. (And, as a side note, I do miss that line dearly.) My All-in-Wonder 9800 Pro was an awesome card for its era. When I sold off my desktop and switched to a desktop-replacement laptop, I begrudgingly used a Mobility Radeon X600 (a major disappointment in the mobile and desktop sectors to be sure) before eventually jumping to a GeForce Go 7600 in my next laptop. That 7600 was an absolute demon at the time, and it made me a believer in the Nvidia team. I went on to put a GeForce 7600 GT in the new desktop PC I built for college, and that card performed incredibly for the time. I later made the switch to the even more amazing GeForce 7950 GT, before finally using EVGA's marvelous trade-up program to secure a GeForce 8800 GTS 640MB, largely as a result of reviews on TR (which I wouldn't find myself writing for until two years later).

And that's when the tears came. While Nvidia's coverage sampled antialiasing is fantastic, and its transparency AA is in my opinion superior to AMD's alternative, it's no secret that Nvidia's Vista drivers were dismal and unreliable for the first year or so. While running the 8800 GTS, I used a cheap GeForce 7100 GS to drive my third monitor. OpenGL games crashed on loading unless I disabled the 7100, and one stage of Tomb Raider: Legend simply refused to run on the 8800. While Tomb Raider: Legend is a fantastic game, I could deal with having to play that one section on the 7100. But for better or worse, Quake 4's single-player campaign is one of my favorites, and if that's not running, I've got problems.

So, I defected back to the AMD camp when the Radeon HD 3850 came out. Life was good, for the most part. Then the Radeon HD 4800 series dropped and I, like many you, got very excited—AMD was back and truly competitive again. I cheerfully picked up the Radeon HD 4870 and brought it home.

Once again, the tears came. While performance was absurdly better and smoother than my 3850s, the 4870's cooler was, at the default settings, woefully incapable of dealing with the heat generated in my case. Raising the fan speed helped, but ultimately, having to choose between noise and having my video card combust proved to be an unacceptable compromise. After experimenting with different cooling solutions, I finally settled on a Zalman VF1000 coupled with the red metal backplate from the stock AMD cooler. This was an "acceptable" compromise that appeared on a lot of different forums. While the VRMs still ran punishingly hot, the GPU remained cool enough to keep operating, provided there was adequate airflow from the front of the case.

Indeed, once I had this situation in order, the 4870 was problem-free... until last Friday, when it decided it'd had enough amid a game of Ghostbusters. After monitoring the voltage, amperage, and temperatures under stress, then swapping drivers, swapping entire video cards, and experimenting with clock speeds, the culprit finally revealed itself: the memory was dying, if not dead. The card now only operates in 2D mode; visits to 3D mode result in driver crashes if I'm lucky, but more often the system just hard locks.

I still gaze at AMD with something of a fanboy's eyes. I game at 1920x1200, so I must now decide if I want to step up to a 4890, or if I should grab a 4850 (or a 4830) as a stopgap until the DirectX 11 lineup arrives later this year. While Nvidia allures me with promises of CUDA-accelerated Adobe Premiere Pro and the chance to fool around with PhysX, I still find myself gravitating towards AMD's GPUs. They generally offer better performance for the price, in my opinion, and AMD remains the underdog. The 4800 series offers incredible performance at 8xAA, as well, negating my desire for CSAA. I even like to fool around with shader-based antialiasing options.

Still, the disappointment is palpable. While I eye the Radeon HD 4890 for the opportunity to tweak it, I can't help but feel kind of screwed here. It's unfortunate. I didn't want to splash out on another card, since I quite liked my 4870, but what am I gonna do now?

68 comments — Last by Krogoth at 6:05 AM on 07/10/09

Bad call, Valve, bad call
— 10:28 AM on June 25, 2009

Today, I'm going to get away from ridiculously controversial subjects. Instead, I'm going to visit with a smaller controversy, one that already seems to be dying down a touch. I am, of course, talking about Left 4 Dead 2.

I won't disagree with anyone who says announcing Left 4 Dead 2 at E3 this year was a bad move on Valve's part, a slap to its best customers. That Valve would have the gall to announce a sequel, for release this year, to a game that just came out less than a year ago is completely nuts, especially since it still can't seem to bring itself to talk at all about Half-Life 2: Episode Three, a game whose window of opportunity is rapidly closing. Episode Two was a fantastic game, but the enthusiasm wasn't quite there the way it was for Episode One or the original Half-Life 2. Delays have a funny way of doing that to a game. Would anyone be excited if Duke Nukem Forever finally came out tomorrow, or would you just shrug and go, "Thank God that's over?"

A lot of people on Valve's forums were up in arms over the content announced for Left 4 Dead 2, suggesting that this is content that was promised to them by Valve from the get go with the original Left 4 Dead. For starters, I don't think Left 4 Dead was worth $50, much less $60 when it came out, and I have a hard time even with $30. The game didn't really have any more content than Team Fortress 2 did at release, and free patches to Team Fortress 2 have brought it nearly in line with the Survival Pack for Left 4 Dead, which added just enough content for me to consider it "complete." But I don't think it would be far-fetched to suggest that in some ways, Team Fortress 2 and its massive support and continued patching kind of ruined Left 4 Dead.

I'll come out and say it: I don't think what Valve has announced for Left 4 Dead 2 should come for free to those of us suckers who purchased their rough draft. The sheer amount of content and the breadth of changes being discussed for the sequel are just too vast to warrant anything but a new release. If we all swallowed Doom II when it came out, Left 4 Dead 2 should be a walk in the park. Far-reaching changes to level design, reworking of the AI director, new melee weapons, new special infected...we're, in many cases, talking about major changes to the core gameplay that will require moderate to substantial rewrites and additions to the existing game code. This doesn't even feel like an expansion pack. It feels like a true sequel.

Where Valve dropped the ball was not just sitting on L4D2 and doing its due diligence in planning and updating it behind the scenes. Left 4 Dead's player base is still growing, and the recent release of the SDK for it feels, quite honestly, kind of pointless. It's not a bone being thrown to the community; it's just a waste of a good opportunity. If Left 4 Dead had been struggling in the marketplace, the way Unreal Tournament 2003 did back in the day, a rushed sequel might have been warranted. Yet Left 4 Dead is more than healthy and doing better every day. If nothing else, there's still a great deal of money to be made off of it. There's a game here to grow, and at least waiting another year would've been by far the smarter play for Valve. In the interim it could maybe, oh, I don't know, release Half-Life 2: Episode Three.

This announcement cuts Left 4 Dead's useful life drastically short. I'd be surprised to see too much mileage out of that SDK, since now most of us are pretty much just going to be playing to prepare for Left 4 Dead 2, which by all accounts is beginning to look like the game Left 4 Dead was supposed to be. I suspect Valve wants to make the case that it's releasing the sequel so quickly because that's the game it wants to expand and build on, but it could still have just piecemealed out one or two more small updates to the first one or even just finished off with the SDK and waited to announce the new one. I doubt anyone would be complaining then, and I doubt there'd be the kind of backlash we've seen. As it stands, though, announcing Left 4 Dead 2 this soon is just bad business sense, bad consumer relations, and bad timing.

54 comments — Last by FubbHead at 9:48 AM on 07/05/09

Enough with the booth babes, please
— 3:23 PM on June 17, 2009

Continuing in the spirit of my last blog post, in which I tackled issues probably just a little bit too heavy for this kind of column, I'm making a push to engender more intelligent consumerism on the part of gamers and, indeed, entertainment consumers in general. Specifically the male ones.

A couple years ago, another site I write for gave me the opportunity to cover the Game Developers Conference in San Francisco. Funnily enough, I normally live in the bay area but was staying in San Diego for school and had to be flown out. Being able to cover an event for a publication was actually pretty exciting for me. The time I spent in San Francisco radically changed my formerly poor opinion of the city, too, and I now visit regularly. (Folks in the bay area, by the way, owe it to themselves to check out the Hypnodrome, which is currently putting on the most unspeakably insane play I've ever seen.) While I was covering GDC, however, I had the privilege—or rather, the misfortune—of experiencing a convention stalwart in person. I witnessed the booth babe in her native habitat.

For the uninitiated (all four of them), booth babes are, generally speaking, beautiful, scantily-clad women whose sole purpose is to draw attention to the individual booths at most trade shows. They may engage you in conversation about whatever product is being shilled at their booth, but for all intents and purposes, they exist to look pretty and pull the mostly male convention-goers from the crowd.

Those of you who've always wanted to be at a convention and mingle with the booth babes will just have to take my word for it: these are the most depressing creatures I've ever seen—and I've been in my share of crappy pet stores. Meth-addled hookers cause me less psychological distress than these ladies. There's something authentically unsavory about this practice, and while I'm not going to sit here in my ivory tower and preach about how bad I feel for these girls—I hope they're making good money, at least—I will say I'm pretty disgusted with the institution that produces them.

Consider a commercial for Axe products, something rife with beautiful women who are so incredibly turned on by a man using an Axe product that, well... frankly, if you haven't raced out and bought it by now, you're an idiot doomed to a horrible, sexless existence. Now consider what this commercial is actually talking about. It suggests women are sex objects, nothing more, whose panties get in a tizzy whenever someone wearing a body spray is wandering around. It suggests you like watching beautiful, nubile women in this tizzy. And it suggests you, the male viewer in the target demographic, are stupid enough to buy this crap. How much of the commercial has anything to do with the quality of the product? And how much of it has to do with just flaunting female flesh in front of you under the assumption that you'll be dumb enough to both marginalize the more numerous half of the species and buy something because it'll make you look cool with girls? Noodle it out for yourself, because we're circling back to my initial point.

How much do you think the booth babe really has to do with the products on display? At least Axe is somehow tied to the scent of the male body, which women do key off of sometimes, just as males will key off of how a woman smells. The booth babe often has nothing whatsoever to do with the product at hand; her job is to look pretty and make whoever's around interested in whatever's available. The really hilarious thing is that if you're a regular male reader of The Tech Report, odds are good that just getting a look at the hardware or games floating around would've been enough. This stuff is like porn to me, at least. On Friday mornings I thumb through the Fry's ad like a sex addict through a Hustler Barely Legal magazine.

What's really depressing is the way the electronics and gaming industries go hand-in-hand in revealing themselves to be profoundly misogynistic through this kind of practice. Those of you who've been paying attention may have noted the hit E3 took when it banned booth babes from the show floor in 2006, though admittedly, some awful restructuring decisions would come on the heels of that ban and nearly fatally wound the expo in 2007. Attendance in 2006 was actually down 14% from the previous year, even though all three next-generation gaming systems were on the floor. And sadly, in a bid to resurrect E3 this year, booth babes were reinstated.

Of course, if you want to get your fix of booth babes, there's always Computex, among other trade shows.

I just honestly find the practice depressing and dehumanizing. I feel dumber for being part of the demographic—an owner of a functional penis—that these women are trotted out for. And you have to wonder if the female attendees and journalists, the rarefied female gamers and tech enthusiasts, don't feel at least a little disenfranchised or objectified by crap like this. There's no reason these fields have to be male-dominated, except that they've been a boys' club for so long that most of the incumbents would sooner just shrug it off and forget about it.

That's an easy thing to do, too. I'm sure someone in the comments will cry foul, saying I'm taking this too seriously or being oversensitive or just trolling for brownie points with the ladies. Yet as a writer and producer of content, it's my job to produce something of intelligence and value, and booth babes actually do merit some discussion here since they're a symptom of a greater issue (which I will cheerfully explore in future blogs). I don't cry myself to sleep at night thinking of those poor girls in the tiny clothes being fawned over by the unwashed masses, but I definitely got uncomfortable around them at GDC 2007. (Side note: S3's babes, much like the booth, were the epitome of depressing.)

Still, I feel obligated to ask: is stuff like this really doing anyone any favors?

177 comments — Last by pixeltek at 6:08 PM on 06/25/09

The nascent art form
— 9:44 AM on June 11, 2009

The subject I'm going to discuss in this post probably isn't new to you, but it's also going to be the basis for some other blog posts I want to write further down the line. I'd like to address a question that rears its ugly head every now and then about video games and hopefully put a concrete stamp on it.

Are video games art?

I expect The Tech Report's peanut gallery will overwhelmingly (and correctly) respond in the affirmative. If you don't agree, then you're in the right place, because this blog will hopefully help elucidate the question.

The first and most basic thing to recognize is that video games are a nascent art form. They haven't been around that long, especially compared to other, more entrenched forms of art. For example, classicist visual art like painting and sketching has been around since proto-humans scraped coal on the insides of cave walls. Over time, the art form changed and mutated, going through phases and ages, and it will continue to do so. For what it's worth, I'm not a fan of where it presently sits with respect to the "art scene" proper—art produced in the past twenty years tends to be too meta for me; too much about answering the profound and profoundly dull question "what is art?" and less about just making something that inspires thought beyond that.

Likewise, when we introduce photography into the discussion, we bring with it the discussion of art forms being defined by one another. I don't find it unusual to suggest that the introduction of photography in the 1830s was key to steering visual art away from more traditional representative work and into more abstract, experimental endeavors.

When you move forward in history to motion pictures—really just high-speed photography played back at high speed—it isn't surprising to find that medium defined chiefly by art that preceded it: photography and theatre. And just in case you're curious, I feel compelled to point out that Americans were nowhere near the kinds of pioneers of film that artists in other countries were. German expressionism gave birth to many of the ideas of special effects and lighting tricks we enjoy today, and Sergei Eisenstein helped pioneer the concept of montage—two images played in sequence representing a third idea.

Bringing up contributors from other countries is key to my central thesis—video games as art—because it forces us to shed light on why the debate even exists. Motion pictures as introduced in America were ghettoized from the get-go. Film was nowhere near the respected art form it is today; just a century ago, it was a baser means of entertainment for the masses and ill-regarded in artistic circles. Does that sound familiar at all?

The funny thing is, in other countries, film was being more aggressively pursued as an exciting new means of artistic expression. As a result, while Americans were futzing around with the Hays Code until the mid-1960s, Europeans and Russians were going absolutely nuts with the new medium and pushing it much further than we were inclined to. I'm not one of those "old films are the best films" blah blah jackasses; for my own enjoyment and education, I generally don't watch anything made before 1970.

But one of my favorite examples is in a French film I watched in one of my classes, Francois Truffaut's Shoot the Piano Player from 1960. There's a scene in the bedroom where the main character has just had relations with a prostitute, they're lying in bed together, and her breasts are exposed. They're having a conversation, and the prostitute says something to the effect of "look at me, I'm an American" before covering up her breasts with the sheet. Keep in mind that in 1960 with the Hays Code still active, this scene would never have made it to America. So if anything, what you learn from this bit—in context—is that censorship stunts art, and the French will mock us for it.

I've digressed by ranting about the Hays Code (this is the tip of the iceberg; such is my loathing for censorship), but hopefully you're starting to see some parallels here. Video games are taken more seriously in another country (Japan), while here, in their youth, they're regarded as amusement for man-children and lacking in artistic merit.

And it's important to note that video games are still very young. We still have developers flirting with mature content. Violent content is almost a non-issue in America, where John Carpenter's seminal horror classic The Thing gets played virtually uncut on the Sci-Fi Channel while people have fits over Janet Jackson's nipple in the Super Bowl telecast (a case that's still an ongoing legal issue). Violence, though it's had its watersheds with Doom, Mortal Kombat, and the Columbine incident, is still something developers are fairly comfortable with. Complex issues like war can actually be handled in a fairly intelligent way. I was very impressed by Infinity Ward's Call of Duty 4: Modern Warfare in that respect—and subsequently disheartened by Treyarch's World at War entry, which squandered Modern Warfare's good will.

Turn around and take a look at how sexual content is presently handled in video games, though, and you'll find it's still very stunted. BioWare seems to have had the most success in introducing sexuality without making it completely moronic, but the love scenes in the otherwise outstanding Mass Effect are still awkward to say the least, and rumors about Dragon Age: Origins are disappointing. Likewise, look at how sexuality is handled in games like God of War, which has generated a bafflingly small amount of negative media attention. Or in The Witcher, where sexual conquests are literally collected like trading cards. (The Witcher is a Polish-developed game, but these other ones are American.)

At the same time, I also want you to note how video games are currently progressing. Given that newer art forms will often define themselves against pre-existing ones, consider the heavy push for theatricality in video games—even going as far back as when SquareSoft, then still interested in making good games, marketed Parasite Eve for the PlayStation as "the Cinematic RPG." Consider how Mass Effect parlays this theatricality by opening up options to control dialogue and relationships in-game to an extent. One of my favorite examples of the potential of video games as an art form, Call of Duty 4, follows similar lines by driving you to your own execution or forcing you to drag your mangled body out of an airplane, exploiting the first-person perspective and interactivity in ways film never could.

Or you can just go on your merry way analyzing BioShock.

I hope I've made my point. Video games are indeed art—a nascent art form, as I enjoy saying. Those who would claim otherwise seem to have a very short memory.

29 comments — Last by blazer_123 at 8:06 PM on 06/16/09

When overclocking doesn't work out
— 11:41 AM on June 3, 2009

This blog entry I write because I feel it's necessary to raise a point about overclocking processors and GPUs. This is hot off another late night attempt to coax just a little more love from my Core 2 Quad Q6600, which ended in tears, misery, and reinforcement of the status quo. I'm not overclocking on an ECS board either, here, so we're clear — I'm using an X38-based Gigabyte motherboard. Simply put, my G0-stepping Q6600 isn't happy going over 3GHz. I shoot for round numbers and at one point was able to get it up to 3.2GHz, but now we hang out in the magical land of 3GHz, using a multiplier of nine and a 1,333MHz front-side bus. This seems to be the best compromise, since I don't have to ramp up my CPU fan just to keep temperatures down and thus defeat the purpose of having a silent machine.

My Visiontek Radeon HD 4870 512MB fares no better. It's rock stable at stock speeds, but a journey into overclocking land is always fraught with disappointment. The GPU can't handle any more than the stock core speed, and overclocking the memory on the RV770 chip is, in my experience at least, somewhat pointless without a good core clock to match. When I tested my 4870 at a core clock of 800MHz, it felt measurably smoother than at stock speed, which was great...if I only intended to game for maybe five or ten minutes. And this is under Unreal Tournament 3; imagine if I'd thrown Crysis at it.

I bring these things to your attention because if you Google overclocks on either of these chips, you'll see far better results than mine. People on forums will talk about getting their G0-stepping Q6600's all the way up to 3.6GHz and better, at 1.25 vcore or lower, and it's Prime95 stable, we swear! The fact is that overclocking is a massive crapshoot. It's the kind of thing that we warn you about in our CPU reviews, in every overclocking section, and other sites will warn you about, as well. Overclocking is incredibly unpredictable. When the Phenom II was being promoted, AMD was shouting about how the chip could probably hit 4GHz, but our samples only hit the neighborhood of 3.5GHz—still a good overclock, but not the kind of angels-singing-from-the-rafters overclock for which one might hope.

If it seems like I'm beating a dead horse, it's only in an attempt to pound into steel what some consumers still don't fully understand. You read about these crazy overclocks, or even a legit review will get a pretty good one, so you'll go and buy the processor.  Lo and behold, you don't get anywhere near what you heard about. That's the nature of the beast, so you'd better buy a processor that's going to be good enough for you at or close to its stock clock speed.

Likewise, overclocking isn't an exact science. We have an excellent guide on overclocking here, written by Geoff, that can at least get you started, but there's a lot of tweaking that you may have to do, and you also have to decide just how you define "stable." For me, for example, "stable" means rock stable. I do heavy high-definition video editing on my desktop, which has been very heavily designed and tuned specifically for that task. An overclock with "good enough" stability isn't going to be good enough for me when Adobe Media Encoder slams all four cores and redlines them for hours on end. Your standards may be a bit looser than mine, but an overclock could easily involve tweaking timings and all kinds of esoteric settings in the BIOS to reach the kind of crazy heights some of the chips on the web hit.

Overclocking video cards becomes even less exact. The "old standby" for video card stability testing these days seems to be Furmark, with ATITool running a close second, but I've had ATITool green-light overclocks that have been tested for over an hour and promptly cause crashes when the card enters an actual game. The overclock testers in software like ATI's Catalyst Control Center and Nvidia's System Tools, formerly known as nTune, have seemed even less reliable. Catalyst Control Center is all too happy to let me overclock to 800MHz on the core, "stable," and watch the card choke in actual gameplay. Nvidia's System Tools had the same issue in my laptop (Asus X83Vm-X2), where my immensely overclockable GeForce 9600M GS nonetheless had higher-clocked confirmed stability tests than bore themselves out in Left 4 Dead. For what it's worth, though, the 9600M GS does purr along quite happily at near-9700M GT speeds.

I'm not sure exactly how this is going to go over, how successful the warning will be. We're certainly past the era where the Barton-core AMD Athlon XP 2500+'s quick tweak to 3200+ speeds was considered a major overclock, especially if I'm disappointed in only scoring a 25% overclock off of my Q6600. The vast majority of chips on the market these days do have some pretty impressive overclocking headroom in their designs, but these overclocks aren't guaranteed. They can be facilitated, as AMD does with its Fusion Overdrive utility, but they can't be guaranteed by any stretch of the imagination. You may wind up being the unlucky sucker with the Phenom II X4 that only goes up to 3.2GHz or the Core i7 that only hits 3.4GHz at most. I do say "unlucky" with tongue planted firmly in cheek, given how crazy some overclocks can be these days, but we've also gotten to the point where the increased headroom has brought increased expectations with it.

All the forum posts in the world aren't going to change the fact that you're gambling when you bet on overclocking your next processor. The ATI Radeon HD 4890 may have been designed to hit 1GHz, and some partners are even releasing cards running at 1GHz from the factory, but that doesn't mean you can buy a cheaper one and just magically expect to hit 1GHz.

Buyer beware: you may get exactly what you paid for.

61 comments — Last by potatochobit at 3:37 PM on 06/22/09

So what if I play with bots?
— 11:21 AM on February 5, 2009

Some of my favorite games since I started really getting into PC gaming include Unreal Tournament, Unreal Tournament 2004, Rainbow Six Vegas (1 & 2), Doom 3, Quake 4, Enemy Territory: Quake Wars, Call of Duty 4: Modern Warfare, Elder Scrolls IV: Oblivion, and Mass Effect. It's fair to say that I love the first-person stuff. Big time. And as I've gotten older and my attention span has waned (side note: Chrono Trigger is still awesome), my desire for hard-core shooting action has only grown. However—and this is a big however—I almost never game online.

"Dustin," I can hear you say, "most of these games are great because of their online component. Why don't you like playing online?" Well, simply put, I suck. Your follow-up, I'm sure: "Well, if you play online, you'll get better." That's true, but what if I don't care? What if I don't want to? What if I'm perfectly content hammering the bots in Quake Wars on Easy mode?

I'm writing this because I can't be the only person in the world who doesn't mind—and is in fact perfectly content—playing with bots. The online gaming scene has really exploded over the past few years, but I've elected to stay by the wayside. I have good reasons for it. While playing these games with friends and family is fun, and LAN parties are a blast, I simply don't like playing with the usual online folks. Playing online usually means some combination of the following: griefers, cheaters, jerks, horrible sound macros, being told to "learn to play noob" or an abbreviation thereof, and often just getting pulverized. From a technical perspective, I'm also vulnerable to Merciless Ping, God of Latency, and although my Radeon HD 4870 is more than happy to hold up its side of the bargain, there's no accounting for how random an Internet connection can be.

If I play with bots, however, I can control how difficult the game is. I don't have to worry about coordinating with other players, and I can lone-wolf it the way I like. I learned a long time ago I'm not much of a team player, and I can't be alone in not wanting to coordinate massive strikes. Quake Wars is a fantastic game, but the kind of teamwork required for playing online is something I just don't have in me. I'm no strategist. Bots may not be as random or clever as human players can be, but they get the job done, and some games (particularly the Unreal Tournament series, but also Quake Wars, surprisingly) have fantastic AI that creates a perfectly enjoyable solo experience.

The point toward which I'm carefully meandering is that, unfortunately, a lack of bots has made otherwise good games fall by the wayside for me. Prey could be salvaged by its bizarre deathmatch, but finding people to play that with me is like finding a needle in a haystack. Team Fortress 2 is a major offender, because it just plain doesn't have any kind of "practice" mode to speak of to acquaint you with the mechanics of the game. It's a fantastic game that has virtually no appeal for me whatsoever because of its strict, online-only design. I play the odd game with my girlfriend or my sister, but I get bored and tired after a little while because I don't have the opportunity to feel the game out or play it the way I'd like.

Bots could make games like these a bit more appealing for folks like me who don't actually want to go online but do want to enjoy a good deathmatch or multiplayer shooting environment. I'd spend absurd amounts of time in the deathmatch mode of even something like Doom 3 if there were some halfway decent bots running around with me (though in fairness, I love Doom 3's deathmatch for some unfathomable reason).

Finally, I am here to say that I can't be the only person who likes playing deathmatch and multiplayer-style games by myself. I don't have to hit the dance floor to listen to techno, and I don't have to hop online to see the appeal of a good shooter. So what about the rest of you? Would some of you rather see bots materialize in games more often, or am I hopelessly alone and cowering in a corner here?

121 comments — Last by Delphis at 9:08 AM on 02/27/09