I made a dreadful mistake a week ago. With my Radeon HD 4870 in MSI's hands in City of Industry, my desktop has been cheerfully running on a Radeon HD 4670. The 4670 is a beast for what it is, and miraculously, a bunch of my games play perfectly fine at my display's native 1920x1200 resolution. Unfortunately, once you've tasted the incredible power of a higher end (albeit now quite affordable) graphics card, making the step back proves difficult.
So, I wound up going back to an older game. Specifically, I installed Diablo II.
I've never been a huge Blizzard fan, but I respect what the company does. Just to stir the pot a little, I don't think its games are actually that great. Certainly, by a commercial metric and by the rabid fan base, Blizzard games could be considered absolutely amazing. For me, though, they've always been a kind of gaming junk food. Much as I never feel very good about myself after eating a bag of Milano cookies, Diablo II leaves me feeling curiously empty after a few hours. I got the same feeling with World of Warcraft. To my credit, however, I was able to kick that habit pretty easily. Every class but the Rogue felt like a series of timers in the early going, and the game eventually no longer seemed worth the subscription fee. I ended up buying the Nicoderm CQ of MMORPGs, Guild Wars.
With that said, I feel like Diablo II and World of Warcraft manage to be devastatingly addictive without feeling very rewarding—for me, at least—simply because Blizzard designed them only to stimulate the ever-loving crap out of the human brain's reward center. When I play Diablo II, I am rewarded for my tireless, ceaseless clicking. I play a fairly popular mod called Eastern Sun (a friend admonishes me constantly for this, saying he thinks vanilla Diablo II is better), and with the tweaked drop rate and vastly increased number of uniques, set items, and so on, I get to see all kinds of new and exciting gear. Goals are set for which items to grind or gamble for, what level to hit, and so on.
By the time all is said and done, though, I may have bumped up a couple of levels and amassed a veritable wealth of gear, but nothing has actually happened. Even in the game, progress doesn't seem that exciting. The story itself never struck me as being terribly compelling, crippled by how orderly the game plays out. I never really feel like I'm getting anywhere, knowing I'll just be playing the same five acts on a harder and then harder-still difficulty level.
There's just something about the way Blizzard designs games. Diablo II has always had aggravating balance issues, and I gave up playing the vanilla version in part because only certain, specific builds could really make it through the game. It's the same reason I don't play Magic: the Gathering competitively, either; I don't like when one choice is so simply, purely better. The class I liked playing, the Amazon (javelin specialized), got nerfed a while ago, which caused the character build to just lose steam at a certain point. My understanding is that World of Warcraft isn't much better in this respect, and to me, that defeats the purpose of a class system and character building. In Mass Effect or Fallout 3, I largely adapted my builds to my playing style. The games gave me different ways of interacting with them, and all avenues were at least viable. Blizzard games seem to punish more casual players and balance things for the obsessive ones.
Speaking of balancing, the ever-changing nature of Diablo II (and even more so World of Warcraft) also bothers me. Game patches normally exist in large part to fix problems, but Blizzard patches often radically revamp the games themselves. Classes get heavily tweaked or nerfed, drops change, and so on. It certainly seems to keep the game fresh, but the policy never agreed with me.
Yet I can't stop playing. As I said before, there's something about Diablo II that just stimulates my brain's reward center. In an episode of the Garfield cartoon, there was a game show called "Hit the Buzzer, Win a Cookie," and that seems to be pretty much what's going on here. Kill some cows, gain a level, get some loot. Sell the loot, lather, rinse, repeat. Look at all the cool new gear I can go kill cows with! And the only progress I've made is that I've become more efficient at the grind. It's questionably satisfying, but I can't seem to stop.
With the other games I mentioned, I at least get the feeling I'm progressing through a story, like I have a set goal and will eventually get to some type of ending. Diablo II was never about that. From day one, it was about rapid character-building with friends, racing through the game, and settling on a specific build. There were just certain ways you did things, and we'd all get into it and talk about them, but that sense of "what exactly am I achieving?" eventually set in.
I'm genuinely curious if anyone else has gotten this same vibe. There's no question that Blizzard's games seem to demand a heck of a lot of play, especially with entire television stations devoted to Starcraft in South Korea, and the embarrassing amount of cash World of Warcraft rakes in. Am I alone here, or are these games (or at least Diablo II) just addictive despite not actually being that much genuine fun?A tale of avarice, betrayal, and upgrading
It's story time with Uncle Dusty, kids, so scoot in and I'll spin you a tale of terror sure to make spiders crawl up your spine. It's a tale of avarice and betrayal. How one man's hunger for power ultimately led to his undoing. It's a tale... of upgrading.
As I mentioned in my last post, I found myself with the opportunity to make the jump from a Core 2 Quad Q6600 (overclocked to 3GHz) to an AMD Phenom II X4 955 Black Edition, yet this upgrade proved unwise. Some have questioned or criticized my desire for motherboard-based RAID, but I assure you I can tell the difference on this machine. I've seen video encoding and rendering processes get throttled by even the RAID 0's bandwidth. Yet a dedicated RAID card costs at least $300, and I have yet to see one that's universally recommended. Intel's ICH south bridges are a known quantity for me, and many benchmarks where they compete against affordable RAID cards are fairly tight races, as far as I've seen.
Being in need of decent motherboard-based RAID, I wound up crawling back to my Gigabyte GA-X38-DS4 motherboard and Core 2 Quad Q6600 processor, figuring I could just return the Phenom II, sell the motherboard I was given, and upgrade to a Q9550 or Q9650. I know some of you must now be asking, "Why would you bother with another Core 2? Why not just upgrade to a Core i7?" Your question is a valid one, and the answer is two-fold. First, upgrading to Core i7 would be prohibitively expensive. I already have 8GB of Corsair XMS2 DDR2-800 that plugs along happily for me, and I'm certainly not going to go with less RAM to upgrade. The expense of upgrading to a Core i7 pushes me over $600, at least, which is just too rich for my blood. Second, I have a completely irrational and ludicrous bias against Core i7. I don't like the branding, let alone how muddled it's become with Lynnfield, and overclocking seems daunting. The platform is also nowhere near as mature as the Core 2's, where everything is also a known quantity. And finally, since I know you're going to ask, "Why not wait for Lynnfield," my answers for i7 are for the most part applicable there, too.
So naturally, before even checking to make sure my old setup worked properly—why wouldn't it?—I ordered myself a Core 2 Quad Q9650. Again, this seems like another silly decision in a string of them. Why didn't I just buy a Q9550? Was the extra .5 on the multiplier really worth another hundred dollars? Well, first, I was able to get a healthy discount on the Q9650. Second, I wanted to start out from where I left off with the overclocked Q6600. And third, the extra .5 on the multiplier was worth the difference to me. With a 1600MHz front-side bus, that takes me all the way to 3.6GHz versus the 3.4GHz I'd get from the Q9550. Lower overclocks don't ask as much from my older X38-based board and DDR2-800 memory. And finally, it's a psychological thing. I would have felt (and do feel) happier with the Q9650 than the Q9550. Part of the purchase, at least for me, is the experience of having, and I do attach strange sentimentality and feelings towards hardware (hence my aversion to the Core i7). Insane? Sure!
With my old X38 board back in my case and the Q6600 humming along, I noticed an interesting quirk on startup: the system had reset itself to stock speeds. The computer was perfectly stable otherwise, but my overclock just seemed to have reset itself, and the system was taking an extra ten seconds to POST. So I went back and, naturally, set my overclock again at the settings that I knew were rock stable. Computer exits BIOS, saves, shuts down. Restarts, the lights come on... and no one's home. No POST screen, nothing. Ten seconds later, it shuts down again, then starts up again, POSTs: no overclock. I went through this process a couple of times with no success in changing settings. If you're like me, you're thinking something's wrong with the motherboard, like maybe some BIOS corruption or something. So I cleared CMOS, flashed the BIOS, changed the battery, sacrificed a goat... and the problem persisted. Worry set in. Did I damage my board? Was I careless? I tested it with my friend's Core 2 Duo E6600 and got the same issue.
At this point, I faced a conundrum. I couldn't "go back;" I was officially on the road to upgrading. Should I buy a new Intel-based board or purchase a RocketRAID card for the AMD platform? The Intel option would cost me roughly $40 more, but performance would be higher and I'd still have ICH-based RAID. The AMD processor and board would be easier to overclock, but the RocketRAID card would need to use a PCI Express x16 slot. Since the board is 790GX-based, that means my Radeon HD 4870's bandwidth would be cut in half. All signs point to PCI Express 2.0 x8 still being fast enough for the Radeon, but there's that voice in the back of my head going, "But what if they're wrong? What if those two frames per second you lose are off the minimum and not the average?" I ended up going the Intel route.
The only brick-and-mortar store selling motherboards in my area is Fry's Electronics, whose selection of Core 2-ready boards was remarkably anemic—many weak MicroATX boards and few offerings with at least a P45 chipset and a RAID-capable south bridge. In the end, I was left with exactly one option: the Gigabyte GA-EP45-UD3P. I came home, checked reviews, found I had made a good decision, and mounted it in the case while I waited for the Q9650 to arrive. I proceeded to live off of my laptop for the next few days, and I have to tell you... the GeForce 9600M GS just doesn't cut it for gaming if you've been using a Radeon HD 4870 for a year. Still, at least keeping the laptop attached to my 27" Dell monitor (another odd purchase choice, but it was cheap on campus) made the computing experience plenty enjoyable.
Only after starting to mount the GA-EP45-UD3P did I notice a major problem: Fry's had actually sold me an open-box product. Oh, it was presented as new and not marked as open-box, but it was missing the backplate and parts that should've been packaged in. I was so exasperated that I mounted the board anyhow and just used my old backplate, which matched nearly perfectly. Neuroses eventually set in, though, and I proceeded to order the same board new from Amazon. The new mobo arrived the same day as the Q9650. I removed the used EP45-UD3P, replaced it with the Amazon-bought model, and set up the computer again. Once I had a stable Windows 7 RC installation going, I proceeded to try overclocking the Q9650. I'm sure you can imagine what happened next. I'll give you a hint: the exact same problem I had with my old board. "Is this a Gigabyte issue? Did I get two bad boards? What's going on?"
At my wit's end, I drove an hour to the next closest Fry's to see if it had another option—maybe an Asus board in stock—but no dice. They did have an open-box EP45-UD3P, and by then I was just thinking, "You know, I've got two bad boards already. At worst, what's this one gonna do? Not work?"
Naturally, the new board had the exact same problem as the other two. Something was definitely up. Could I have damaged my power supply's auxiliary 12V connector with my constant fiddling? I'd taken to using needle-nose pliers to get in there and squeeze the clip. Had I damaged the power supply, and was the BIOS reading this as a reason not to let me raise or lower my front-side bus by so much as a megahertz? With no other options and in for a penny, in for a pound, I tried hooking up my old X38-based board and my friend's E6600 to the power supply in my dad's desktop. I plugged a spare Radeon HD 3650 into the graphics slot, and powered it on. Sure enough, it let me change the FSB. The board worked fine! Apparently, my PSU was the cause of all my woes.
The next day, I swapped my X38-based board back into my case and put in the Q9650, convinced a new PSU would do the trick. However, I wanted to double-check with another unit to be absolutely certain I had to go out and purchase a replacement. As it happened, a new case I had ordered to build a secondary machine out of spare parts (and sell it at a later date) arrived. I tried connecting that enclosure's bundled PSU to my motherboard, and the problem suddenly reappeared—I still couldn't change the front-side bus. Even if I disconnected everything but the video card, keyboard, and mouse, the problem persisted. At that point, I theorized that it was because the PCIe connectors to the video card might not be working right (insane, I know), or that the video card itself may somehow have been causing the problem. I asked a friend to come over with his Radeon HD 4870, swapped that card in...
...and I could overclock again.
This is one of the weirdest quirks I've ever encountered, but there you have it. My otherwise perfectly functional and stable Radeon HD 4870 was somehow screwing up the POST process on my motherboard. With a heavy heart, I sent my 4870 back through the warranty service, knowing I would probably get it back after technicians failed to see anything wrong with it. Now, with the Radeon HD 4670 from my media center in my old my old X38 mobo's PCIe x16, and a mountain of other hardware to return, my computer finally works fine again, and the Q9650 hits 3.6GHz stable at stock voltage.Just fix the AHCI already, AMD
I've been partial to AMD for a long time. The first PC I built for myself had an Athlon Thunderbird running at a blistering 1.4GHz, and I later upgraded to the Barton-core Athlon XP 2500+. I was one of the lucky ones to get an unlocked model, too, and I took that baby straight up to XP 3200+ speeds. Remember when 200MHz was a major overclock? Crazy kids, spoiled with your Phenom IIs and Core 2s.
When I decided to start using a desktop-replacement notebook instead of a full-on desktop machine, even my shiny new laptop ran a Mobile Athlon 64 3700+, and that thing felt like a demon compared to the XP 2500+. Ultimately, I'd had good experiences with AMD.
After I transferred to the University of California, San Diego, though, I felt like I needed to build a proper desktop for video editing. I also wanted to game on something meatier than a Mobility Radeon X600. Unfortunately for AMD, Intel's Core 2 Duo was beating the Athlon 64 X2 soundly. I built an Intel machine based around everyone's favorite sweet-spot processor at the time, the Core 2 Duo E6600. Since then, the same PC has seen multiple upgrades of all types: video cards, memory, and hard disks. Eventually, even the CPU was upgraded to a Core 2 Quad Q6600—another sweet-spot contender. My Q6600 served me well and overclocked to a smooth 3GHz.
When my birthday came around this year, however, I was given an opportunity to upgrade to an AMD Phenom II. I liked what I'd read about these processors and what they offered, and the chance to have a top-of-the-line CPU for a reasonable price (the Phenom II X4 955 Black Edition with Lettuce, Tomato, and a Side of Mashed Potatoes) was very attractive to me. The Phenom II was ordered, and a friend of mine gave me a spare 790GX-based motherboard as a birthday gift. I was set.
Before getting into what happened, I should mention that as part of having a video editing rig, my scratch disk is a RAID 0 hanging off the south bridge. Intel's south bridges have, in my experience, offered excellent RAID quality. When you're editing high-definition video, you really do need as much storage performance as you can conceivably get so that your hard drives don't bottleneck the processor. A single drive often won't cut it, but a RAID 0 can help shorten render times tremendously.
The reason I bring up RAID is a simple one, and if you've been paying attention to AMD for the past couple years, you may already know where I'm going.
The RAID support on my motherboard's SB750 south bridge was dreadful. Enabling RAID to begin with disabled the expansion card I use to add SATA and eSATA ports, and I could only enable it again once Windows was installed. Also, Windows 7 required a driver installation to detect my optical drive. When I finally got past those hurdles, HDTune was averaging about 100MB/s read for reads on the RAID 0, with a lot of nasty peaks and valleys.
I've seen people get much higher speeds than that with SB750-based RAIDs, so maybe my board was just having trouble handling both a RAID 0 and a RAID 1 at the same time. All I know is that installing Windows and getting everything up and running on Intel's ICH-based RAID is a picnic by comparison. Intel's RAID BIOS is extremely easy to use, and everything registers perfectly fine through the entire Windows setup (and in Windows itself) without the need for separate driver installations. In Windows, the RAID is cheerfully stable, and performance is solid across the board. My ICH10R RAID 0 averages about 160MB/s in HDTune with pretty consistent peaks and valleys.
In addition to the SB750 RAID benching grossly below what it did on the Intel controller and oftentimes even below a single drive, AMD's RAIDXpert software was also downright bizarre. I'm stunned that the RAID manager for integrated south-bridge RAID would run in a browser window and even require a login and password. Not only that, but the software lists the login and password under the boxes as "admin/admin." Seriously? For what it's worth, RAIDXpert was sort of easy to use, but the browser-based interface and "login" screen reeked of a kludge slapped together by people who didn't care. Re-enabling Native Command Queuing on the drives in the RAID didn't solve any of my problems, either; the performance remained identical. Outright deleting the AMD drivers and running off native Windows drivers also did not correct the problem, and it left the RAID running painfully slowly.
The bottom line for me is that I need good, functional RAID—maybe not a full-on card, but at least solid motherboard RAID—and I just wasn't going to get it from AMD. It could very well have just been my board specifically, but you can't deny how poorly the SB750 comes across when it has trouble with one of its most basic applications: moving data on and off of hard disks. A visit here will show the SB750 underperforming due to needing to run in native IDE mode, but the real crime can be witnessed in this review written in late 2007, nearly two years ago. Basic AHCI support still hasn't been fixed.
And I'm sorry, but I'm just not interested in wasting time having to perform some strange driver voodoo to get RAID even working at all. I like the 790GX and was keen to use it to drive a spare monitor, but that just wasn't meant to be.
That's pretty much how AMD lost my business as a user. I instead opted to spend my upgrade funds on an Intel Core 2 Quad Q9650. My motherboard has an Intel RAID controller that works exactly as I'd hoped.
One of the reasons AMD purchased ATI was to have a complete platform, a processor and chipset that could be sold together. I don't like watching AMD play second-fiddle to Intel all the time, but letting even basic hardware AHCI languish like this for nearly two years is inexcusable, let alone the dismal RAID performance.
Sorry, AMD. I think the Overdrive software is a nice touch and your IGP is pretty stellar, all things considered. But the platform you offer me is still missing basic functionality other vendors deliver. Maybe when you can get your south bridges sorted out, maybe in the next generation, we can talk, but for now I'll be just fine with my Core 2 Quad and ICH10R. What good is offering RAID in all of your south bridges when it barely works?Crysis averted
(Warning: spoilers below!)
I'm not sure if my preference has been stated in this blog before, but I'm decidedly not a fan of Crysis. My taste in gaming is probably pretty questionable; I do love Call of Duty 4: Modern Warfare and Portal just like anyone else, but there have been sacred cows I just didn't care for. I thought Half-Life 2 was grossly overrated, and I still think the series didn't really get to the point where I could say "this is a great game" until Episode Two. I've also had a general distaste for Crysis. I did beat it, so I can at least say I've played through the whole game to know to dislike it.
My problems with Crysis are many: it's punishing even on today's hardware, and it's experienced such a minimal performance jump going from the previous hardware generation to this one that I've had to question the quality of its coding. I also felt like the weapons weren't balanced well, with the alien weapon being straight up weak. The story and the gung-ho, battle-hungry Americans drove me insane. I've never cared for how Cevat Yerli represented himself or his company when he's spoken in public, and the "A Cevat Yerli Game" credit at the beginning of Crysis seems gauche, especially now that the "A Film By" credit is going out of fashion in movies. The vehicles handle badly, especially the VTOL at the end, which I dub the "hippocopter" in deference to Yahtzee's Zero Punctuation review. And finally, Crysis has the same problem Far Cry did: when the non-human enemies show up, the game suddenly begins to suck, and the gameplay loses the lion's share of its depth.
Yeah, I'm not a fan of Crysis.
I'm also a writer and reviewer, though, and since Crysis Warhead now stands in for the original in most hardware reviews, the upgrade had to be made. I put it off for too long. I had to bite the bullet and buy the game.
Sojourns into Crysis-land have, if nothing else, been pretty ones. I figured the worst that could happen with Warhead was that I'd see some cool stuff before eventually getting sick of the poor design decisions, poor performance, and so on. Surprisingly, none of these things happened. Warhead is a better-looking game for sure, but I also found that the overwhelming majority of my problems with the original had been ameliorated.
One of the major improvements was getting largely rid of the gung-ho American crap. Psycho is a vastly more interesting and amusing character to spend a game with than the original's Nomad, due in no small part to the use of third-person cut scenes. The fact that he comes off with strong hints of Jason Statham make him much more likable as an action hero. Any red-blooded male that doesn't want to spend a game pretending to be Jason Statham or some equivalent British badass is going to be missing some programming.
The more action-oriented gameplay tailors itself nicely around Psycho's character, too. Where Crysis was slower-paced, Warhead's willingness to provide you with plenty of ammo and powerful guns and then make the nanosuited KPA soldiers more common was the right call. Spending less of the game sneaking up on and gunning down helpless KPA jobbers and more time fighting enemies that can actually present a real challenge was a big plus.
The vastly improved alien A.I. didn't hurt, either. The tremendous shift in gameplay that brings the original Crysis into sub-Doom levels of complexity just doesn't really happen in Crysis Warhead, where Crytek opts to instead make the alien enemies much smarter. The new ones are more evasive, work in teams, and there are even new aliens that improve the defensiveness of the existing ones. The dynamic has changed radically, and combat regains some of the decision-making that was completely lost when the aliens were introduced in the first game.
Ultimately, that's what made the title work for me: Crytek finally figured out how not to completely screw up a game with a gameplay shift. The freeze comes early in Crysis Warhead, and the game smartly intersperses human and alien encounters, sometimes combining the two. It completely changes the dynamic of the game, and the more action-oriented pacing goes a long way toward making alien encounters feel less like a dramatic style shift. Compare this to where the trigens are introduced in Far Cry and all of your tactical gameplay is suddenly for naught, or as I mentioned before, when the aliens show up in Crysis and the game degenerates into Painkiller without the cool gun that shoots shurikens and lightning.
I recognize I may be in the minority here, and some people like the original far more than Warhead for its more measured gameplay. However, I'd just like to point out that Warhead doesn't feature an absurdly overlong and nightmarishly irritating-to-navigate floating journey through an alien spaceship. The worst parts of Warhead are the vehicle sections (go figure), but they're sparse, and none of them are as frustrating and badly handled as the hippocopter VTOL in Crysis. Warhead is able to maintain a consistent tone and minimize the parts of the game (hovercraft section, I'm looking at you) that bleed the fun out of it and turn it into an exercise in tedium.
As a last point: I'd just like to say that, while the C.U.D.A.A.T.s software noticeably improved the look of Crysis for me without impacting performance much, I found Warhead ran perfectly fine and looked much better on its own. Warhead makes a strong case for a 1GB video card, too, but let's be realistic—if you're going to be gaming seriously at this point, are you really going to buy a 512MB one?
While I'll never get over Cevat Yerli's griping about piracy throttling Crysis sales (gee, Cevat, it couldn't possibly be because no one wanted to invest in a game they weren't even sure they could run well), especially after they posted million-plus sales, at least his name isn't plastered all over Warhead. Now that Crytek has more or less perfected its gameplay, or at least improved it to the point where it doesn't take a swan dive at a certain point of the game, I can safely say I'll actually look forward to the next game the studio has in store.Internet service provider, indeed
Going into this, I should probably state that I'm well aware of the deplorable state of broadband Internet access and management in this country. If Time Warner's naked attempt to extort its customers by testing tiered Internet access with absurdly low download caps is evidence of anything, it's how much we're at the mercy of the companies involved. I feel as though in many instances, we're dealing with a "race to the bottom," if you will: how little can a company offer, and how much can it charge for it? If I learned anything in college, it's that you don't have to be the best, you just have to be better than the alternatives. This all seems very grave for something much smaller, but I habitually look at the disease instead of the symptoms. Even if that might blow things a little out of proportion.
In my area, the Internet access options consist of AT&T's DSL service and Comcast's cable modem service. Most people out here go with Comcast. I can't say I blame them. Comcast is an evil empire if ever there were one, with their history of futzing with BitTorrent packets among other things, but remember what I said about not having to be the best, just better than the alternatives?
My Internet connection with AT&T had been stunningly mediocre for the entire time I'd had its DSL service. Despite being on the 6Mbps "Elite" plan, download speeds only ever topped out at about 600KB/s. Fine, whatever. That was until around the 10th of this month, when my connection took a nasty nosedive, averaging about a sixth of what I'd been getting before. Ping was astronomically high (thus ensuring I wouldn't be enjoying my recently replaced copy of Call of Duty 4 online or playing Left 4 Dead with my friends), download speeds were dismal, and my connection would periodically drop. With my pitiful connection, I was able to speak to AT&T through its online chat and get on my way to having the problem corrected. First, the representative and I ran the company's bandwidth meter test which, surprise surprise, measured my download speed at about 2Mbps. Then eventually a line test was performed, and a problem was indeed noted. The rep told me he would escalate my issue to the line department, and I would receive a phone call within the next 24 hours.
I did not.
I eventually had to call back and jump through hoops to speak to AT&T's line department on the evening of the 13th. The specialist I spoke to said that there was nothing wrong with my equipment, but that there was indeed a problem with the line. Lots of packets were being lost, corrupted, and dropped. So he dialed back my bandwidth in an attempt to mitigate the problem somewhat and produce a more stable connection. He also put in to have a technician come out to my apartment to locate and fix the issue. On the 17th. Not a whole lot of options, so I went ahead and made the appointment.
So the technician arrived that day, tested the line and found nothing wrong with it, and asked me if I had a laptop to test the DSL modem. (Why, I don't know, when there was a tower on which the modem was sitting.) Ignoring the fact that I'm used to seeing technicians with their own laptops, I loaned him mine and marveled as he struggled to find the shortcut for the command window in Windows Vista.
Maybe I'm being unfair; I'm not really sure, but if I were a technician regularly handling network issues, I'd probably know the keyboard shortcut to open a command window: Windows+R, then "cmd."
Eventually, after running tests, he informed me that it was my router and not the modem or anything else, and he plugged the modem directly into my laptop. The connection was indeed faster. I sent him on his way, figuring "fine, I'll just go buy another router, this one's pretty old anyhow." The router (Linksys WRT54G v5) had been having trouble even registering the PPPoE connection—surely that must have been it.
Unfortunately, upon testing my connection using the direct link to the modem, I found it was faster than it was before, but by a hair, and, woefully, still not terribly stable. Download speeds were topping out at 200KB/s, still a far cry from what I was getting before this mess began and nowhere near what I was paying for. So I figured it could be the modem, and I went and purchased a DSL modem and router combo. (The technician informed me that if the modem was bunk, I would have to pay for a new one out of pocket anyway.) Naturally, the unit couldn't make the DSL connection, even with the same settings as the original modem: it returned "no PPPoE" errors. Finally, I went back to the original modem—this was on the 18th, mind you—and chatted up AT&T's tech support online again. Their bandwidth test again showed me having far more bandwidth than I was actually getting, and independent sites were testing bandwidth at 50% of that speed. Arguing with the tech and explaining what had been going on thus far escalated back to the line department.
The following morning, I received an automated phone call saying the technicians had resolved the problem and asking for me to press "1" if everything is okay and "2" if it is not. This was at eight in the morning; an automated call that forces me to immediately check my Internet connection is preposterous. And at this point, I'd had it. The connection wasn't working at all, constantly redirecting me to a login page where the password had been reset. I had gone more than a week without stable Internet access, to the point of not having any at all, and the one technician who came out didn't seem to have much of a clue what he was doing.
I wound up switching to Comcast the next day, a Monday. That wasn't before calling Best Buy, which carries self-installation kits and whom I remembered as being able to set up a new connection in-store, and telling the customer service rep I'd like to start a new account with Comcast and asking if they had the equipment in store. "Yes, we have wireless and regular." She thought I was talking about routers.
The switch to Comcast should've been more painless than it was. I was able to go to a local store and pick up the equipment, add it to my account and so on, and I didn't even need to buy the self-installation kit. "Just hook it up, then call our customer service line and have them activate the modem." Okay, fair enough. I do as instructed, and three hours after picking up the equipment, getting it home, and setting it up, the technician on the other end of the line informs me that high speed Internet service is not on my account. I explain to him that it was just added hours ago, and he says it probably hasn't gone through their system yet. This is after having been on hold for a half hour. He puts me on hold again to look things up.
Popping open my web browser in the interim, since Windows did indicate it was getting an Internet connection from the cable modem, I'm given two options: one for home users setting up their new connection and one for technicians. The one for home users links to a program you can theoretically download to install your connection, and I say "theoretically" because it refused to download. At this point, I figured, "What the hell," and went with the technician link. Lo and behold, it never asked me for credentials or anything, just the account number, and proceeded to activate the modem just fine. I hung up the phone, still on hold.A plea for Lara Croft
No matter how many times I tell them, none of my friends believe me when I say the more recent Tomb Raider games are actually good. And so I write this blog post in a bid to educate, elucidate, and evangelize the good word of Lara Croft by way of Crystal Dynamics.
Eidos's prized heifer had gone from being a video gaming icon to more or less a laughingstock of the industry, as poorer and poorer games starring Lara Croft were released one after the other, culminating in the dismally-received Tomb Raider: Angel of Darkness. I'd like to point out that during this time, Lara's look essentially went unchanged. Polygons were added here and there, but she remained essentially the same inhuman-looking breast-and-lip creature. I wasn't exactly a fan of the Core-developed Tomb Raider games, and by and large dismissed the lot of them; time spent trying to figure out how to play Tomb Raider II only resulted in discovering that the camera would zoom in on Lara's swimming posterior if I hit one of the shoulder buttons on my PlayStation. For reference, this was a period of time in which I was willing to send hours to an unfortunate grave playing Square's abysmal Final Fantasy fighting game Ehrgeiz, so it's not like I was the pickiest gamer in the world.
Between the declining quality of Tomb Raider games and the magazine cover with Duke Nukem's hands covering Lara's bare breasts, Croft's public image wasn't doing so well. Eidos, seeing their cash cow being lead to slaughter by Core, suddenly had an outbreak of common sense and handed over the reins to Legacy of Kain developers Crystal Dynamics, leading to the franchise reboot and all-around enjoyable title Tomb Raider: Legend.
I will come out and say that I freaking love the Crystal Dynamics Tomb Raider games. I'm working on Tomb Raider: Underworld right now, but I finished Tomb Raider: Legend a while ago and had a lot of fun with it, which is more than I expected. The demo immersed me in a world that was lush with the beauty of modern graphics, quality voice acting, and a shiny new Lara Croft that actually looked human. She spoke intelligently, she emoted, and she wasn't all T&A anymore. The game's story was definitely interesting enough, and the variety of locales was exciting and refreshing. About the only real problem I had with the game was the quirky camera, but I adjusted, and the gameplay was well worth it. Crystal Dynamics had taken the challenge of reviving Tomb Raider earnestly and produced a game of great quality.
Unfortunately, when you've burned customers with a string of inferior games, they're eventually going to ignore new ones promised to be better. So, while it's my understanding that Legend still fared pretty well, just looking at my own gaming circle I see faces that are just a little too skeptical to be swayed by my impassioned pleas to give the new games a chance.
The release of Tomb Raider: Anniversary probably didn't help things much. It's the one game in the Crystal Dynamics-developed trio that I just can't bring myself to get into. While beautiful, to be sure, this proper reworking of the very first Tomb Raider is missing some of the advances and changes that Legend brought to the table, including the slight skewing towards more action-oriented gameplay. Players who went from Call of Duty 4: Modern Warfare to World at War and found less game instead of more will know exactly what I'm talking about.
When Tomb Raider: Underworld finally hit the $20 mark, I ordered it, and I've been invested in it since. The graphics have taken a solid step forward and it's a very beautiful game. The game's presentation has also been bumped up some, and there are some really great changes to the design. I don't think it's of the same caliber as Legend, and the camera actually seems to have gotten a bit worse, but the story is immediately engaging and exciting and I'm constantly anticipating where it will send me next. The freedom Lara has to move in her environment has increased from the previous games, and an excellent replacement for the original's button-press quicktime events manifests here. Instead of hitting the indicated button, the game simply slows down radically and forces you to actually react using the in-game controls. Floor crumbling beneath you? Hit the grapple key to shoot the grappling hook and save yourself. The fundamental principle is the same—press a key to avoid death—but having to interact on the normal game's terms instead of hitting arbitrary keys feels more involving and exciting.
I feel like these recent Tomb Raider games have been at least a little unfairly dismissed. It's my understanding that Eidos wasn't happy with how Underworld sold, which is a shame, because it's still a fine game. Underworld doesn't quite have that same revitalized sense as Legend did, but it's a more than acceptable sophomore effort. Care was clearly taken in trying to capitalize on and improve the things that worked the first time around.
Since any of the Crystal Dynamics-developed Tomb Raider games can be had for $20 now, and since demos are available for each one, I highly suggest checking them out. At the very least, Tomb Raider: Legend is worth your time. Who knows, you may find yourself a believer just like I did.
First, I'll come out and say it: Street Fighter IV for the PC is a tremendously impressive port. If you weren't paying attention to Capcom's efforts when they did a bang-up job porting Devil May Cry 4 to the PC, then Street Fighter IV may come as a bit of a surprise to you. But those of you who experienced the remarkably well-optimized DMC4 PC version will no doubt still be pleased at how well Street Fighter IV runs.
As a casual fighting game enthusiast (strictly 2D, not a fan of 3D fighters), and a long-time fan of the Street Fighter franchise, Street Fighter IV had threatened to force me to purchase a next-generation console to play it, and the options there are pretty dismal. First, you have the grossly overpriced Sony PlayStation 3, which thus far is pretty much the black monolith representing Sony's unending hubris. The PlayStation 3 tends to have inferior ports of games that show up on both the PC and Xbox 360, and worthwhile exclusives for it are few and far between. So at the end of the day, given I have a plenty capable PC and a media center for HD playback, Sony's glorified Blu-ray player wasn't for me. On the other hand, while I'd definitely be happy to pick up Microsoft's Xbox 360, moronically high hardware failure rates (I've read as high as 30%) coupled with an obnoxious system fan rule that system out for me. My former roommate had a 360 and I could hear it coming down the stairs; likewise, I don't know anyone who hasn't had one fail.
Thankfully, Capcom made the unusual decision to port a fighting game—their flagship fighting game—to the PC, heretofore a fighting game wasteland relegated to quality programming like FX Fighter. So Street Fighter IV appeared late to the party but fashionably so, running beautifully and delightfully bug free (at least in my experience.)
First, as far as gameplay goes, it's pretty excellent and a lot of fun. The AI can be very cheesy and throw-happy, I'm not sure how I feel about how Chun-Li has evolved (she's actually one of the slower female fighters now, at least as far as I can tell) from SFII to SFIII to here, and I'm really not happy with Capcom taking a page out of the SNK "all of our bosses are grossly overpowered" playbook with Seth (though it's not much of a surprise, given what a pain Gill was in SFIII). But the game is fun, a lot of the characters play the same as they always did, and the revenge gauge mixes things up just a touch. I've never been very good at anything other than Street Fighter Alpha 3 (X-ism Chun-Li), so a low difficulty level has worked out fine for me. Any higher and the game just gets too throw-happy for me. Capcom's developers have said they tried to make the game casual friendly, and for the most part, they've succeeded. I'm not sure how much more casual-friendly it could be made without turning into a brainless button masher (Soul Calibur, I'm looking at you.)
As for the visuals, I've actually found myself to be a pretty big fan of the new art style. While some of the male characters are grossly over-ripped, the female models are uniformly excellent, with very expressive, intelligent faces. All of the characters ooze personality in ways their traditional 2D couldn't. As much as I love classical sprite artwork and backgrounds, the move to 3D for the engine (while maintaining 2D gameplay) has paid off in spades here. The care taken in producing the game shows. Visual artifacts are also pretty rare. The game just looks good.
Fair warning, though: anti-aliasing takes a big toll on frame rates. My old 512MB Radeon HD 4870 couldn't hit 8xAA without massive chugging; my new 1GB one can do it fine. But the 512MB Radeon HD 4670 in my media center can't even enable it at 720p, and the game chops at 1080p without several settings being reduced. I think the performance feels about as good as it could, and the media center could very well be CPU-limited by the Phenom X3 8750 (a hand-me-down from a friend). As an afterthought, the filters available to the PC version are cute, but none of them look any better than the regular game does, in my opinion.
Unfortunately I can't write a blog post without complaining about something, and with Street Fighter IV, there are two major issues that are delightfully intertwined with each other.
Though I purchased Street Fighter IV off of Steam, it comes with Games for Windows Live's tentacles penetrating its every nook and cranny like a hentai monster. You actually can't play with your unlocked characters or saved settings without it logging into Games for Windows Live, which makes no sense at all. On more than one occasion, I've been unable to login, and as a result, the game was rendered not necessarily unplayable, but shamefully limited. My favorite character is Cammy, one of the unlockables; if I can't login, I can't use her. That's insane. And unlike some kind of useful service like Steam Cloud, I was disappointed to discover that the unlockables aren't even saved to my Games for Windows Live profile. Upon installing the game on my media center after having unlocked everything on my desktop, I found that, even after logging in, I still had to go back and unlock everything again. Ridiculous. So apparently I just can't have nice things the way I want them.
This is all compounded horribly, of course, by the fact that Street Fighter IV just hasn't played well for me online. I'm sure fifty posts will pop up saying "but it worked great for me" and if it did, kudos to you, but with something as timing-sensitive as a fighting game, anything but a perfect connection is really going to be felt. I've found the online experience so hit-and-miss that I honestly just lost any interest in it, which effectively ruins any reason to have Games for Windows Live attached. Unless you've gotta have those gamer points and achievements, Games for Windows Live is basically worthless. And with all of your unlockables chained up behind it, it's actually a detriment to the game.
The disaster that is Games for Windows Live's deep ties in Street Fighter IV PC keeps this port from being a home run, and that's a shame. I wanted to recommend it unequivocally and even encourage people to buy it to support the quality of the port and to suggest demand for more games like this on the PC, to prove that the PC is a viable platform for fighting games, too. Games for Windows Live is the fly in that ointment. If you think you can put up with something that ludicrous, then I definitely recommend the game. It's nice to have a 2D fighter floating around again that doesn't require obscene technical knowledge the way King of Fighters and Guilty Gear can, and it's even better to have it gorgeously rendered on the PC.
|Here are two of ASRock's next-gen Z170 motherboards||14|
|Google's Project Soli radar gesture tracking looks awesome||9|
|Zotac and EVGA liquify the GeForce GTX Titan X||15|
|Nvidia's GameWorks program goes mobile||7|
|Lenovo's ThinkPad 10 tablet looks like a Surface 3 in a suit||11|
|Deal of the week: Asus' Core M ultrabook for $599 and Project Cars for $34||10|
|SourceForge adds software bloat to more installers||46|
|Google Jumps on panoramic VR video||18|
|Catalyst 15.5 betas promise gains in Project Cars, Witcher 3||28|