So, you heard the news: PC sales are tanking. Apparently, nobody wants to buy Dells or HPs anymore; nobody cares about clunky laptops and bulky mid-towers. People haven't necessarily stopped using them—aging PCs are still humming along in bedrooms, living rooms, and offices everywhere. It's just that those machines aren't getting replaced. Instead, people are spending their hard-earned dough on what analysts call post-PC devices: smartphones, tablets, phablets, and so on.
The PC industry is scrambling to adapt. Microsoft has retooled Windows into a weird hybrid that straddles post-PC tropes and legacy conventions. Laptop makers are bending over backwards to give us touch-enabled laptops that double as tablets. Everyone is working overtime to put a new spin on old concepts... and by all accounts, it isn't working. PC shipments suffered their greatest decline ever last quarter, in spite of Windows 8 and all those tablet-notebook hybrids.
Some say there's no hope, but I disagree. Because the PC is booming—just not the PC we know.
What is a PC? The initials stand for "personal computer." According to the Merriam-Webster, a personal computer is a "general-purpose computer equipped with a microprocessor and designed to run especially commercial software (as a word processor or Internet browser) for an individual user."
In the 90s, when I was growing up, personal computers were few and far between. To record TV shows, we used VCRs and VHS tapes. To share music with friends, we spent hours copying cassette tapes. To get in touch with friends and relatives, we used a land line—or, if textual communication was more up our alley, we wrote a letter. By hand. With a pen and paper.
The only kind of personal computer you could buy was a big beige box with a matching CRT monitor. Or, if you were loaded, you could get a laptop with a crappy passive-matrix LCD and a trackball wedged in the palm rest. These were toys of the privileged. We, the geeky elites, used them to play Doom and to rack up preposterous phone bills surfing AltaVista and GeoCities on 14.4K modems. A PC was a badge of pride. Finding someone else who knew about the Internet was sufficient to spark a lasting friendship.
If you'd fallen into a coma in 1993 and awoken today, you'd realize that personal computers are everywhere now. You'd probably notice the laptops and mid-towers at first, but then you'd start to see the phones, the tablets, the game consoles... and you'd think, aren't they all the same thing? Aren't these all general-purpose computers equipped with processors and designed to run commercial software? Sure, some of them are a little more locked down than the IBM clones of old, but that's nothing a little jailbreaking or rooting won't fix. Hackers can still write their own software. Not only that, but they can also package that software, ship it, and make money from it with very little effort or risk. It's a far cry from the days of shareware trials on 1.44MB floppy disks.
Two decades ago, having the tools to play video games, to get on the Internet, and to write crappy BASIC programs made us special. Now that personal computing has grown into something so exceedingly ubiquitous, we feel like we're not so unique anymore. Some of us see the PC's evolution as a corruption of something precious—but I don't think it is. The basic formula that made PCs great 20 years ago is still there, more or less intact, in today's post-PC devices. You just have to look past the drastically different packaging and realize that modern personal computers are more beautiful, more versatile, and easier to use than than they've ever been.
I still spend a lot of time in front of an old-school desktop PC. I wouldn't dream of giving it up. The thing is, though, my PC is big, heavy, difficult to operate, and required for serious work—Photoshop, Excel, web design, text editing, you name it. There's nothing terribly personal or cozy about it. If you think about it, today's tablets and phones fulfill the PC's original mission—making personal computing available to the masses—far more elegantly than this thousand-dollar workstation.
Because that's really what this is: a workstation. And that's really what most of today's traditional PCs are. They're workstations with multiple processor cores, Windows NT-based operating systems, and copious amounts of storage and memory.
There's nothing wrong with that. Workstations will always be needed, because there will always be work to do. But we shouldn't pretend that the PC is somehow dying because people aren't buying workstations they no longer need. The PC isn't dying, because today's real PCs are in our pockets. We're buying more of them than ever, and they're doing more for us than i486 Compaqs ever did.Modern shooters and the atrophy of fun
I finished BioShock Infinite last weekend. It's probably one of the best shooters I've ever played.
I loved the cleverness of the storyline, the expressiveness of the characters, and the unique beauty of the graphics. I loved how the game got so immersive that, during late-night sessions, I almost felt like I really was fighting my way through Columbia—like I really was trying to tear a young woman named Elizabeth from the clutches of her fanatical, despotic father.
I loved everything about BioShock Infinite. Except for the gameplay.
Don't get me wrong. The exploration was great. Watching the story unfold was incredible. But the combat and looting got so repetitive—so downright boring—that I couldn't stand to play more than a couple hours at a time. I got sick of digging through trashcans inexplicably filled with silver coins and of fighting wave after wave of enemies, each one more indistinguishable than the last. The combat sequences blurred together, ran into one another, and I found myself praying for them to end, hoping that I could proceed without playing hi-fi wack-a-mole with steampunk guns and angry crows.
So why is BioShock Infinite one of my favorite shooters? Because the others are just as bad, if not worse.
Since the days of Doom and Quake, we've seen shooters take quantum leaps in graphics, writing, voice acting, and just about everything else—except for gameplay. Somehow, gameplay hasn't evolved. It hasn't gotten more fun or more engaging or more interesting. Instead, it's atrophied into a bland rut, to the point where big-budget shooters feel just like old light-gun arcade games (Virtua Cop, House of the Dead, and so on). Players are still stuck on rails, still made to gun down easy target after easy target, pausing only to reload and to watch cut scenes. Today's visuals and stories might be Oscar-worthy, but the interactivity still feels like tasteless filler.
Shooters could be so much more. Instead of trivializing combat, they could make fights less frequent, longer, and more memorable. They could reward players for acting rationally when outnumbered—hide, flee, or die. Shooters could, when appropriate, encourage problem-solving and exploration over brute force. Hell, why couldn't they have players decide how the story plays out? But no, that's all too much to ask. Studios and publishers seem to have forgotten that games are supposed to be games, not CG films with playable action scenes.
Things weren't always this way. I have very fond memories of System Shock 2, BioShock Infinite's spiritual pre-predecessor. I remember desperately scrounging for ammo, cowering in fear from even lone mutants, since I knew a fight might leave me badly wounded—and the noise might attract other creatures. I recall sneaking past enemies and reprogramming turrets to dispatch them so that I wouldn't lose precious health or bullets in combat. I can still recall the satisfaction I felt when, later in the game, I finally had enough upgrades to gun down monsters in one shot.
In System Shock 2, each enemy encounter was an event: memorable, frightening, dangerous, and sometimes exhilarating. Getting lost in the corridors of the Von Braun was part of the game, and it made the experience all the more immersive. There was a real sense that you, the player, had to use your own cunning and skill and sense of orientation to survive. Because of that, beating the game felt like a true achievement, and it made you want to start all over again. Nothing about it felt like watching a bad Michael Bay flick.
System Shock 2's formula should have been spread far and wide and polished to a mirror shine by now. But instead, after 14 long years, that formula has been largely forgotten.
There's nothing dangerous or memorable about BioShock Infinite's gameplay. While players must still hunt for ammo, combat is so frequent that bullets are strewn everywhere, and picking them up feels like a chore rather than a relief. The anguish of an empty gun is nowhere to be found, either. Elizabeth replenishes your ammo supply during combat, and when things go south, dying causes you to be magically teleported to a safe location with spare magazines in your pockets. There's no longer any danger. Skill and cunning aren't really rewarded anymore.
Not even the game's "vigors" manage to spice up combat. Most of them basically do the same thing: stall bad guys for a few seconds and inflict a small amount of damage. Using a vigor at the right time can mean the difference between a cleared battlefield and a forced resurrection, but there's nothing hugely satisfying about the process. It just adds more steps to the tedium of depleting each enemy spawn point.
The most depressing thing about BioShock Infinite, though, is that it's actually one of the more original shooters out there. Compared to the endlessly multiplying Call of Duty clones, its gameplay is textured and tinged with depth and variety. While I was able to beat BioShock Infinite and derive pleasure from the experience, I've had to stop myself from playing war-themed shooters altogether. Their single-player campaigns are just awful. The last one I bothered to finish was Battlefield 3, and I hated everything about it.
I'm not sure who or what to blame. Maybe this is all an attempt to appeal to the lowest common denominator. Maybe game studios are so intent on catering to brain-dead 14-year-olds with Xbox 360s that they've lost sight of what makes games fun. If that's the case, then there may be little hope. It's entirely possible that the next crop of consoles will bring us unimaginably pretty games with sugar-free, decaffeinated gameplay that's as boring as ever. That would be even more soul-crushing than BioShock Infinite's failings.
Our only hope is that, eventually, even 14-year-olds will get sick of playing the same game over and over. They'll start to clamor for better games, where interactivity involves more than just pointing and aiming. And game developers will deliver. It might seem unlikely, but I remember being 14 quite well. It was around the time System Shock 2 came out, and I didn't toss that game aside for something with more instant gratification. I dug in, and I loved it.The problem with Windows convertible tablets
I've been spending a fair bit of time with Windows convertible tablets lately. I reviewed the Samsung ATIV Smart PC Pro 700T last week, and I've had the Asus VivoTab RT kicking around in my benchmarking lair for a few weeks. I'm also currently testing an Atom-based tablet: the VivoTab Smart, which combines x86 support with the slender profile and long battery life you'd expect from an ARM-based device.
Oh, and I've tried both versions of Microsoft's Surface. Not in my office, though—there was a Microsoft kiosk at the mall, and I stopped by while shopping for a Valentine's Day gift. Yes, I'm that romantic.
Anyhow, the longer I spend with these devices, the more I grow convinced that convergence à la Microsoft is an ill-tasting recipe. I articulated some of my reservations in the Samsung 700T review, where I stated:
There seems to be little overlap between what people do on tablets, which is mainly content consumption, and what people need full-featured notebook PCs for, which is productivity. . . . So, why must we have both on one machine? What's so compelling about having Facebook and Kindle apps on the same physical system as Office and Photoshop? Since the combination is fraught with compromise, why not get a great tablet and a great ultrabook rather than a less-than-great combination of the two?
TR's own Geoff Gasior had a reasonable answer to this: because carrying one device is better than carrying two. And hey, I totally get that. My problem is that saving room in my backpack does me little good if I it means passing up the best tool for the job. From my experience so far, Windows convertible tablets are rarely—if ever—the best tools for the job.
Think about it. What do you want from a tablet? You want plenty of quality apps to choose from, including games. You want a great display, long battery life, and something that's thin and light. You also want a device that's both fast and easy to use, because content consumption is no fun if waiting and troubleshooting are involved.
Win8 and WinRT systems just don't deliver there. Good Modern UI games and apps are still pitifully few in number. (There's no Flipboard, Feedly, or Google Currents. No HBO GO or BBC iPlayer. No Yelp, and no official Facebook client.) The handful of Windows tablets that are thin, light, and endowed with long battery life—those with Atom and ARM-based processors—all seem to have ugly, low-resolution displays. (1366x768 is just downright sinful on a tablet screen.) As for speed and ease of use, WinRT slates take forever to launch apps, and while Atom tablets strike a passable balance between performance and power efficiency, ease of use remains a concern. One must still put up with the awkward marriage between Modern UI and the desktop, not to mention the questionable design choices within the Modern UI environment itself.
Okay, now what do you want from a good laptop? This is a system you're going to be using for productivity, so you want it to be fast. You want a great keyboard and touchpad, because controlling Windows 8's desktop environment with a touch screen is an exercise in frustration. If this is a productivity machine, chances are you want more than 11.6 inches of screen space. Don't get me wrong; small, highly portable notebooks are great. Photo editing on a thimble-sized display, however, is not. Neither are the cramped keyboards and truncated touchpads that invariably accompany smaller screens.
Windows convertibles also fail to deliver in this department. For them to double as half-way decent tablets, convertibles must sacrifice desktop performance and capabilities in one way or another—either with plodding performance, like the Atom-based systems, or with a "let's pretend" desktop courtesy of Windows RT. They must restrict themselves to 11.6" or smaller screen sizes, as well, which inherently compromises the keyboard and touchpad arrangement. Worse, that compromise is often more dire than it ought to be. The Surface's Touch Cover is just plain awful (try touch-typing on the thing, I dare you), and the Samsung 700T's fickle and undependable touchpad really disappointed me.
Ideally, Windows convertible tablets should offer the best of laptops and tablets, all in a single device. They should, but they do not. Current offerings feel more like crappy tablets rolled into crappier notebooks—jacks of all trades, masters of none, with good design sense and usability discarded in the name of convergence.
What does that convergence get you?
Well, you can store all your music, photos, and personal files on a single device. That's nice, I suppose. Then again, cloud storage is starting to make that convenience a little old-fashioned. I don't carry very much music on my phone, for example, because I don't have to. When I want to listen to something that's absent from the device, I simply grab it through iCloud over the LTE connection. (And no, being an Apple-worshipping metrosexual isn't a prerequisite. Google and Amazon run similar services.)
What else? We've already addressed the saving-space-in-your-backpack thing, and I think the downsides of convergence make that a lopsided bargain. That leaves one major advantage: cost. Buying a convertible tablet is cheaper than springing for a separate tablet and notebook, isn't it? If you're strapped for cash, convergence must be a pretty solid proposition.
The price difference isn't as big as you'd expect, however. A Nexus 10 will set you back $400; an iPad, $500. An entry-level ultrabook can be had for $650, and a good one will cost about $1,000. Now look at Samsung's ATIV Smart PC Pro 700T—a fine example of a Windows 8 convertible with ultrabook-class performance, which is precisely what you need if you aren't buying a separate laptop. It costs almost $1,200 at Newegg. That's $150 more than the Nexus 10 and the inexpensive ultrabook, and only $300 cheaper than the iPad and deluxe ultrabook combo.
Now, why on Earth would you settle for the worst of both worlds?
Don't get me wrong; convergence can be done well. Smartphones are touch-based computers converged with mobile phones, and they're are a great example of the concept taken to the right place. In that instance, though, convergence works because people don't want their pockets weighed down with extra hardware. In your trouser pockets, every ounce and every cubic inch counts. That's why nobody seems to mind that smartphones have pitiful battery life compared to basic cell phones. The benefits of convergence—having a little, Internet-connected computer, media player, and gaming console in your pocket—far outweigh the inconvenience of having to charge up every night.
I don't think you can make a strong case for convergence between tablets and notebooks. You don't carry those devices in your pocket. You carry them in a backpack, a briefcase, or a messenger bag, and so it doesn't really matter whether you're hefting a tablet and an ultrabook or a tablet and a removable keyboard dock. There's a small weight and thickness difference, but it doesn't amount to very much. The Samsung 700T weighs 3.54 lbs when docked. Put together, the iPad and deluxe ultrabook we talked about weigh 4.3 lbs. We're talking about a 12-ounce disparity, which is nothing compared to the nuisance of having to carry both a phone and a PDA in your pockets.
Of course, none of this means successful notebook-tablet convergence is unachievable. Once the hardware delivers ultrabook-class performance in the power envelope of a Tegra 3, and once Modern UI is sufficiently polished, fine-tuned, and loaded with great third-party apps, then I expect we'll see some excellent convertibles—devices good enough to make me ditch my iPad and my laptop. Perhaps all it will take is the next generation of processors—Haswell, Bay Trail, and Temash. Or maybe only Windows 9 and next year's hardware innovations will bring us there.
Or maybe it will take even longer than that.
For now, though, I'll keep watching Windows convertibles as I always have: with a mixture of curiosity and disappointment.On the marginalization of consumer laptops
You probably saw that Gartner report earlier this week about the sluggishness of PC shipments last quarter. Shipments were so sluggish, according to Gartner, that they shrank by almost 5% compared to the same quarter in 2011. I'm sure there were many factors at play, but Gartner pins the blame on one in particular: users relinquishing PCs for daily use.
Whereas as once we imagined a world in which individual users would have both a PC and a tablet as personal devices, we increasingly suspect that most individuals will shift consumption activity to a personal tablet, and perform creative and administrative tasks on a shared PC. There will be some individuals who retain both, but we believe they will be exception and not the norm. Therefore, we hypothesize that buyers will not replace secondary PCs in the household, instead allowing them to age out and shifting consumption to a tablet.
I'm a PC enthusiast, and chances are you, the reader, are as well. We might therefore find it hard to imagine folks ditching their computers for comparatively limited tablets. I mean, you can't do much on a tablet, can you? Most of them lack Flash support, make multitasking awkward at best, and don't play terribly well with keyboards. I use mine for e-book reading and some light gaming, but I would never dream of taking it to a trade show instead of a laptop. No way.
Yet Gartner's suspicion is truer than you might think—and as it happens, I have some very convincing anecdotal evidence to support it.
I bought my girlfriend an iPad 4 for Christmas. Well, technically, we went to the Apple Store and picked it out together. Aline chose the base Wi-Fi model (in white) and a matching SmartCover (in pink) for a total of $538 U.S. before tax. She unwrapped everything a couple of days early (because waiting sucks), played with it some, and then promptly stashed away her notebook PC—a relatively speedy 13" machine with a Trinity APU and Windows 8.
The laptop has been sitting under her desk ever since. She hasn't switched it on in almost a month. Not once.
And really, the substitution makes perfect sense, if you think about it from her perspective. The iPad has a great many advantages over a cheap consumer laptop:
For the price of the iPad and SmartCover, Aline could have snagged an Asus VivoBook X202E, which is selling for $549.99 at Newegg right now. I had a chance to play with that pseudo-ultrabook before Geoff got to work on his review, though, and I wasn't impressed. The thing is abysmally slow, has a really ugly screen, and seems to run its fan continuously, even at idle. Geoff measured the battery life at four hours, which sort of sucks. Overall, I found it unpleasant and frustrating to use.
Sure, the X202E runs things the iPad cannot—things like Word, Excel, Photoshop, and a full-featured operating system with proper file management. If you need to do real work, then there's no substitute for a real laptop (although you'd be surprised how much an iPad can do with a Bluetooth keyboard and Apple's iLife apps). The thing is, however, most consumers already have an old PC they can use to write resumes or telecommute. Why should they buy a new laptop when a tablet can serve their other needs so much better?
I can't think of a good argument.
When the iPad came out in early 2010, I thought of it as a nifty companion device for folks who already owned smartphones and laptops. Tablets seemed, in short, like gadgets for the technologically privileged—cool but unnecessary. Yet in three short years, these new devices have become something else altogether. In a very real sense, they've become highly compelling replacements for consumer laptops in non-productivity usage scenarios. That's exciting... and, frankly, a little scary.I wrote a novella! Here are some things I learned
Fluke: Langara's Prize will be free to download on Amazon.com (and on Amazon's international sites) until Saturday at 2:00 AM CST.
We've all written fiction. It might have been as part of a school assignment, or a loved one might have asked, "Did you take out the trash?" and you might have replied, "Yes, of course!" before going on to actually do it. That counts, too.
This year, I went a little further with the whole concept and wrote a 41,000 word novella. It's called Fluke: Langara's Prize, and it went up on Amazon last week. Scott even gave it a a nice little introduction in the news section. He also edited it and published it. In exchange, TR is getting a cut of the proceeds.
I wrote Fluke in one- to two-hour stretches, five or six days a week, over a period of about six months. I started putting the first words down in early December 2011 and finished in late May 2012. It was a pretty wild ride, and I enjoyed most of it—even if by the end, I was starting to feel exhausted from the extra workload. Slowly putting Fluke together taught me a number of valuable things about writing, and I figured they'd be a good topic for a blog post. So, here goes.
The first thing I learned is that writing fiction is pretty counter-intuitive for a journalist. My day job at TR is all about relating facts and events in the most precise and accurate way possible. I already know everything I need to say; the trick is saying them the right way. It's like playing connect-the-dots or paint-by-numbers. Writing fiction, on the other hand, is more like doing a freehand drawing of something you've never seen before. All you've got is a blank page and some ideas. Turning the ideas into a compelling picture is really, really hard. The only way to pull through is to let your gut take over, which can take some coaxing at first.
I coaxed my gut (ew!) by spending some time reading books before sitting down to write. I blew through Game of Thrones, the Kingkiller Chronicles, a few Stephen King novels, and some other stories that way—just reading for a couple of hours every night before my writing session. I found that, after reading, words and descriptions came more naturally. Approaching a new scene, I knew which angle felt right and which angle wouldn't work. I knew what kind of pacing to use and how often to pepper the action with descriptions. Simply getting the rhythm of good fiction in my head before writing worked wonders.
I also had to suppress the urge to write flowery prose. Long, latinate words are great for sounding authoritative when you're talking about graphics cards, but they're pretty awful when you're telling a story. Shorter, simpler words usually have a more vivid meaning in the reader's mind—they certainly do in mine—because they're used much more often in everyday life. So, somewhat counter-intuitively, simpler descriptions are more striking. Something like "Tom could perceive the mellifluous tittering of seagulls circumnavigating the iridescent estuary" looks very pretty, but it's tedious to read. It's also bland from a descriptive standpoint, because the words carry more of an abstract meaning than a visceral one. Replace with, "Tom heard the soft squawking of seagulls flying above the river mouth, where the muddy rapids spilled into the shimmering sea," and you've got the start of something.
I used two other tricks to try and smooth out the writing as much as possible. The first was to revise my last 1,000 words or so before writing anything new. That had two advantages: the same paragraphs would get revised multiple times over the course of several days, and revising would get me in the right state of mind to continue from where I left off, which prevented abrupt changes in pacing or style. There was one disadvantage, which was that by the end of each chapter, the writing was so polished that I was afraid of writing anything new. I repeatedly had to remind myself that it's okay to write a bad first draft—in fact, you pretty much have to start with a bad draft to get a good one down the road.
Of course, occasionally, a draft is really bad. In that case, as much as it may hurt, the best course of action is to select all, delete, and start over. First drafts can be especially shaky if you haven't written in a while, which is why I tried as much as possible not to take days off. After a long, exhausting day, writing even 100 words is better than writing nothing at all.
So that covers the writing part. The rest is all about the plot, which takes a whole other set of skills to pull off—not to mention a lot of sleepless nights trying to get your story out of a jam.
I don't know if there's a recipe for imagination, but I often found that ideas came pretty much randomly, whether I was thinking about the story or not. Many flashes of lucidity came while I was in the shower or trying to fall asleep. My solution was to download Evernote on my phone and write down ideas as soon as I was able, regardless of the time or place. It's tempting to think that you'll remember a good idea the next morning, but it doesn't always work out that way—and you'd be a fool to risk it. Reaching for your phone and typing a few words only takes a minute. Once your idea is committed to ASCII, falling asleep is much easier. Well, unless you get another idea after that. But they usually taper off... eventually.
Once you've got your ideas and your technique down, there are two ways to write. You can write as you go along, which some authors do quite successfully, or you can meticulously outline everything. I did a bit of both, although more of the latter at the beginning and more of the former toward the end. I had my main story outline, which I would then extend with sub-outlines for the different chapters. When I'd get stuck, I would outline the next chapter in order to pull through the current one. I found that, in many cases, writing toward a goal can be easier than ticking boxes.
The last trick I used was something that, after six long years of working for TR, seems almost natural: submit myself to criticism. After polishing up each chapter, I would print it out on my laser printer and show it to my girlfriend. She would read it and give me her feedback, which ranged from gushing to disappointed. Her input led to plenty of revisions and tweaks. I also got input from my father and a few friends of mine, and I made a substantial number of revisions based on their comments and critiques. The rule of thumb here is never to be defensive. If one person finds a problem with your story, then others likely will, too. And the more comfortable that person feels, the more likely they are to give honest feedback. The last thing you want is for test readers to feel they have to praise shoddy work.
And... I think that's about all there is to it. That, and a lot of hard work and perseverance.
I'm happy I wrote Fluke. It has some rough edges, but the feedback on Amazon and TR suggests I've managed to entertain at least a few total strangers, and that's really all I could ask for. I've also learned a lot about the writing process, and I'm eager to get started on a new story. For now, the hard part is to try and promote this thing so more people read it—and just like when I started to write, I have pretty much no idea what I'm doing.Why CS:GO is the best multiplayer shooter out there
I picked up Counter-Strike: Global Offensive last week. I don't know why it took me so long—the game came out in August, after all, and it costs only $15. Anyway, I was playing Battlefield 3 with a buddy of mine, and we were both getting slaughtered by a whole team's worth of veterans—you know, those folks with the golden eagles next to their names and every unlock in their arsenals. I mentioned CS:GO in passing, and my friend asked, "Why aren't we playing that right now?"
So we did. We logged out, opened up Steam, bought CS:GO, waited for the download to finish, and jumped in.
It took me a few hours to get back up to speed. This was my first time playing any version of CS in nearly six years, and I'd forgotten all the tricks—crouch to increase accuracy, walk to sneak up on enemies, take out the knife to run, camp whenever possible, and most of all, don't right-click to aim un-scoped weapons ('cause you can't). Making things even trickier, I had to familiarize myself with the slightly tweaked gameplay mechanics and new weapons in CS:GO. Somehow, the game felt both weirdly alien and tantalizingly familiar.
I pressed on. After a few hours, I rediscovered why CS is such a good game—and why other multiplayer shooters still pale in comparison.
It's not that other shooters aren't well designed or fun to play. A good round of BF3 (or whichever Call of Duty sequel all the pimple-faced teenagers are glued to right now) can be just as cathartic as any CS match. The problem is that, unlike CS, those games seem to require constant commitment—something I, as grown man with a job and hobbies other than gaming, can never quite muster.
With today's shooters, you've pretty much got to pick up the game at launch and play on a regular basis. The more you play, the higher you rise through the ranks, and the more weapons you unlock. If you only jump in occasionally (for, dare I say it, recreation), then there's no way to keep up with more committed players. You might be just as skilled as the next guy, but not having this unlock or that weapon may mean losing a fight nine times out of 10. That seems to happen whenever I return to BF3 after a long hiatus.
By contrast, CS is totally egalitarian. Everyone has access to the same items, and players aren't ranked. Nobody cares if you play four hours a day, seven days a week. All that matters is how well you negotiate the next firefight. You might take out half of the enemy team... but then again, a much less skilled player might blind you with a flashbang grenade and unload his top-of-the-line shotgun into your skull. It's not unusual to see a good player climb to the top of the scoreboard only to decline back into mediocrity. In CS, skill and alertness are your primary weapons—and when you get tired, there are no unlockables to help you keep your edge.
That's not to say CS takes the alertness requirement to an uncomfortable extreme. I gave up on the Modern Warfare series a long time ago for that reason: multiplayer skirmishes are just too damn fast and hectic. Drop your guard for a microsecond, and someone is guaranteed to air out your skull with a few well-placed bullets. That kind of constant stress gets exhausting after a while. In CS, Valve has tuned the cadence and pacing almost to perfection. Some rounds are fast and intense, while others go on for several minutes, with two or three surviving players hunting each other in a deadly game of cat and mouse. Players are encouraged to retreat and flank enemies, too, so some battles are interrupted and resumed elsewhere, with wounded combatants quietly sneaking around, trying to get the drop on each other.
CS:GO is just loads of fun. I can jump in anytime I want, play for a few hours, and then quit until I feel like playing again. I never feel an obligation to grind my way through defeat after defeat just to catch up to other players. Nor do I find myself suppressing the urge to play because I know I've fallen too far behind.
Before I sign off, let me address why I think CS:GO is worth picking up over the classic CS 1.6 or CS:S. Valve hasn't modernized the basic mechanics or scrapped the classic maps—that would be sacrilege—but it's made a plethora of little enhancements that, in my view, make the game more modern and enjoyable. For example, players now get assist points when they inflict damage but die before getting a full kill. In older versions of CS, you could get someone down to a single health point and receive zero credit when another player finished him off. No longer.
On top of that, Valve has transplanted the multiplayer matchmaking and dedicated server mojo from its other, more recent titles, so getting into the action (either alone or with your friends) is now much easier. There are new game modes, if you don't mind the odd departure from the classic formula, and the graphics have gotten a much-needed coat of fresh paint. CS:GO still looks slightly dated next to Battlefield 3, but it's nowhere near as old-school as even CS:S. Don't get me wrong; graphics don't make or break a good game. But that doesn't mean a little eye candy can't improve the overall experience.
When I first read about CS:GO, I expected it to be a watered-down, prettied-up version of the original geared toward console players. Now, I see it's every bit as authentic as its predecessors, and it actually improves upon them in very tangible ways. If you've given up on other multiplayer shooters out of frustration—as I almost did—then try CS:GO. Trust me. For $14.99, it's more than worth a shot—and Steam has it on sale for $11.24 today.
Well, I've done it now. Like a mistreated spouse returning to his abuser, I've crawled back into Apple's aluminum, glass, and white polycarbonate arms—I've gone and bought an iPhone 5.
That would be a completely unremarkable purchase if I hadn't updated this very blog a month ago with a long tirade about Apple's failings. At the time, I was sick of Apple Maps, sick of Android users getting cool features I didn't have, and unimpressed with what I'd read about the iPhone 5. After considering my options carefully, I became determined to grab whatever Nexus phone Google cranked out next.
What the heck happened, then?
Funnily enough, the biggest factor was actually walking into an Apple Store and trying an iPhone 5. Within less than a minute, I realized Apple has done a pretty poor job of advertising this thing. Yes, the iPhone 5 has a larger screen. Yes, it's got LTE connectivity, a slightly thinner design, and new earbuds, too. But what makes the iPhone 5 amazing is how frickin' fast it feels. Web pages and apps load in the blink of an eye. Multitasking is almost seamless. Every corner of the user interface responds instantly with silky-smooth transitions. It's really a sight to behold, especially for someone upgrading from a two-year-old phone—as most prospective iPhone 5 buyers probably are.
I played with some Android phones immediately afterward, but none of them gave me that same sense of flawless fluidity. Not even the Galaxy S3 felt quite as quick. It didn't help that the TouchWiz user interface looked as ugly and messy as ever. Even my girlfriend, who's been using a beat-up HTC Desire with Android 2.2 for the past couple of years, commented on how uninspired the Galaxy S3's software looked. (This was with Ice Cream Sandwich, by the way. Jelly Bean still isn't out officially on the S3 here.)
I went home perplexed, thinking the upcoming Nexus 4 would perhaps be better. As more and more details leaked out, though, it became clear that this wouldn't be a premium phone like the iPhone 5 or Galaxy S3. With a $329 contract-free asking price, no LTE support, and just 8GB of storage on the base model, the Nexus 4 has turned out to be more of a lower-cost, no-frills alternative to the Apple and Samsung flagships. That's fine, of course. Good on Google for offering a reasonably priced, contract-free smartphone that doesn't suck. To someone both able and willing to spring for the iPhone 5, though, the Nexus 4 doesn't look like a very credible alternative.
I mean, just look at AnandTech's performance preview. The Nexus 4 trails the iPhone 5 by a wide margin in most graphics and web browsing performance tests, and its battery life is markedly worse. This is no iPhone 5 killer.
So, it came down to the iPhone 5 and the Galaxy S3. In one corner, I had the fastest and most finely crafted smartphone on the market—an exquisitely designed piece of technology so thin and light it almost felt like a plastic prop. In the other corner, I had ugly software and an even uglier PenTile display wrapped inside a bigger, heavier phone made out of actual plastic. Choosing option B would entail a trip through the time-consuming world of custom ROMs, and I'd be stuck with PenTile's fuzzy-looking fonts, too.
I chose option A.
The honeymoon lasted about 24 hours. All of a sudden, I realized the phone had a dark, yellow smear at the top of the screen. The smear was particularly noticeable in the Kindle app, which shines a light on display uniformity issues by hiding UI elements, including the status bar.I Googled around and found forum posts advising me to wait a few days, because apparently, the yellow patch was a dab of glue that hadn't fully cured yet. I waited. The yellow smear stayed.
After nearly a full week, I took my phone to the Apple Store. I set up an appointment at the Genius Bar, waited about 30 minutes, and was finally greeted by a long-haired technician wearing a pair of those weird Vibram toe-shoes. (You know the ones.) I described the problem, but under the store's bright fluorescent lights, it was barely noticeable. Worse, I'd noticed that other iPhone 5s on display also had a slight yellow haze at the top of their screens. Some humming and hawwing ensued, and then the technician told me, "Yeah, to be honest, I don't really see it."
Then he offered to replace the phone anyway.
I asked if I could compare the replacement to my phone. "Sure, no problem," he said. The replacement wouldn't start up fully without a SIM card, so he went and fetched a spare SIM from the back and gave me free rein to load up white screens and compare the phones side by side. Ultimately, we agreed there wasn't much of a difference. But the replacement looked slightly better, so I asked if I could keep it. "Sure," said the Apple Genius with the weird toe-shoes. While he was filing the paperwork, I asked if the replacement was a refurb. "Nope, it's a brand-new phone," he said. "We don't actually have refurbs yet."
A few minutes later, I walked out of the Apple Store with a new iPhone 5 fresh from the factory—and the realization that Apple has some of the finest after-sales support on the planet.
Sadly, my adventure didn't end there. The replacement iPhone 5 turned out to have slightly wonky color calibration. Grays were reddish, some hues were oversaturated, and contrast seemed lacking compared to my previous iPhone 5. I hadn't noticed any of those things in the store, but they started to bother me over the next few days. I was even more miffed when I realized the headphone jack was mounted at a slight angle. Eventually, riddled with shame at my pickiness (and maybe some amount of undiagnosed OCD, too), I visited another Apple Store and asked if I could get the phone swapped out again.
The technician who greeted me this time wasn't wearing weird toe-shoes, but he was just as accommodating. He told me the display was within spec, but the slightly slanted headphone jack was "good enough for him" to justify an exchange. Again, a brand-new iPhone 5 fresh from the factory was fetched from behind the counter. Again, I asked to compare my faulty phone to the replacement, and again, the technician obliged.
The replacement I received is utterly perfect. The screen has a beautiful, warm color temperature, which is only slightly cooler than that of my desktop monitors (and TR North's iPad 3). Grays look gray, blacks look black, the backlight doesn't leak, and at maximum brightness, images are shockingly clear and vivid. Next to it, my iPhone 4's screen looks like an old 1980s TV—all murky, washed-out, and bluish. My new iPhone 5 is so perfect, in fact, that I'm now terrified of dropping it and breaking it
I wouldn't have to cry myself to sleep if that ever happened, though. I ponied up $99 for AppleCare, which now includes a provision allowing for up to two instances of accidental, user-inflicted damage. Over the next couple of years, I can break my phone twice and, each time, get a replacement for only $49.
I went back to play with the Galaxy S3 earlier this week. After obsessing over the iPhone 5's color calibration and display imperfections for days on end, I noticed not without amusement that both S3s on display had wildly miscalibrated screens, with pale-green whites and overblown colors. Also, up close, the side-effects of the PenTile subpixel layout were just as obvious and ugly as I remembered. Text looked noticeably fuzzy compared to the iPhone 5's beautiful IPS panel.
I'm still not totally happy with where iOS is at the moment. The Maps app remains imperfect. Although I've finally found a great public transit app ("Transit"), I wish bus and train directions were built-in again. The Mail app is still missing some features, like full conversation view and Priority Inbox, and the new App Store interface feels clunky. That said, after spending some time with recent Android phones, I get the sense that iOS still offers a cleaner, smoother experience overall. I also feel like Apple offers a level of polish the competition lacks, and using another platform would leave me with more grievances—not fewer.
It's no contest on the hardware side, though. The iPhone 5 actually feels too light, but the construction is anything but cheap. While I was still getting used to the weight, the phoned slipped out of my hand, bounced on my desk, and landed on my carpet. The phone was unscathed, but the anodized aluminum band left a noticeable gash in my desk. This thing is built out of metal and glass, and there's no mistaking that fact when you run your fingers along its ridges, buttons, and panels. Also, like most other Apple products I've owned, the iPhone 5 is so beautifully made that I sometimes pick it up just to admire it.
Finally, Apple's support staff has displayed a tremendous level of care and attention to user satisfaction. Being able to walk in with a minor, almost frivolous issue and come out less than an hour later with a brand new phone is pretty incredible. Maybe Apple Geniuses are simply compensating for quality control issues—and certainly, getting a perfect iPhone 5 the first time around would have been great. However, other phone makers also ship lemons. Would any of them exchange a product on the spot because of a minor cosmetic flaw or a problem their technicians can't replicate? I doubt it.
One last thing. After getting the iPhone 5, I relieved my girlfriend of her crummy HTC phone and gave her my iPhone 4. Those two handsets felt pretty similar when we got them a couple of years back, but the HTC has aged rather poorly. It never got any official updates past Android 2.2, and it's gotten slower and slower over time. I tried rooting it and installing Jelly Bean, which took me the better part of a Saturday afternoon, and the result was almost unusably slow. Unofficial Android 2.3 ROMs were about the best I could do, and they still felt somewhat sluggish and choppy—though better than the stock ROM. Meanwhile, the iPhone 4 happily runs iOS 6 with no coaxing or hacks, and it feels considerably faster and more responsive than the Desire—surprising, considering the two devices came out literally one month apart and were pretty comparable at the time.
That, plus my abysmal experience using a friend's Nexus S running Ice Cream Sandwich, suggests Apple phones stand the test of time better than their peers. Considering I don't plan to upgrade again until 2014, I definitely find that reassuring.
|HP offers Leap Motion-infused keyboard with desktop, all-in-one PCs||18|
|Friday night topic: Awkward moments||83|
|Deal of the week: IPS displays and 7'' tablets||23|
|Dell's Venue 8 Pro will be $99 at select Microsoft Stores on Monday||72|
|Brawling my way through Batman: Arkham Origins||26|
|Heavyweight rematch: Gigabyte X79-UP4 vs. MSI X79A-GD45 Plus||15|
|Thursday Night Shortbread||52|
|Acer's Iconia W4 tablet offers Bay Trail, 8'' display for $330||30|
|They had a 40M mail-in-rebate.||+31|