You probably saw that Gartner report earlier this week about the sluggishness of PC shipments last quarter. Shipments were so sluggish, according to Gartner, that they shrank by almost 5% compared to the same quarter in 2011. I'm sure there were many factors at play, but Gartner pins the blame on one in particular: users relinquishing PCs for daily use.
Whereas as once we imagined a world in which individual users would have both a PC and a tablet as personal devices, we increasingly suspect that most individuals will shift consumption activity to a personal tablet, and perform creative and administrative tasks on a shared PC. There will be some individuals who retain both, but we believe they will be exception and not the norm. Therefore, we hypothesize that buyers will not replace secondary PCs in the household, instead allowing them to age out and shifting consumption to a tablet.
I'm a PC enthusiast, and chances are you, the reader, are as well. We might therefore find it hard to imagine folks ditching their computers for comparatively limited tablets. I mean, you can't do much on a tablet, can you? Most of them lack Flash support, make multitasking awkward at best, and don't play terribly well with keyboards. I use mine for e-book reading and some light gaming, but I would never dream of taking it to a trade show instead of a laptop. No way.
Yet Gartner's suspicion is truer than you might think—and as it happens, I have some very convincing anecdotal evidence to support it.
I bought my girlfriend an iPad 4 for Christmas. Well, technically, we went to the Apple Store and picked it out together. Aline chose the base Wi-Fi model (in white) and a matching SmartCover (in pink) for a total of $538 U.S. before tax. She unwrapped everything a couple of days early (because waiting sucks), played with it some, and then promptly stashed away her notebook PC—a relatively speedy 13" machine with a Trinity APU and Windows 8.
The laptop has been sitting under her desk ever since. She hasn't switched it on in almost a month. Not once.
And really, the substitution makes perfect sense, if you think about it from her perspective. The iPad has a great many advantages over a cheap consumer laptop:
For the price of the iPad and SmartCover, Aline could have snagged an Asus VivoBook X202E, which is selling for $549.99 at Newegg right now. I had a chance to play with that pseudo-ultrabook before Geoff got to work on his review, though, and I wasn't impressed. The thing is abysmally slow, has a really ugly screen, and seems to run its fan continuously, even at idle. Geoff measured the battery life at four hours, which sort of sucks. Overall, I found it unpleasant and frustrating to use.
Sure, the X202E runs things the iPad cannot—things like Word, Excel, Photoshop, and a full-featured operating system with proper file management. If you need to do real work, then there's no substitute for a real laptop (although you'd be surprised how much an iPad can do with a Bluetooth keyboard and Apple's iLife apps). The thing is, however, most consumers already have an old PC they can use to write resumes or telecommute. Why should they buy a new laptop when a tablet can serve their other needs so much better?
I can't think of a good argument.
When the iPad came out in early 2010, I thought of it as a nifty companion device for folks who already owned smartphones and laptops. Tablets seemed, in short, like gadgets for the technologically privileged—cool but unnecessary. Yet in three short years, these new devices have become something else altogether. In a very real sense, they've become highly compelling replacements for consumer laptops in non-productivity usage scenarios. That's exciting... and, frankly, a little scary.I wrote a novella! Here are some things I learned
Fluke: Langara's Prize will be free to download on Amazon.com (and on Amazon's international sites) until Saturday at 2:00 AM CST.
We've all written fiction. It might have been as part of a school assignment, or a loved one might have asked, "Did you take out the trash?" and you might have replied, "Yes, of course!" before going on to actually do it. That counts, too.
This year, I went a little further with the whole concept and wrote a 41,000 word novella. It's called Fluke: Langara's Prize, and it went up on Amazon last week. Scott even gave it a a nice little introduction in the news section. He also edited it and published it. In exchange, TR is getting a cut of the proceeds.
I wrote Fluke in one- to two-hour stretches, five or six days a week, over a period of about six months. I started putting the first words down in early December 2011 and finished in late May 2012. It was a pretty wild ride, and I enjoyed most of it—even if by the end, I was starting to feel exhausted from the extra workload. Slowly putting Fluke together taught me a number of valuable things about writing, and I figured they'd be a good topic for a blog post. So, here goes.
The first thing I learned is that writing fiction is pretty counter-intuitive for a journalist. My day job at TR is all about relating facts and events in the most precise and accurate way possible. I already know everything I need to say; the trick is saying them the right way. It's like playing connect-the-dots or paint-by-numbers. Writing fiction, on the other hand, is more like doing a freehand drawing of something you've never seen before. All you've got is a blank page and some ideas. Turning the ideas into a compelling picture is really, really hard. The only way to pull through is to let your gut take over, which can take some coaxing at first.
I coaxed my gut (ew!) by spending some time reading books before sitting down to write. I blew through Game of Thrones, the Kingkiller Chronicles, a few Stephen King novels, and some other stories that way—just reading for a couple of hours every night before my writing session. I found that, after reading, words and descriptions came more naturally. Approaching a new scene, I knew which angle felt right and which angle wouldn't work. I knew what kind of pacing to use and how often to pepper the action with descriptions. Simply getting the rhythm of good fiction in my head before writing worked wonders.
I also had to suppress the urge to write flowery prose. Long, latinate words are great for sounding authoritative when you're talking about graphics cards, but they're pretty awful when you're telling a story. Shorter, simpler words usually have a more vivid meaning in the reader's mind—they certainly do in mine—because they're used much more often in everyday life. So, somewhat counter-intuitively, simpler descriptions are more striking. Something like "Tom could perceive the mellifluous tittering of seagulls circumnavigating the iridescent estuary" looks very pretty, but it's tedious to read. It's also bland from a descriptive standpoint, because the words carry more of an abstract meaning than a visceral one. Replace with, "Tom heard the soft squawking of seagulls flying above the river mouth, where the muddy rapids spilled into the shimmering sea," and you've got the start of something.
I used two other tricks to try and smooth out the writing as much as possible. The first was to revise my last 1,000 words or so before writing anything new. That had two advantages: the same paragraphs would get revised multiple times over the course of several days, and revising would get me in the right state of mind to continue from where I left off, which prevented abrupt changes in pacing or style. There was one disadvantage, which was that by the end of each chapter, the writing was so polished that I was afraid of writing anything new. I repeatedly had to remind myself that it's okay to write a bad first draft—in fact, you pretty much have to start with a bad draft to get a good one down the road.
Of course, occasionally, a draft is really bad. In that case, as much as it may hurt, the best course of action is to select all, delete, and start over. First drafts can be especially shaky if you haven't written in a while, which is why I tried as much as possible not to take days off. After a long, exhausting day, writing even 100 words is better than writing nothing at all.
So that covers the writing part. The rest is all about the plot, which takes a whole other set of skills to pull off—not to mention a lot of sleepless nights trying to get your story out of a jam.
I don't know if there's a recipe for imagination, but I often found that ideas came pretty much randomly, whether I was thinking about the story or not. Many flashes of lucidity came while I was in the shower or trying to fall asleep. My solution was to download Evernote on my phone and write down ideas as soon as I was able, regardless of the time or place. It's tempting to think that you'll remember a good idea the next morning, but it doesn't always work out that way—and you'd be a fool to risk it. Reaching for your phone and typing a few words only takes a minute. Once your idea is committed to ASCII, falling asleep is much easier. Well, unless you get another idea after that. But they usually taper off... eventually.
Once you've got your ideas and your technique down, there are two ways to write. You can write as you go along, which some authors do quite successfully, or you can meticulously outline everything. I did a bit of both, although more of the latter at the beginning and more of the former toward the end. I had my main story outline, which I would then extend with sub-outlines for the different chapters. When I'd get stuck, I would outline the next chapter in order to pull through the current one. I found that, in many cases, writing toward a goal can be easier than ticking boxes.
The last trick I used was something that, after six long years of working for TR, seems almost natural: submit myself to criticism. After polishing up each chapter, I would print it out on my laser printer and show it to my girlfriend. She would read it and give me her feedback, which ranged from gushing to disappointed. Her input led to plenty of revisions and tweaks. I also got input from my father and a few friends of mine, and I made a substantial number of revisions based on their comments and critiques. The rule of thumb here is never to be defensive. If one person finds a problem with your story, then others likely will, too. And the more comfortable that person feels, the more likely they are to give honest feedback. The last thing you want is for test readers to feel they have to praise shoddy work.
And... I think that's about all there is to it. That, and a lot of hard work and perseverance.
I'm happy I wrote Fluke. It has some rough edges, but the feedback on Amazon and TR suggests I've managed to entertain at least a few total strangers, and that's really all I could ask for. I've also learned a lot about the writing process, and I'm eager to get started on a new story. For now, the hard part is to try and promote this thing so more people read it—and just like when I started to write, I have pretty much no idea what I'm doing.Why CS:GO is the best multiplayer shooter out there
I picked up Counter-Strike: Global Offensive last week. I don't know why it took me so long—the game came out in August, after all, and it costs only $15. Anyway, I was playing Battlefield 3 with a buddy of mine, and we were both getting slaughtered by a whole team's worth of veterans—you know, those folks with the golden eagles next to their names and every unlock in their arsenals. I mentioned CS:GO in passing, and my friend asked, "Why aren't we playing that right now?"
So we did. We logged out, opened up Steam, bought CS:GO, waited for the download to finish, and jumped in.
It took me a few hours to get back up to speed. This was my first time playing any version of CS in nearly six years, and I'd forgotten all the tricks—crouch to increase accuracy, walk to sneak up on enemies, take out the knife to run, camp whenever possible, and most of all, don't right-click to aim un-scoped weapons ('cause you can't). Making things even trickier, I had to familiarize myself with the slightly tweaked gameplay mechanics and new weapons in CS:GO. Somehow, the game felt both weirdly alien and tantalizingly familiar.
I pressed on. After a few hours, I rediscovered why CS is such a good game—and why other multiplayer shooters still pale in comparison.
It's not that other shooters aren't well designed or fun to play. A good round of BF3 (or whichever Call of Duty sequel all the pimple-faced teenagers are glued to right now) can be just as cathartic as any CS match. The problem is that, unlike CS, those games seem to require constant commitment—something I, as grown man with a job and hobbies other than gaming, can never quite muster.
With today's shooters, you've pretty much got to pick up the game at launch and play on a regular basis. The more you play, the higher you rise through the ranks, and the more weapons you unlock. If you only jump in occasionally (for, dare I say it, recreation), then there's no way to keep up with more committed players. You might be just as skilled as the next guy, but not having this unlock or that weapon may mean losing a fight nine times out of 10. That seems to happen whenever I return to BF3 after a long hiatus.
By contrast, CS is totally egalitarian. Everyone has access to the same items, and players aren't ranked. Nobody cares if you play four hours a day, seven days a week. All that matters is how well you negotiate the next firefight. You might take out half of the enemy team... but then again, a much less skilled player might blind you with a flashbang grenade and unload his top-of-the-line shotgun into your skull. It's not unusual to see a good player climb to the top of the scoreboard only to decline back into mediocrity. In CS, skill and alertness are your primary weapons—and when you get tired, there are no unlockables to help you keep your edge.
That's not to say CS takes the alertness requirement to an uncomfortable extreme. I gave up on the Modern Warfare series a long time ago for that reason: multiplayer skirmishes are just too damn fast and hectic. Drop your guard for a microsecond, and someone is guaranteed to air out your skull with a few well-placed bullets. That kind of constant stress gets exhausting after a while. In CS, Valve has tuned the cadence and pacing almost to perfection. Some rounds are fast and intense, while others go on for several minutes, with two or three surviving players hunting each other in a deadly game of cat and mouse. Players are encouraged to retreat and flank enemies, too, so some battles are interrupted and resumed elsewhere, with wounded combatants quietly sneaking around, trying to get the drop on each other.
CS:GO is just loads of fun. I can jump in anytime I want, play for a few hours, and then quit until I feel like playing again. I never feel an obligation to grind my way through defeat after defeat just to catch up to other players. Nor do I find myself suppressing the urge to play because I know I've fallen too far behind.
Before I sign off, let me address why I think CS:GO is worth picking up over the classic CS 1.6 or CS:S. Valve hasn't modernized the basic mechanics or scrapped the classic maps—that would be sacrilege—but it's made a plethora of little enhancements that, in my view, make the game more modern and enjoyable. For example, players now get assist points when they inflict damage but die before getting a full kill. In older versions of CS, you could get someone down to a single health point and receive zero credit when another player finished him off. No longer.
On top of that, Valve has transplanted the multiplayer matchmaking and dedicated server mojo from its other, more recent titles, so getting into the action (either alone or with your friends) is now much easier. There are new game modes, if you don't mind the odd departure from the classic formula, and the graphics have gotten a much-needed coat of fresh paint. CS:GO still looks slightly dated next to Battlefield 3, but it's nowhere near as old-school as even CS:S. Don't get me wrong; graphics don't make or break a good game. But that doesn't mean a little eye candy can't improve the overall experience.
When I first read about CS:GO, I expected it to be a watered-down, prettied-up version of the original geared toward console players. Now, I see it's every bit as authentic as its predecessors, and it actually improves upon them in very tangible ways. If you've given up on other multiplayer shooters out of frustration—as I almost did—then try CS:GO. Trust me. For $14.99, it's more than worth a shot—and Steam has it on sale for $11.24 today.
Well, I've done it now. Like a mistreated spouse returning to his abuser, I've crawled back into Apple's aluminum, glass, and white polycarbonate arms—I've gone and bought an iPhone 5.
That would be a completely unremarkable purchase if I hadn't updated this very blog a month ago with a long tirade about Apple's failings. At the time, I was sick of Apple Maps, sick of Android users getting cool features I didn't have, and unimpressed with what I'd read about the iPhone 5. After considering my options carefully, I became determined to grab whatever Nexus phone Google cranked out next.
What the heck happened, then?
Funnily enough, the biggest factor was actually walking into an Apple Store and trying an iPhone 5. Within less than a minute, I realized Apple has done a pretty poor job of advertising this thing. Yes, the iPhone 5 has a larger screen. Yes, it's got LTE connectivity, a slightly thinner design, and new earbuds, too. But what makes the iPhone 5 amazing is how frickin' fast it feels. Web pages and apps load in the blink of an eye. Multitasking is almost seamless. Every corner of the user interface responds instantly with silky-smooth transitions. It's really a sight to behold, especially for someone upgrading from a two-year-old phone—as most prospective iPhone 5 buyers probably are.
I played with some Android phones immediately afterward, but none of them gave me that same sense of flawless fluidity. Not even the Galaxy S3 felt quite as quick. It didn't help that the TouchWiz user interface looked as ugly and messy as ever. Even my girlfriend, who's been using a beat-up HTC Desire with Android 2.2 for the past couple of years, commented on how uninspired the Galaxy S3's software looked. (This was with Ice Cream Sandwich, by the way. Jelly Bean still isn't out officially on the S3 here.)
I went home perplexed, thinking the upcoming Nexus 4 would perhaps be better. As more and more details leaked out, though, it became clear that this wouldn't be a premium phone like the iPhone 5 or Galaxy S3. With a $329 contract-free asking price, no LTE support, and just 8GB of storage on the base model, the Nexus 4 has turned out to be more of a lower-cost, no-frills alternative to the Apple and Samsung flagships. That's fine, of course. Good on Google for offering a reasonably priced, contract-free smartphone that doesn't suck. To someone both able and willing to spring for the iPhone 5, though, the Nexus 4 doesn't look like a very credible alternative.
I mean, just look at AnandTech's performance preview. The Nexus 4 trails the iPhone 5 by a wide margin in most graphics and web browsing performance tests, and its battery life is markedly worse. This is no iPhone 5 killer.
So, it came down to the iPhone 5 and the Galaxy S3. In one corner, I had the fastest and most finely crafted smartphone on the market—an exquisitely designed piece of technology so thin and light it almost felt like a plastic prop. In the other corner, I had ugly software and an even uglier PenTile display wrapped inside a bigger, heavier phone made out of actual plastic. Choosing option B would entail a trip through the time-consuming world of custom ROMs, and I'd be stuck with PenTile's fuzzy-looking fonts, too.
I chose option A.
The honeymoon lasted about 24 hours. All of a sudden, I realized the phone had a dark, yellow smear at the top of the screen. The smear was particularly noticeable in the Kindle app, which shines a light on display uniformity issues by hiding UI elements, including the status bar.I Googled around and found forum posts advising me to wait a few days, because apparently, the yellow patch was a dab of glue that hadn't fully cured yet. I waited. The yellow smear stayed.
After nearly a full week, I took my phone to the Apple Store. I set up an appointment at the Genius Bar, waited about 30 minutes, and was finally greeted by a long-haired technician wearing a pair of those weird Vibram toe-shoes. (You know the ones.) I described the problem, but under the store's bright fluorescent lights, it was barely noticeable. Worse, I'd noticed that other iPhone 5s on display also had a slight yellow haze at the top of their screens. Some humming and hawwing ensued, and then the technician told me, "Yeah, to be honest, I don't really see it."
Then he offered to replace the phone anyway.
I asked if I could compare the replacement to my phone. "Sure, no problem," he said. The replacement wouldn't start up fully without a SIM card, so he went and fetched a spare SIM from the back and gave me free rein to load up white screens and compare the phones side by side. Ultimately, we agreed there wasn't much of a difference. But the replacement looked slightly better, so I asked if I could keep it. "Sure," said the Apple Genius with the weird toe-shoes. While he was filing the paperwork, I asked if the replacement was a refurb. "Nope, it's a brand-new phone," he said. "We don't actually have refurbs yet."
A few minutes later, I walked out of the Apple Store with a new iPhone 5 fresh from the factory—and the realization that Apple has some of the finest after-sales support on the planet.
Sadly, my adventure didn't end there. The replacement iPhone 5 turned out to have slightly wonky color calibration. Grays were reddish, some hues were oversaturated, and contrast seemed lacking compared to my previous iPhone 5. I hadn't noticed any of those things in the store, but they started to bother me over the next few days. I was even more miffed when I realized the headphone jack was mounted at a slight angle. Eventually, riddled with shame at my pickiness (and maybe some amount of undiagnosed OCD, too), I visited another Apple Store and asked if I could get the phone swapped out again.
The technician who greeted me this time wasn't wearing weird toe-shoes, but he was just as accommodating. He told me the display was within spec, but the slightly slanted headphone jack was "good enough for him" to justify an exchange. Again, a brand-new iPhone 5 fresh from the factory was fetched from behind the counter. Again, I asked to compare my faulty phone to the replacement, and again, the technician obliged.
The replacement I received is utterly perfect. The screen has a beautiful, warm color temperature, which is only slightly cooler than that of my desktop monitors (and TR North's iPad 3). Grays look gray, blacks look black, the backlight doesn't leak, and at maximum brightness, images are shockingly clear and vivid. Next to it, my iPhone 4's screen looks like an old 1980s TV—all murky, washed-out, and bluish. My new iPhone 5 is so perfect, in fact, that I'm now terrified of dropping it and breaking it
I wouldn't have to cry myself to sleep if that ever happened, though. I ponied up $99 for AppleCare, which now includes a provision allowing for up to two instances of accidental, user-inflicted damage. Over the next couple of years, I can break my phone twice and, each time, get a replacement for only $49.
I went back to play with the Galaxy S3 earlier this week. After obsessing over the iPhone 5's color calibration and display imperfections for days on end, I noticed not without amusement that both S3s on display had wildly miscalibrated screens, with pale-green whites and overblown colors. Also, up close, the side-effects of the PenTile subpixel layout were just as obvious and ugly as I remembered. Text looked noticeably fuzzy compared to the iPhone 5's beautiful IPS panel.
I'm still not totally happy with where iOS is at the moment. The Maps app remains imperfect. Although I've finally found a great public transit app ("Transit"), I wish bus and train directions were built-in again. The Mail app is still missing some features, like full conversation view and Priority Inbox, and the new App Store interface feels clunky. That said, after spending some time with recent Android phones, I get the sense that iOS still offers a cleaner, smoother experience overall. I also feel like Apple offers a level of polish the competition lacks, and using another platform would leave me with more grievances—not fewer.
It's no contest on the hardware side, though. The iPhone 5 actually feels too light, but the construction is anything but cheap. While I was still getting used to the weight, the phoned slipped out of my hand, bounced on my desk, and landed on my carpet. The phone was unscathed, but the anodized aluminum band left a noticeable gash in my desk. This thing is built out of metal and glass, and there's no mistaking that fact when you run your fingers along its ridges, buttons, and panels. Also, like most other Apple products I've owned, the iPhone 5 is so beautifully made that I sometimes pick it up just to admire it.
Finally, Apple's support staff has displayed a tremendous level of care and attention to user satisfaction. Being able to walk in with a minor, almost frivolous issue and come out less than an hour later with a brand new phone is pretty incredible. Maybe Apple Geniuses are simply compensating for quality control issues—and certainly, getting a perfect iPhone 5 the first time around would have been great. However, other phone makers also ship lemons. Would any of them exchange a product on the spot because of a minor cosmetic flaw or a problem their technicians can't replicate? I doubt it.
One last thing. After getting the iPhone 5, I relieved my girlfriend of her crummy HTC phone and gave her my iPhone 4. Those two handsets felt pretty similar when we got them a couple of years back, but the HTC has aged rather poorly. It never got any official updates past Android 2.2, and it's gotten slower and slower over time. I tried rooting it and installing Jelly Bean, which took me the better part of a Saturday afternoon, and the result was almost unusably slow. Unofficial Android 2.3 ROMs were about the best I could do, and they still felt somewhat sluggish and choppy—though better than the stock ROM. Meanwhile, the iPhone 4 happily runs iOS 6 with no coaxing or hacks, and it feels considerably faster and more responsive than the Desire—surprising, considering the two devices came out literally one month apart and were pretty comparable at the time.
That, plus my abysmal experience using a friend's Nexus S running Ice Cream Sandwich, suggests Apple phones stand the test of time better than their peers. Considering I don't plan to upgrade again until 2014, I definitely find that reassuring.My six days with Windows 8
So, I installed Windows 8 on my desktop PC last week.
It was late on the eve of the launch. The download link for the $39.99 Win8 Pro upgrade had just gone live. I felt that familiar twinge in my stomach—the one that always precedes major upgrades, especially those conducted when they really shouldn't be. My rational side tried to argue against clicking the button. It made some pretty good points, too: I had to work in the morning, and if something went wrong, I wouldn't have time to fix it. My work PC would be hosed for the next day. However, my impulsive side retorted with an extremely persuasive argument: "Dude, you could upgrade right now." And so I clicked.
I lucked out, because everything went without a hitch. I downloaded the installer and ran it straight from Windows 7. It asked me to uninstall a couple of incompatible applications, rebooted my computer a couple of times, and showed me a fine collection of progress bars. Oh, and I think I had to answer a few questions, too, like whether I wanted to install right away or make a bootable USB thumb drive first. (I chose option A.) Finally, at 10:50 PM on the evening of Thursday, October 25, my PC booted into Windows 8 for the first time.
I stayed up entirely too late that night bobbing back and forth between
Metro Modern UI tiles and the new-and-improved desktop. I fulfilled Microsoft's wishes and turned my local user account into an online one, which let Windows 8 sync my settings with the cloud. I agonized over which background graphic and color combo to use for the Start screen, and I wrestled with the hot corners, teaching myself to use the Charms bar and the new multitasking mojo on the left side of the screen. I stormed the Windows Store and downloaded a whole bunch of Modern UI apps to see what third-party developers had cooked up. I bounced back to the desktop and marveled at the multi-monitor taskbar and ribbon UI in the File Explorer. Ooh! Aah...
Mostly, I played around like an excited child. I've pretty much always done that after OS upgrades.
The next morning, I got up for work and was dismayed to find that, when I was actually being productive, Windows 8 really didn't feel all that different from Windows 7.
Then the sound in my headphones went out, and Creative's X-Fi driver caused a blue screen of death, complete with the new sad-face smiley. Classic Creative! Everything worked fine after I rebooted and installed the X-Fi's Windows 8 x64 beta driver, though. I guess Windows 8 is happier with bespoke drivers, even if it'll take your Win7 ones in a pinch. Then again, this is Creative we're talking about. Maybe I encountered what underpaid Singaporean developers call a "worst-case scenario."
I encountered only one other hitch, which is that Sublime Text has mysteriously disappeared from the "Open with" list for HTML files. It refuses to return no matter how many times I manually select it in the new "Choose default program" pop-up. Oddly enough, Sublime Text is still right there in the File Explorer ribbon's "Open" menu. Hmm.
Back to my point: somehow, Windows 8 doesn't really feel like an upgrade—not in the way Windows 7 did over Vista, and definitely not in the way Vista did over XP. When I'm busy working in the desktop and ignoring the Modern UI Start screen, which is about 99.9% of the time, Windows 8 feels more like a Windows 7 service pack with a custom skin than a whole new operating system. It's kind of underwhelming. Now, don't get me wrong; I don't regret upgrading. It only set me back 40 bucks, and the improvements I do notice (the new File Explorer, Task Manager, file copy dialogs, multi-monitor taskbar, and so forth) are very much welcome. It's just that... well, sometimes, I find myself looking around and saying, "That's it?"
I tried spending some time in the Modern UI interface, thinking some exciting new paradigms might be there waiting to be discovered. Mostly, what I found was unfinished apps that poorly replicated the functionality of major websites. Even apps that actually did something useful, like the Modern UI version of Skype, felt pared down and lacking compared to their desktop counterparts. Not that I found many of those. The Windows Store catalog is awfully thin right now. I'm only using one Windows Store app with any regularity, and that's the Windows 8 version of Jetpack Joyride.
I think that's the problem with Modern UI for us desktop users. You see, on tablets and smartphones, mobile apps like IMDB and Yelp and Facebook make a lot of sense. They're usually easier to navigate than the corresponding websites on a tiny touch screen, and they're often faster to open, as well. But on a desktop? You've got a keyboard, a mouse, and a big screen right there in front of you. Websites load in a picosecond, and you get to navigate them with a pretty optimal set of tools. What's the point of going to the Start screen and loading up a big, clunky, dumbed-down app when you can load up a full-featured website in a fraction of the time?
I can think of only one instance where Modern UI would come in handy on the desktop. Let's say I had a Windows 8 tablet or a touch-enabled laptop. Let's say I'd gotten awfully cozy with a certain Modern UI app. Now, I'd be delighted if I could use it on my big computer. The alternatives—having to use a website with a different interface or, worse, another piece of software entirely—wouldn't be nearly as convenient. Not having a full array of features wouldn't matter, because replicating familiarity would be the whole point.
Maybe I should take the touch-enabled laptop out of that hypothetical, though. I went and tried some of those on Friday, and let me just say Modern UI looks really, really awful on a 15" touch screen. The fact that the on-screen keyboard pops up when you tap into a text field, even though you've got a hardware keyboard right there underneath, doesn't help. (Yes, this happens, and it doesn't just happen on one system. I tried two touch laptops from two different vendors, and they both did the same thing.)
While grimacing at those half-baked machines, I realized yet another way in which Windows 8's forced convergence is hopelessly awkward.
The store shelves were packed with Windows 8 systems. Some of those systems were touch-enabled, some were not, and right there in between was a lonely WinRT convertible tablet. For an average user without prior knowledge, there was no way to tell whether a given Win8 machine would: a) respond to touch input or b) run x86 applications. None whatsoever. Heck, I caught myself pawing at non-touch-enabled Win8 laptops and being disappointed when that did nothing but smear the screen. Things will only get more confusing later this month, when Clover Trail-powered Windows 8 convertible tablets start to coexist with ARM-powered Windows RT ones.
This is a bigger deal than it seems. For the past couple of decades, people have been able to count on the fact that Windows PCs all operate the same way and all run the same software—generally speaking, at least. Windows 8 totally throws that out the window, and it does so in the worst way possible: by forcing a consistent appearance on systems that work totally differently.
I don't regret upgrading to Windows 8 on my desktop. All of a sudden, though, Apple's strategy of cleanly segregating iOS and OS X is looking awfully sensible.Dishonored: A nice change of pace
Warning: minor spoilers follow. If you haven't beaten the game and want a completely untarnished experience, don't read this!
Good single-player games are somewhat of a rarity these days. Way too many titles emphasize multiplayer over single player, and those that don't usually offer frustratingly linear experiences, with cut scenes punctuated by hours and hours of repetitive gunplay. (I'm looking at you, Max Payne 3.) Notable exceptions include the handful of open-world RPGs out there—but those are repetitive in their own way. After so many side quests and dungeon cleanups, the Skyrims and Fallouts of this world start to feel like second jobs.
I was pleasantly surprised when I picked up Dishonored last week. The game has no multiplayer component—the developers say they were never pressured to add one—and while it's very much story-driven, Dishonored gave me free rein to approach missions as I liked. I could jump from rooftop to rooftop and silence guards with sleeping darts. I could sneak through the sewers and try to avoid the
zombies plague victims roaming there. Or I could stab and shoot my way from start to finish, taking down everyone who got in the way.
The game offered an impressive set of tools to speed things along, too. After a few upgrades, I was able to teleport silently behind enemies, stop the passage of time, and briefly possess foes, The Exorcist-style. If I'd favored aggression over sneaking, I probably would have chosen some other powers—like the one that conjures a swarm of plague-infested rats to devour an unlucky target. Mmm.
That's only part of the reason I enjoyed Dishonored so much, though.
This game has a unique, captivating atmosphere, which is something few other titles get right. The designers managed to put together a totally believable alternate universe that is different enough to feel otherworldly but consistent enough to seem genuine. All that talk about getting an industrial designer on board wasn't just a load of hot air, by the looks of it. Everything in the game world, from the costumes and architecture to the furniture and whale oil-powered electric devices (don't ask), feels like it belongs and serves a purpose in that strange, pseudo-19th-century universe. The fact that everything is so well tied together is pretty cool, and it makes you want to explore and discover—not just cruise along to the next checkpoint.
Some have poked fun at Dishonored's plot for mixing and matching disparate elements, but I think it works, if you suspend your disbelief just a bit. More importantly, the plot isn't about a cartoonish struggle between good and evil. If I had to sum it up, I'd say the story is about how power both attracts and corrupts. There are shades of gray here—moral dilemmas that have no easy solution; characters that do bad things despite good intentions. The protagonist may be an assassin, but assassinations are only one way to dole out justice. Usually, you can snoop around, learn a bit about each target's backstory, and find a non-lethal alternative to their murder.
I got drawn in so much that I balked at killing pretty much anybody. Could I justify murdering city guards who were misled into believing I killed their empress? Could I assassinate a noblewoman simply for being the antagonist's lover—or kill a brilliant scientist for serving the wrong cause? At one point, the game even gave me a choice between torturing a character and finding a way to bribe him into revealing sensitive information. I picked option B. It involved a little more effort, but it worked, and the character ended up joining the resistance later.
It's refreshing to see a video game story try for a little nuance and complexity, and I think Dishonored's makes a very respectable attempt. Of course, you don't have to be the good guy. The lethal approach is just as much fun as sneaking around, and judging by some of the videos on YouTube, it's possible to become quite a prolific killer. (I expect I'll start a second playthrough to try that out soon.)
If I had one complaint, it's that Dishonored's developers made their inspirations a little too obvious. The game looks an awful lot like Half-Life 2 in places. Yes, I know the art director is the guy who designed City 17, but it feels like he should have mixed it up a little more. Also, following the non-lethal path feels a bit too much like playing a Thief game. I'd have appreciated some new twists to character AI beyond the same old basics—multiple levels of alertness, pre-programmed patrol routes, and so forth—that are the bread and butter of seemingly every stealth action game.
Oh, and why the heck is the left mouse button bound to the weapon in your right hand, and vice versa? Come on.
We're on the outs, my iPhone 4 and I.
Oh, we had a long and beautiful honeymoon. The iPhone 4 was my first proper smartphone ever, and I was immediately in love. The effortless sliding of icons on the home screen, the silky smoothness of the inertial scrolling, and the razor sharpness of fonts at 326 PPI... it was dizzying. I loved the iPhone for its brains, too. Anywhere, anytime, it gave me instant access to everything—maps, e-mail, navigation, music, Twitter, Facebook, news, weather, e-books, you name it. The list went on and on, and the convenience factor was off the charts.
But time started to take its toll on my beloved. The iPhone 4 began to feel more sluggish than before. I started to notice more hitching in the user interface. My usage patterns hadn't changed—I still mostly checked my e-mail, kept up on my RSS feeds, found my way around in the Maps app, and wasted time on Facebook. However, Apple kept piling on new feature after new feature in successive iOS releases, and each one seemed to make my iPhone 4 feel slower and more dated.
So I started itching for an upgrade. I patiently waited for Apple to introduce the iPhone 5. I mean, what else was I going to get? My girlfriend made the mistake of buying an Android phone a few months after I got my iPhone 4, and I saw first-hand the problems with that device: mediocre industrial design, a low-quality camera, and worst of all, no support for software updates past Android 2.2. My few, short brushes with Android 4.0 on other phones weren't too encouraging, either. As for Windows phones, as much as I like Metro in a mobile context, I wasn't thrilled with any of the devices out there—or the integration with Bing.
Alas, my desire for an iPhone 5 instantly vanished as soon as I saw the mess Apple made with iOS 6. Apple's new Maps app was the biggest and most appalling mistake from that release. It stripped away two major features upon which I relied heavily: public transit directions and Google Street View. And that's not even the worst part. Some information was just plain missing. Some places were mislabeled. And the satellite imagery was awful, at least for my city. Folks in other places have had no better luck.
To replace the missing public transit functionality, I had to buy a third-party app, TransitTimes+, which set me back $3.99. To its credit, TransitTimes+ offered me a lot of things the Google-powered Maps app didn't, like individual bus routes and nearby departure times. But it also didn't do some of what the Google app used to do, and I found it a chore to launch from the neutered Maps app. After spending a fair amount of time with it, I still longed for the old, Google-powered experience.
I thought about toughing it out for a few months until the Google Maps app comes out for iOS. If that turns out to be any good, then I can grab the iPhone 5. But thinking about that option, I realized something.
Google clearly has the better mapping software. I also think there's no question that Chrome is a better browser than Safari on the iPhone. Chrome handles tabbed browsing more elegantly, and it syncs with my PC, while Safari on iOS can only sync with Safari on OS X (the Windows version isn't available anymore) and Internet Explorer (which I don't use). Meanwhile, I use GMail for both personal and work e-mail, and the iOS Mail app still doesn't have proper support for all of GMail's features, like push notifications and Priority Inbox.
So, if the Google apps and services are better, then why even bother with iOS? Why not just get a Google phone?
I've been keeping an eye on Android, and with version 4.1 (a.k.a. Jelly Bean), the operating system finally seems to have received the polish and responsiveness that prior releases so sorely lacked. Android reminds me a little of Windows in its early days. It feels like Google iterated and iterated until, through sheer brute force, it started outclassing the previously superior solution. Mac OS began looking dated next to Windows in the late 90s, and it feels like iOS is starting to look a little crummy next to Android nowadays. Apple has excellent hardware, but I keep watching demos of Android 4.1 and thinking, why doesn't my iPhone do that?
Of course, the Android handset market is a veritable minefield of mediocrity—much like the PC clone market in the late 90s. Perhaps the best Android phone out there right now is Samsung's Galaxy S III, but like most Android smartphones from major manufacturers, it has a custom user interface and custom apps layered on top of the Google OS. Samsung calls those customizations TouchWiz, and I'm not a fan of them. I think TouchWiz icons and UI widgets looks cheap, and what I've seen of Samsung's custom apps hasn't impressed me. Custom software layers also increase the potential for security vulnerabilities—like the recent TouchWiz exploit that allowed Galaxy phones to be wiped remotely.
What I want, then, is a good, solid phone that runs the stock version of Android 4.1.
My friends tell me I can root the Galaxy S III and install a TouchWiz-free version of Android on it. I considered that option, but you know what? I don't want to do that. I don't want to have to jeopardize my warranty, spend hours digging through guides and downloading custom firmware, hoping throughout the whole process that I don't brick my phone. I don't want to have to worry about reinstalling the default firmware and resetting flash counters if I need to return the device for warranty service. I want a handset that, out of the box, looks and works just the way I want it to—like my iPhone 4 back in 2010.
That requirement leaves me with pretty much only one option: get a Google Nexus phone. Problem is, the Galaxy Nexus is already about a year old, and its replacement hasn't arrived yet. There have been leaked pictures of an LG Nexus handset purportedly due out in the near future, and I may well get that one. However, I'm not in love with the industrial design portrayed in the leaked shots, and I have no idea if the camera or the display are any good. I like the feel of the Galaxy Nexus' curved, textured back, but the LG Nexus' back will apparently be flat and smooth. Too bad.
My indecision has left me idly wondering if, maybe, I shouldn't get a Windows Phone 8 device. I'm probably going to upgrade my desktop PC to Windows 8, and I like the idea of running a scaled-down version of it on my phone. Microsoft has said the two operating systems will have a lot of common code, and porting apps between the two should be feasible with very little work. If all goes as planned, I might be able to run downscaled versions of desktop apps on my phone, and vice versa. I expect there would be some amount of synchronization going on, as well, so my settings and preferences would carry over from my PC to my phone.
Unfortunately, I don't really like the few Windows Phone 8 devices that have been announced so far. The Lumia 920 looks too fat, and as with the HTC 8X, garb choices include "black" and "several garish colors you're going to get sick of within six months." Beyond that, I worry about the software. Bing Maps and Internet Explorer 10 may work just fine, but if I'm ditching Apple because its apps aren't as good as Google's, then why would I even bother with Microsoft? Plus, even if I stopped worrying and learned to love Bing Maps, I'd still be using an underdog platform with fewer third-party apps than Android or iOS.
That, folks, is my smartphone conundrum for 2012.
I hate being in this situation. Back in 2010, the iPhone 4 was clearly the best phone to get. Sure, there were hiccups with the antenna, but I never experienced those—and I got a free bumper out of the resulting scandal. Today, I'm left scratching my head and wondering what the heck to buy. Maybe that's a reflection on the smartphone market's maturity. Maybe it just means there are too many good phones to choose from. But I'm a little more pessimistic; I think it means Apple has gotten too complacent, and because of that, there are no clear winners anymore.
|Battlefield Hardline open beta scheduled for February 3||17|
|You can now unlock your Chromebook with your phone||5|
|Deal of the week: A Radeon R9 290X for $233||84|
|AMD's new Fixer video is even crazier than the last||73|
|Leak pegs desktop Broadwell, Skylake for mid-year||48|
|WSJ: Microsoft to back Cyanogen with $70M investment||53|
|You've goat to check out Silicon Power's new thumb drive||54|
|We discuss the GeForce GTX 970 memory controversy||71|
|nvidia already released an official response: https://www.youtube.com/watch?v=spZJrsssPA0||+63|