Ceci est un blog

I choose to be spied on
— 11:27 AM on July 17, 2013

I choose to trust Google.

I choose to trust Facebook, Microsoft, and Apple. I choose to give those companies access to all my data—every picture I take, every e-mail I send, and every document I save online. They get my vacation photos and birthday wishes, and all the Skype calls I make with family members and coworkers. I could use the phone, but why bother? My phone calls are recorded, too.

I choose to trust the government. Not just the government of Canada, where I live, but also the governments of the United States, the United Kingdom, France, and other western countries that monitor online communications. I know that fighting terrorism is just a pretext; I know there are a million reasons to keep tabs on citizens, because knowledge is power, and power is irresistible. But I choose to trust that that knowledge won't be used to blackmail me, to detain me indefinitely, or to get me to inform on my friends, family, and coworkers. That kind of thing went down in East Germany, but Canada and the U.S. aren't East Germany. Our countries are free and democratic and governed by the rule of law, and free nations never become un-free.

Except sometimes.

The NSA's National Security Operations Center Floor. Source: NSA.

I choose not to hear too much. When I see the latest leak about how my data is harvested, or about how the government coerces businesses into collaborating, I read the headline, sigh, and move on. I know how the rest of it goes. I know things are ugly. But the more I know, the more I hate myself for my own inaction. Each news story is a reminder that I've been robbed of my privacy and that I've done nothing to take it back. I should close my Facebook account, encrypt all my communications, and disable iCloud on my iPhone... but what's the point? They can keep my data forever. And one day, they'll be able to decrypt anything.

Thankfully, there aren't that many stories. There aren't that many reminders. In the news, I mostly hear about Edward Snowden. What was his girlfriend like? Will he seek asylum in Venezuela? I hope they don't catch him. I think he's a good guy. I'd hate to see him rot in prison for the rest of his life.

I choose to go on with my life. I'm a busy man: I have a job, a girlfriend, and hobbies. I have movies to see and cable shows to watch. These flagrant incursions on my privacy don't affect the way I live, because for the most part, I'm still free to say what I want and to do what I want. When you have a job and a home and a flat-screen TV, complacency is always the easiest course of action—even when important ideals need safeguarding.

I choose to leave it up to others. The enormity of it all, the way it's all coming out in the open, makes me hopeful that someone, somewhere will do something. Maybe congressmen and MPs will stand up for my rights. That's what they're supposed to do, isn't it? Or maybe activists will walk down in the street and wave flags, shout slogans, and wash pepper spray out of their eyes until the NSA and its Canadian and British and French counterparts are neutered or dismantled. If enough people started marching, I would probably join them. But I wouldn't go down there all by myself.

I choose to wait. Subtle changes are always subjectively nonthreatening, whether it's the oceans rising or the Internet turning from a wild frontier into a mass surveillance tool. There will be plenty of time to solve it tomorrow. Or next Wednesday. Or next year. I haven't been personally inconvenienced yet, so what's the rush? Wait, hold on... I think Futurama is on tonight.

Finally, when it's late at night and I can't sleep, I choose to feel hopeless. Because I understand technology. I've been using technology, thinking about it, and writing about it for most of my life. I know what it can do and what it shouldn't. I of all people should be getting royally, supremely worked up about all this.

But I'm not.

And if I'm not, then who is?

197 comments — Last by Kougar at 2:41 AM on 08/21/13

At WWDC 2013, Apple showed its new soul
— 11:59 PM on June 11, 2013

There was something different about Apple during yesterday's WWDC keynote. I couldn't put my finger on it at first.

Part of it was the glitchy developer demo at the start of the keynote. Right around the time Tim Cook should have been waxing poetic about Apple's accomplishments, two scruffy guys from a small robotics start-up took the stage and horsed around with toy cars.

Then came Craig Federighi. Apple's new software chief made the OS X Mavericks and iOS 7 demos come alive. He joked around with the audience and poked fun at himself. He came across as warm and personable—the polar opposite of Scott Forstall and Bertrand Serlet, the former iOS and OS X gurus, who always exuded cold intensity and rarely, if ever, strayed from their rehearsed remarks.

And then there was the new Designed by Apple in California campaign. After years of squeaky-clean, product-centric ads, Apple ditched the white backdrops, the oversaturated colors, and the catchy indie hits. It gave us honesty and emotion, and it tried to communicate something profound about its identity.

I think I know what's happening: Apple is growing a new soul. It's growing a new identity based not on one man's ego, but on human ideals we can all connect with.

It did something similar in 1997 with the Think Different campaign, which changed Apple's image from that of a dying PC maker to that of a champion for idealism. But then Jobs fused his identity with Apple's, and there was no longer a need for the "Think Different" credo. From 1997 until October 5, 2011, Jobs was the soul of Apple. He shaped the business, vetted the products, and stood on stage alone to introduce everything that came out of the company. He made Apple seem like a flawless machine whose only purpose was to bring his vision to fruition.

When he passed away, Apple became soulless. The people Jobs had hired were still there, and so were the products he had helped create. So, too, was the design and management infrastructure he had put in place. But there no longer seemed to be anything holding it together. The loss was so great that, after just a few months, people began to wonder if Apple had lost its way. They wondered this even as Apple continued to carry out Jobs' plan and to release the products he had vetted. Because he was gone, the magic was gone.

Apple stayed in this uncomfortable limbo for 20 months. Then, at WWDC 2013, we saw it finally fill the void left by its founder's death. There was no bold talk of corporate restructuring or rebranding. Rather, the new ad campaign, the tweaked keynote style, and Federighi's antics showed a side of the company we'd never seen before—a human side, a relatable side that might have been stifled by Jobs' perfectionism and arrogance before. Watching the keynote, I felt like Apple had gotten a new lease on life. The company seemed emboldened by its founder's legacy yet free from the weight of his influence.

And all it said was, "We are Apple. This is who we are."

Companies without soul can prosper. Firms like Microsoft and ExxonMobil post healthy profits and, for the most part, delight their investors. But nobody feels a personal connection to them. I think Apple came dangerously close to following those companies down that dark and dreary road. However, I think Tim Cook and his team were perceptive enough to steer clear of it and, once again, imbue Apple with human qualities. Those aren't the qualities of the old Apple—charisma and persistence and arrogance. They're new qualities like warmth, playfulness, devotion, and humor.

That's the sense I'm getting from yesterday's keynote, anyway. The new Apple may never be like the old Apple, but from what I saw, it could turn out even better.

92 comments — Last by shank15217 at 5:25 PM on 06/24/13

An ode to the Kindle Paperwhite
— 6:56 PM on May 2, 2013

Oh Paperwhite, my sweetest Paperwhite
In softness clad, and black as moonless night
Submerged in all thine words, I do thee clutch
And thou, to me, feel'st like a lover's touch

My eyes aloft o'er fonts as clean as dew
So easy 'tis to lose myself in you
Caecilia and Palatino, my dear
Did not before look quite so fine and clear

I loved thy sister, trusty Kindle Touch
But next to thee, my sweet, she lacks so much
Face sunken in and fussy as a bee
Could not tell cloth from skin, it saddened me

And when the sun at day's end went to rest
The words at once became as dim as west
So dark were they, I huddled by the light
of bulb, or tube, or candle in the night

One day I laid the words on fine a slate
A thing with bigger screen and greater weight
Yet soon my eyes grew weary from the glare
And seeing the faint reflection of my hair

But thee, O fairest reader of them all
Have no such flaws, no penchant to appall
When set all day upon thine front-lit face
Mine eyes do not fatigue, do not give chase
To chapter's end, to hasty epilogues
They seek the words like famished pollywogs

And though 'tis true you are not free of kink
One glowing edge, one corner tinted pink
Your screen, when dimmed to match the light around
Is fairer than what printed things abound

Except, perchance, a Gutenberg Bible
But paperbacks were never as noble

*  *  *

26 comments — Last by NIKOLAS at 2:32 AM on 05/28/13

The PC is booming—just not the PC we know
— 11:05 PM on April 18, 2013

So, you heard the news: PC sales are tanking. Apparently, nobody wants to buy Dells or HPs anymore; nobody cares about clunky laptops and bulky mid-towers. People haven't necessarily stopped using them—aging PCs are still humming along in bedrooms, living rooms, and offices everywhere. It's just that those machines aren't getting replaced. Instead, people are spending their hard-earned dough on what analysts call post-PC devices: smartphones, tablets, phablets, and so on.

The PC industry is scrambling to adapt. Microsoft has retooled Windows into a weird hybrid that straddles post-PC tropes and legacy conventions. Laptop makers are bending over backwards to give us touch-enabled laptops that double as tablets. Everyone is working overtime to put a new spin on old concepts... and by all accounts, it isn't working. PC shipments suffered their greatest decline ever last quarter, in spite of Windows 8 and all those tablet-notebook hybrids.

Some say there's no hope, but I disagree. Because the PC is booming—just not the PC we know.

Source: German Federal Archive.

What is a PC? The initials stand for "personal computer." According to the Merriam-Webster, a personal computer is a "general-purpose computer equipped with a microprocessor and designed to run especially commercial software (as a word processor or Internet browser) for an individual user."

In the 90s, when I was growing up, personal computers were few and far between. To record TV shows, we used VCRs and VHS tapes. To share music with friends, we spent hours copying cassette tapes. To get in touch with friends and relatives, we used a land line—or, if textual communication was more up our alley, we wrote a letter. By hand. With a pen and paper.

The only kind of personal computer you could buy was a big beige box with a matching CRT monitor. Or, if you were loaded, you could get a laptop with a crappy passive-matrix LCD and a trackball wedged in the palm rest. These were toys of the privileged. We, the geeky elites, used them to play Doom and to rack up preposterous phone bills surfing AltaVista and GeoCities on 14.4K modems. A PC was a badge of pride. Finding someone else who knew about the Internet was sufficient to spark a lasting friendship.

If you'd fallen into a coma in 1993 and awoken today, you'd realize that personal computers are everywhere now. You'd probably notice the laptops and mid-towers at first, but then you'd start to see the phones, the tablets, the game consoles... and you'd think, aren't they all the same thing? Aren't these all general-purpose computers equipped with processors and designed to run commercial software? Sure, some of them are a little more locked down than the IBM clones of old, but that's nothing a little jailbreaking or rooting won't fix. Hackers can still write their own software. Not only that, but they can also package that software, ship it, and make money from it with very little effort or risk. It's a far cry from the days of shareware trials on 1.44MB floppy disks.

Two decades ago, having the tools to play video games, to get on the Internet, and to write crappy BASIC programs made us special. Now that personal computing has grown into something so exceedingly ubiquitous, we feel like we're not so unique anymore. Some of us see the PC's evolution as a corruption of something precious—but I don't think it is. The basic formula that made PCs great 20 years ago is still there, more or less intact, in today's post-PC devices. You just have to look past the drastically different packaging and realize that modern personal computers are more beautiful, more versatile, and easier to use than than they've ever been.

I still spend a lot of time in front of an old-school desktop PC. I wouldn't dream of giving it up. The thing is, though, my PC is big, heavy, difficult to operate, and required for serious work—Photoshop, Excel, web design, text editing, you name it. There's nothing terribly personal or cozy about it. If you think about it, today's tablets and phones fulfill the PC's original mission—making personal computing available to the masses—far more elegantly than this thousand-dollar workstation.

Because that's really what this is: a workstation. And that's really what most of today's traditional PCs are. They're workstations with multiple processor cores, Windows NT-based operating systems, and copious amounts of storage and memory.

There's nothing wrong with that. Workstations will always be needed, because there will always be work to do. But we shouldn't pretend that the PC is somehow dying because people aren't buying workstations they no longer need. The PC isn't dying, because today's real PCs are in our pockets. We're buying more of them than ever, and they're doing more for us than i486 Compaqs ever did.

83 comments — Last by Diplomacy42 at 1:52 AM on 06/26/13

Modern shooters and the atrophy of fun
— 9:07 AM on April 5, 2013

I finished BioShock Infinite last weekend. It's probably one of the best shooters I've ever played.

I loved the cleverness of the storyline, the expressiveness of the characters, and the unique beauty of the graphics. I loved how the game got so immersive that, during late-night sessions, I almost felt like I really was fighting my way through Columbia—like I really was trying to tear a young woman named Elizabeth from the clutches of her fanatical, despotic father.

I loved everything about BioShock Infinite. Except for the gameplay.

Don't get me wrong. The exploration was great. Watching the story unfold was incredible. But the combat and looting got so repetitive—so downright boring—that I couldn't stand to play more than a couple hours at a time. I got sick of digging through trashcans inexplicably filled with silver coins and of fighting wave after wave of enemies, each one more indistinguishable than the last. The combat sequences blurred together, ran into one another, and I found myself praying for them to end, hoping that I could proceed without playing hi-fi wack-a-mole with steampunk guns and angry crows.

So why is BioShock Infinite one of my favorite shooters? Because the others are just as bad, if not worse.

Since the days of Doom and Quake, we've seen shooters take quantum leaps in graphics, writing, voice acting, and just about everything else—except for gameplay. Somehow, gameplay hasn't evolved. It hasn't gotten more fun or more engaging or more interesting. Instead, it's atrophied into a bland rut, to the point where big-budget shooters feel just like old light-gun arcade games (Virtua Cop, House of the Dead, and so on). Players are still stuck on rails, still made to gun down easy target after easy target, pausing only to reload and to watch cut scenes. Today's visuals and stories might be Oscar-worthy, but the interactivity still feels like tasteless filler.

Shooters could be so much more. Instead of trivializing combat, they could make fights less frequent, longer, and more memorable. They could reward players for acting rationally when outnumbered—hide, flee, or die. Shooters could, when appropriate, encourage problem-solving and exploration over brute force. Hell, why couldn't they have players decide how the story plays out? But no, that's all too much to ask. Studios and publishers seem to have forgotten that games are supposed to be games, not CG films with playable action scenes.

Things weren't always this way. I have very fond memories of System Shock 2, BioShock Infinite's spiritual pre-predecessor. I remember desperately scrounging for ammo, cowering in fear from even lone mutants, since I knew a fight might leave me badly wounded—and the noise might attract other creatures. I recall sneaking past enemies and reprogramming turrets to dispatch them so that I wouldn't lose precious health or bullets in combat. I can still recall the satisfaction I felt when, later in the game, I finally had enough upgrades to gun down monsters in one shot.

In System Shock 2, each enemy encounter was an event: memorable, frightening, dangerous, and sometimes exhilarating. Getting lost in the corridors of the Von Braun was part of the game, and it made the experience all the more immersive. There was a real sense that you, the player, had to use your own cunning and skill and sense of orientation to survive. Because of that, beating the game felt like a true achievement, and it made you want to start all over again. Nothing about it felt like watching a bad Michael Bay flick.

System Shock 2's formula should have been spread far and wide and polished to a mirror shine by now. But instead, after 14 long years, that formula has been largely forgotten. 

Source: GOG.com.

There's nothing dangerous or memorable about BioShock Infinite's gameplay. While players must still hunt for ammo, combat is so frequent that bullets are strewn everywhere, and picking them up feels like a chore rather than a relief. The anguish of an empty gun is nowhere to be found, either. Elizabeth replenishes your ammo supply during combat, and when things go south, dying causes you to be magically teleported to a safe location with spare magazines in your pockets. There's no longer any danger. Skill and cunning aren't really rewarded anymore.

Not even the game's "vigors" manage to spice up combat. Most of them basically do the same thing: stall bad guys for a few seconds and inflict a small amount of damage. Using a vigor at the right time can mean the difference between a cleared battlefield and a forced resurrection, but there's nothing hugely satisfying about the process. It just adds more steps to the tedium of depleting each enemy spawn point.

The most depressing thing about BioShock Infinite, though, is that it's actually one of the more original shooters out there. Compared to the endlessly multiplying Call of Duty clones, its gameplay is textured and tinged with depth and variety. While I was able to beat BioShock Infinite and derive pleasure from the experience, I've had to stop myself from playing war-themed shooters altogether. Their single-player campaigns are just awful. The last one I bothered to finish was Battlefield 3, and I hated everything about it.

I'm not sure who or what to blame. Maybe this is all an attempt to appeal to the lowest common denominator. Maybe game studios are so intent on catering to brain-dead 14-year-olds with Xbox 360s that they've lost sight of what makes games fun. If that's the case, then there may be little hope. It's entirely possible that the next crop of consoles will bring us unimaginably pretty games with sugar-free, decaffeinated gameplay that's as boring as ever. That would be even more soul-crushing than BioShock Infinite's failings.

Our only hope is that, eventually, even 14-year-olds will get sick of playing the same game over and over. They'll start to clamor for better games, where interactivity involves more than just pointing and aiming. And game developers will deliver. It might seem unlikely, but I remember being 14 quite well. It was around the time System Shock 2 came out, and I didn't toss that game aside for something with more instant gratification. I dug in, and I loved it.

233 comments — Last by SBJ_Eagle at 4:41 AM on 04/26/13

The problem with Windows convertible tablets
— 10:56 PM on March 13, 2013

I've been spending a fair bit of time with Windows convertible tablets lately. I reviewed the Samsung ATIV Smart PC Pro 700T last week, and I've had the Asus VivoTab RT kicking around in my benchmarking lair for a few weeks. I'm also currently testing an Atom-based tablet: the VivoTab Smart, which combines x86 support with the slender profile and long battery life you'd expect from an ARM-based device.

Oh, and I've tried both versions of Microsoft's Surface. Not in my office, though—there was a Microsoft kiosk at the mall, and I stopped by while shopping for a Valentine's Day gift. Yes, I'm that romantic.

Anyhow, the longer I spend with these devices, the more I grow convinced that convergence à la Microsoft is an ill-tasting recipe. I articulated some of my reservations in the Samsung 700T review, where I stated:

There seems to be little overlap between what people do on tablets, which is mainly content consumption, and what people need full-featured notebook PCs for, which is productivity. . . . So, why must we have both on one machine? What's so compelling about having Facebook and Kindle apps on the same physical system as Office and Photoshop? Since the combination is fraught with compromise, why not get a great tablet and a great ultrabook rather than a less-than-great combination of the two?

TR's own Geoff Gasior had a reasonable answer to this: because carrying one device is better than carrying two. And hey, I totally get that. My problem is that saving room in my backpack does me little good if I it means passing up the best tool for the job. From my experience so far, Windows convertible tablets are rarely—if ever—the best tools for the job.

 

Think about it. What do you want from a tablet? You want plenty of quality apps to choose from, including games. You want a great display, long battery life, and something that's thin and light. You also want a device that's both fast and easy to use, because content consumption is no fun if waiting and troubleshooting are involved.

Win8 and WinRT systems just don't deliver there. Good Modern UI games and apps are still pitifully few in number. (There's no Flipboard, Feedly, or Google Currents. No HBO GO or BBC iPlayer. No Yelp, and no official Facebook client.) The handful of Windows tablets that are thin, light, and endowed with long battery life—those with Atom and ARM-based processors—all seem to have ugly, low-resolution displays. (1366x768 is just downright sinful on a tablet screen.) As for speed and ease of use, WinRT slates take forever to launch apps, and while Atom tablets strike a passable balance between performance and power efficiency, ease of use remains a concern. One must still put up with the awkward marriage between Modern UI and the desktop, not to mention the questionable design choices within the Modern UI environment itself.

Okay, now what do you want from a good laptop? This is a system you're going to be using for productivity, so you want it to be fast. You want a great keyboard and touchpad, because controlling Windows 8's desktop environment with a touch screen is an exercise in frustration. If this is a productivity machine, chances are you want more than 11.6 inches of screen space. Don't get me wrong; small, highly portable notebooks are great. Photo editing on a thimble-sized display, however, is not. Neither are the cramped keyboards and truncated touchpads that invariably accompany smaller screens.

Windows convertibles also fail to deliver in this department. For them to double as half-way decent tablets, convertibles must sacrifice desktop performance and capabilities in one way or another—either with plodding performance, like the Atom-based systems, or with a "let's pretend" desktop courtesy of Windows RT. They must restrict themselves to 11.6" or smaller screen sizes, as well, which inherently compromises the keyboard and touchpad arrangement. Worse, that compromise is often more dire than it ought to be. The Surface's Touch Cover is just plain awful (try touch-typing on the thing, I dare you), and the Samsung 700T's fickle and undependable touchpad really disappointed me.

Ideally, Windows convertible tablets should offer the best of laptops and tablets, all in a single device. They should, but they do not. Current offerings feel more like crappy tablets rolled into crappier notebooks—jacks of all trades, masters of none, with good design sense and usability discarded in the name of convergence.

What does that convergence get you?

Well, you can store all your music, photos, and personal files on a single device. That's nice, I suppose. Then again, cloud storage is starting to make that convenience a little old-fashioned. I don't carry very much music on my phone, for example, because I don't have to. When I want to listen to something that's absent from the device, I simply grab it through iCloud over the LTE connection. (And no, being an Apple-worshipping metrosexual isn't a prerequisite. Google and Amazon run similar services.)

What else? We've already addressed the saving-space-in-your-backpack thing, and I think the downsides of convergence make that a lopsided bargain. That leaves one major advantage: cost. Buying a convertible tablet is cheaper than springing for a separate tablet and notebook, isn't it? If you're strapped for cash, convergence must be a pretty solid proposition.

The price difference isn't as big as you'd expect, however. A Nexus 10 will set you back $400; an iPad, $500. An entry-level ultrabook can be had for $650, and a good one will cost about $1,000. Now look at Samsung's ATIV Smart PC Pro 700T—a fine example of a Windows 8 convertible with ultrabook-class performance, which is precisely what you need if you aren't buying a separate laptop. It costs almost $1,200 at Newegg. That's $150 more than the Nexus 10 and the inexpensive ultrabook, and only $300 cheaper than the iPad and deluxe ultrabook combo.

Now, why on Earth would you settle for the worst of both worlds?

Don't get me wrong; convergence can be done well. Smartphones are touch-based computers converged with mobile phones, and they're are a great example of the concept taken to the right place. In that instance, though, convergence works because people don't want their pockets weighed down with extra hardware. In your trouser pockets, every ounce and every cubic inch counts. That's why nobody seems to mind that smartphones have pitiful battery life compared to basic cell phones. The benefits of convergence—having a little, Internet-connected computer, media player, and gaming console in your pocket—far outweigh the inconvenience of having to charge up every night.

I don't think you can make a strong case for convergence between tablets and notebooks. You don't carry those devices in your pocket. You carry them in a backpack, a briefcase, or a messenger bag, and so it doesn't really matter whether you're hefting a tablet and an ultrabook or a tablet and a removable keyboard dock. There's a small weight and thickness difference, but it doesn't amount to very much. The Samsung 700T weighs 3.54 lbs when docked. Put together, the iPad and deluxe ultrabook we talked about weigh 4.3 lbs. We're talking about a 12-ounce disparity, which is nothing compared to the nuisance of having to carry both a phone and a PDA in your pockets.

Of course, none of this means successful notebook-tablet convergence is unachievable. Once the hardware delivers ultrabook-class performance in the power envelope of a Tegra 3, and once Modern UI is sufficiently polished, fine-tuned, and loaded with great third-party apps, then I expect we'll see some excellent convertibles—devices good enough to make me ditch my iPad and my laptop. Perhaps all it will take is the next generation of processors—Haswell, Bay Trail, and Temash. Or maybe only Windows 9 and next year's hardware innovations will bring us there.

Or maybe it will take even longer than that.

For now, though, I'll keep watching Windows convertibles as I always have: with a mixture of curiosity and disappointment.

109 comments — Last by kamikaziechameleon at 2:34 PM on 04/03/13

On the marginalization of consumer laptops
— 12:33 AM on January 18, 2013

You probably saw that Gartner report earlier this week about the sluggishness of PC shipments last quarter. Shipments were so sluggish, according to Gartner, that they shrank by almost 5% compared to the same quarter in 2011. I'm sure there were many factors at play, but Gartner pins the blame on one in particular: users relinquishing PCs for daily use.

Whereas as once we imagined a world in which individual users would have both a PC and a tablet as personal devices, we increasingly suspect that most individuals will shift consumption activity to a personal tablet, and perform creative and administrative tasks on a shared PC. There will be some individuals who retain both, but we believe they will be exception and not the norm. Therefore, we hypothesize that buyers will not replace secondary PCs in the household, instead allowing them to age out and shifting consumption to a tablet.

I'm a PC enthusiast, and chances are you, the reader, are as well. We might therefore find it hard to imagine folks ditching their computers for comparatively limited tablets. I mean, you can't do much on a tablet, can you? Most of them lack Flash support, make multitasking awkward at best, and don't play terribly well with keyboards. I use mine for e-book reading and some light gaming, but I would never dream of taking it to a trade show instead of a laptop. No way.

Yet Gartner's suspicion is truer than you might think—and as it happens, I have some very convincing anecdotal evidence to support it.

I bought my girlfriend an iPad 4 for Christmas. Well, technically, we went to the Apple Store and picked it out together. Aline chose the base Wi-Fi model (in white) and a matching SmartCover (in pink) for a total of $538 U.S. before tax. She unwrapped everything a couple of days early (because waiting sucks), played with it some, and then promptly stashed away her notebook PC—a relatively speedy 13" machine with a Trinity APU and Windows 8.

The laptop has been sitting under her desk ever since. She hasn't switched it on in almost a month. Not once.

And really, the substitution makes perfect sense, if you think about it from her perspective. The iPad has a great many advantages over a cheap consumer laptop:

  • It's fast. Aline's laptop wasn't slow by any means, but many consumer notebooks are. The iPad isn't. The iPad's user interface feels snappy and responsive. Apps load quickly, and you rarely get the sense that you're waiting on the device. Part of the credit goes to Apple's excellent A6X processor, but a big part goes to iOS, which is expertly tailored to run smoothly on the low-power hardware. The same can be said for a lot of third-party iOS apps. Using a brand-new iPad is just a very satisfying experience all around.
  • It's cool and quiet. The iPad has no fans to whine and moan at you when you're running games or watching online videos. It never gets uncomfortably hot to the touch. It never scalds your thighs. Getting those same perks from a cheap PC laptop is difficult if not impossible. During Netflix marathons, Aline usually had to prop her Trinity laptop atop an Amazon box to keep her thighs cool. She did the same with her old Intel CULV ultraportable before that. The iPad can be cradled comfortably in her arms no matter what it's running.
  • It's supremely portable. At 1.44 lbs, the iPad is lighter than virtually any notebook south of $1,000. And since it's just one big, super-thin screen with some hardware glued to the back, you can use it comfortably anywhere—on the couch, in bed, on an airplane, and even in the john. (Or so I hear. Ahem.)
  • It actually has all-day battery life. Notebook vendors have promised us all-day battery life for years now, and they keep falling short more often than not. A fancy ultrabook might get you seven or eight hours, but cheaper systems aren't even close. The iPad, meanwhile, stayed up for over 12 hours in our web surfing test. With a device like that, there's no need to worry about running out of juice or sitting near an outlet during use. Charge it overnight every other day or so, and you're good. Heck, you don't even have to shut it down, because its standby time is preposterous—something like a month, according to Apple.
  • It doesn't get gross. Have you seen a consumer laptop after a few months of use? It's a mess: crumbs in the keyboard, gunk on the screen, finger oil on the touchpad, food stains on the palm rest, and so on. The iPad doesn't get anywhere near that filthy. All of the action happens on the touch screen, which is easy to wipe clean with pretty much any piece of non-abrasive cloth. I usually wipe mine on my t-shirt. The back doesn't really get dirty, either, because it's just a slick sheet of anodized aluminum. The buttons and connectors might gather a little lint or miscellaneous gunk here and there, but that's nothing compared to a well-loved notebook PC.
  • It runs all the games you can buy for the platform. Intel's integrated graphics have made some huge strides over the past few generations, but let's face it: laptops without discrete GPUs are still iffy for gaming. If you're a neophyte, there's no telling whether or not a game will run well. That isn't a problem on the iPad. Every title you can buy or download runs smoothly, and there are some shockingly pretty ones out for iOS. Sure, triple-A games aren't available—but you can't really run Far Cry 3 on an Intel IGP, anyway.
  • It makes consuming content delightfully easy. Everything is right there, a few finger taps and swipes away: movies, TV shows, music, e-books, comics, magazines... Even the web is more fun to browse on a tablet than on a laptop with a crappy touchpad. (And yes, most laptop touchpads are still really crappy.)
  • It takes data loss out of the equation. Hardware failures happen. So do thefts and accidental damage. Those events can mean losing years' worth of data with a consumer laptop, but not so with the iPad. If the device breaks, you can just go to the Apple Store, get it swapped out, and reload your backup from iCloud. Your software and data will be pretty much just like you left them.
  • It looks pretty. People like beautiful objects. If we didn't, we wouldn't have invented jewelry, Art Deco, and German cars. PC laptops have gotten a lot prettier in recent years, but for the most part, they're still pretty ugly. The iPad looks gorgeous by comparison—especially with a matching SmartCover. It doesn't hurt that iOS's candy-coated icons are a lot prettier than Windows 8's drab tiles—or that text and graphics are razor-sharp on the Retina display.

For the price of the iPad and SmartCover, Aline could have snagged an Asus VivoBook X202E, which is selling for $549.99 at Newegg right now. I had a chance to play with that pseudo-ultrabook before Geoff got to work on his review, though, and I wasn't impressed. The thing is abysmally slow, has a really ugly screen, and seems to run its fan continuously, even at idle. Geoff measured the battery life at four hours, which sort of sucks. Overall, I found it unpleasant and frustrating to use.

Sure, the X202E runs things the iPad cannot—things like Word, Excel, Photoshop, and a full-featured operating system with proper file management. If you need to do real work, then there's no substitute for a real laptop (although you'd be surprised how much an iPad can do with a Bluetooth keyboard and Apple's iLife apps). The thing is, however, most consumers already have an old PC they can use to write resumes or telecommute. Why should they buy a new laptop when a tablet can serve their other needs so much better?

I can't think of a good argument.

When the iPad came out in early 2010, I thought of it as a nifty companion device for folks who already owned smartphones and laptops. Tablets seemed, in short, like gadgets for the technologically privileged—cool but unnecessary. Yet in three short years, these new devices have become something else altogether. In a very real sense, they've become highly compelling replacements for consumer laptops in non-productivity usage scenarios. That's exciting... and, frankly, a little scary.

155 comments — Last by KoolAidMan at 2:36 AM on 02/15/13