My PC is too big. Much too big. I'd always vaguely suspected it, but testing Corsair's Obsidian Series 350D case earlier this week made it quite clear.
My PC is full of air and unoccupied slots and bays. I have four 5.25" optical drive bays that I don't use. The top one houses a DVD burner, but I can't remember the last time I stuck a disc in it. I moved to Canada over three years ago, and I'm positive that I've never purchased a blank DVD in this country.
Half of the expansion slots on my motherboard are set dressing. I only have a dual-slot graphics card and a sound card. In fairness, I use five of my six hard-drive bays—but that's because I'm still holding on to old drives, including a 320GB WD Caviar SE16. If I were to build a new system today, I would probably need just two 3.5" bays, with one 4TB hard drive in each. Add a 2.5" solid-state drive for my OS and applications, and I'd be set.
I'm sure I'm not alone. In fact, I'm willing to bet the vast majority of PC gamers and enthusiasts out there have just as much empty space in their PCs. Oh, don't get me wrong; leaving room for upgrades is fine. However, in the age of laptops, iPads, and smartphones, it seems a little strange that we should all have humongous mid-tower PCs full of air.
Over the past few days, I've been trying to picture what a modern desktop PC ought to look like. We could redesign everything completely, of course—introduce new form factors all over the place and wind up with something close to perfection. However, I think we can already improve things greatly with a few simple, practical steps:
That's about as far as I've gotten just now, but I'm sure there are other things we could do. And I'm sure you folks have ideas, too.
The broader point, though, is that desktop PCs could use a makeover. With just a handful of good initiatives, and maybe a new standard or two, we could make desktop PCs substantially simpler to build, more straightforward to use, and easier to carry around. Not every enclosure needs built-in cabling for everything plus a dozen front-panel ports, but we should at least offer those options. The easier it is to build a PC, the more people will do it, and the better the industry will be.I choose to be spied on
I choose to trust Google.
I choose to trust Facebook, Microsoft, and Apple. I choose to give those companies access to all my data—every picture I take, every e-mail I send, and every document I save online. They get my vacation photos and birthday wishes, and all the Skype calls I make with family members and coworkers. I could use the phone, but why bother? My phone calls are recorded, too.
I choose to trust the government. Not just the government of Canada, where I live, but also the governments of the United States, the United Kingdom, France, and other western countries that monitor online communications. I know that fighting terrorism is just a pretext; I know there are a million reasons to keep tabs on citizens, because knowledge is power, and power is irresistible. But I choose to trust that that knowledge won't be used to blackmail me, to detain me indefinitely, or to get me to inform on my friends, family, and coworkers. That kind of thing went down in East Germany, but Canada and the U.S. aren't East Germany. Our countries are free and democratic and governed by the rule of law, and free nations never become un-free.
I choose not to hear too much. When I see the latest leak about how my data is harvested, or about how the government coerces businesses into collaborating, I read the headline, sigh, and move on. I know how the rest of it goes. I know things are ugly. But the more I know, the more I hate myself for my own inaction. Each news story is a reminder that I've been robbed of my privacy and that I've done nothing to take it back. I should close my Facebook account, encrypt all my communications, and disable iCloud on my iPhone... but what's the point? They can keep my data forever. And one day, they'll be able to decrypt anything.
Thankfully, there aren't that many stories. There aren't that many reminders. In the news, I mostly hear about Edward Snowden. What was his girlfriend like? Will he seek asylum in Venezuela? I hope they don't catch him. I think he's a good guy. I'd hate to see him rot in prison for the rest of his life.
I choose to go on with my life. I'm a busy man: I have a job, a girlfriend, and hobbies. I have movies to see and cable shows to watch. These flagrant incursions on my privacy don't affect the way I live, because for the most part, I'm still free to say what I want and to do what I want. When you have a job and a home and a flat-screen TV, complacency is always the easiest course of action—even when important ideals need safeguarding.
I choose to leave it up to others. The enormity of it all, the way it's all coming out in the open, makes me hopeful that someone, somewhere will do something. Maybe congressmen and MPs will stand up for my rights. That's what they're supposed to do, isn't it? Or maybe activists will walk down in the street and wave flags, shout slogans, and wash pepper spray out of their eyes until the NSA and its Canadian and British and French counterparts are neutered or dismantled. If enough people started marching, I would probably join them. But I wouldn't go down there all by myself.
I choose to wait. Subtle changes are always subjectively nonthreatening, whether it's the oceans rising or the Internet turning from a wild frontier into a mass surveillance tool. There will be plenty of time to solve it tomorrow. Or next Wednesday. Or next year. I haven't been personally inconvenienced yet, so what's the rush? Wait, hold on... I think Futurama is on tonight.
Finally, when it's late at night and I can't sleep, I choose to feel hopeless. Because I understand technology. I've been using technology, thinking about it, and writing about it for most of my life. I know what it can do and what it shouldn't. I of all people should be getting royally, supremely worked up about all this.
But I'm not.
And if I'm not, then who is?At WWDC 2013, Apple showed its new soul
There was something different about Apple during yesterday's WWDC keynote. I couldn't put my finger on it at first.
Part of it was the glitchy developer demo at the start of the keynote. Right around the time Tim Cook should have been waxing poetic about Apple's accomplishments, two scruffy guys from a small robotics start-up took the stage and horsed around with toy cars.
Then came Craig Federighi. Apple's new software chief made the OS X Mavericks and iOS 7 demos come alive. He joked around with the audience and poked fun at himself. He came across as warm and personable—the polar opposite of Scott Forstall and Bertrand Serlet, the former iOS and OS X gurus, who always exuded cold intensity and rarely, if ever, strayed from their rehearsed remarks.
And then there was the new Designed by Apple in California campaign. After years of squeaky-clean, product-centric ads, Apple ditched the white backdrops, the oversaturated colors, and the catchy indie hits. It gave us honesty and emotion, and it tried to communicate something profound about its identity.
I think I know what's happening: Apple is growing a new soul. It's growing a new identity based not on one man's ego, but on human ideals we can all connect with.
It did something similar in 1997 with the Think Different campaign, which changed Apple's image from that of a dying PC maker to that of a champion for idealism. But then Jobs fused his identity with Apple's, and there was no longer a need for the "Think Different" credo. From 1997 until October 5, 2011, Jobs was the soul of Apple. He shaped the business, vetted the products, and stood on stage alone to introduce everything that came out of the company. He made Apple seem like a flawless machine whose only purpose was to bring his vision to fruition.
When he passed away, Apple became soulless. The people Jobs had hired were still there, and so were the products he had helped create. So, too, was the design and management infrastructure he had put in place. But there no longer seemed to be anything holding it together. The loss was so great that, after just a few months, people began to wonder if Apple had lost its way. They wondered this even as Apple continued to carry out Jobs' plan and to release the products he had vetted. Because he was gone, the magic was gone.
Apple stayed in this uncomfortable limbo for 20 months. Then, at WWDC 2013, we saw it finally fill the void left by its founder's death. There was no bold talk of corporate restructuring or rebranding. Rather, the new ad campaign, the tweaked keynote style, and Federighi's antics showed a side of the company we'd never seen before—a human side, a relatable side that might have been stifled by Jobs' perfectionism and arrogance before. Watching the keynote, I felt like Apple had gotten a new lease on life. The company seemed emboldened by its founder's legacy yet free from the weight of his influence.
And all it said was, "We are Apple. This is who we are."
Companies without soul can prosper. Firms like Microsoft and ExxonMobil post healthy profits and, for the most part, delight their investors. But nobody feels a personal connection to them. I think Apple came dangerously close to following those companies down that dark and dreary road. However, I think Tim Cook and his team were perceptive enough to steer clear of it and, once again, imbue Apple with human qualities. Those aren't the qualities of the old Apple—charisma and persistence and arrogance. They're new qualities like warmth, playfulness, devotion, and humor.
That's the sense I'm getting from yesterday's keynote, anyway. The new Apple may never be like the old Apple, but from what I saw, it could turn out even better.An ode to the Kindle Paperwhite
Oh Paperwhite, my sweetest Paperwhite
In softness clad, and black as moonless night
Submerged in all thine words, I do thee clutch
And thou, to me, feel'st like a lover's touch
My eyes aloft o'er fonts as clean as dew
So easy 'tis to lose myself in you
Caecilia and Palatino, my dear
Did not before look quite so fine and clear
I loved thy sister, trusty Kindle Touch
But next to thee, my sweet, she lacks so much
Face sunken in and fussy as a bee
Could not tell cloth from skin, it saddened me
And when the sun at day's end went to rest
The words at once became as dim as west
So dark were they, I huddled by the light
of bulb, or tube, or candle in the night
One day I laid the words on fine a slate
A thing with bigger screen and greater weight
Yet soon my eyes grew weary from the glare
And seeing the faint reflection of my hair
But thee, O fairest reader of them all
Have no such flaws, no penchant to appall
When set all day upon thine front-lit face
Mine eyes do not fatigue, do not give chase
To chapter's end, to hasty epilogues
They seek the words like famished pollywogs
And though 'tis true you are not free of kink
One glowing edge, one corner tinted pink
Your screen, when dimmed to match the light around
Is fairer than what printed things abound
Except, perchance, a Gutenberg Bible
But paperbacks were never as noble
The PC is booming—just not the PC we know
So, you heard the news: PC sales are tanking. Apparently, nobody wants to buy Dells or HPs anymore; nobody cares about clunky laptops and bulky mid-towers. People haven't necessarily stopped using them—aging PCs are still humming along in bedrooms, living rooms, and offices everywhere. It's just that those machines aren't getting replaced. Instead, people are spending their hard-earned dough on what analysts call post-PC devices: smartphones, tablets, phablets, and so on.
The PC industry is scrambling to adapt. Microsoft has retooled Windows into a weird hybrid that straddles post-PC tropes and legacy conventions. Laptop makers are bending over backwards to give us touch-enabled laptops that double as tablets. Everyone is working overtime to put a new spin on old concepts... and by all accounts, it isn't working. PC shipments suffered their greatest decline ever last quarter, in spite of Windows 8 and all those tablet-notebook hybrids.
Some say there's no hope, but I disagree. Because the PC is booming—just not the PC we know.
What is a PC? The initials stand for "personal computer." According to the Merriam-Webster, a personal computer is a "general-purpose computer equipped with a microprocessor and designed to run especially commercial software (as a word processor or Internet browser) for an individual user."
In the 90s, when I was growing up, personal computers were few and far between. To record TV shows, we used VCRs and VHS tapes. To share music with friends, we spent hours copying cassette tapes. To get in touch with friends and relatives, we used a land line—or, if textual communication was more up our alley, we wrote a letter. By hand. With a pen and paper.
The only kind of personal computer you could buy was a big beige box with a matching CRT monitor. Or, if you were loaded, you could get a laptop with a crappy passive-matrix LCD and a trackball wedged in the palm rest. These were toys of the privileged. We, the geeky elites, used them to play Doom and to rack up preposterous phone bills surfing AltaVista and GeoCities on 14.4K modems. A PC was a badge of pride. Finding someone else who knew about the Internet was sufficient to spark a lasting friendship.
If you'd fallen into a coma in 1993 and awoken today, you'd realize that personal computers are everywhere now. You'd probably notice the laptops and mid-towers at first, but then you'd start to see the phones, the tablets, the game consoles... and you'd think, aren't they all the same thing? Aren't these all general-purpose computers equipped with processors and designed to run commercial software? Sure, some of them are a little more locked down than the IBM clones of old, but that's nothing a little jailbreaking or rooting won't fix. Hackers can still write their own software. Not only that, but they can also package that software, ship it, and make money from it with very little effort or risk. It's a far cry from the days of shareware trials on 1.44MB floppy disks.
Two decades ago, having the tools to play video games, to get on the Internet, and to write crappy BASIC programs made us special. Now that personal computing has grown into something so exceedingly ubiquitous, we feel like we're not so unique anymore. Some of us see the PC's evolution as a corruption of something precious—but I don't think it is. The basic formula that made PCs great 20 years ago is still there, more or less intact, in today's post-PC devices. You just have to look past the drastically different packaging and realize that modern personal computers are more beautiful, more versatile, and easier to use than than they've ever been.
I still spend a lot of time in front of an old-school desktop PC. I wouldn't dream of giving it up. The thing is, though, my PC is big, heavy, difficult to operate, and required for serious work—Photoshop, Excel, web design, text editing, you name it. There's nothing terribly personal or cozy about it. If you think about it, today's tablets and phones fulfill the PC's original mission—making personal computing available to the masses—far more elegantly than this thousand-dollar workstation.
Because that's really what this is: a workstation. And that's really what most of today's traditional PCs are. They're workstations with multiple processor cores, Windows NT-based operating systems, and copious amounts of storage and memory.
There's nothing wrong with that. Workstations will always be needed, because there will always be work to do. But we shouldn't pretend that the PC is somehow dying because people aren't buying workstations they no longer need. The PC isn't dying, because today's real PCs are in our pockets. We're buying more of them than ever, and they're doing more for us than i486 Compaqs ever did.Modern shooters and the atrophy of fun
I finished BioShock Infinite last weekend. It's probably one of the best shooters I've ever played.
I loved the cleverness of the storyline, the expressiveness of the characters, and the unique beauty of the graphics. I loved how the game got so immersive that, during late-night sessions, I almost felt like I really was fighting my way through Columbia—like I really was trying to tear a young woman named Elizabeth from the clutches of her fanatical, despotic father.
I loved everything about BioShock Infinite. Except for the gameplay.
Don't get me wrong. The exploration was great. Watching the story unfold was incredible. But the combat and looting got so repetitive—so downright boring—that I couldn't stand to play more than a couple hours at a time. I got sick of digging through trashcans inexplicably filled with silver coins and of fighting wave after wave of enemies, each one more indistinguishable than the last. The combat sequences blurred together, ran into one another, and I found myself praying for them to end, hoping that I could proceed without playing hi-fi wack-a-mole with steampunk guns and angry crows.
So why is BioShock Infinite one of my favorite shooters? Because the others are just as bad, if not worse.
Since the days of Doom and Quake, we've seen shooters take quantum leaps in graphics, writing, voice acting, and just about everything else—except for gameplay. Somehow, gameplay hasn't evolved. It hasn't gotten more fun or more engaging or more interesting. Instead, it's atrophied into a bland rut, to the point where big-budget shooters feel just like old light-gun arcade games (Virtua Cop, House of the Dead, and so on). Players are still stuck on rails, still made to gun down easy target after easy target, pausing only to reload and to watch cut scenes. Today's visuals and stories might be Oscar-worthy, but the interactivity still feels like tasteless filler.
Shooters could be so much more. Instead of trivializing combat, they could make fights less frequent, longer, and more memorable. They could reward players for acting rationally when outnumbered—hide, flee, or die. Shooters could, when appropriate, encourage problem-solving and exploration over brute force. Hell, why couldn't they have players decide how the story plays out? But no, that's all too much to ask. Studios and publishers seem to have forgotten that games are supposed to be games, not CG films with playable action scenes.
Things weren't always this way. I have very fond memories of System Shock 2, BioShock Infinite's spiritual pre-predecessor. I remember desperately scrounging for ammo, cowering in fear from even lone mutants, since I knew a fight might leave me badly wounded—and the noise might attract other creatures. I recall sneaking past enemies and reprogramming turrets to dispatch them so that I wouldn't lose precious health or bullets in combat. I can still recall the satisfaction I felt when, later in the game, I finally had enough upgrades to gun down monsters in one shot.
In System Shock 2, each enemy encounter was an event: memorable, frightening, dangerous, and sometimes exhilarating. Getting lost in the corridors of the Von Braun was part of the game, and it made the experience all the more immersive. There was a real sense that you, the player, had to use your own cunning and skill and sense of orientation to survive. Because of that, beating the game felt like a true achievement, and it made you want to start all over again. Nothing about it felt like watching a bad Michael Bay flick.
System Shock 2's formula should have been spread far and wide and polished to a mirror shine by now. But instead, after 14 long years, that formula has been largely forgotten.
There's nothing dangerous or memorable about BioShock Infinite's gameplay. While players must still hunt for ammo, combat is so frequent that bullets are strewn everywhere, and picking them up feels like a chore rather than a relief. The anguish of an empty gun is nowhere to be found, either. Elizabeth replenishes your ammo supply during combat, and when things go south, dying causes you to be magically teleported to a safe location with spare magazines in your pockets. There's no longer any danger. Skill and cunning aren't really rewarded anymore.
Not even the game's "vigors" manage to spice up combat. Most of them basically do the same thing: stall bad guys for a few seconds and inflict a small amount of damage. Using a vigor at the right time can mean the difference between a cleared battlefield and a forced resurrection, but there's nothing hugely satisfying about the process. It just adds more steps to the tedium of depleting each enemy spawn point.
The most depressing thing about BioShock Infinite, though, is that it's actually one of the more original shooters out there. Compared to the endlessly multiplying Call of Duty clones, its gameplay is textured and tinged with depth and variety. While I was able to beat BioShock Infinite and derive pleasure from the experience, I've had to stop myself from playing war-themed shooters altogether. Their single-player campaigns are just awful. The last one I bothered to finish was Battlefield 3, and I hated everything about it.
I'm not sure who or what to blame. Maybe this is all an attempt to appeal to the lowest common denominator. Maybe game studios are so intent on catering to brain-dead 14-year-olds with Xbox 360s that they've lost sight of what makes games fun. If that's the case, then there may be little hope. It's entirely possible that the next crop of consoles will bring us unimaginably pretty games with sugar-free, decaffeinated gameplay that's as boring as ever. That would be even more soul-crushing than BioShock Infinite's failings.
Our only hope is that, eventually, even 14-year-olds will get sick of playing the same game over and over. They'll start to clamor for better games, where interactivity involves more than just pointing and aiming. And game developers will deliver. It might seem unlikely, but I remember being 14 quite well. It was around the time System Shock 2 came out, and I didn't toss that game aside for something with more instant gratification. I dug in, and I loved it.The problem with Windows convertible tablets
I've been spending a fair bit of time with Windows convertible tablets lately. I reviewed the Samsung ATIV Smart PC Pro 700T last week, and I've had the Asus VivoTab RT kicking around in my benchmarking lair for a few weeks. I'm also currently testing an Atom-based tablet: the VivoTab Smart, which combines x86 support with the slender profile and long battery life you'd expect from an ARM-based device.
Oh, and I've tried both versions of Microsoft's Surface. Not in my office, though—there was a Microsoft kiosk at the mall, and I stopped by while shopping for a Valentine's Day gift. Yes, I'm that romantic.
Anyhow, the longer I spend with these devices, the more I grow convinced that convergence à la Microsoft is an ill-tasting recipe. I articulated some of my reservations in the Samsung 700T review, where I stated:
There seems to be little overlap between what people do on tablets, which is mainly content consumption, and what people need full-featured notebook PCs for, which is productivity. . . . So, why must we have both on one machine? What's so compelling about having Facebook and Kindle apps on the same physical system as Office and Photoshop? Since the combination is fraught with compromise, why not get a great tablet and a great ultrabook rather than a less-than-great combination of the two?
TR's own Geoff Gasior had a reasonable answer to this: because carrying one device is better than carrying two. And hey, I totally get that. My problem is that saving room in my backpack does me little good if I it means passing up the best tool for the job. From my experience so far, Windows convertible tablets are rarely—if ever—the best tools for the job.
Think about it. What do you want from a tablet? You want plenty of quality apps to choose from, including games. You want a great display, long battery life, and something that's thin and light. You also want a device that's both fast and easy to use, because content consumption is no fun if waiting and troubleshooting are involved.
Win8 and WinRT systems just don't deliver there. Good Modern UI games and apps are still pitifully few in number. (There's no Flipboard, Feedly, or Google Currents. No HBO GO or BBC iPlayer. No Yelp, and no official Facebook client.) The handful of Windows tablets that are thin, light, and endowed with long battery life—those with Atom and ARM-based processors—all seem to have ugly, low-resolution displays. (1366x768 is just downright sinful on a tablet screen.) As for speed and ease of use, WinRT slates take forever to launch apps, and while Atom tablets strike a passable balance between performance and power efficiency, ease of use remains a concern. One must still put up with the awkward marriage between Modern UI and the desktop, not to mention the questionable design choices within the Modern UI environment itself.
Okay, now what do you want from a good laptop? This is a system you're going to be using for productivity, so you want it to be fast. You want a great keyboard and touchpad, because controlling Windows 8's desktop environment with a touch screen is an exercise in frustration. If this is a productivity machine, chances are you want more than 11.6 inches of screen space. Don't get me wrong; small, highly portable notebooks are great. Photo editing on a thimble-sized display, however, is not. Neither are the cramped keyboards and truncated touchpads that invariably accompany smaller screens.
Windows convertibles also fail to deliver in this department. For them to double as half-way decent tablets, convertibles must sacrifice desktop performance and capabilities in one way or another—either with plodding performance, like the Atom-based systems, or with a "let's pretend" desktop courtesy of Windows RT. They must restrict themselves to 11.6" or smaller screen sizes, as well, which inherently compromises the keyboard and touchpad arrangement. Worse, that compromise is often more dire than it ought to be. The Surface's Touch Cover is just plain awful (try touch-typing on the thing, I dare you), and the Samsung 700T's fickle and undependable touchpad really disappointed me.
Ideally, Windows convertible tablets should offer the best of laptops and tablets, all in a single device. They should, but they do not. Current offerings feel more like crappy tablets rolled into crappier notebooks—jacks of all trades, masters of none, with good design sense and usability discarded in the name of convergence.
What does that convergence get you?
Well, you can store all your music, photos, and personal files on a single device. That's nice, I suppose. Then again, cloud storage is starting to make that convenience a little old-fashioned. I don't carry very much music on my phone, for example, because I don't have to. When I want to listen to something that's absent from the device, I simply grab it through iCloud over the LTE connection. (And no, being an Apple-worshipping metrosexual isn't a prerequisite. Google and Amazon run similar services.)
What else? We've already addressed the saving-space-in-your-backpack thing, and I think the downsides of convergence make that a lopsided bargain. That leaves one major advantage: cost. Buying a convertible tablet is cheaper than springing for a separate tablet and notebook, isn't it? If you're strapped for cash, convergence must be a pretty solid proposition.
The price difference isn't as big as you'd expect, however. A Nexus 10 will set you back $400; an iPad, $500. An entry-level ultrabook can be had for $650, and a good one will cost about $1,000. Now look at Samsung's ATIV Smart PC Pro 700T—a fine example of a Windows 8 convertible with ultrabook-class performance, which is precisely what you need if you aren't buying a separate laptop. It costs almost $1,200 at Newegg. That's $150 more than the Nexus 10 and the inexpensive ultrabook, and only $300 cheaper than the iPad and deluxe ultrabook combo.
Now, why on Earth would you settle for the worst of both worlds?
Don't get me wrong; convergence can be done well. Smartphones are touch-based computers converged with mobile phones, and they're are a great example of the concept taken to the right place. In that instance, though, convergence works because people don't want their pockets weighed down with extra hardware. In your trouser pockets, every ounce and every cubic inch counts. That's why nobody seems to mind that smartphones have pitiful battery life compared to basic cell phones. The benefits of convergence—having a little, Internet-connected computer, media player, and gaming console in your pocket—far outweigh the inconvenience of having to charge up every night.
I don't think you can make a strong case for convergence between tablets and notebooks. You don't carry those devices in your pocket. You carry them in a backpack, a briefcase, or a messenger bag, and so it doesn't really matter whether you're hefting a tablet and an ultrabook or a tablet and a removable keyboard dock. There's a small weight and thickness difference, but it doesn't amount to very much. The Samsung 700T weighs 3.54 lbs when docked. Put together, the iPad and deluxe ultrabook we talked about weigh 4.3 lbs. We're talking about a 12-ounce disparity, which is nothing compared to the nuisance of having to carry both a phone and a PDA in your pockets.
Of course, none of this means successful notebook-tablet convergence is unachievable. Once the hardware delivers ultrabook-class performance in the power envelope of a Tegra 3, and once Modern UI is sufficiently polished, fine-tuned, and loaded with great third-party apps, then I expect we'll see some excellent convertibles—devices good enough to make me ditch my iPad and my laptop. Perhaps all it will take is the next generation of processors—Haswell, Bay Trail, and Temash. Or maybe only Windows 9 and next year's hardware innovations will bring us there.
Or maybe it will take even longer than that.
For now, though, I'll keep watching Windows convertibles as I always have: with a mixture of curiosity and disappointment.
|Corsair's Graphite Series 380T case reviewed||12|
|Friday night topic: why the fear of autonomous machines?||42|
|Corsair's new DDR4 modules are rated for 3300 MT/s||16|
|Deal of the week: A 240GB SSD for only $80||4|
|Asus' X99 Deluxe motherboard reviewed||10|
|Intel's Core i7-5960X processor reviewed||99|
|Steam's in-home streaming accelerated by GeForce GPUs||22|
|Apple sets date for expected iPhone 6 reveal||21|
|Now we can lose our data 8TB at a time.||+42|