Ceci est un blog

For Valve, is the Steam controller both a blessing and a curse?
— 10:06 PM on April 10, 2014

As excited as we may be about Steam machines democratizing PC gaming in the living room, there's no question Valve's new platform will encounter some obstacles on its road to success. Valve will have to persuade other developers to support a new platform, for starters, and it will be forced to work with hardware makers, particularly AMD and Nvidia, to ensure SteamOS gets top-notch driver support. On top of that, Valve will need to convince PC gamers to step out of their comfort zone and embrace a wildly different operating system without support for many familiar apps and games.

At the Game Developers Conference last month, I became aware of another potential bump on the road: the Steam controller.

The Steam controller has a unique design with dual touchpads instead of analog sticks or d-pads. Its design is supposed to let gamers play not just shooters, but also titles that traditionally require a keyboard and mouse. (Think Civilization V and Diablo III.) When Valve first revealed the Steam controller, it said the device's dual touchpads have a much higher resolution than typical analog sticks. Thanks to a "new generation of super-precise haptic feedback," the touchpads can also convey "speed, boundaries, thresholds, textures, action confirmations, or any other events about which game designers want players to be aware." This video shows just how fast and versatile the Stream controller can be for a seasoned user.

On paper, that all looks great. Conventional console controllers are markedly slower and less accurate than a mouse and keyboard even in first-person shooters. The Steam controller promises not just to remedy that, but also to make a whole bunch of new games comfortable to play from the couch. 

But there's a downside. As I learned first-hand, the Steam controller has a pretty steep learning curve—steep enough, I fear, to put off some potential converts.

Valve had several Steam machines set up at its GDC booth, all hooked up to the very latest Steam controller prototypes. Over a period of about 10 minutes, I gave the controller a shot in both Portal 2 and a Japanese-looking side-scroller whose name I can't recall. In Portal 2, the Steam controller was so unlike anything I'd ever used that I was completely useless with it. Oh, the shape of the device was excellent, and the positioning of the buttons was great. But I could barely circle-strafe, and precise aiming was entirely out of the question. So was the kind of quick response needed to complete the next puzzle. All I could do was run around and try to get a feel for the elusive touchpad controls.

At this point, one of the Valve guys shepherding the demos told me I'd better not run Portal 2, since it tended to crash on the particular Steam machine I was using. So, I fired up the aforementioned side-scroller. There, the 2D environment made it easier to steer my character around, but I still had trouble with the sensitivity and response of the two touchpads. The main thing that threw me off, I think, was that the touchpads register absolute finger positions—not relative ones like the touchpad on a laptop might. Holding your thumbs in exactly the right positions with only a couple of concentric ridges for guidance is... tricky.

After getting myself killed a few times, I put down the controller and spoke with the Valve guy. Was I just uncommonly clumsy, I asked, or was the learning curve really so steep? To my surprise, he said it personally took him eight hours to get fully acquainted with the Steam controller. The learning process does vary from user to user, he added, and faster learners can apparently pull off the same feat in only 15 minutes. But based on my own experience, that probably requires some uncanny dexterity.

Even 15 minutes is a long time, though, especially for someone who's used to instant familiarity with mice, keyboards, and conventional gamepads. A quick brush with a Steam machine at a store, a friend's house, or some other venue might easily discourage a future purchase. If we're looking at eight hours of learning time for a Valve employee, I worry that some folks will spend much longer wrestling with the thing. For a brand-new platform whose success will hinge on broad adoption, that's not a good thing.

Simply plugging an Xbox controller into a Steam machine would take care of that problem. I'm sure Valve will make it possible. That said, an Xbox controller would also put players at a disadvantage in multiplayer skirmishes, where they'd likely fight PC players armed with keyboards, mice, and (yes) Steam controllers. An Xbox gamepad would also seriously limit the playability of Steam's many point-and-click games—and one of the big selling points of Steam machines is their ability to bring those titles to the living room.

What does Valve think of all this? Well, the Valve guy I spoke to said the company is indeed concerned about the Steam controller's steep learning curve—but it thinks the size of Steam's current user base will work in its favor, and so will the Steam controller's support for titles that can't be played with conventional gamepads. In other words, it's willing to gamble that the pros will outweigh the cons.

I suppose I can't argue with that. Still, I can't shake the feeling that one of the Steam machines' hottest features may also be a barrier to their success.

94 comments — Last by grndzro at 8:47 AM on 04/17/14

Broken Age and the Kickstarter factor
— 9:40 PM on February 6, 2014

I haven't done a ton of gaming on my PC lately. It's not for a lack of games—I have access to a lot of new releases for testing purposes—or for a lack of hardware—there are literally crates full of graphics cards in my office. It's not even for a lack of free time, so long as I'm not crunching away on another time-sensitive TR review.

No. My problem is that, these days, big-budget games are stuck in a rut.

As graphical fidelity has grown and development costs have ballooned, originality seems to have atrophied. Most of the new releases out there feel like carbon copies of their predecessors, with similar gameplay, similar level design, and similar stories. I've found that to be true whether one looks at multiplayer or single-player. In action games, single-player tends to be especially bereft of variety: you shoot some bad guys, hide behind cover to heal for a second, move on to the next area, watch the cut scene, rinse, repeat. Multiplayer sometimes get spiced up with co-op or expansive, vehicle-laden maps, but basic gameplay doesn't vary all that much.

Indie games have proved to be a good way for people like myself—those tired of cookie-cutter shoot 'em ups—to scratch our gaming itch without boring ourselves to death. I've grabbed a number of fun, experimental indie titles on Steam, like Race the Sun, Dyad, and Gone Home. In terms of originality, they're worlds better than Call of Duty XLVII: Ghost Strike Team At War. But by virtue of being indie, they also tend to be saddled with low production values. Lack of funding tends to limit scope and quality in ways that, at times, can be disappointing.

So, is there a way for big budgets to feed originality rather than destroy it?

Perhaps Kickstarter is the way to go. Last month, a friend of mine told me about Broken Age, a project led by Tim Schafer, one of LucasArts' former adventure game gurus. Shafer and his team asked for $400,000 to fund a "classic point-and-click adventure," and they wound up with $3.34 million in pledges. That larger-than-expected windfall allowed them to build a game with killer art and top-notch voice talent, with folks like Elijah Wood, Jack Black, Wil Wheaton, and Pendleton Ward on board.

Last week, I sat down to play the first act of Broken Age. (The second act is coming later this year.) I made it last a couple of days, logging about four hours of play time in all—about what you'd expect from a blockbuster shooter's single-player campaign these days. And for the first time in forever, I had actual fun playing a big-budget game.

In Broken Age, you alternate between the roles of Shay, a teenage boy who's apparently alone on a spaceship full of overbearing robots, and Vella, a teenage girl from a coastal village where tradition calls for her impending sacrifice to a giant, Miyazaki-esque monster. There's no apparent connection between these two characters at first, though the game lets you switch between them at will. Sick of talking to Vella's grandmother about the monster? Step into Shay's shoes and explore space a while. As the story progresses, however, the game offers up small hints about how Shay and Vella's destinies are linked. More becomes clear during the act-one finale, which left me honestly (and pleasantly) surprised.

As far as gameplay mechanics go, Broken Age should be immediately familiar to any fan of LucasArts classics like Monkey Island or Day of the Tentacle. It should be easy enough to pick up by someone uninitiated, too. In that respect, I suppose one could make the case that Broken Age is just as cookie-cutter as Call of Duty—just with a multi-decade gap between cookie cuttings. Here, though, gameplay mechanics are very much incidental. They're simply a delivery vehicle for the game's top-notch writing, voice acting, and art.

Playing Broken Age feels surprisingly like watching a Pixar movie or an episode of Adventure Time. The humor is universal, likely to amuse kids as much as the older and more jaded among us, and the delivery is extremely polished. Modern shooters may feel like cut-rate action movies, but Broken Age doesn't feel like a cut-rate cartoon. Hearing Wil Wheaton play a lumberjack terrified of talking trees (which, as it turns out, are real), or listening to Elijah Wood argue with a spaceship AI who calls him "sweetie," you never get the sense that you're wasting your time with sub-par entertainment. On the contrary, I would often sift through dialogue trees to make sure I didn't miss out on any punchlines.

I've heard some folks complain that Broken Age is too straightforward, that its puzzles are too easy to solve, and I suppose that's true. Someone with half a brain and some experience with adventure games probably won't get stuck on any of the puzzles, at least not for very long. I certainly didn't. But you know what? That's okay. For me, playing Broken Age was about enjoying the ride, not about being challenged or validated. I love puzzles in the right context—I'm a big fan of the Myst series—but I've never enjoyed contrived riddles in point-and-click adventures. They're often frustrating to solve, and they get in the way of the story.

Unchallenging as it may be, Broken Age feels like a breath of fresh air amid all the brown-and-gray levels, assault rifles, and overwrought military themes. And it gives me hope for future Kickstarter projects.

This path may not be easy for other indie developers to follow. With the funding drive that led to Broken Age's creation, Tim Schafer appealed to the nostalgia of a whole generation of folks—and he offered absolute expertise in the genre. (This is the guy who made Grim Fandango.) Few others in the industry can make such a claim. At the same time, there are other industry veterans like Tim with potentially great ideas that wouldn't fly with major publishers. I'd love to see what someone like, say, Tom Hall could do with three million bucks and complete creative control.

More to the point, Broken Age has shown that a Kickstarter project can blossom into a high-quality piece of interactive entertainment. The more projects like Broken Age succeed, the more studios will feel encouraged to pursue unorthodox ideas. And I think that's a good thing.

46 comments — Last by shaq_mobile at 8:33 PM on 02/17/14

High-PPI support in Windows 8.1: still not so great
— 8:25 PM on December 19, 2013

Displays with high pixel densities are pretty much standard in tablets, and we're all waiting for them to become standard in notebooks. Take a trip to your local Best Buy, though, and chances are a majority of systems in the laptop aisle will have 1366x768 panels—even large notebooks that really have no business with a display resolution that low.

It's a sad state of affairs. If Google can serve up two megapixels in a $229 tablet, then why can't PC makers do the same in $800 ultrabooks? Why isn't 1080p the new standard by now? And why aren't truly high-PPI screens (think 2560x1440 or more) widely available for those who don't mind paying a premium?

I'm sure costs and margins partly explain why PC makers continue to ride the 1366x768 gravy train. As I discovered recently, however, there's another, even more infuriating hurdle on the path to high-PPI nirvana.

You see, high-PPI support in Windows still kinda sucks.

Behold exhibit A: the Zenbook Prime UX31A from Asus. This is an Ivy Bridge-powered ultrabook with a 13.3", 1920x1080 IPS panel. It's already more than a year old, and there's nothing all that remarkable about it. Last month, I dug it out from my pile of review samples, loaded it up with Windows 8.1, and took it to AMD's APU13 conference in San Jose, California. There, I used it as my primary computer for about four days.

Windows 8.1 did a great job of recognizing the Zenbook Prime's display as a high-PPI one, and it scaled the user interface accordingly right away. Default applications like the File Explorer looked crisp and clean, with readable text and correctly sized widgets. I couldn't really fault Microsoft there; they clearly seemed to have done their part.

Things got ugly once I started installing third-party apps, though. Here's what Google Chrome looked like at the default scaling setting:

Total blurry mess. For reference, here's Chrome next to a File Explorer window that's scaled properly. Note the difference in font sharpness:

It's not just Chrome's user interface that was scaled up and blurred. The whole application was blurred—even web pages. I tried switching to the Chrome beta channel, even toggling an obscure high-PPI switch in the hidden "chrome://flags" settings to enable high-PPI support, but nothing helped. The beta looked no better, and the obscure toggle ("HiDPI Support," in case you're wondering) just made everything broken and ugly.

I encountered the same blurriness in other third-party apps: iTunes, 7-Zip, and Sublime Text 2, my favorite text editor. They all scaled up to the right size, but without making proper use of the extra pixels available. The result was invariably atrocious. Looking at blurred fonts all day is a recipe for headaches.

Now, a few third-party applications did handle themselves better. Here's Firefox, for instance:

Mozilla's browser at least understood that it was running on a high-PPI system, and it scaled page contents and UI fonts sans blur. As you can see above, however, the UI widgets didn't quite look right. The icons were blurry, and the Firefox menu was full of giant black arrows for some reason. Don't get me wrong; this was still worlds better than Chrome. But it was hardly the kind of experience you'd expect from a premium ultrabook with a fancy screen.

At the other end of the spectrum, you have apps like Photoshop that pretty much ignore Windows' PPI settings altogether. Here's Photoshop CS6 and Word running side by side; Word scales correctly, while Photoshop doesn't:

I couldn't track down a fix for Adobe's negligence. I did, however, find out how to get rid of the blur in apps like Chrome: right click on the application shortcut, go to Properties, find the compatibility settings tab, and tick "Disable display scaling on high DPI settings." Boom! All better. Except, not really. Ticking that checkbox means UI widgets stay the same size at any scaling level, and fonts may or may not scale up as needed. In Chrome's case, that means tiny buttons, big text labels, and illegibly small fonts on web pages. You have to raise Chrome's default page zoom to 125% in order to make the web readable.

There is a simpler alternative to enabling that compatibility setting for half your apps. In Windows' Display control panel, ticking "Let me choose one scaling level for all my displays" will restore the legacy scaling from previous Windows releases. That means no blur, but also no improvement over the per-app compatibility setting. Fonts and UI widgets are still sized inconsistently, and in apps like Chrome and Sublime Text, you still have to scale page or document contents manually.

So, yeah. Running Windows on a notebook with a high-PPI screen is an exercise in frustration right now. With the Zenbook Prime, I sometimes wished that I had a 1366x768 screen—not because I didn't enjoy the extra pixel density in software that supported it, but because I just wanted everything else to look right. And no matter how much I tinkered, some things never did look quite right.

Now, can you imagine a technically illiterate user grappling with these same problems? Yikes. No wonder HP, Dell, & co. aren't tripping over themselves to sell you high-PPI notebooks. The software support just isn't mature enough yet.

If PC makers aren't going to take the first step, then Microsoft needs to reach out to developers and make sure Windows software is ready for high-PPI screens. We're not talking about cleaning the Augean stables here. Apple has already pulled off something quite similar. There was a rough transition period after Retina MacBooks came out a couple of years back, but high-PPI support in OS X and Mac software has improved dramatically since then. Today, laptops like the $1,299 MacBook Pro with Retina display are very compelling, partly because they offer a high-PPI experience that Windows just can't match.

Microsoft, I know you're all about tablets right now—but if you want to make Windows notebooks sexy again (and goodness knows they need it), then fixing high-PPI support should be high on your list of priorities. If this doesn't get done, my next laptop might just have an Apple logo on it.

103 comments — Last by HisDivineOrder at 10:05 AM on 12/31/13

My Steelcase Leap chair fixed my crappy posture
— 2:59 PM on October 18, 2013

I do a lot of sitting.

Like, a lot. I sit in my home office for eight to nine hours every day doing TR-related work. Once I'm done, I go on sitting there doing other things—working on personal projects, playing Trackmania, wasting my time on Reddit, and so forth.

I take breaks, of course. Every now and then, I'll walk to the kitchen, open the fridge door, decide that I'm not hungry, and walk back to my desk. If it's not raining, I'll go outside for a walk. Even if it is raining, I may venture out into the city to run errands.

But, yeah, I mostly sit. That's why, nearly six years ago, I paid an almost outrageous sum of money for a fancy ergonomic chair.

The Herman Miller Mirra served me well. It encouraged me to sit properly, and even when I slouched, it was far more comfortable and supportive than the cheap office chairs I'd sat in before. I could get the lumbar support and seat depth just so, and I could make the chair lean forward when I needed. And, heck, the thing looked plain cool, like something out of Star Trek. I was delighted.

Well, at least at first. A couple of years ago, I started noticing some tingling in my right pinky and ring fingers. I blamed my mouse initially, and when changing mice didn't help, I fiddled with the armrests and tried to use my left hand to mouse for a while. Some of those things helped. I got better, then worse, then better, then worse again. Finally, last winter, I got some x-rays done and went to see a physiotherapist. I was told that my upper back and neck were the problem. In short, it was a posture issue. I started going to the gym, doing stretches, and watching my posture more closely.

But it wasn't just me showing signs of wear. Over the years, the curve of the Mirra's back had flattened somewhat, and the lumbar support had lost much of its rigidity. The other day, I tried sitting in my girlfriend's cheap Ikea chair for a few days. And guess what? The tingling in my fingers got better.

In the end, I decided to call my local Herman Miller distributor and get the Mirra serviced—then to sell it and buy another, better chair.

I settled on the Steelcase Leap. The Leap is a favorite among many, and some, like the folks at TheWireCutter, recommend it over the venerable Aeron as well as Herman Miller's new flagship, the Embody. The Wall Street Journal called the original version of the Leap "Best Overall" in 2005. I ordered the V2 model, which has softer arm rests, a taller back, and other design tweaks. It set me back $755 before tax, which is a lot, but not that much for something in which I spend most of my waking hours.

On October 4, the Leap showed up at my door. Here's what I typed in our staff IRC channel immediately after sitting in it and making the requisite adjustments:

[9:51:17 AM] wow, this steelcase chair... instant relief

The Leap looks pretty unimpressive next to its Herman Miller counterparts. It has padded cushions instead of fancy mesh materials, and there's a lot of plastic covering things up. Steelcase has put an adjustment guide under each arm rest, too, and it has labeled the adjustment knobs with both printed text and Braille. Looking at this thing, you get the sense the Leap was designed to populate boring offices filled with normal people—not European design studios rife with iMacs and glass-top desks. If Herman Miller can be accused of favoring form over function, Steelcase is the polar opposite.

Yet, as boring as it looks, the Leap is just as adjustable as the Mirra—and, more importantly for my needs, its back has a much more pronounced curve with some much-needed padding. Adjusted properly, the Leap almost punishes me for not sitting up straight. Even brand new, the Mirra only ever encouraged good posture, and it never insisted too terribly much.

The Leap made my back better instantly, but it took me over a week to get really used to the thing. See, the Mirra has a flexible mesh seat, kind of like a hammock, that molds itself to the shape of your butt. The sides and front of the chair are rock-hard, but the part where your butt hangs is very soft. The Leap is the other way around. The front edge never cuts off circulation to your legs, and the sides are soft, but the part where your butt goes is quite firm. There's a couple inches of padding and a hard surface underneath, and that's it.

This is a deliberate design choice on Steelcase's part. Here's what the company says about it:

Does a thicker seat cushion mean a chair is more comfortable?
Not necessarily, some chairs have thicker foam that may feel softer initially, but will lead to user discomfort after an hour or two of sustained sitting since thicker foam typically provides little ergonomic support. This is not good for the life of the chair or the long-term comfort of the user. In essence, foam that feels great initially does not always translate into long-term seated comfort.

Steelcase also badmouths mesh seat designs like the Mirra's. It claims they restrict user movement and cause discomfort when your body touches the hard frame supporting the mesh. "Moreover," it adds, "the side forces that are felt when you push down on mesh will have a tendency to 'squeeze' you into the chair, resulting in uneven pressure distribution."

I don't know about that; the Mirra's seat was pretty comfortable. The Leap, on the other hand, is literally a pain in the butt unless it's adjusted just so. Seriously, it's very unforgiving.

However, now that I've found the correct seat depth, lumbar height, and back tension to accommodate my flabby body, the butt soreness has given way to a feeling of firm support. The firmness keeps me alert and aware of my posture—and every now and then, it encourages me to change position or to get up and walk around, which is what you're supposed to do. The back isn't cold and hard like the Mirra's, but it's just as punishing as the seat if you slouch. When I get up at the end of the day, my back is still curved, and the whole middle third of my body is a little sore—but in a good way, like after a visit to the gym.

More to the point, the Leap helps to keep my ulnar nerve from getting pinched. Even after a long day of typing, benchmarking, and Excel jockeying, I feel little to no tingling in my fingers. And now, sitting in other chairs—even the Mirra—brings back the symptoms in a hurry.

So, yeah. Good job, Steelcase. You made me super uncomfortable for a week or so, but it was worth it.

70 comments — Last by ReefGeek at 6:55 AM on 11/18/13

Are Valve and AMD about to ruin PC gaming?
— 12:25 PM on October 11, 2013

When the PlayStation 4 and Xbox One were announced earlier this year, I saw it as a victory for the PC.

Soon, it seemed, bringing major blockbusters to the PC would be easier than ever. There would be three major gaming platforms—the PlayStation 4, the Xbox One, and the Windows PC—and each one would be only a slight twist on the same basic formula. Each one would feature an x86 processor, a DirectX 11 graphics chip, and its own, custom-tailored operating system. How could it get any simpler?

It couldn't. Instead, it got more complicated.

Last month, Valve announced SteamOS, a Linux-based operating system built around the eponymous game distribution service. SteamOS will show up on a whole lineup of Steam machines next year, and although it will let users stream games from Windows PCs, it will also run games natively. On the operating system's reveal page, Valve teases, "Watch for announcements in the coming weeks about all the AAA titles coming natively to SteamOS in 2014." In other words, big, third-party publishers may soon offer games for both Windows and SteamOS.

On the heels of Valve's announcement, AMD revealed Mantle, an API that lets developers optimize games for AMD hardware. Unlike Valve, AMD wasn't shy about naming one of its partners. A version of EA's Battlefield 4 optimized for Mantle will be released in December, not long after the game's scheduled October 29 debut. Other partners will no doubt follow, as will other titles.

Now, all of a sudden, next year's gaming landscape looks to be shaping up very differently. Instead of three major platforms based on a common hardware architecture, game developers will face two monolithic platforms and a fragmented one—the PC—that will have two starkly different operating systems and three different APIs—Direct3D, OpenGL, and Mantle.

SteamOS will complicate things by virtue of its existence. I wouldn't be surprised to see it become so successful that games must be ported to it, yet not successful enough to dislodge Windows completely. If that happens, then developers will have no choice but to support both Windows and Valve's operating system. This will mean extra work. Given the funding and time restrictions game studios often grapple with, we may see longer delays between console and PC releases as a result—not to mention lower-quality ports.

Mantle is kind of a double-edged sword, as well. While it may simplify some facets of cross-platform development, allowing console optimizations to be shared with the PC, Mantle may also encourage developers to prioritize AMD hardware at the expense of Nvidia GPUs and ever-faster Intel IGPs. In a worst-case scenario, non-AMD systems will end up delivering a second-rate experience, with worse performance, worse image quality, and more bugs.

This could all make PC gaming somewhat daunting to newcomers. Today, buying a decently powerful PC opens up access to a huge library of games, from point-and-click adventure titles to the latest cross-platform shooters. Next year, things will be different. Because of SteamOS, not all gaming PCs will be able to run all PC games, unless one is prepared to install a second operating system. And because of Mantle, buying a machine with a GPU from the wrong vendor could mean missing out on critical optimizations.

Now, don't get me wrong. SteamOS and Mantle also have the potential to do great things for the PC. Microsoft's custodianship of the platform has been marred by stagnation and split loyalties, and Valve could do a far better job, especially in the living room. Mantle may also enable optimizations that let PCs match or surpass the performance of next-gen consoles more easily. That could make PC gaming more, not less accessible, even if buying an AMD GPU is required.

Nevertheless, I think it's a dangerous time to tinker with the PC gaming formula. We're on the verge of a new console cycle, and the PS4 and Xbox One are about to reduce the PC's performance and image quality lead by a long shot. Next-gen consoles will likely be more affordable than comparable gaming PCs, as well. If the PC becomes too fragmented, and if playing games on this platform becomes too complicated, then I fear we'll see many folks take the easy road and switch to a console.

There's always been talk about innovations on the console front spelling doom for PC gaming, and it's never been true. Today, however, I worry that innovations from within the PC camp are threatening the platform. At a time when the line between PCs and consoles is getting blurrier than ever, too much fragmentation could damage the platform beyond repair. And PC titans like AMD and Valve would only have themselves to blame.

404 comments — Last by sleepybison at 11:19 PM on 11/21/13

iOS 7: replacing the elegant with the tolerable
— 12:04 AM on September 20, 2013

If I had to describe the totality of Apple's hardware design during the final years of Steve Jobs' tenure, I would use one word to do it:

Elegant.

My iPhone 4 was elegant. When I got it, I would often take it out of my pocket just to tinker with the software and to admire the hardware. It was sleek, sexy, modern, and fast. It was like Scarlett Johansson in a red dress or George Clooney in a tuxedo. It wasn't just effortlessly desirable; there was something special about it, a magnetism that made you want to keep looking, and staring, and admiring.

The same goes for many Apple products I've used over the years—my old aluminum MacBook, my new iPhone 5, the wired aluminum keyboard on which I'm typing this blog post. Even the lowliest of Apple cables and connectors have an elegance about them. Sometimes, when I have to charge my phone, I'll take an extra moment just to look at the Lightning connector—and then I'll plug it in, and I'll get a small whiff of satisfaction from the way it clicks into place. This doesn't happen consciously. Something about Apple's hardware design just seems to trigger that kind of reaction.

Apple owes this all to one man: Jony Ive. Sir Ive has been in charge of the company's industrial design for close to two decades. Recently, he was put in charge of human interface design, as well. iOS 7, the first release to bear his mark, came out yesterday, and millions of Apple users rushed to download it.

I was one of them.

I was excited about iOS 7. As an admirer of Ive's hardware creations, I was excited to see what he'd contribute to the software side of things. Plus, iOS was starting to look a little dated. Some of the UI widgets had been around since the release of the original iPhone in 2007. Seven years is a long time. In seven years, even the prettiest thing can start to get tiresome. It was high time for a new injection of elegance.

Well, I've been using iOS 7 for about a day now. I've used it on my iPhone and on my girlfriend's iPad. I've poked around the UI, agonized over a new wallpaper selection, and rearranged my home screen icons.

My verdict? It's okay. It's a little bit cleaner, a little bit brighter, and a little bit more colorful than the previous release. Apple has added some nice features, like Control Center, and it's made much-needed improvements to old ones, like multitasking. The animations look neat, although they do make the phone seem a little slower. Using iOS 7 kinda makes you feel like a disembodied spectator sometimes—unlike iOS 6, which was very fast and responsive.

For the most part, though, iOS 7 is okay. It's new enough not to look old, and it's pretty enough not to look ugly. It's fine.

And that's exactly what's wrong with it.

There's no elegance anymore. No magnetism. Nothing about the way iOS 7 looks makes me feel happy to be an iPhone user. Nothing about it makes me want to poke around the interface just to admire it. There's some mild curiosity, perhaps, but no admiration. Nothing like what I get from looking at Apple hardware and holding it in my hand.

Part of the problem, I think, is that Apple went overboard with simplifying the UI. Simple design is good, but make something too simple, and there's real a danger of it losing its identity. Some of the new iOS 7 apps, like the Calendar app, remind me an awful lot of Google apps. The Apple version is usually a little nicer, a little cleaner, but the difference is subtle enough not to matter much.

Some of the icon and button designs also feel a little half-baked. The Safari icon looks sort of sad. The Music icon has an angry red-orange gradient. The Reminders icon is kinda nondescript, and the Calculator icon is bland and not pleasing to the eye. In Safari, three of the five buttons in the bottom bar have pale blue outlines and a roughly rectangular shape, so they're instantly forgettable (and a little tricky to tell apart at first).

Then there are the new, over-saturated backdrops that clash with the icons and obscure the text labels on the home screen. See the image of the iPhone 5C variants above. Where's the elegance? Where's the charm?

iOS 7 will no doubt be improved, refined, buffed out. Jony Ive will hopefully get better at the whole UI design thing. But until then, iOS 7 will dilute the elegance of Apple's hardware with a look and feel that's merely tolerable. At a time when Apple's dominance is being challenged more than ever, being merely tolerable is very dangerous.

88 comments — Last by amadeus at 1:58 PM on 10/10/13

The desktop PC needs a makeover
— 1:25 AM on August 16, 2013

My PC is too big. Much too big. I'd always vaguely suspected it, but testing Corsair's Obsidian Series 350D case earlier this week made it quite clear.

My PC is full of air and unoccupied slots and bays. I have four 5.25" optical drive bays that I don't use. The top one houses a DVD burner, but I can't remember the last time I stuck a disc in it. I moved to Canada over three years ago, and I'm positive that I've never purchased a blank DVD in this country.

Half of the expansion slots on my motherboard are set dressing. I only have a dual-slot graphics card and a sound card. In fairness, I use five of my six hard-drive bays—but that's because I'm still holding on to old drives, including a 320GB WD Caviar SE16. If I were to build a new system today, I would probably need just two 3.5" bays, with one 4TB hard drive in each. Add a 2.5" solid-state drive for my OS and applications, and I'd be set.

I'm sure I'm not alone. In fact, I'm willing to bet the vast majority of PC gamers and enthusiasts out there have just as much empty space in their PCs. Oh, don't get me wrong; leaving room for upgrades is fine. However, in the age of laptops, iPads, and smartphones, it seems a little strange that we should all have humongous mid-tower PCs full of air.

Over the past few days, I've been trying to picture what a modern desktop PC ought to look like. We could redesign everything completely, of course—introduce new form factors all over the place and wind up with something close to perfection. However, I think we can already improve things greatly with a few simple, practical steps:

  • Let's make microATX the new default for desktops. microATX provides enough expansion for a couple of graphics cards plus one wildcard, uh, card, which is about all most of us will ever need. We can keep ATX around for workstations and extreme quad-GPU rigs.
  • Get rid of 5.25" bays. Just get rid of 'em. Optical media is dead, and there are far better ways to back up your data than to burn a DVD or Blu-ray.
  • While we're at it, let's have smaller power supplies, too. Pretty much nobody needs a 1kW PSU. Heck, I figure most gaming PCs draw less than 500W. I'm sure we don't need to devote a cubic foot at the bottom of every case to AC-DC conversion. Switching to the SFX form factor could be a viable option there; Silverstone already makes a nice 450W SFX PSU.
  • Speaking of power, we could save users a lot of grief by simplifying power cabling. Heck, we could build it right into the enclosure—connect the PSU to the case with a big, standardized connector, and have strategically placed plugs and connectors sprout off where they're needed. All of the sudden, you no longer need loads of space around the motherboard and behind the motherboard tray for cable routing.
  • In line with the above, we might as well integrate SATA data connectors into drive bays, too. Just make every bay behave like a docking station and pre-route the cables. I guess we'll also want an option to bypass or upgrade the integrated cables, since high-end SATA Express SSDs are presumably just around the corner. Not all drives will need a 2GB/s interface, though.
  • Come up with a unified connector for front LEDs and buttons. This is long, long overdue. Seriously, how hard could it be to call up major motherboard makers and make them all agree on a common pin-out? Give it a snazzy marketing name, add it to the list of features along with your military-grade capacitors and auto-overclocking voodoo, and move on. Sheesh.
  • On the cooling side of things, let's try to arrange the stock fans in order to maintain positive internal pressure. And let's avoid having huge, unfiltered grates at the top of the case. You don't see anyone cracking open their laptop to vacuum dust out of it every six months. Desktop PCs shouldn't require that, either.
  • Oh, and give us more I/O at the front. Even high-end cases usually have only four front USB ports, and those tend to be all crowded together. I'd like to be able to leave at least a couple of charging cables plugged in permanently and still have room for chunky thumb drives and USB headsets.

That's about as far as I've gotten just now, but I'm sure there are other things we could do. And I'm sure you folks have ideas, too.

The broader point, though, is that desktop PCs could use a makeover. With just a handful of good initiatives, and maybe a new standard or two, we could make desktop PCs substantially simpler to build, more straightforward to use, and easier to carry around. Not every enclosure needs built-in cabling for everything plus a dozen front-panel ports, but we should at least offer those options. The easier it is to build a PC, the more people will do it, and the better the industry will be.

260 comments — Last by CBHvi7t at 4:53 PM on 12/16/13