Notes from TR's next-gen storage testing
A few tidbits from preliminary testing
In between work on other projects, I've been running exploratory tests on PCIe SSDs to see how their performance characteristics differ from the SATA drives we've been reviewing for the past few years. This research is part of a larger effort to come up with a new collection of tests for our next-gen storage suite.
Our old suite dates back to 2011, and though it's been tweaked here and there, it's long overdue for a major overhaul. The tests we conceived three years ago aren't ideal for the latest SATA drives, let alone the coming wave of PCIe SSDs. So, I've been testing some newer drives to see what it takes to exploit their full potential.
SSDs have been bottlenecked by the Serial ATA interface for quite some time. The 6Gbps host connection is the obvious limitation, but it's not the only one. SATA is also constrained by the AHCI protocol, which was conceived in an era of much slower mechanical drives.
Fortunately, the table is set for a PCI Express revolution. Windows 8.1 offers native support for NVM Express, a newer protocol designed specifically for PCIe SSDs. Intel's 9 Series chipsets have their own provisions for PCIe drives, and compatible M.2 slots can be found on most enthusiast-oriented motherboards from that camp.
There aren't a lot of purebred PCIe SSDs on the market right now, but we can get a sense of what's possible from Intel's DC P3700. This datacenter drive comes on a half-height expansion card with a beefy heatsink. It has four-lane Gen3 interface, and it's based on the NVM Express protocol. It's also extremely expensive; our 800GB sample sells for a whopping $2,600 at Newegg.
Although the P3700 works in standard desktop systems, we haven't been able to boot Win8.1 on the thing. The motherboards we've tried don't even show the drive as a boot option in the firmware. That's not entirely surprising given the P3700's target market, but it does highlight the fact that not all PCIe SSDs are fully supported in the PC arena.
Plextor's M6e is a whole other story. This drive has a tiny M.2 2280 form factor, and it's only $220 for 256GB. The dual-lane Gen2 interface is a good fit for 9-series motherboards, and we haven't had any issues booting Windows. However, the M6e uses AHCI instead of NVMe, so it's not a truly next-gen product.
For comparative reference, we've also been running preliminary tests on Samsung's 850 Pro 512GB. It's the fastest SATA SSD we've encountered to date, making it a good control of sorts.
Our first batch of results comes from Iometer, which lets us tweak the queue depth and the number of workers (threads, basically) hammering the drive with I/O. The number of concurrent I/O requests is the product of the number of workers and the queue depth. For example, one worker at QD32 produces 32 concurrent requests—the same as for a four-worker config at QD8.
Even with the lightest load, the P3700 more than doubles the sequential speeds of the other SSDs. Its read performance scales up aggressively as the queue depth rises, but there's less improvement with writes. Interestingly, the P3700's sequential speeds drop in our four-worker tests, at least versus single-worker configs with the same number of simultaneous requests.
In the write speed test, the four-worker setups produce similar slowdowns on the M6e and 850 Pro. There's less of an impact in the read speed test, where the performance of those drives is fairly consistent across our six load configurations. Neither the M6e nor the 850 Pro hits substantially higher speeds under heavier loads.
Additional workers do help with random I/O, at least for some of the SSDs. Check out the peak 4KB random write rates:
The P3700 and M6e both get a boost from additional workers. The gains are bigger with heavier loads, especially on the Intel SSD. Check out the 50% jump in IOps from one worker at QD32 to four workers at QD8.
Curiously, the 850 Pro doesn't respond well to loads spread across multiple workers. Its random write rate drops substantially when we switch from one worker to four, even when the total number of concurrent requests remains the same. That's a shame, because the 850 Pro actually outperforms the M6e with a single worker.
Those random write peaks are much higher than the sustained rates that each SSD achieves. Here's a closer look at how the drives compare across a 30-minute test. Click the buttons below the graph to switch between the various worker-and-queue combos.
All the drives peak early before trailing off as the clock ticks. The speed and shape of the decline is different for each one, in part because of the large differences in overprovisioned area.
The M6e 256GB and 850 Pro 512GB allocate roughly the same percentage of flash to overprovisioned area, but since the Samsung has a higher total capacity, it has more of this "spare" area to devote to accelerating incoming writes. The P3700 800GB has an even higher total capacity, but that's not all. Like most server-grade gear, it also sets aside a much larger percentage of its flash as overprovisioned area.
The P3700 is a beast, and so is this test. I can't think of a client application that generates an uninterrupted stream of random I/O for any considerable length of time. One of the biggest challenges with developing this new suite is balancing our desire to push drives to their limits with the need to present performance data that's actually relevant to desktop workloads.
Sequential speeds don't waver over longer tests, so there's no need to draw out those results out over time. The same goes for random read rates. IOps is the most commonly used metric for random I/O, but we think response times can be more instructive.
All the SSDs are in the same ballpark up to four simultaneous requests. The M6e and 850 Pro slow down considerably after that, and they really struggle under our heaviest load. The P3700's response times get slower, as well, but not by nearly as much.
Thanks to our resident developer, Bruno "morphine" Ferreira, we have another storage benchmark with a configurable load. RoboBench is based on Windows' robocopy command, which can be run with up to 128 simultaneous threads. With the aid of a RAM drive, we can use RoboBench to test read, write, and copy speeds with real-world files. Here's a taste of how RoboBench scales when reading files:
The work test uses tens of thousands of relatively small spreadsheets, documents, web-optimized images, HTML files, and the like. Read speeds increase dramatically to start, but the gains peter out as the thread count rises.
The media test comprises much larger movie, RAW, and MP3 files. Four threads are sufficient to reach top speed even on the P3700.
Robocopy defaults to eight threads, so that's probably a good test to use along with the single-threaded config. It's more difficult to make a case for testing additional configurations, in part because of the time required to secure-erase and pre-condition SSDs before any test that writes to them.
The above results provide a small taste of what we're working on for future SSD reviews. I have a tendency to go a bit overboard with testing, but I'm trying to exercise more restraint this time around. We'll see how that works out. Stay tuned.Gaming on the Grid with Nvidia's Shield Tablet
Music and video streaming are big business these days. Game streaming hasn't taken off in the same way, but Nvidia has big plans for it. For more than a year, the company has been working on a Grid Game Streaming service that pipes PC games to its Shield devices. A preview of that service is available for free on Shield handhelds and tablets until June 30, 2015. Since we've already looked at the Shield tablet's native gaming chops and its local PC streaming capabilities, we decided to give Grid a shot.
Getting in on the preview requires more than just a compatible Shield device. Nvidia also has stringent networking requirements: at least 10Mbps of downstream bandwidth and a ping time under 60 milliseconds. Ample bandwidth is needed to deliver a decent picture at a high frame rate, while low latency is required to minimize the delay between the user's inputs and the on-screen action.
Grid's list of demands also includes a GameStream-ready 5GHz Wi-Fi router, but that requirement isn't strictly enforced. The preview let me play through my ISP's not-at-all-fancy 802.11n router. I also tested the service with an approved Asus RT-N66U, but I didn't notice a difference between the two. GameStream-ready routers may be more important on networks with lots of wireless devices and other local congestion.
According to Grid's built-in network analyzer, even my relatively pedestrian DSL connection meets the bandwidth and latency requirements. Despite measuring a ping time of just 18 milliseconds to Nvidia's servers, the analyzer says Grid is "not yet available" in my location. "You may experience stutter or high latency," it warns. TR's Benchmarking Sweatshop in North Vancouver, Canada apparently lies outside the official catchment area for Nvidia's west-coast Grid installation.
Grid servers are also hosted on the east coast of the U.S. and in Ireland. The Irish deployment is meant to cover "most of Western Europe," and there are plans to expand into Asia next year.
Along with a fast network connection, Grid pretty much requires a proper gamepad. The PC games on tap weren't designed with touchscreen input in mind, and they don't even map well to gamepads all of the time. Fortunately, the Shield controller has a built-in touchpad, and the Grid service has a virtual keyboard, making it possible to navigate situations that call for more traditional PC input.
With all the requisite pieces in place, plugging into the Grid is very straightforward. The library of available titles is displayed in Nvidia's Shield Hub app, and launching a game takes no more than a tap. Nvidia claims the start times are faster than for Sony's PlayStation Now streaming service, but I don't have any PlayStation hardware, so I can't compare the two. The initial load times seem slower than on an SSD-equipped PC, probably due to the time required to spool up the virtual session. After games are up and running, the subsequent load times are reasonably short.
Right now, Grid's selection is limited to 20 titles. There are some high-profile inclusions, like Batman: Arkham City and Borderlands 2, but the library pales in comparison to what's available with other game streaming services. OnLive and PlayStation Now both have much bigger catalogs with over 100 games.
Nvidia clearly has a lot of catching up to do, but it promises to add new titles weekly. For the "Netflix for games" tagline to ring true, Nvidia will also have to add more recent content. The bulk of the games in the Grid catalog are 2-3 years old at least, which is a little stale compared to the fresher content on Netflix.
Now, don't get me wrong. Older PC games can still be a lot of fun, especially on mobile devices. Grid just isn't an effective way to sample the latest and greatest releases—at least so far.
In part because the games are a little older, the visual quality isn't exceptional. Video compression is probably the biggest culprit there. Even with most games configured with high detail levels in addition to antialiasing and anisotropic filtering, the output suffers from a distinct lack of sharpness that's particularly apparent in motion. The 720p default resolution doesn't help on that front, though it's possible to raise the resolution and tweak other graphics settings in some games.
From a normal viewing distance, Grid's output doesn't look too bad on the Shield Tablet's 8" screen. The visual compromises are similarly easy to overlook with the feed piped to my 50" TV, but only when I'm sitting across the room on the couch. The flaws are more glaring up close, including at arm's reach from my 24" desktop monitors.
Nvidia says Grid streams run at 60 frames per second. The frame rate feels that fluid, but moments of lag interrupt the flow. These hitches are generally infrequent and brief, and they're less evident in slower-paced games. However, they're very apparent in Race Driver: GRID, whose constant, high-speed motion makes even minor hiccups noticeable. Here's a few minutes of GRID footage captured with the Shield Tablet's built-in recording capability. Watch for the obvious hitch around the 0:37 mark. (And ignore the frame counter in the top right corner; even though Grid streams should run at 60 FPS, the tablet's real-time recording seems to be capped at half that rate.)
The Shield Tablet makes game recording incredibly easy, so here are some snippets from Arkham City and Borderlands 2. Like the GRID clip, these clips were captured with the default in-game settings and the highest recording quality. Blame YouTube for degrading the picture quality with its own compression.
Although Grid games feel responsive in between the hiccups, lag has definitely caused me to miss apexes in GRID, headshots in Borderlands, and counters in Batman. Worse than that, the threat of stutters striking without warning makes me feel hesitant and disconnected while playing any game.
To be fair, some of my sessions have suffered more lag than others. My first extended stint was late at night, and the experience was generally good. Hitching was more apparent in a subsequent session during prime-time hours, though. Folks with faster connections or closer proximity to Nvidia's servers may have better luck.
For me, the hiccups taint the experience more than the graphical compromises. That's probably because I prefer action-oriented titles that are more sensitive to lag. Also, I've streamed games to the Shield Tablet from a local PC, so I know what the experience is like with a really fast network connection: consistently smooth and much prettier. Local streaming isn't limited to Grid's dated library, either.
For those without access to a local PC, Grid games offer a lot more depth than typical mobile fare. It's also pretty cool to be able to play big-name blockbusters on an Android tablet. The biggest question that remains is how much the service will cost when the free preview expires. Nvidia hasn't divulged any details on that front.
At least in its current state, Grid streaming seems like a better fit for casual gamers than for serious or discerning ones. I can't easily put myself into the head of someone who isn't as picky about graphical quality or consistent responsiveness, but those people do exist, and I'm genuinely curious to see what they make of the Grid preview. The experience may well be good enough for folks who haven't been spoiled by superior local streaming or native PC gaming. Grid probably works very well with Google Fiber, too.A subjective look at the A8-7600's gaming performance
Update — Faulty memory appears to be behind the crashing we experienced with the A8-7600T. The AMD-branded DIMMs provided with the Kaveri test system produce errors when running Prime95 alongside the Unigine Valley graphics benchmark. These errors occur with the memory clocked at 2133MHz, the maximum speed officially supported by both the DIMMs and the processor. Dialing back the modules to 1866MHz eliminates the errors, and so does swapping in a pair of Corsair Vengeance DIMMs. The Corsair modules passed a 12-hour stress test at 2133MHz without so much as a single error.
Kaveri is AMD's first APU to feature integrated graphics based on the latest generation of Radeon graphics cards. As we learned in our review of the A8-7600, even a cut-down version of this DirectX 11-class GPU can keep up with the latest blockbuster games. Battlefield 4, Batman: Arkham Origins, and Tomb Raider are all playable at a 1080p resolution. The frame rates aren't great—around 25-30 FPS—and the in-game detail settings need to be turned down to get the games running that well. But the action is smooth enough and the graphics are good enough to deliver an enjoyable experience, especially for so-called casual gamers with less refined tastes. Not bad for a $120 processor that can fit inside small-form-factor and all-in-one systems.
Our deadline for the Kaveri review was extremely tight, so there was no time to test the A8-7600 in additional games. However, I wanted to see how the chip handled a broader collection of titles, specifically the older, less demanding games so frequently discounted on Steam. These games may not be the latest and greatest, but they're still a lot of fun, and they're very cheap to buy. Perhaps the A8-7600 could run them with fewer compromises.
Since I was pretty much zombified the day after the review went up, I decided to find out. Installing and playing games was my only real hope of productivity in that state. The following are my subjective impressions and some accompanying screenshots. Clicking the screenshots will bring up a larger, full-resolution image that provides a better sense of how things look.
First, here are some shots from the games we tested in the review. (Our full, inside-the-second analysis begins here.)
I didn't include the full-sized images in the initial article, but they're worth perusing. All three games look better than one might expect from integrated graphics, especially given the display resolution. That said, Batman and Tomb Raider both crashed to the desktop multiple times during testing, and they weren't the only games to have issues.
Speaking of other games, let's look at batch of first-person shooters.
Borderlands 2 ran reasonably smoothly with only depth of field, ambient occlusion, and antialiasing disabled. The frame rate stuck to around 30 FPS, and I didn't perceive any obvious stuttering. This game is definitely playable, though it did crash to the desktop twice.
Serious Sam also crashed a couple of times. Otherwise, the game ran pretty well with high details and only ambient occlusion and antialiasing disabled. Frame rates bounced around within the 25-50 FPS range depending on how many baddies there were on the screen. The occasional slowdown was noticeable during the heaviest action, but it didn't really affect my enjoyment of the game.
Fraps' frame rate counter showed 30-45 FPS during my Dishonored session. The action felt smooth, with no apparent interruptions to fluid frame delivery. And the game looked decent, too. All the graphical settings were maxed with the exception of the model detail, which was set to normal rather than high, and antialiasing, which was disabled.
Next on the shooter front: Mirror's Edge and Counter-Strike: Global Offensive.
Both of these games ran well on the A8-7600. Counter-Strike regularly hit 60 FPS with the details maxed and FXAA turned on. It felt noticeably silkier than the other shooters, which is exactly what you want in a game that relies on quick reactions.
Mirror's Edge had slightly lower frame rates than Counter-Strike, and I had to disable antialiasing and PhysX effects to make the action stutter-free. After those adjustments, Fraps' FPS counter never dropped below 35 FPS, and Faith's free running felt fluid. Or it did until the game crashed. Twice. Noticing a pattern yet?
Dirt: Showdown crashed to the desktop multiple times, too. It was actually part of the original test suite for the A8-7600 review, but I switched to Tomb Raider after encountering a couple of early crashes on the Kaveri setup. After getting another shot, Dirt: Showdown ran pretty well, at least between subsequent crashes. With high details and antialiasing disabled, the frame rate hovered around 30-35 FPS. There were no obvious stutters or slowdowns.
The only crashing problem in Need for Speed: Shift 2 Unleashed was hitting other cars, and I can't blame the game or the APU for that. With high in-game detail settings, the A8-7600 cranked out 25-30 FPS. The frame rate dipped to the lower end of that range when there were more cars in front of me, but that didn't make the gameplay feel sluggish.
More game crashes hit when I tackled Sleeping Dogs. One of them even hosed part of the Windows install, forcing me to re-image the system. Ugh.
When it wasn't crashing, Sleeping Dogs was too choppy with high details. The game ran at 30-45 FPS with medium details, though. Scaling back the eye candy sacrificed the slickness of the environment, but it was necessary to even out the frame delivery and eliminate stuttering. And the graphics still looked all right.
Just Cause 2 is enjoying somewhat of a renaissance thanks to a free multiplayer mod. The mod crashed on me several times, but the standard, single-player version of the game ran without issue. And it ran very well, too. With high details and everything but antialiasing and ambient occlusion disabled, there were no noticeable slowdowns in the frame rate. Fraps reported 30-45 FPS for the duration of my test session.
A couple of more casual games, Dyad and Trials Evolution Gold, performed impeccably with all their in-game detail settings turned up. Not that we should be surprised. These titles are much simpler than the other games we've looked at so far.
Dyad and Trials Evolution Gold were immune to crashes, and the Kaveri system was perfectly stable in all our non-gaming tests, including those that tapped the integrated Radeon via OpenCL. The A8-7600 still had problems with exactly half of the games we played, though. That's a lot, especially since these aren't overly obscure titles. (I'm not counting Just Cause 2 multiplayer, which could probably use more polish.)
AMD's OverDrive utility showed no evidence that the APU was overheating. Also, there were no problems with Richland-based APUs running in the same test system and with the same drivers. Those chips have an older integrated graphics architecture that may use a separate driver code path, so perhaps this is just a software issue that can be ironed out with a future Catalyst driver release. Fingers crossed.
In between crashes, the A8-7600's gaming chops impressed me. This APU is fast enough to run lots of really good titles at 1080p resolution, and it can handle older games without too much sacrifice. That said, there are still some compromises involved. Even in older games, it's rare to be able to turn the detail settings all the way up, and antialiasing often causes slowdowns. Some visual fidelity is inevitably lost versus what can be achieved with a more powerful GPU.
Some smoothness is lost, as well. Although the A8-7600 was largely stutter-free in the games we tested, the lower frame rates we experienced in more recent titles didn't feel as fluid as the 60 FPS we got in Counter-Strike. The APU was fast enough to run the games we played at the settings we used, but the performance definitely wasn't ideal for most of those titles.
For casual audiences with less refined appetites, the A8-7600 is probably fast enough. Connoisseurs are unlikely to be satisfied, though, and I question how well future titles will run on the chip. The Xbox One and PlayStation 4 have a lot more GPU grunt than Kaveri's integrated Radeon, and developers are likely to target those platforms as their new baseline. Perhaps Kaveri can serve as a sort of gateway drug by giving people a taste of PC gaming without all the trimmings.
For more than a decade, an Xbox console has lived under my television. I started with the original and moved on to the 360, and I have to admit that both offered a great couch-gaming experience in their day. The Xbox One isn't in my future, though. Instead, I'm going to build myself a new home-theater PC.
Now, don't get me wrong. I'm not anti-Xbone or even anti-console. The latest generation of Xbox and PlayStation machines has definite appeal. But, for the first time ever, the PC is comfortable enough in the living room that I don't need a complementary console.
PCs have long worked in the living room, of course. I've had one hooked up to my TV forever. For a long time, though, home-theater gaming rigs felt like transplanted desktops. Even when dressed up in spouse-friendly enclosures and filled with quiet hardware, they were always hampered by a Windows operating system that wasn't designed to be used from the couch.
Then, along came Steam's Big Picture GUI. Unlike the Windows desktop, this interface can be navigated comfortably at a distance with little more than a gamepad. The graphics are decent, the layout is reasonably intuitive, and everything feels generally snappy. Big Picture Mode fundamentally changed my living room gaming experience for the better, and it's the single biggest reason I'm passing on the new generation of consoles.
The supersized Steam UI still has some rough edges. Not all games have controller support, and some of those that do still require a keyboard and mouse to log in to third-party services like Uplay and GFWL. New additions can require a trip to the desktop to install DirectX and other packages. These quirks are annoying, but they reinforce the superiority of the Big Picture UI. When everything works as it should, the interface offers a seamless experience from purchasing to playing.
Big Picture mode is limited to Steam games, but that's more of a benefit than a detriment. Steam is easily the best game distribution platform around. Downloads are speedy, the DRM restrictions are generally reasonable, community-generated content is encouraged, and the selection of titles is incredibly diverse. Bargains abound, too, making it possible to build an extensive game library on the cheap. I've amassed a collection of really excellent indie titles at only a few bucks a pop. Blockbusters aren't as cheap, but they receive plenty of discounts of their own.
Thanks to Steam, PC games are generally cheaper than their console counterparts. I wouldn't go so far as to say PC gaming is cheaper as a whole, but over the long run, the difference is narrower than the console sticker prices suggest.
Up front, there's no question that buying a gaming PC is more expensive than getting the latest console. Once you factor in the cost of comparable hardware, a decent controller, and a Windows license (sorry, SteamOS just isn't there yet), it's hard to come anywhere close to the Xbone's $500 price tag, let alone the PS4's $400 sticker. PCs deliver a lot more flexibility—and often a lot more power—but you pay for it.
The thing is, consoles aren't really as cheap as they seem on the shelf. The Xbox One and PlayStation 4 both have subscription fees attached to online multiplayer gaming. Microsoft charges $60 for a year of Xbox Live Gold, and Sony demands $50 for its equivalent PlayStation Plus package. Even with the attached freebies and perks, those fees add up to quite a lot over the typical console lifespan.
Well, they do over the typical console life cycle. It's not uncommon for consoles to die long before their successors are released. I'm on my third Xbox 360 already, and I was never a heavy user. I know folks who have suffered more Xbox failures and some who have been through multiple PS3s. Most of these deaths have occurred after the warranty expired, leaving owners on the hook. Console prices tend to fall over time, so at least replacements are cheaper than the initial units. They still represent an additional cost, though.
PCs fail, too. Unlike with consoles, however, DIY repairs are a breeze. Individual components can be replaced with off-the-shelf hardware—and without special tools or firmware hax0ring. PC parts may not get cheaper like consoles do, but the options at each price point improve over time. For example, low-end graphics cards generally perform similarly to mid-range offerings from a few years prior.
The rapid rate of PC hardware development has definitely lowered the cost of a decent living room rig. Even relatively modest machines can render the latest blockbusters at 1080p resolution without issue. Yet the Xbone and PS4 have to scale some games back to lower resolutions to deliver smooth frame rates. There's no way to upgrade their guts to get better visuals, either. Equipping a PC for the latest 4K TVs isn't cheap, but at least it's an option.
To be honest, cost has never really been a deciding factor for me. My home-theater gaming rigs have largely been cobbled together from old review hardware, so they've always been cheap to build. They've also had a lot more horsepower than the consoles sitting next to them. But, for more than a decade, I kept using consoles because they provided a smoother, more enjoyable overall gaming experience from the couch.
The Xbone and PS4 are still simpler propositions than a modern gaming PC. However, this latest console generation faces a PC gaming ecosystem that's much more competent in the living room. The gaming rig under my television no longer feels like a second-class citizen, so I can finally ditch consoles completely. Anyone want to buy a dust-covered Xbox 360?Brawling my way through Batman: Arkham Origins
I keep hearing that winter is coming, but in Gotham, it's already here. Snowflakes swirl as I glide through the dim moonlight and onto another empty rooftop. My boots crunch into inches of fresh powder, and the next thing I hear is voices below. The crude, thuggish banter tells me what to expect before I even peer over the ledge. There's at least half a dozen of 'em, some armed and armored, and all oblivious to the shadow perched above.
My mission lies blocks away, and these miscreants aren't related. But they're here and, well, I can't help myself. A moment later, I'm careening through the night toward my first target. His body crumples into the pavement, crushed under my weight. Then the dance begins.
The next victim gets clocked before he realizes what's happening. I connect again and again while his compatriots clue in and begin to circle. One swings a pipe, but my deft counter delivers a critical hit. The next few minutes are a flurry of violence. I leap between opponents like a wrecking ball, my attacks growing more brutal with the building momentum. Most strikes are delivered with carefully timed precision. Every so often, though, I give in to the beast and explode in a burst of unbridled aggression.
Time slows down as the last man falls. For a moment, I feel at peace. This is exactly why I was so excited to play Batman: Arkham Origins.
This franchise and its satisfying combat system were introduced in 2009 with Batman: Arkham Asylum. Two years later, Arkham City polished the package and added a giant city to the mix. Both of those games were created by Rocksteady Studios, but the torch was passed to Warner Bros. Games Montreal for the latest Arkham Origins chapter. That fact made me nervous at first, but a few moments with the game confirm that the core principles remain intact.
For me, the Batman games are all about brawling. Arkham Origins sticks to the established formula here. The controls are tight, and the mechanics reward thoughtful, coordinated attacks over frantic button mashing. Reckless assaults will still do in a pinch, though. The balance between aggression and restraint is what makes the combat so addictive for me. Every encounter is different, and Arkham Origins adds a couple of new enemies to spice things up.
Some situations demand a stealthier approach. Players often find themselves in cavernous settings littered with heavily armed opponents and multiple ways to skulk around. Enemies must be picked off carefully, and there are several ways to subdue them from above, below, and with the gadgets stowed in our hero's utility belt. Arkham Origins has a couple of new toys, including a slick double-ended claw that strings up tightropes and can be used to hurl explosive canisters at enemies.
Although much of the action offers players multiple paths and methods, some of the boss fights feel very scripted. One of the early ones is pretty much just an endless stream of quick-time counters. I couldn't pass it without robotically following the on-screen cues, and even then, it took a couple of tries. That sapped all the fun out of the experience.
The indoor missions lead players along linear paths, and there's generally only one way around a given obstacle. The hand-holding continues with the detective work, which is essentially an interactive cinematic that revolves around "finding" clues identified by giant red markers. It's neat to see crimes recreated from the evidence, but there's no real thinking involved.
Outdoors, Arkham Origins provides a vast playground filled with side quests, activities, and random thugs to engage. There's a new fast-travel system that can be unlocked by capturing various control points across Gotham. I like having the option to warp across the map, but so far, I've been traveling on foot and grappling between rooftops. Coupled with the gliding and diving dynamics, the ability to accelerate past grapple points makes covering distance easy. There's plenty of scenery to take in, too.
I've been playing on a hot-clocked GeForce GTX 680 with the eye candy and PhysX dials turned all the way up. The graphics aren't mind blowing, but this is still a good-looking game with some very slick effects. The snow is especially crystalline; though it doesn't swirl with particle-driven excess during battle, beatdowns leave a nice impression, as do the player's feet. There's some neat heat shimmer and smoke, too, and Batman's cape looks reasonably fluid, despite a few clipping problems. Perhaps I'm spoiled, but some of the models and textures could use more detail. The same goes for the cinematics I've seen thus far.
After nine hours of play time, I'm about halfway through the story. The narrative hasn't really grabbed me, perhaps because I'm constantly distracted by random street fights and other excuses to throw down. The challenge maps are particularly fun; they serve up one fist fight after another and are perfect for quick sessions. Arkham Origins also has a whole online multiplayer component I haven't even touched.
Since I haven't sampled the multiplayer, which is all new and developed by Splash Damage, it seems a little unfair to criticize the game for offering more of the same. But the single-player component very much feels like a tweaked and massaged version of Arkham City. There's some comfort to the familiarity, and there's nothing wrong with offering a slight variation on a successful recipe. I clearly haven't tired of brawling. However, part of me wishes Arkham Origins brought something bigger to the experience. There was a quite a jump between Arkham Asylum and Arkham City, but the step up to the latest chapter feels like a small one. I hope a bolder leap forward lies in Batman's future—just as long as it doesn't ruin the combat.Google, I love you, but you're bringing me down
I was listening to LCD Soundsystem the other day, and New York, I love you, but you're bringing me down somehow coalesced my feelings on Google's recent behavior. Dunno what happened there, but being well into a growler of Tofino Brewing's Hoppin Cretin IPA might have contributed. In any case, I'll take inspiration where I can get it.
Google, you're perfect, please don't change a thing.
If it weren't for you, we might have to use Bing.
Gmail captured my heart, and Android played a part.
If only I could, I would give you a ring.
I love Google. I really do. Its search engine has been a vital part of my Internet experience for what feels like forever. As far as I can tell, Google search remains the best way to find information on the web, especially since it's started spitting out knowledge along with links to third-party content.
Then there's Gmail, the slickest free webmail solution around. Gmail has been boss since back when it was an exclusive, invite-only club. I use it constantly for my personal and work correspondence on every computing device I own.
Android is my preferred platform for mobile devices. To me, it feels much more like a real operating system than iOS. I'll take basic freedoms like file management over the tightly controlled Apple experience any day. Android had some rough edges in its infancy, but it's improved dramatically over the past couple of years thanks to sensible optimizations and thoughtful updates. Speaking of which...
Google, I love you, but you're bringing me down.
I thought we were close, but that's all turned around.
Your KitKat may be sweet, still I fear we'll never meet.
Turns out my Nexus is too old for this round.
In the time that I've owned it, my Galaxy Nexus smartphone has become a faster, more capable device thanks to new Android releases. Google Now integration, combined with speech recognition for search, has made it much easier to get the information I want quickly and easily. Smaller tweaks have been welcomed, too, but Android 4.1's "Project Butter" optimizations take the cake. They made the whole UI much smoother and more responsive.
You can imagine my delight when Google revealed that Android 4.4 KitKat includes "Project Svelte" enhancements aimed at more efficient memory usage. The latest version of the OS is designed to improve performance on devices with as little as 512MB of RAM and to speed up multitasking for all. Such an update seems perfectly suited to the two-year-old Nexus. Google doesn't agree, however. Its KitKat FAQ says the Galaxy "falls outside of the 18-month update window when Google and others traditionally update devices."
What's the point of optimizing Android for less potent hardware if you're not going to bring older, less potent devices along for the ride? Cheaper handsets like the Moto G, probably. And wearables, I suspect.
As Google points out, its 18-month upgrade window is typical for the industry. And that's the problem. Google wasn't supposed to be like the others; it was supposed to be better. Nexus devices don't look quite as sweet when their ticket to OS updates expires 18 months after introduction. At least the ROM community might be able to pick up the slack.
Google, you're awesome, but you're cramping my style.
Your SD aversion has irked me for a while.
The Nexus 5 looks legit, and part of me lusts for it.
But 32 gigs are too few for my files.
Even with a limited OS upgrade path, Nexus devices are still pretty sweet. If only Google weren't allergic to equipping them with expandable storage. The official line is that Google wants to unify storage on a single volume. There are benefits to that approach, and the internal storage should be faster than the microSD alternative. However, Google has been negligent on the other side of the equation. All of its current Nexus devices top out at a measly 32GB. Meanwhile, the latest Apple devices are available with up to 64GB and in some cases 128GB of flash.
Flash chips have become smaller and prices have plummeted in recent years, so there's no excuse for skimping. Apps are only getting larger, especially games designed to take advantage of the latest hardware. Modern camera sensors pack ever more megapixels, expanding the footprint of the pictures we take and the videos we record. Meanwhile, high-PPI displays encourage us to consume media with the highest fidelity—and corresponding file size.
Cloud storage is supposed to bridge the gap, but that's an ugly compromise for all kinds of reasons. What if you have a limited data cap or a slow connection? What if you don't want your data floating around in the ether, where the NSA and others might be able to find out about your Nickelback bootleg collection? Being limited to 32GB of local storage is potentially crippling for both power users and folks with extensive media libraries.
Google, I love you, but you're bringing me down.
You've taken this smilie, turned it to a frown.
Maps that guided my life are now the cause of much strife.
What ever happened to holding my hand out of town?
Google Maps is in a class of its own. For some reason, though, Google seems intent on neutering the Android app. First, it obfuscated the process of caching maps locally, an essential feature for travelers who don't want to pay exorbitant roaming fees. Then, it removed My Maps functionality entirely, preventing carefully crafted maps from being accessed on mobile devices.
The Maps app has undergone some questionable UI changes, as well, although I'm not nearly as irate about those tweaks some of the recent reviews on Google Play. The lost functionality bothers me the most, in part because it makes me reluctant to depend on anything Google makes.
Google, you give and then you take away.
Yet it's hard to complain because I never pay.
Reader may be the worst, but it wasn't the first.
Why get attached if I don't know if you'll stay?
I never really got into Google Reader. When it was shuttered this summer, my life was largely unaffected. But I felt for the folks who had come to rely on the service to digest all of their feeds. Google gave them something they loved and then took it away.
iGoogle didn't stick, either, though it had a good run before being shut down at the beginning of the month. Google has ended other services, too, making me question whether one of my own favorites will be targeted in the next round of "spring cleaning." Since Google's products tend to be free, it's hard to complain too loudly when they're yanked. If only that lessened the feeling of loss.
Google, come on now, you're up in my grill.
I thought we were cool, I thought we were chill.
Now I get why there's plus, social network's a must.
But making me sign up... I swear I could kill.
When Google+ was introduced, I barely noticed. Everyone needed a social network because, um, Facebook, or something. Google's approach to sharing at least seemed to be more sensible than the status quo, but I never paid too much attention, mostly because I'm just not interested in social networking. Google was OK with that, at least for a while. Lately, it's been trying to jam Google+ down my throat.
Google+ first snuck into my life via Gmail, and it's now infected YouTube. Pretty soon, I wonder if any of the company's services will be accessible without a plus account. And for what? So Google can claim to have a bunch of active accounts owned by people who still prefer to hang out at Zuckerberg's house?
I know, I know. Other companies pull this kind of crap all the time. But Google+ was supposed to be the opt-in social network, and Google was supposed to be better than all this.
Adventures in left-handed mousing
And oh, just take me off your circles list.
That no one even knows exists.
No matter how much you insist.
Maybe I'm wrong. And maybe you're right.
And maybe I'm spoiled, and this is driven by spite.
Google, I love you, but you're bringing me down.
The little things add up, and then they start to drown.
I take all this for free, then make it about me.
But deep inside I feel like I've been clowned.
I'm right-handed. I use my dominant hand for everything from scrawling my name to hurling tennis balls for my dog to brushing my teeth. My right hand also spends an awful lot of time clutching my mouse. That wasn't a problem years ago, when I had the stamina to put in a full day working for TR and then spend hours in the evenings dealing out headshots in first-person shooters. But perhaps due to those marathon sessions, I've developed a bit of an RSI issue in my right shoulder. Recently, it's become difficult to make it through the day without some mousing-related discomfort, especially if I spend a lot of time in Excel. The twinge that manifests in my shoulder toward the end of my shift is painful enough that my freshly downloaded copy of Battlefield 4 remains unplayed.
Working less isn't really an option; there are news posts to write, articles to edit, and reviews to crank out. I could revamp my workstation, but my better half is an occupational therapist, and she tells me that my current setup is pretty close to ideal. The desk could be a bit lower—or the chair a little higher—but that's about it. Even if I got the height just right, I'd still be reaching to the right of the keyboard to use the mouse. That's the problem, she says.
My solution, at least for now, is switching hands.
Having done a fair amount of left-handed mousing after breaking my right ring finger last year, I'm no stranger to the off-hand approach. That initial foray involved moving my usual mouse, a Cyborg Gaming Rat 7, over the left side of the keyboard. The Rat worked in a pinch, but its asymmetrical body is a poor fit for lefties. The shape is all wrong, and the thumb buttons are on the opposite side. Pressing side buttons with one's pinkie is more than a little awkward.
My lefty stint with the Rat 7 taught me another important lesson: I'm pretty lousy with my non-dominant hand. I can move the pointer more or less where I want it within a reasonable amount of time, but forget about hitting a precise target with any semblance of speed. This dynamic was particularly frustrating when transitioning between simple desktop tasks and more detailed work like photo editing, which often requires pixel-perfect positioning. The Rat 7's on-the-fly sensitivity switch proved to be invaluable, allowing me to dial down the DPI to compensate for my lack of coordination.
With those memories fresh in my mind, I started looking for a suitable mouse—something with a thumb button on the right side and an easily accessible sensitivity switch. The selection of left-handed and ambidextrous mice is pretty limited, and most are uber-cheap models that lack premium features like DPI control. In the end, I settled on the SteelSeries Sensei Raw, which has an ambidextrous shell, buttons on both sides, and a high/low DPI switch just behind the scroll wheel. The Sensei is pretty affordable, too. Newegg sells it for only $48, which is less than the ambidextrous Razer alternative.
After a few days of using the Sensei, I'm already in love with its soft-touch exterior. The body is a little small for my tastes, but it's a big improvement over the Rat, at least for my left hand. The wheel and buttons feel solid, the braided cord is incredibly long, and the feet slide smoothly on my desk. Admittedly, the pulsing internal LEDs are a bit much for me, but there are options to tone down the brightness, swap the pulsing for a steady glow, and turn off the lights completely.
Configuring the Sensei for left-handed use is easy. The drivers switch the left and right buttons automatically, but the thumb buttons must be bound manually. That's easy enough, and thanks to built-in macro functionality, side-scrolling and other combos can be tied to any button. SteelSeries software also includes sliders for each of the dual sensitivity modes. The DPI can be set between 90 and 5760 DPI, which is plenty of range for my needs. There's more than enough granularity, too.
After a simple initial setup, integrating the Sensei into my daily routine has proven to be somewhat difficult. The problem isn't mousing with my left hand. Instead, it's simultaneously executing key combinations with my right.
Despite its dominant nature, my right hand is comically inept at hitting vital keyboard shortcuts for copy, cut, paste, and undo. Not only are those shortcuts on the wrong side of the keyboard, but they also feel backward when executed with my right hand. The same functions can be performed with mouse input alone, of course. I can also lift my hand off the mouse and punch Ctrl+whatever with my left hand. But both of those solutions are slower and less efficient than a tag-team approach, especially with my mousing hand already at a disadvantage.
On the flip side, I'm used to moving my right hand back and forth between the numpad and mouse when entering data into Excel. Using the numpad with my left had never felt natural, probably because it involved twisting my body or relocating the keyboard. With the mouse in my left hand, my right rests comfortably on the numpad, avoiding the side-to-side movement that aggravates my shoulder.
Speaking of which, mousing with my left hand has definitely dampened the RSI symptoms on my right side. I'm still using my right-handed mouse from time to time, usually when something needs to be done as quickly as possible, but balancing the load definitely helps. Just days after adding a lefty to my arsenal, my right shoulder already feels fresher.
Mousing with my left hand feels less awkward, too. My speed and accuracy seem to be improving with each day, and I'm finding that I have to concentrate less to get the cursor just where I want it. Movement that was once thoughtful is becoming more automatic. You won't find me gaming with a lefty stance anytime soon, though. I may become sufficiently productive on the desktop, but I doubt I'll ever be as deadly with my non-dominant hand.
The thing is, I don't have to be as good with my left hand. The more time I spend dual-wielding, the more I like the approach. I'm getting used to shifting high-priority tasks to my right hand and more casual mousing to my left. So far, I've been able to lighten the load on my shoulder without completely compromising my productivity. With some custom keyboard macros, I might even be able to get around my shortcut woes. Even if I don't, my days of one-handed mousing are definitely over.
|Rockchip SoC powers $149 Chromebooks, sub-$100 dongle||8|
|Corsair's M63MM RGB mouse is bringing balls back||11|
|Asus' ROG Sica cuts the gaming mouse to the bare essentials||15|
|Here's why Xeon D could make dual-socket servers scarce||29|
|The TR Podcast 173: Torquing the Titan||4|
|A fresh look at storage performance with PCIe SSDs||36|
|Leaked specs detail Intel's 14-nm Braswell SoCs||37|
|Here are our musings on the new MacBook||156|
|THIS IS THE INTERNET. THERE IS NO PLACE FOR FUN DISCUSSION.||+36|