TR's VR journals, part one: setting the stage
— 5:33 PM on July 13, 2016

Over the past couple months, I've been places. I've dangled from the side of a rock wall hundreds of feet above crashing waves. I've set foot on a shipwreck deep beneath the ocean and hung out with a humpback whale. I've laid waste to pallets of unsuspecting fruit with a pair of katanas. I've led a research mission on an alien world full of exotic life forms. I've been to Taipei to look at a bunch of new PC hardware. (OK, that last one actually happened.) No, I don't have a holodeck or transporter pad installed in my office. Consumer virtual reality gear is here now. I've been spending lots of time strapped into the big two virtual reality headsets that came out this year: Oculus' Rift and HTC's Vive (built in partnership with Valve Software).

Since dedicated VR headsets require powerful gaming PCs to do their thing, both AMD and Nvidia are betting big on VR tech, too. AMD's latest graphics card, the $200-and-up Radeon RX 480, is priced specifically to make VR-capable graphics more affordable, while Nvidia's Pascal architecture even boasts a couple of VR-specific features in its silicon. Nvidia's VRWorks and AMD's LiquidVR both help developers to address the specific progamming needs of VR titles on Radeons and GeForces. Even PC companies like Zotac and MSI are thinking about how VR is going to change the personal computing experience. This is just a tiny slice of the vast number of software and hardware companies making a move into VR. If you hadn't already guessed, this technology is a Big Deal.

All that buzz has arisen for good reason. VR headset makers often talk about "immersion" and "presence," the sense that you're really inhabiting the virtual worlds viewed through these headsets. At their best, VR headsets really do create a feeling of being transported to another place, whether that's the seat of a Group B rally car or the bowels of Aperture Science. That immersion becomes especially deep when one can actually reach out and touch virtual environments in a room-size space, like one can with the Vive. I often find myself saying "wow" when I enter a new VR experience for the first time, and I imagine I've looked like the clichéd image of the guy below more often than I'd like to admit. When you put on these goggles for the first time, though, it's easy to understand why Oculus relies so heavily on this image to convey what VR is like.

Now that the Vive and Rift are on the market, though, one could be forgiven for thinking that some of the shine has come off. There are some fully- fleshed-out gaming experiences available on both platforms, but many of today's VR games are experimental and rather short. Valve's The Lab demo is typical of the breed: a collection of mini-games that's a fun, if not particularly deep, introduction to VR. While it's easy to understand that developers aren't yet committing the full force of triple-A resources to either headset, that approach might go some way toward explaining the apparently tepid response to these technological wonders a couple months in.

Recent Steam hardware survey results report that vanishingly small numbers of respondents on the platform have bought into either VR hardware ecosystem. While not every Rift user is going to have Steam installed, the numbers still aren't large. Even when people do buy into VR, it seems like they travel to virtual worlds less and less. Razer CEO Min-Liang Tan recently conducted a Twitter poll regarding the frequency of VR headset usage, and of 2,246 respondents, 50% said they don't even use their VR hardware any more. Another 33% use their headsets "just once in a while," and only 17% claim to strap their head-mounted device daily. Those are potentially troubling numbers for systems that cost hundreds of dollars—and that's before we consider the $1000 or more needed to build a compatible PC. More likely, we're just seeing the slow development of a nascent platform.

Another obstacle to getting folks to buy into VR may be that describing what it's like is incredibly hard without actually strapping a headset to someone's face. (Valve's clever augmented-reality setup, as seen above, comes close, though.) Words, pictures, and video are all abstractions of reality to begin with, and it's really hard to talk about something that promises to whisk you away to another world using them. I imagine many people I've told about VR feel sort of like families did when our parents used to pull out slide projectors and show off vacation photos. "You have to be there" may be true, but it's ultimately unsatisfying for the folks that haven't yet shared the experience.

Judging the Rift and Vive is also challenging because these are incredibly personal pieces of equipment. No other computing device of late straps on to one's head and blocks out the world, so it's important to get some face time with both devices instead of relying on one reviewer's sweeping proclamations. To give just one example, I have a huge head that stretches the Rift's strap adjustments to their limits. Its ring of face foam also doesn't make even contact with my skull, so the headset creates uncomfortable pressure points on my forehead. The Vive's strap system may not be as slick as the Rift's, but its generous foam donut and broad range of strap adjustments lets me get a much more comfortable fit. Other people find the Vive unbearably heavy and clunky. Since we all have different heads and musculatures, I think trying on the Rift or Vive before plunking down hundreds of bucks is mandatory.

Given those challenges, trying to review these headsets in an open-and-shut manner as we do with cases and graphics cards just isn't going to work. Along with the need for individual test-drives and the rapidly changing software landscape, not all of the pieces of the puzzle are in place yet for us to really issue a verdict on the Rift or Vive. For example, Oculus just finished fulfilling its pre-order backlog for the Rift, and it has yet to release its Touch hand controllers. Those controllers are going to massively change the way that Rift users interact with VR, but for now, they're only in the hands of developers and conference demonstrators. Every day brings new software for the Rift and Vive that could change the value proposition for either platform.

Because of this rugged new frontier, we're not going to write one review of each headset and call it good. Our conclusions now may not have anything to do with the way VR plays out on the Rift or Vive months down the line. Instead, we're going to be writing a series of articles—TR's VR journals, if you will—that chronicle our journeys through virtual worlds and the hardware we're using to get there. This way, we'll be able to talk about VR as it stands right now while we map out the broadening VR universe over time.

In the coming weeks, we'll talk about the Rift and Vive hardware, the process of setting them up in various spaces, and the different types of experiences each platform offers. Later on, we'll examine what's inside each headset's shell, the hardware and software magic that makes them work, and the challenges and methods involved in measuring VR performance. We'll try and intersperse that coverage with brief reviews of new games and experiences as they arrive so that you're up-to-date on the places you can go with your own Rift or Vive. We'll also be examining VR experiences that don't even require a PC, like Samsung's Gear VR, and what they mean for this burgeoning space. We expect it'll be a wild ride. Stay tuned.

21 comments — Last by Notafanboy at 9:22 AM on 07/19/16

Radeon Pro Duo spearheads AMD's push for VR dominance
The red team gets spicy with Capsaicin
— 6:00 PM on March 14, 2016

GDC—At its Capsaicin event this evening, AMD took the wraps off a wide range of software and hardware projects that put the spotlight on virtual reality. The company boasts that its products underpin 83% of VR-capable entertainment systems around the world, a figure driven in large part by the company's presence in the console space. AMD is also exploring VR opportunities beyond gaming in the health care, education, media, training and simulation, and entertainment industries.

First and foremost, the company is releasing a new graphics card called the Radeon Pro Duo. This is the dual-Fiji card (previously known as Gemini or the Fury X2) that CEO Lisa Su first showed off in June of last year. The card comes with 8GB of HBM VRAM. Like the Fury X before it, this card relies on liquid cooling to manage its prodigious heat output. According to AnandTech's Ryan Smith, the card will come with ISV-validated drivers for professional applications, much like AMD FirePro cards.

The Pro Duo is the first card in what AMD is calling its "Radeon VR Ready Creator" program—products meant to serve double duty as powerful VR development and playback platforms alike. The company says the card will deliver 16 TFLOPS of compute performance, and it'll be available early in the second quarter of this year with a $1500 price tag.

Polaris may be the first step in AMD's next-generation GPU architectures, but Radeon Technologies Group VP Raja Koduri also shared a tantalizing look at the company's next-generation roadmap. While the company does admit this roadmap is subject to change, that projected info does offer a first look at when we can expect various features (like HBM2) to arrive on future AMD graphics cards.

AMD is also partnering with Crytek to put Pro Duo-equipped PCs in universities around the world as part of Crytek's VR First project. Crytek and AMD plan to work together to promote VR development for head-mounted devices like the Oculus Rift and HTC Vive using AMD's LiquidVR SDK. LiquidVR is a set of development tools for virtual reality applications. AMD says that its tools let devs perform GPU-accelerated head tracking, harness multiple graphics cards to scale rendering performance, pass head-tracking data to the GPU with a minimum of latency, and ease the process of connecting and displaying VR content on VR headsets.

Along with the VR Creator product tier, AMD is also introducing a "Radeon VR Ready Premium" badge that will identify Radeon graphics cards and Radeon-equipped PCs that should offer a high-quality VR experience. One of those systems is the HP Envy Phoenix desktop we looked at a couple weeks ago. Cards from the Radeon R9 290 series and up should be eligible for the VR Ready Premium badge.

Last, but certainly not least, AMD showed off a version of its Polaris 10 GPU running Hitman. From what we know so far, Polaris 10 is AMD's "console-class" graphics card for thin-and-light notebooks. The company expects that graphics chips and cards using Polaris 10 silicon will be able to deliver as much as two times the performance per watt of Nvidia's GeForce GTX 950. The company says that it demonstrated Polaris 11 in December of last year. If that's the case, a Polaris 11 chip running Star Wars Battlefront at 1080p and 60 FPS drew about 90W at the wall. A similarly-configured, GTX-950-equipped PC drew about 140W during that same demo.

78 comments — Last by Mr Bill at 5:42 PM on 05/21/16

Microsoft's push for a unified cross-platform gaming experience backfires
— 6:26 PM on March 1, 2016

Over the past few years, Microsoft has made a few attempts to build bridges to the PC gaming community. More often than not, though, those efforts have ended with unhappy customers and the perception that Redmond is out of touch with PC gamers. For some time now, it's felt like Microsoft and the PC gaming community have reached an uneasy peace. Over the past week, though, it feels like we've gone from zero to "WTF?" again at a rapid pace.

This most recent spat started with the release of some weird-looking benchmark results for the latest beta of Ashes of the Singularity from the folks at Guru3D. Ryan Shrout at PC Perspective attempted to get to the bottom of this issue, and he discovered some troubling behaviors that might become par for the course for DirectX 12 games in general and Universal Windows Platform games in particular.

According to Shrout, vsync remains turned on for DirectX 12 games running on AMD hardware, even if it's turned off in a game's settings. He discovered that turning off vsync in Ashes doesn't actually do what one might expect. Instead, the game engine will render as fast as it's able, but any frames that fall outside the vsync interval will simply be dropped from the pipeline. Shrout warns this behavior can still lead to judder during gameplay, even if it does eliminate tearing.

Shrout says that's all that's thanks to games and drivers taking advantage of Windows Display Driver Model (or WDDM) 2.0. In conjunction with DirectX 12, this new model apparently requires games to use a new compositing method that's similar to borderless windowed mode in today's applications. Using this compositing path lets games run without tearing, but it apparently makes it harder for games to render with the uncapped frame rates that PC gamers have come to know and enjoy.

The furor only intensified when it came to light that Quantum Break, an upcoming DirectX 12 game from Max Payne developer Remedy Entertainment, would only be available on Windows 10 through the Microsoft Store. It appears that Store apps—specifically, UWP games—come with the same kinds of restrictions Shrout discovered when testing Ashes.

According to Mark Walton at Ars Technica, the Windows Store version of Rise of the Tomb Raider won't let users disable vsync, for example, and it has problems running with CrossFire and SLI multi-GPU configs. The Store version of RoTR also doesn't appear to expose an executable file to apps that need one to work, like Steam's Big Picture mode, graphics card control panels, and game overlays like Fraps.

Since other programs can't hook into Store apps, PC Perspective's Shrout worries that a wide range of tools that PC enthusiast sites use to benchmark hardware will no longer work. He thinks that restriction means developers will have to begin writing benchmarking tools into games themselves, something that Ashes of the Singularity developer Oxide Games has done quite competently.

Even if one developer has made a good tool, though, the implications of a fragmented benchmarking environment that only lets hardware reviewers get as much of a look into a game's performance as its developer allows is a chilling prospect. I think that development also puts control of benchmarking results into the hands of those with the most incentive to meddle with them, and it's a frightening prospect to consider that PC hardware reviewers might not be able to independently verify the truthfulness of the tools we're given.

It doesn't help that Microsoft's communication about effects of these new technologies and platforms has been quite muted, too, given the potential magnitude of changes they could hold for the future of gaming on Windows. The company held a press event last week to showcase its vision of the future of gaming across the PC and the Xbox One, two islands that it wants to bridge with universal Windows apps. The broader public is just hearing about the details of this event today.

Attendees brought up complaints about vsync and benchmarking (among other issues) with Xbox head Phil Spencer. Going by Sam Machkovech's account at Ars Technica, Spencer said "These [issues] are all in our roadmap...We hear the feedback from the community, and we will embrace it."

The problem from this PC gamer's perspective is that these issues have rarely been problems for games sold through any platform save the one Microsoft is trying to establish. If the company truly understood the wants and needs of the PC gamer, these issues would surely have been ironed out before major titles like Gears of War: Ultimate Edition and Rise of the Tomb Raider exposed them in another publicity firestorm.

No matter how you slice it, this debacle is another black mark on Microsoft's efforts to reach the PC gaming community, and it's one the company could ill afford given its past relationships with that market. Only time will tell whether the company will truly embrace community feedback and make Windows Store games into the kinds of experiences that PC players (and testers) value. For now, though, gamers can simply choose to put their dollars into other, more "open" platforms like Steam, Origin, Uplay, and GOG, and we imagine they'll do just that.

193 comments — Last by JonnyFM at 7:37 PM on 04/20/16

Ashes of the Singularity's second beta gives GCN a big boost with DX12
— 11:21 AM on February 25, 2016

Ashes of the Singularity is probably the first game that will come to most people's minds these days when we talk about DirectX 12. That title has been the source of a bunch of controversy in its pre-release lifetime, thanks to an ongoing furor about the game's reliance on DirectX 12's support for asynchronous compute on the GPU and that feature's impact on the performance of AMD and Nvidia graphics cards alike.

Ashes of the Singularity will be getting a second beta with a new benchmark ahead of its March 22 debut, and several outlets have been given early access to that software in advance of its official release. AnandTech, ExtremeTech, and Guru3D have all benched this new beta in both single and multi-card configurations, including the kinds of Frankenstein builds we saw when developer Oxide Games previewed support for DirectX 12's explicit multi-adapter mode. Explicit multi-GPU support is being added to the game's public builds for this beta release, too, so more of the press has been able to put Radeons and GeForces side-by-side in the same systems to see how they work together.

Performance highlights
While all three reviews are worth reading in-depth, we want to highlight a couple things. Across every review, Radeon cards tend to lead comparable GeForces every step of the way in single-card configurations, at least in the measure of potential performance that FPS averages give us. Those leads widen as resolution and graphics quality settings are turned up.

For example, with Ashes' High preset and the DX12 renderer, the Radeon R9 Fury X leads the GeForce GTX 980 Ti by 17.6% at 4K in AnandTech's testing, going by average FPS. That lead shrinks to about 15% at 2560x1440, and to about 4% at 1080p. Ashes does have even higher quality settings, and Guru3D put the game's most extreme one to the test. Using the Crazy preset at 2560x1440, that site's Fury X takes a whopping 31% lead over the GTX 980 Ti in average FPS. Surprisingly, even a Radeon R9 390X pulls ahead of Nvidia's top-end card with those settings.

As we saw last year in an earlier Ashes benchmark series from PC Perspective, switching to DirectX 11 reverses the Radeons' fortunes. Using that rendering path causes a significant drop in performance: 20% or more, according to AnandTech's results.

The new Ashes benchmark lets testers flip asynchronous compute support on and off, too, so it's interesting to examine what effect that has on performance. AnandTech found that turning on the feature mildly harmed performance on GeForce cards. Radeons, on the other hand, got as much as a 10% boost in frame rates with async compute enabled. Nvidia says it hasn't enabled support for async compute in its public drivers yet, so that could explain part of the performance drop there.

In the "things you can do, but probably shouldn't" department, the latest Ashes beta also lets testers turn on DX12's Explicit Multiadapter feature in unlinked mode, which we'll call EMU for short. As we saw the last time we reported on performance of cards in an EMU configuration, the feature does allow for some significant performance scaling over single-card setups. It also allows weirdos who want to throw a Fury X and a GTX 980 Ti in the same system let their freak flags fly.

AnandTech did just that with its Fury X and GTX 980 Ti. Using the red team's card as the primary adapter, the site got a a considerable performance increase over a single card when running Ashes at 4K and with its Extreme preset. The combo delivered about 39% more performance over a GTX 980 Ti and about 24% over an R9 Fury X. With the GTX 980 Ti in the hot seat, the unlikely team delivered about 35% more frames per second on average.

EMU does come with one drawback, though. Guru3D measured frame times using FCAT for its EMU testing, and the site found that enabling the feature with a GTX 980-and-GTX-980-Ti pairing resulted in significant frame pacing issues, or "microstutter," an ugly problem that we examined back in 2013 with the Radeon HD 7990. If microstutter is a widespread issue when EMU is enabled, it could make the feature less appealing.

Caveats
As with any purportedly earth-shattering numbers like these, we think there are a few caveats. For one, this is a beta build of Ashes of the Singularity. AnandTech cautions that it's "already seen the performance of Ashes shift significantly since our last look at the game, and while the game is much closer to competition now, it is not yet final."

For two, each site tested Ashes with the Radeon Software 16.1.1 hotfix, AMD's latest beta driver. After the results from each site were published, AMD released the Radeon Software 16.2 hotfix, which contains some Ashes-specific optimizations. We're curious to see what effect the updated driver has on the performance of the game on AMD hardware, and it's entirely possible that the effect could be positive.

For three, as we mentioned earlier, Nvidia says that it still hasn't enabled support for asynchronous compute in its public drivers. GeForce software product manager Sean Pelletier took to Twitter yesterday to point this out, and he also noted that Oxide Software's statement that asynchronous compute support was enabled in the green team's public drivers was incorrect. Given how heavily Ashes appears to rely on asynchronous compute, the fact that Nvidia's drivers apparently aren't exposing the feature to the game could partially explain why GeForce cards are lagging their Radeon competitors so much in this benchmark.

Still, if these numbers are any indication, gamers with AMD graphics cards could see a big boost in performance with DirectX 12, all else being equal. It's unclear whether other studios will take advantage of the same DX12 features that Oxide Games has with Ashes of the Singularity. Just because a game claims DirectX 12 support, that gives us no insight about the number of API features a particular title includes. Even so, we could be on the threshold of some exciting times in the graphics performance space. We'll have to see how games using this new API take shape over the coming months.

73 comments — Last by moshpit at 3:45 PM on 03/21/16

Taking the reins
— 11:04 AM on December 4, 2015

Many readers are already aware that The Tech Report’s Editor-in-Chief, Scott Wasson, is leaving us to help AMD’s Radeon Technologies Group use the "inside the second" methods he pioneered to improve its hardware and software products. If you’ve been under a rock for some reason, you should go read his farewell post for all the details.

Though Scott is leaving us, the site he built isn’t going anywhere. I’ve agreed to become TR’s Editor-in-Chief to continue the mission he started 16 years ago. It’s a great honor and a tremendous responsibility for me to carry on the site’s traditions of staunchly independent reporting, deep technical knowledge, and innovative, data-driven reviews. Those values will continue to shape our coverage going forward.

In my time writing for the site—first as a freelancer, then as Managing Editor—Scott, Geoff, and Cyril all worked with me to instill the same principles they wove into The Tech Report’s DNA from the beginning. I have been immeasurably lucky to learn this trade from some of the best minds in the business.

Above all, it’s my hope that you, dear reader, will stick with us. We cannot do what we do without your support and feedback, and I know firsthand just how bright and passionate a community has grown around TR over the years. I look forward to bringing you many more years of news, reviews, and more. Thanks for your continued support.

79 comments — Last by TopHatKiller at 8:20 PM on 12/19/15