We haven't said much about TR subscriptions for a little while, after the rush of the launch, but this little experiment is so far off to an excellent start. You all proved that reader-supported content can work, and you saved our bacon after weak sales in early 2014. We learned some lessons from the initial introductory period, and now we're making additions and changes to the subscription service in response.
One thing that we've wanted to do is add more value for subscribers, so that more of you who are regular readers will find it worth your time to sign up. To that end, we're very happy to announce our first external benefit for TR subscribers: some handsome discounts on software purchased from the Macrium website, including the outstanding Macrium Reflect backup and imaging solution.
Anyone who subscribes for any amount of money at all, down to $1 payment in our pay-what-you-want system, will get a code good for 20% off at Macrium.com. Those folks who beat the average and get a Gold subscription will receive a code for a whopping 40% off, instead.
If you're a TR Silver or Gold subscriber now, your discount code is already waiting for you. Just go to the user control panel and look for it under the "Features" tab. The code should be redeemable throughout the next year.
I'm very pleased to be able to offer a subscriber discount on a product as good as Reflect. I make use of Reflect in Damage Labs constantly thanks to your recommendations. The program writes a bootable WinPE utility onto a thumb drive, and I use it for imaging all of my test systems. I also back up my own PC with Reflect, and it has saved me from an SSD failure with a flawless restore of a weekly image backup. Not only that, but I've received free updates from Macrium for more than a year now without once being held hostage to a required, paid upgrade due to an "incompatibility" with an upgraded version of Windows—unlike *ahem* some imaging companies.
We have more subscriber benefits in the works along these lines, so do yourself a favor and sign up now. You'll also get all of the other subscriber perks, including single-page article views, print templates, comment reply notifications, a subscriber badge, and access to the Smoky Back Room. Beat the average to get triple upvote/downvotes and access to our four-megapixel image galleries, as well.
Finally, remember, if you like what we're doing, you can always add to your subscription amount to support the cause. Thanks!More light bulbs? Yep, more light bulbs
The Internet is a strange and wonderful place. A couple of months ago, I posted a Friday night topic on light bulbs that incited a fair amount of discussion. Not long after that, I kid you not, I started receiving press releases and phone calls from the world's light-bulb brands, as if it made perfect sense for a website with the tagline "PC hardware explored" to be writing about LEDs versus CFLs.
This is a dangerous development.
As you may have gathered from my FNT post, I'm more than happy to geek out about lighting technologies. Quite a few of you are, too, apparently. Heck, I can even tie in my off-hours semi-obsession with my day job.
Watch and learn, kids.
After all, 2014 is already shaping up as the Year of the Display in PC hardware, with technologies like 4K and adaptive refresh rates hitting the market for the first time. There's huge overlap between lighting tech and displays. Backlight quality helps determine the temperature and color gamut of an LCD monitor. Beyond that, we're gonna need some serious candlepower (and efficiency) to make high-dynamic-range displays a reality. And one of the most promising display technologies, OLED, may also be the most promising lighting technology on the horizon. The fates of lighting technology and visual computing are deeply intertwined.
Hence, I've spent a silly amount of my free time lately screwing in various sorts of light bulbs for comparison, and here I am in the middle of a work day writing a blog post about it. It's educational, career-development type stuff.
I'm not sure any sane boss would buy that line, which is why it's great to be your own boss.
Anyhow, I've made a few new discoveries in my light bulb vision quest since last time out. Let me bring you up to date.
The Cree TW Series odyssey
First, I think it was one of you people, out there on the Internet, who posted in my Friday night topic and first made me aware of Cree's TW Series bulbs, a follow-up to the excellent LED lights selling across the U.S. at The Home Depot. Whoever you are, you cost me a fair chunk of change on light bulbs.
I was already a big fan of the Crees, which are superb in fixtures and other sorts of indirect lighting, but the stock Cree 60W replacements aren't quite up to replacing incandescents in every case. Above our kitchen table, for instance, in a triple-socket fixture with exposed bulbs, the regular Cree LEDs produce bright but somewhat harsh light. Under that light, the wood in our table and chairs looks kind of yellowy-green, more so than it does in daylight or with incandescents.
Cree cooked up the TW (or True White) Series in an attempt to rectify that shortcoming. The TW Series bulbs are rated for a Color Rendering Index of 93, substantially higher than the CRI rating of 80 for the regular Cree bulb. I'm not quite sure what all voodoo Cree put into the TW Series in order to achieve this improvement, but one component is a neodymium coating on the glass (similar to GE's Reveal bulbs) that filters out a portion of the light spectrum. I believe there may be a different mix of LED colors inside, as well.
There is a tradeoff involved: the TW Series 60W equivalent uses 13.5W to produce 800 lumens of illumination, while the regular Cree bulb requires only 9.5W to do the same. The TW Series bulb also has a somewhat larger base, so it may not fit into certain fixtures as easily as the stock Cree.
Anyhow, I ordered up some TW Series bulbs with a silly amount of anticipation, and I have to say: I was not disappointed.
Although the TW Series has the same 2700K color temperature rating as the regular Cree bulb, the light produced by the TW Series is much better balanced. When I installed the TWs in our kitchen fixture, the wood in our kitchen table regained its deep red and brown tones. No longer did it look sickly and yellow-green.
Under a lampshade, especially, the TW Series is virtually indistinguishable from an incandescent bulb. I look at it periodically and shake my head. Although there's surely more room for improvement, I think LED lighting technology has hit an important high-water mark here. I don't think most folks could tell the difference between this thing and a 60W incandescent in a casual, side-by-side "taste test."
The only downside of note is that the TW Series bulb doesn't appear to be quite as bright as the stock Cree 60W-equivalent, in spite of the matching lumen ratings. The TW Series illuminates as well as a 60W incandescent, but the stock Cree goes above and beyond that. Depending on the situation, you may find that you prefer the brighter but somewhat less balanced light from the regular bulb. For example, I wound up mixing two TW bulbs with one regular one in our kitchen fixture in order to get the right mix of brightness and quality.
I ordered my TW Series bulbs online, since they weren't available in stores locally, but that's since changed, as I learned from, ahem, an official Cree press release mailing. The TW Series is now available at The Home Depot stores across the U.S. The 60W-equivalent TW Series bulb goes for $15.97 a pop—four bucks more than the standard Cree offering. Since you're potentially looking at owning this thing for 10 years or more, I'd say the premium is worth paying.
My order of a six-pack of bulbs left me armed with a mix of regular and TW Series 60W equivalents. I thought I'd maybe use them to replace some of the remaining incandescents in my house, in places where CFLs just wouldn't cut it. Here's what happened instead: I found myself wandering through the house, swapping out a bunch of the CFLs for LEDs. Turns out, at the end of the day, my affinity for light quality trumps any pretensions of being green. For me, the advent of high-quality LED lighting means the death of CFLs, and I couldn't be happier about it.
Rosewill gets into the LED game
We've reviewed several of Rosewill's keyboards, so when I mentioned light bulbs in that Friday night topic, a keen-eyed PR person from Rosewill insisted on sending me some of their new LED bulbs. You can "see if you like ours better than Cree's!" she suggested perkily via email.
Upon reading that statement, I actually sat back in my chair, inhaled, and said to myself, "That is a bold statement."
But hey, the folks at Rosewill have done a nice job with their mechanical keyboards, so who knows?
These LED bulbs are apparently brand-new products that have just become available at Newegg. Rosewill sent me two different models of LED lights to try out, the warm-white 6.5W bulb rated for 560 lumens and the warm-white 8.2W bulb rated for 660 lumens. The firm doesn't provide an incandescent wattage equivalent for these things. Both of them fall somewhere in between the usual output of 40W and 60W incandescents.
My first impression of the Rosewill LEDs was quite positive. As you can see in the picture above, these bulbs have a compact ceramic base that's less bulky than the Cree's, and they're somewhat shorter in terms of total height, too—very much the size and shape of a traditional incandescent light bulb.
Rosewill rates its soft-white products at a color temperature of 3000K, slightly cooler than the 2700K rating for most soft-white bulbs. In theory, at least, I like the idea of a slightly cooler everyday bulb. So many of the 2700K CFLs I've been using for years at 2700K are too yellowy and seem "off." (There's also a 5000K "cool white" version of each bulb, but I told them not to bother sending those. Ugh. I don't need a grow lamp.)
After screwing the Rosewill 8.2W LED into a lamp and firing it up, I decided maybe 3000K wasn't a great idea. Perhaps this is emitted spectrum instead of just color temperature, but my first thought was that the Rosewill soft-white bulbs emit light that's just a little too Walmart-esque for my tastes. Too much blue to the hue, in my view.
Sorry about that.
I will say Rosewill has one-upped Cree on another front, though. The light produced by this bulb is distributed evenly in a broad, nearly spherical area limited only by the presence of that ceramic base. There aren't any obvious hotspots or dark areas. The Cree's LED tower is more compact and more closely resembles an incandescent filament, but it doesn't emit as much light straight up.
I was torn on whether the Rosewill lights really produced better illumination quality than a CFL when I first tested them in several shaded lamps. The bluish light seemed pretty similar overall. Any doubts on that front were squelched when I subbed in the Rosewill 8.2W bulbs for 13W CFLs in a couple of those three-light open fixtures. The Rosewills performed surprisingly well in direct lighting situations, producing brighter and subjectively higher-quality light than the CFLs they replaced. I also found that the 6.5W bulbs were a nice upgrade in lumen output from a 40W incandescent.
Still, the light quality doesn't really come close to Cree's regular offerings, let alone the TW Series.
The biggest drawback to the Rosewill LEDs, though, is probably the delay on start-up. Like most LEDs, these bulbs reach peak brightness pretty much as soon as they ignite. Trouble is, there's a pretty pronounced delay of a half-second or so (it seems to vary) between flipping the switch and ignition. Seriously, that is a long time. Even CFLs, which take several minutes to reach peak brightness, start producing some light almost instantly. The Crees LEDs are nearly instant-on, too. You can decide how annoying you find this quirk, but personally, I want a faster response when I flip the light switch.
Add in the fact that the Rosewill LEDs aren't compatible with dimmers and only come with a two-year warranty (versus Cree's decade-long pledge,) and it's clear this isn't quite the same caliber of product. That means the Rosewill bulb needs to be cheaper than the Cree, and right now, the 8.2W version is selling for $12.45 at Newegg. This thing needs to cost less, not more, than the market leader.
I suspect Rosewill knows that, and I suspect they'll run discounts and promotions that effectively drop the price of these bulbs over time. At a bit of a discount, these Rosewill bulbs could be a nice value, particularly for use in fixtures where their compact bases, conventional height, and well-distributed illumination would be appreciated.
A new contender emerges
LEDs are getting to be mighty good, but they're not the only lighting technology vying for a spot in sockets after the incandescent ban. The folks at a new start-up company have refined and miniaturized a form of induction lighting in order to create the Finally Bulb, whose story was told at length in this New York Times write-up.
Induction lighting has been used in commercial settings for ages, apparently, but was too large to be practical elsewhere. Finally calls its version of induction tech "Acandescent" lighting, which isn't bad as marketing names go.
The Finally Bulb is set for release this summer, and it looks to be almost exactly the same size as a 60W incandescent. The rest of the specs look pretty decent, too. It requires 14.5W and produces 800 lumens, and the company claims the bulb turns on instantly, with a rated life of 15,000 hours and a warranty spanning 10 years.
Two things could possibly set this bulb apart. One is light quality. The 2700K bulb has a CRI rating of 83, which is higher than the standard Cree LED's rating. The Finally marketing materials focus quite a bit on light quality, claiming this is "the first bulb to truly replicate the look, reassuring warmth and omnidirectional light of the incandescent bulbs you love."
That's a strong claim for a bulb with an 83 CRI. Still, CRI is an imperfect measure, so I'm eager to have a look at one of these things in operation as soon as possible.
The other big deal with the Finally Bulb is its projected price of about $8. That's cheap. If this bulb produces truly appealing light, meets its specs, and undercuts quality LEDs by a few bucks per bucks per socket, it might become yet another a viable alternative option.TR subscriptions: our progress so far
Several days ago, we introduced our pay-what-you-want subscription system. We asked for you all to support us, and we offered a handful of extra perks on the site in return. So far, your response has been overwhelmingly positive and gratifying.
In a short span, we've already received enough subscription funds to improve our bottom line by $1,100 per month for the next year. Even more remarkably, the current average payment stands at over $50 per user.
I don't think any of us here on staff would have predicted an average that high. This is, after all, a pay-what-you-want system where the majority of the benefits are available for as little as one dollar.
So, to everyone who has signed up to support us, thank you very much, from all of us. We appreciate the support and the kind words that have come with it. We built a system that essentially relies on your goodwill, and you have confirmed our faith in you by responding with uncommon generosity.
Of course, the amount we've taken in so far isn't nearly enough to sustain us outside of our usual advertising sales—but this has been an incredibly encouraging start. Not only are we better funded during a difficult time, but we've also demonstrated that a creative crowd-funding system can be an effective supplement to ad sales. Simply knowing that fact makes us stronger, better able to pursue our work with independence and confidence. That's one of the reasons that we took this route. TR has always been about serving a community of readers.
At the risk of lowering the average, I should probably mention something. I've noticed a certain reluctance among folks to make a smaller contribution in order to get a Silver subscription. There's a bit of a "Gold or nothing" mentality, seems like. Let me do my best to discourage that kind of thinking. There is no shame to paying well below the average in order to pick up a Silver subscription. Every bit of support we get helps. Heck, we built most of the good perks into the Silver tier and set the price to "whatever you want" in order to encourage broad participation.
For the perplexed, Cyril made a swanky infographic that maps out the various features of the two tiers in a nice visual format. I have to say, I'm really digging the e-mail notifications when folks reply to my comments.
If you can't afford to beat the current ~$50 average for an annual subscription, one of the best things you can do to support TR is choose one of the pre-set payment amounts and allow your subscription to auto-renew next year. Having that consistent support over time is what will allow us to plan, build, and grow—even if it's a small amount per person each year.
I should also point out something else about how the payment system works. The amount you give is cumulative for your subscription term. If you start with a $25 payment to get Silver, wait a month, and then pay another $26 after your next paycheck, your total amount contributed will be $51. At that point, assuming the average is still $50, you'll have beaten the average, so you'd be automatically upgraded to Gold for the remaining 11 months. I hope that's clear enough. There's more info about how this whole thing works in the FAQ. Also, the slick little predictor Bruno cooked up for the payment page will tell you exactly what you need to pay to get Gold or earn a spot in the top-10 list.
Speaking of the top-10 list, here's how it stands right now. These are some incredibly generous folks. Remember, the top 10 contributors as of noon Central time on March 21, 2014 will get to choose from one of these three things:
If you'd like to bump one of these guys off the list and grab a spot, you can add to your subscription total at any time. The predictor will tell you exactly what it'll take to get there.Introducing TR subscriptions
We've been independently publishing The Tech Report for nearly 15 years, and today, we've come to a crossroads. You see, TR has been supported primarily by advertising sales since its beginning, but that business has been difficult for a while, for a number of reasons.
Although PC gaming and enthusiast systems are a growing segment, the overall market for PCs has been in a prolonged slump. Advertising dollars have moved elsewhere. We have been attempting to hold steady and keep our current, long-tenured staff intact for the past few years, but the reality is that we've been slowly bleeding money.
That reality is a strange one for us to face, since your interest hasn't waned as the ad market has weakened. TR as a publication is more vibrant than ever. We know from our daily experience that you all, our readers, crave the sort of in-depth reviews, articles, and podcasts we produce. Our web traffic is strong. We've had huge appreciation for things like our Inside the Second gaming benchmarks and our SSD endurance experiment. And the TR Podcast has become so popular, we had to find a higher-bandwidth hosting arrangement.
Our challenge, then, has been finding a way to enable the community to support us. Heck, we want to depend on you. We've always tried to run the site with our readers' best interests in mind, and we've given up countless business opportunities as a result. Might as well make our relationship official.
We've been pondering adding a subscription option to the site literally for years, but we couldn't find a satisfactory way to make it work. Lots of independent web publications face a similar struggle to fund the production of high-quality journalism. We searched far and wide to see how others were handling it. Sadly, virtually all of the existing crowd-funding approaches have obvious drawbacks. Too often, they involve gating off public access to core content, hours of tedious rewards fulfillment work that distracts from content creation, or setting an arbitrary subscription price that doesn't work for everyone.
We didn't think any of the existing models were right for TR and its community, so we did what we've done in the past in tough situations: we innovated. We conceived of a better model, and we quietly spent the last of our cash reserves building it. What we've come up with is somewhat unique, and we think it's the best attempt of its type so far.
We've created a "subscriptions" system, but we're not gating off our articles from the public. Virtually everything we publish will remain freely available to all. Instead, we've built a number of new features into the site, many of them often requested. If you subscribe, then you'll get access to them.
Best of all, you can name your own price for a subscription, so folks are free to support TR as much or as little as they think is appropriate. If you like what we're doing and want to support us further, you can add to your contribution total at any time.
If you contribute any amount, you'll get access to a full year of our Silver subscriber features:
Those who contribute enough to beat the current average payment will get a Gold subscription, which gets them:
Additionally, the usernames of our very best supporters will be shown on our list of the top 10 contributors, visible in various places across the site.
|2. Anonymous Gerbil||$286|
|7. Mr. Took||$50|
|8. Anonymous Gerbil||$50|
Of course, the biggest benefit of subscribing is TR's continued health and ability to publish the sort of high-quality content we always have. In fact, if this crazy scheme works out well, we'll put any extra funds we receive back into producing articles for the site and building additional features for subscribers. We have a whole bucket-load of cool ideas for subscriber perks that we'd love to implement.
Speaking of added perks, we've decided to kick off our subscriptions push with a little something extra. The top 10 subscribers as of noon Central time on March 21, 2014 will get to choose from one of these three items:
Don't worry if your initial contribution gets bumped out of the top 10 by a small amount. You can add more to your total at will.
I'm hoping I can treat somebody to some genuine Kansas City barbecue.
Right now, though, we need your help. You can go right here to sign up for an annual subscription. If you still have questions about how this whole deal works, you can check out our FAQ for more info. And I'll be answering questions not covered by the FAQ in the comments below. Thanks in advance for your support.You can snag a 39'' 4K display for $404
The other day, my friend and fellow PC enthusiast Andy Brown pinged me and told me I needed to come over to his house to see his new toy: a 39" 4K display that he ordered from Amazon for 500 bucks. Coming from anybody else, I'd have been deeply skeptical of this purchase, but Andy is actually the co-founder of TR and has impeccable taste in such matters.
The product he purchased is this Seiki Digital 39" 4K 120Hz LED television. I started asking him more questions and looking into it. The more I learned, the more intrigued I became. Soon, I was at Andy's place peering into this large and glorious panel. He had the thing placed directly in front of his old 2560x1600 30" HP monitor, and I can't say I blame him. After all, you can almost get four copies of TR, or any other standard web-width site, side by side on the thing.
Yeah, this beast has more real estate than Ted Turner. And it has dropped in price to $404 at Amazon as I write. With free Prime shipping. And it's still in stock.
Sounds too good to be true, right?
Not really. This thing is just a killer deal, available to anyone. But there are a few caveats.
First, there's the matter of refresh rates. This display has a single HDMI input that can support the panel's native resolution of 3840x2160 at a refresh rate of 30Hz. That's a fast enough update rate for desktop and productivity work, but 30Hz is not good for gaming, even with vsync disabled.
Your fall-back option is to drop down to 1920x1080 while gaming, where this thing supports a nice, fast 120Hz refresh rate. That's a compromise on resolution, yes, but this puppy is probably faster than your current display, since 60Hz is the usual standard. Also, 1080p is a nice resolution for gaming because it doesn't require heaps and heaps of GPU horsepower in order to maintain acceptable performance.
And did I mention the price?
The other matter of some importance is the image quality of the display. I believe it's an S-MVA-type panel, which should make it superior to a TN panel and faster than an IPS one. Standing in front of it, that seems about right. There's less color shift than on most TN panels, and there's a heckuva lot of pop to the deep reds and oranges that often seem muted on TN panels.
This is a TV, though, so color correctness is an issue. You may want to buy or borrow a calibrator for it. Andy didn't yet have his display calibrated properly in Windows. The blues in the TR header were alarmingly neon and bright, to the point of being annoying. He'd had more luck with calibration on his Hackintosh, though. When he switched over there, the blues were somewhat tamed, though still brighter and more saturated than I would have liked. He'd put some work in dialing down the backlight intensity in one of the config menus on the TV in order to reach non-retina-searing brightness levels appropriate for a computer monitor.
But did I mention the price?
The simple fact is that you can have a massive array of pixels a couple of feet from your face for about $400. Stretched across a 39" panel, the pixel density is obviously higher than on my own 30" 2560x1600 monitor, but it's not so incredibly high that text becomes completely unreadable. If you do need to bump up the font size, the PPI shouldn't be so out-of-bounds high that the default Windows scaling options are overwhelmed. (I'd still recommend Windows 8.1 for a better experience. Or Mac OS X for the best.)
And there are so, so many pixels.
I know there are a lot of display innovations on tap for this year, including dynamic refresh schemes like G-Sync and 4K TN panels for around $700. This one comes at you from a different angle, and it's not something I expected, to say the least. But if you're willing to front a fraction of the cost of most 4K monitors, you can have the same pixel count today at a crazy discount.Oculus Rift's 'Crystal Cove' prototype tickles our rods and cones
The absolute highlight of last year's CES was getting a first look at an Oculus Rift prototype. Strapping on a Rift for the first time is a mind-blowing experience. It will change your view of what's possible in gaming in the next 5-10 years. Naturally, then, when it came time to plan for CES 2014, I made sure to schedule some time with the folks at Oculus to see what they—and especially new Oculus CTO John Carmack—have been doing.
As you may have heard, the new "Crystal Cove" prototype that Oculus brought to the show this year captured a major award: Best in Show for CES 2014. The news came to the folks in the Oculus meeting room late on Thursday last week, as we were getting a demo of the headset. Given what I saw through those goggles, the recognition seems well deserved.
Crystal Cove is the third generation of hardware Oculus has put on public display. The first generation, with a 720p LCD screen inside, was the one they showed at CES 2013. Later last year, Oculus upgraded to a higher-resolution 1080p LCD. Crystal Cove takes several important steps beyond that.
Much of the new tech in Crystal Cove is intended to overcome one of Oculus' biggest challenges: VR headsets don't work well for everyone. A lot of people develop a sense of vertigo, nausea, or fatigue after using a Rift prototype for a while, sometimes in only minutes. The problem is apparently caused by the disconnect between what your senses expect to see in response to head motions and what's actually displayed. If the system doesn't respond quickly or accurately enough, you may find yourself unexpectedly executing a technicolor yawn.
Even several tens of milliseconds worth of delay can be enough to trigger a problem, so Oculus has been pushing to squeeze any latency it can out of the sensor-to-display feedback loop. That's why the Crystal Cove prototype contains a 1080p AMOLED display. The AMOLED delivers markedly better color saturation and deeper blacks than the earlier LCDs. More importantly, though, the AMOLED has a much faster pixel-switching time: less than a millisecond, versus about 15 ms for the LCDs in prior Rift prototypes.
Interestingly enough, switching to an AMOLED alone doesn't fix the ghosting that's often evident when making sweeping movements with a Rift attached to your noggin. Oculus claims this ghosting effect isn't inherent to the display itself and isn't visible on a high-speed camera; instead, it's caused by an interaction with the human visual system. They have been able to mitigate the problem, however, by implementing a low-persistence display mode. The AMOLED is quick enough to flash on and off again very quickly, at a high enough rate that no flicker is perceptible to the human eye. What you'll notice, instead, is that the ghosting effect is essentially eliminated.
I got to see low-persistence mode in action, and it works. In the demo, I had the Rift attached to my face while I was looking at some big, red text in the virtual world ahead of me. The Oculus rep had me waggle my head back and forth, and I saw obvious ghosting. He then flipped on the low-persistence mode. The entire display became somewhat dimmer, though without any obvious flicker. I again waggled around my enormous noggin, and the text no longer left a blurry trail of red behind it as I moved.
Given the latency sensitivity of the application and the fact that a low-persistence display mode appears to be in the works for monitors based on Nvidia's G-Sync technology, I had to wonder if Oculus has been experimenting with G-Sync-like dynamic refresh rates, as well. (They totally are.) Sadly, the Oculus rep handling our demo wasn't willing to discuss that subject.
The other big enhancement in Crystal Cove is a major upgrade to the head tracking hardware. The sensors in previous Rift prototypes could detect orientation—roll, pitch, and yaw—but that was it. This revision incorporates infrared LEDs placed all around the front and sides of the headset, and their movement is tracked by a camera placed in front of the user. The camera and LEDs give the Rift true positional tracking of the wearer's head in 3D space.
As with the display changes, the positional tracking appears to work well. In our demo, we were encouraged to crane our necks around 180 degrees in an attempt to throw off the tracking. The display was set to revert to a grayscale mode with the loss of tracking, and invoking it was tough to do while sitting in a chair facing the camera, which is how the Rift is intended to be used. Even when one demo subject managed to contort himself well enough to hide the LEDs from the camera and cause a tracking failure, the system recovered quickly. The display shifted back to full color within two or three seconds after the headset came back into plain view.
The combination of positional tracking, a faster display, and low-persistence mode is meant to provide a better, more comfortable VR experience than past Rift prototypes. I wasn't able to use the Crystal Cove headset long enough to judge for myself, and I haven't felt many ill effects during brief stints with the earlier prototypes. However, the Oculus folks seem to think they've pretty much conquered the sickness problem. Even late in the week at CES, after presumably hundreds of demos to the press and industry, they claimed not to have found anyone yet who was sickened by using a Crystal Cove prototype. If true, that's very good news.
I can tell you that the Crystal Cove hardware provides an even more immersive and borderline magical experience than earlier revisions of the Rift. The AMOLED is a big upgrade just for the color quality and sense of depth. Also, the software being demoed makes much better use of the VR headset.
We first got a look at an Unreal Engine 4 demo created by the guys at Epic called Strategy VR. The visuals in it are rich and detailed. I found myself hunching over and looking down, with my head nearly between my legs, peering over the edge of a virtual cliff in wonder.
The real star of the show, though, was the demo of Eve Valkyrie, the in-development game that's slated to be a Rift launch title. The Rift and this game breathe incredible new life into a genre that's been on the brink of death for some time now. When you slide on the headset, you find yourself sitting in the virtual cockpit of a space fighter. Some of the gauges are hard to make out at first, but if you lean forward, the text becomes clearer and easier to read. Above the gauges is a canopy, with a reeling space battle taking place in the sky beyond. The illusion of being there is strong, more so when you find yourself craning your neck to peer out of the canopy above and to your left, attempting to track an enemy fighter positioning itself on your six.
Having never played before, I scored -30, and my demo was over quickly due to an early death. The realism was impeccable.
Given the progress Oculus has made in the past year, we were left wondering how long it will be until the consumer version of the Rift hits store shelves. Right now, Oculus is being very cautious; it hasn't stated any timelines for the release of a final product. The firm says its goal is to be sure "VR is the right experience" for everyone who buys a headset.
Several components of that experience still need to come together before the Rift is ready for prime time. Oculus admits it's still working to improve the Rift's display resolution between now and the consumer product launch. That seems wise to me. When it's that close to your face and divided between two eyes, a 1080p display feels pretty low-res. If you stop and look, you can see the individual subpixels in the Crystal Cove's AMOLED array.
Also, the Rift currently lacks an audio component, which is a major omission. Oculus admits as much, calling positional audio "super-critical" to a VR experience, but it says it won't reveal any info yet about partnerships on the audio front. I assume that means there will be some.
For what it's worth, AMD had a gen-two Rift prototype on display in its CES booth along with a pair of headphones featuring positional audio generated by GenAudio's AstoundSound middleware and accelerated by a TrueAudio DSP block. I gave this setup a brief spin, and I'd say that's a pretty good start.
Oculus also has to make sure the Rift's game support is broad and deep enough to make the VR headset a compelling purchase. Eve Valkyrie looks amazing, but it won't suffice on its own. Fortunately, the company claims to have shipped about 50,000 Rift developer kits already, which should mean plenty of developers have Rifts strapped to their faces. In fact, one of the strange problems Oculus has now is not being able to track what everyone is doing with its development hardware. If the final headset is anywhere near as compelling as the prototypes, we've got to think there will be a steady stream of Rift-enabled applications released in the next couple of years.
That said, we could easily be waiting until CES 2015 or beyond until the Rift makes its way into its final, near-$300 form and ships to consumers everywhere. Given everything, it's easy to understand why that's the case. Still, having seen the goodness of Crystal Cove in action, a big part of me would like very much to hurry up and get on with the future, because it's really gonna be good.An update on Radeon R9 290X variance
We're still tracking the issue of Radeon R9 290X performance variance after our investigation into the matter last week and AMD's subsequent statement. As noted in that statement, AMD acknowledges that the apparent performance gap between the initial press samples and retail cards is wider than expected. Essentially, the variance from one 290X card to another ought not to be as broad as the 5-10% we're seeing. The firm is still investigating the reasons for this disparity, and an AMD rep paid a visit to Damage Labs yesterday in order to discuss the matter.
One thing the folks at AMD asked me to do is test the press and retail R9 290X cards against one another in the "uber" fan mode. Flipping the switch to enter "uber" mode raises the card's blower speed from 40% to 55% of its potential maximum. (Don't be fooled by the percentages there. The default 40% cap is loud, and the 55% cap is pretty brutal. At a full-on 100%, which you'd never experience during normal operation, the 290X blower sounds like a power tool. A noisy one.) You may recall from our original review that the 290X is quite a bit faster in "uber" mode. That's because in the default "quiet" mode, 290X cards constantly bump up against their thermal limits. "Uber" mode is intended to raise those limits by providing more cooling capacity.
At AMD's request, I ran our HIS retail 290X card and our original review sample through our 30-minute Skyrim test three times each and then through our MSI Kombustor worst-case torture test. The results are straightforward enough that I don't need to plot them for you. Cranking up the blower speed limit allows both the 290X press sample and the HIS retail card to run at a constant 1GHz in Skryim. There are virtually no clock speed reductions via PowerTune. Consequently, both cards perform the same in "uber" mode, averaging about 83 FPS during the duration of the test.
The results from Kombustor are similar. The press sample stays pretty much glued to 1GHz. The HIS retail card's GPU clock dips intermittently to as low as 978MHz in this peak thermal workload, but it spends the majority of its time at 1GHz, as well.
This is an important point to the folks at AMD, because it means the card-to-card performance variance we've cataloged can be eliminated quite literally at the flip of a switch. Owners of retail R9 290X cards will have to be willing to put up a non-default configuration that produces substantially more noise, but their cards should should be able to obtain the same performance that the 290X review sample achieved in "uber" mode in our review.
Obviously, the "uber" mode switch isn't a fix for everything that ails retail R9 290X cards. Many folks will prefer to have the combination of noise levels and performance offer by the stock configuration, and the card-to-card variance there remains an issue.
The next step from here is interesting, because AMD expects its partners to produce cards, like the DirectCU II 290X that Asus teased recently, with custom cooling that surpasses the stock cooler by a fair amount. With more effective cooling, these third-party cards could offer "uber" mode-type performance—and, potentially, less card-to-card variance—even at lower noise levels.
We'll have to see how that shakes out once we get our grubby little hands one of those custom 290X cards. I'm hoping that will happen soon. I also expect AMD to have something more detailed to say about the reasons for the unexpectedly broad card-to-card variance on current retail 290X cards before too long, so stay tuned.
|Here are two of ASRock's next-gen Z170 motherboards||18|
|Google's Project Soli radar gesture tracking looks awesome||11|
|Zotac and EVGA liquify the GeForce GTX Titan X||22|
|Nvidia's GameWorks program goes mobile||14|
|Lenovo's ThinkPad 10 tablet looks like a Surface 3 in a suit||11|
|Deal of the week: Asus' Core M ultrabook for $599 and Project Cars for $34||10|
|SourceForge adds software bloat to more installers||48|
|Google Jumps on panoramic VR video||19|
|Catalyst 15.5 betas promise gains in Project Cars, Witcher 3||28|