The Tech Report secures an independent future
— 2:40 PM on February 28, 2018

A little over two years ago, Scott Wasson, the founder and long-time Editor-in-Chief of The Tech Report, began a new role with AMD to make life better for gamers using Radeon hardware and software. After his departure, Scott maintained his ownership of TR's parent business while searching for a new caretaker that would let us continue to enjoy the editorial independence that's been a hallmark of our work from day one.

Today, I'm pleased to announce that search has come to an end. The Tech Report will remain an independent publication under the ownership of Adam Eiberger, our long-time business manager. I will be staying on as The Tech Report's Editor-in-Chief. We're excited that this arrangement will allow us to sustain and grow the project that Scott started almost two decades ago for many years to come on terms of our choosing.


Our 2003 home page

It is critical to note that during the past two years, Scott exercised no control over the editorial side of TR. What we chose to cover and not to cover, the testing we performed or didn't perform, and the conclusions we reached or didn't reach were entirely under my direction. That remained the case even when our methods led to conclusions that favored products from AMD's competitors, and when we wrote articles that were complimentary to AMD's products.

These facts will not surprise anybody who has paid attention, but this final passing of the torch should put to rest any lingering concerns about our editorial independence.

With new ownership, the inevitable first question will be what else is going to change. Happily, the answer is nothing. You've already seen what The Tech Report looks like under my leadership over the past two years, and we'll continue to bring you the day-to-day news and in-depth reviews that we've always produced. We may explore new platforms and new audiences, but the core reporting and reviewing work that made The Tech Report famous will remain the bedrock of our work going forward.


Our 2010 home page

Although we're excited for the opportunity to continue what Scott started in 1999, this changing-of-the-guard comes at a tough time for online media. While we enjoy (and are deeply grateful for) support from a wide range of big names in the PC hardware industry, advertising support simply isn't as lucrative as it once was for many online publications, and The Tech Report is no exception. At the same time, there's more going on in personal computing and technology than ever, and we want to remain a leading voice in what's new and what's next.


Our home page today

If you believe in The Tech Report's work as strongly as I do, there is no better time to help us build the foundation for the next steps on our journey. Please, for the love of all things holy, whitelist us in your ad blocker. Subscribe for any amount you like. Comment on our articles, participate in the forums, and spread our work far and wide on Facebook and Twitter. Help us help you. Thanks in advance for your continued readership and support.

I'd also like to share a message from Scott regarding this news, as well:

To the TR community:

As you may know, I left my position as TR's Editor-in-Chief to take a job in the industry just over two years ago. In the time since, I have technically retained ownership of The Tech Report as a business entity. Naturally, I've had to stay well away from TR's editorial operations during this span due to the obvious conflict of interest, and I have been looking for the right situation for TR's ownership going forward.

I'm happy to announce that we completed a deal last week to sell The Tech Report to Adam Eiberger, our long-time sales guy. The papers are signed, and he's now the sole owner of the business.

I think this arrangement is undoubtedly the best one for TR and its community, for two main reasons.

For one, I'm still passionate about the virtues of a strong, independent media. With this deal in place, TR will maintain its status as an independent voice telling honest, unvarnished truths in its reporting and reviews.

Second, putting the site into the hands of a long-time employee gives us the best chance of keeping the TR staff intact going forward. Jeff, Adam, Bruno, and the gang have done a stellar job of keeping The Tech Report going in my absence, and I'm rooting for their continued success and growth.

I want to say thanks, once again, to the entire community for making our incredible run from 1999 to today possible. I'll always look back with gratitude on the way our readership supported us and allowed me to live my dream of running an independent publication for so many years. With your continued support and a little luck, TR should be able to survive and thrive for another couple of decades.

Thanks,
Scott

The TR staff would like to extend our thanks to Scott and the many TR alumni for their hard work in building not only one of the finest technology sites around, but also one of the best audiences any writers could ask for. We wish Scott the best as we part ways and carry TR's mission forward.

137 comments — Last by BIF at 6:30 AM on 03/28/18

DirectX 11 more than doubles the GT 1030's performance versus DX12 in Hitman
— 4:02 PM on February 16, 2018

It's been an eventful week in the TR labs, to say the least, and today had one more surprise in store for us. Astute commenters on our review of the Ryzen 5 2400G and Ryzen 3 2200G took notice of the Nvidia GeForce GT 1030's lagging performance in Hitman compared to the Radeon IGPs and wondered just what was going on. On top of revisiting of the value proposition of the Ryzen 5 2400G and performing some explorations of just how much CPU choice advantaged the GT 1030 in our final standings, I wanted to dig into this performance disparity to see whether it was just how the GT 1030 gets along with Hitman or an indication of a possible software problem.

With our simulated Core i3-8100 running the show, I fired up Hitman again to see what was going on. We've seen some performance disparities between Nvidia and AMD graphics processors under DirectX 12 in the past, so Hitman's rendering path seemed like the most obvious setting to tweak. To my horror, I hit the jackpot.

Hitman with DirectX 11 on the GT 1030 ran quite well with no other changes to our test settings. In its DirectX 11 mode, the GT 1030 turned in a 43-FPS average and a 28.4-ms 99th-percentile frame time, basically drawing dead-even with the Vega 11 IGP on board the Ryzen 5 2400G.

Contrast that with the slideshow-like 20-FPS average and 83.4-ms 99th-percentile frame time our original testing showed. While the GT 1030 is the first tier on the Pascal GeForce ladder, there was no way its performance should have been that bad in light of our other test results.

This new data puts the GT 1030 in a much better light compared to our first round of tests in our final reckoning. Even if we use a geometric mean to lessen the effect of outliers on the data, a big performance drop like the one we observed with Hitman under DirectX 12 will have disproportionate effects on our final index. Swapping out the GT 1030's DirectX 12 result for DirectX 11 is only fair, since it's the way gamers should apparently play with the card for the moment. That move does require a major rethink of how the Ryzen 5 2400G and Ryzen 3 2200G compare to the entry-level Pascal card, though.

With the parts lists we put together yesterday, the Ryzen 5 2400G system is about 15% less expensive than the Core i3-8100 system, and its 99th-percentile FPS figure is now about 11% lower than that of the Core i3-8100-and-GT-1030 box. That's still a better-than-linear relationship in price-to-performance ratios for gaming, and it's still impressive. Prior to today, gamers on a shoestring had no options short of purchasing a discrete card like the GT 1030, and the Ryzen 5 2400G and Ryzen 3 2200G now make entry-level gaming practical on integrated graphics alone.

Dropping a Ryzen 3 2200G into our build reduces 99th-percentile FPS performance about another 16% from its beefier sibling, but it makes our entry-entry-level build 17% cheaper still. As a result, I still think the Ryzen 5 2400G and Ryzen 3 2200G are worthy of the TR Editor's Choice awards they've already garnered, but it's hard to deny that these new results take a bit of the shine off both chips' performance.

To be clear, I don't think this result is an indictment of our original data or testing methods. We always set up every graphics card as equally as possible before we begin testing, and that includes gameplay settings like anti-aliasing, texture quality, and API. Our choice to use Hitman's DX12 renderer across all of our test subjects was no different. This is rudimentary stuff, to be sure, but the possibility simply didn't occur to me that using Hitman's DirectX 12 renderer would pose a problem for the GT 1030.

We've long used Hitman for performance testing despite its reputation as a Radeon-friendly title, and its DirectX 12 mode hasn't caused large performance disparities among GeForces and Radeons even as recently as our GeForce GTX 1070 Ti review. Given that past history, I felt it would be no problem to continue as we always have in using Hitman's cutting-edge API support. Testing hardware is full of surprises, though, and putting the Ryzen APUs and the GT 1030 through the wringer has produced more than its fair share of them.

57 comments — Last by xrror at 11:15 PM on 02/25/18

Revisiting the value proposition of AMD's Ryzen 5 2400G
— 3:23 PM on February 15, 2018

The mornings before tight deadlines in the world of PC hardware reviews often follow a week or less of nonstop testing, retesting, and more testing. Sleep and nutrition tend to fall by the wayside in the days leading up to an article in favor of just one more test or looking at just one more hardware combination. None of these conditions are ideal for producing the best thinking possible, and as a human under stress, I sometimes err in the minutes before a big review needs to go live after running that gauntlet.

So it went when I considered the bang-for-the-buck of the Ryzen 5 2400G, where my thinking fell victim to the availability heuristic. I had just finished the productivity value scatter chart and overall 99th-percentile frame time chart on the last page of the review before putting together my conclusion, and having those charts at the top of my mind blinded me to the need for the simple gut check of, y'know, actually putting together a parts list using some of the CPUs we tested. Had I done that, I would have come away with a significantly different view of the 2400G's value proposition.

While the $170 Ryzen 5 2400G would seem to trade blows with the $190 Core i5-8400 on a dollar-for-dollar basis for a productivity system, even that forgiving bar favors the Ryzen 5 once we start putting together parts lists. Intel doesn't offer H- or B-series motherboards compatible with Coffee Lake CPUs yet, so even budget builders have to select a Z370 motherboard to host those CPUs. That alone adds $30 or more to the Ryzen 5 2400G's value bank.

The Multi-Tool    
CPU AMD Ryzen 5 2400G $169.99    
CPU cooler AMD Wraith Spire --    
Memory G.Skill Ripjaws V 8 GB (2x 4 GB)
DDR4-3200 CL16
$103.99    
Motherboard ASRock AB350 Pro4 $69.99 (MIR)    
Graphics card Radeon Vega 11 IGP --    
Storage WD Blue 1TB $49.00    
Power Corsair VS400 $34.99    
Case Cooler Master MB600L $46.99    
Total $474.95    

To demonstrate as much, here's a sample Ryzen 5 2400G build using what I would consider a balance between budget- and enthusiast-friendliness. One could select a cheaper A320 motherboard to save a few more bucks, but I don't think the typical gamer will want to lose the ability to overclock the CPU and graphics processor of a budget system. The ASRock AB350 Pro4 has a fully-heatsinked VRM and a solid-enough feature set to serve our needs, and the rest of the components in this build come from reputable companies. Spend less, and you might not be able to say as much.

The Caffeinator    
CPU Intel Core i5-8400 $189.99    
CPU cooler Intel boxed heatsink --    
Memory G.Skill Ripjaws V 8 GB (2x 4 GB)
DDR4-2666 CL15
$103.99    
Motherboard Gigabyte Z370 HD3 $99.99 (MIR)    
Graphics card Intel UHD Graphics 630 --    
Storage WD Blue 1TB $49.00    
Power Corsair VS400 $34.99    
Case Cooler Master MB600L $46.99    
Total $524.95    
Price difference versus Ryzen 5 2400G PC $50.00    

For our Core i5-8400 productivity build, the $20 extra for the CPU might not seem like a big deal, but it's quickly compounded by the $30 extra one will pay for the Z370 motherboard we selected—and that's after one chances a mail-in rebate to get that price. Intel desperately needs to get B- and H-series motherboards for Coffee Lake CPUs into the marketplace if it wants non-gamers to have a chance of building competitive or better-than-competitive systems with AMD's latest.

The Core i5-8400 can still outpace the Ryzen 5 2400G in many of our productivity tasks, though, and on the whole, the $50 extra one will pay for this system is still more than worth it for folks who don't game. If time is money for your heavier computing workloads, the i5-8400 could quickly pay for the difference itself. Ryzen 5 2400G builders can probably make up some of the performance difference through overclocking, but we don't recommend OCing for productivity-focused builds that need 100% stability.

The Instant Coffee    
CPU Intel Core i3-8100 $119.99    
CPU cooler Intel boxed heatsink --    
Memory G.Skill Ripjaws V 8 GB (2x 4 GB)
DDR4-2400 CL16
$103.99    
Motherboard Gigabyte Z370 HD3 $99.99 (MIR)    
Graphics card Asus GT 1030 $89.99    
Storage WD Blue 1TB $49.00    
Power Corsair VS400 $34.99    
Case Cooler Master MB600L $46.99    
Total $544.94    
Price difference versus Ryzen 5 2400G PC $69.98    

Those building entry-level PCs might not have the luxury of choosing between productivity chops and gaming power, though. To make a gaming build with capabilities similar to those of the Ryzen 5 2400G, building a system around the Core i5-8400 quickly leads to a bottom line that's too expensive to really be considered budget-friendly. That's thanks to the need for an Nvidia GT 1030 like the one we employed with our test system. Those cards were $70 or $80 until just recently, but a mysterious shortage of them at e-tail has suddenly led to a jump in price.

Regardless, back-ordering one of those cards will run you $90 at Amazon right now, and even though we're rolling with that figure for the sake of argument, $90 is honestly too much to pay for a discrete card with the GT 1030's performance. If you had to buy one, we'd wait for prices to drop once stock levels return to normal.

To restore our system to something approaching budget-friendliness, we have to tap a Core i3-8100 for our Coffee Lake gaming system instead of the Core i5-8400, and that suddenly puts the CPU performance of our build behind that of the Ryzen 5 2400G in most applications. Oof.

AMD Ryzen 5 2400G
February 2018

With new information gleaned from retesting the GeForce GT 1030 in Hitman, the Ryzen 5 2400G no longer beats out that card in our final reckoning. On the whole, though, it clears the 30-FPS threshold for 99th-percentile frame rates that we want to see from an entry-level gaming system. Before this week, that's not something we could say of any integrated graphics processor on any CPU this affordable. As part of a complete PC, it does so for $70 less than our GT 1030 build. Gamers don't have to tolerate 1280x720 and low settings on the 2400G, either; we used resolutions of 1600x900 and 1920x1080 with medium settings for the most part.

So there you have it: the Ryzen 5 2400G is a spectacularly balanced value for folks who want an entry-level system without compromising much on CPU or graphics performance, just like its Ryzen 3 sibling is at $100. Both CPUs were equally deserving of a TR Editor's Choice award for their blends of value and performance, and I'll be updating our review post-haste to reflect AMD's dominance in that department. Sorry for the goof, and I'll make a better effort to look before I leap in the future.

57 comments — Last by raaj13 at 8:37 PM on 02/28/18

Tobii makes a compelling case for more natural and immersive VR with eye tracking
— 4:36 PM on January 26, 2018

 We've heard murmurs about the benefits of eye-tracking in VR headsets for quite some time now, but even with the number of press days and trade shows we attend in the course of the year, I'd never had the opportunity to give the tech a spin. That changed with a demo we got to try this year at CES. Tobii, probably the leading company in eye-tracking technology, invited us in for a private showing of its most recent round of VR eye-tracking hardware this year. The company had a prototype HTC Vive headset at hand with its eye trackers baked in for me to kick the tires with, and I came away convinced that eye tracking is an essential technology for the best VR experiences.


Tobii's prototype HTC Vive

Tobii's demo took us through a few potential uses of eye-tracking in VR. The most immediate benefit came in setting interpupillary distance, an essential step in achieving the sharpest and clearest images with a VR headset. With today's headsets, one might need to make a best guess at the correct IPD using an error-prone reference image, but the Tobii tech gave me immediate, empirical feedback when I achieved the correct setting.

Next, the demo pulled up a virtual mirror that allowed me to see how the eyes of my avatar could move in response to eye-tracking inputs. While this avatar wasn't particularly detailed, it was clear that the eye-tracking sensors inside the headset could translate where I was looking into virtual space with an impressive degree of precision and with low latency.

I was then transported to a kind of courtyard-like environment where a pair of robots could tell when I was and wasn't looking at them, causing one to pop up a speech bubble when I did make eye contact. That cute and rather binary demo belies a future where VR avatars could make eye contact with one another, a huge part of natural interaction in the real world that's not present with most human-to-human (or human-to-robot) contact in VR today.

After that close encounter, I was transported to a simulated home theater where I was asked to perform tasks like dimming a light, adjusting the volume of the media being played, and selecting titles to watch. With eye tracking on, I had only to look at those objects or menus with my head mostly still to manipulate them with the Vive's trackpads, whereas without it I had to move my entire head, much as one would have to do with most of today's VR HMDs. It was less tiresome and more natural to simply move my eyeballs to perform that work as opposed to engaging my entire neck.

Another more interactive demo involved picking up a rock with the Vive's controller and throwing it at strategically-placed bottles scattered around a farmyard. With eye tracking off, I was acutely aware that I was moving around a controller in the real world to direct the simulated rock at a bottle. This motion didn't feel particularly natural or coordinated, and I'd call it typical of tracked hand controllers in VR today.

With eye-tracking on, however, I felt as though I was suddenly gifted with elite hand-eye coordination. The eye-tracking-enhanced rock simply went where I was looking when I gave it a toss, and my aim became far more reliable. I wouldn't say that the software was going so far as to correct wildly off-course throws, but it was somehow using the eye-tracking data to smooth some of the disconnect between real-world motion and its effects in VR. The experience with eye-tracking on simply felt more immersive.

Another interactive demo simulated a kind of AR game where a military installation on Mars was poised to fire at UFOs invading Earth. With eye-tracking off, I had to point and click with the controller to adjust the various elements of the scene. When my gaze was tracked, I simply had to look at the stellar body I wanted to adjust and move my finger across the touchpad to move it, rather than selecting each planet directly with the controller. This experience wasn't as revelatory as the rock toss, but it was more inviting and natural to simply look at the object I wanted to manipulate in the environment before doing so.

The final demo dropped me into a sci-fi setting where I could toggle a number of switches and send an interplanetary message. Without eye-tracking on, this demo worked like pressing buttons typically does in VR right now: by reaching out with the Vive controller and selecting the various controls with the trigger. With eye tracking on, however, I had only to look at those closely-spaced buttons and pull the trigger to select them—no reaching or direct manipulation required.


A simulation of how a foveated frame might look in practice. Source: Tobii

The big surprise from this experience was that Tobii had been using a form of foveated rendering throughout the demos I was allowed to try out. For the unfamiliar, foveated rendering devotes fewer processing resources to portions of the VR frame that fall into the user's peripheral vision. Early efforts at foveation relied on fixed, lens-dependent regions, but eye-tracked HMDs can dynamically change the area of best resolution depending on the direction of the wearer's gaze. The Tobii-equipped VR system was invisibly putting pixels where the user was looking while saving rendering effort in parts of the frame where it wasn't needed.

Indeed, the company remarked that nobody had noticed the foveation in action until it pointed out the feature and allowed folks to see an A-B test, and I certainly didn't notice the feature in action until it was revealed to me though that test (though the relatively low resolution and considerable edge aberrations of today's VR HMDs might have concealed the effects of foveation on some parts of the frame). Still, if foveation is as natural on future hardware as Tobii made it feel on today's headsets, higher-quality VR might be easier to achieve without the major increases in graphics-hardware power that would be required to naively shade every pixel.

All told, Tobii's demos proved incredibly compelling, and I was elated to finally experience the technology in action after hearing so much about it. The problem is that getting eye-tracking-equipped headsets onto the heads of VR pioneers is going to require all-new hardware—the company says its sensors require a companion ASIC to process and communicate the eye-tracking data to the host system, and it can't simply be retrofitted to existing HMDs. Asking early adopters to dump their existing hardware for a smoother and more immersive experience might prove to be an uphill climb. Keep an eye out for Tobii tech in future HMDs, though—it makes for a much more natural and immersive VR experience.

26 comments — Last by SgorageJar at 7:10 PM on 02/05/18

Synaptics' Clear ID fingerprint sensor feels like the way of the future
— 10:30 AM on January 17, 2018

Edge-to-edge screens are poised to be the new hotness of smartphone design in 2018, but pushing pixels right out to a device's borders leaves little room for the range of sensors we've come to know and love on the front of a phone—especially fingerprint sensors. By all accounts, Apple is dealing with this new reality by gradually retiring the fingerprint as a biometric input. You can still get a Touch ID sensor on an iPhone 8 or some MacBook Pros, but the future as seen from Cupertino clearly relies on Face ID, its array of depth-mapping hardware, and the accompanying notch.

Fingerprint sensors still have some advantages over face-sensing tech, though. They allow owners to unlock their devices without looking directly at the front of the phone, an important capability in meetings or when the device is resting on a desk or table. They can't be tricked by twins, and they can't be as easily spoofed as some less-sophisticated forms of facial identification. It's simple to enroll multiple fingerprints with most fingerprint sensors, as well, whereas Face ID is limited to one user at the moment. I appreciate being able to enroll several of my ten fingers with my iPhone to account for my left and right hands, for example, while other owners might enroll a spouse's fingerprint for emergencies. Ideally, we'd have both technologies at our disposal in the phones of the future.

Some Android device makers have been coping with the demand for ever-shrinking bezels by introducing less-sophisticated facial unlock schemes of their own, but the overwhelming majority of serious biometric inputs on those devices comes from a fingerprint sensor on the back of the phone. Sometimes those back-mounted sensors are placed well, and sometimes they aren't. As a long-time iPhone user, I believe that the natural home for a fingerprint reader is on the front of the device, but edge-to-edge displays mean that phone manufacturers who aren't buying Kinect makers of their own simply have to put fingerprint sensors somewhere else.

The intensifying battle between face and fingerprint for biometric superiority, and the question of where to put fingerprint sensors in tomorrow's phones, is fertile ground for Synaptics. You might already know Synaptics from its wide selection of existing touchpad and fingerprint-sensing hardware, and last week at CES, the company made a big splash by showing off the first phone with one of its Clear ID under-screen fingerprint sensors inside: a model from Vivo, a brand primarily involved in southeast Asian markets.


The demo Vivo phone. Fingerprints go on the glowing blue spot.

In short, Clear ID sensors let owners enjoy the best of both edge-to-edge screens and front-mounted fingerprint sensors by taking advantage of the unique properties of OLED panels to capture fingerprint data right through the gaps in the screen's pixel matrix itself. Clear ID results in an all- (or mostly-) screen device with no visible fingerprint sensor on its face and no notches for face-sensing cameras at the top of the phone. We covered Clear ID in depth at its debut, but I was eager to go thumbs-on with this technology in a production phone.

What's most striking about Clear ID is how natural it feels to use. Enrolling my fingerprint required the usual lengthy sequence of hold-and-lift motions that most any other fingerprint sensor does these days. Once the device knew the contours of my thumb, though, unlocking the phone proved as simple and swift as resting my opposable digit on a highlighted region of the screen that's always visible thanks to the self-illuminating pixels of the Vivo phone's OLED panel. The process felt as fast as using Touch ID on my iPhone 6S, and it may even have been faster when I got the phone in a state where it would unlock without playing the elaborate animation you see above.

In the vein of the best innovations, Clear ID feels like the way fingerprints ought to be read on phones with edge-to-edge screens, and it'll likely serve as a distinguishing feature for device makers planning to incorporate OLED panels in their future phones. The backlight layer of LCDs won't let fingerprint data pass through to Clear ID sensors, so the tech won't be coming to phones relying on those panels yet, if it ever does. Clear ID is so obvious and natural in use that it was my immediate answer when folks asked about the most innovative thing on display at CES, and I'm excited to see it make its way into more devices soon.

36 comments — Last by jmorey at 7:11 AM on 02/08/18

How much does screen size matter in comparing Ryzen Mobile and Kaby Lake-R battery life?
— 1:15 PM on November 30, 2017

 As we've continued testing AMD's Ryzen 5 2500U APU over the past few days, we've been confronted with the problem of comparing battery life across laptops with different screen sizes. Many readers suggested that I should take each machine's internal display out of the picture by hooking them up to external monitors. While I wanted to get real-world battery-life testing out of the way first, I can certainly appreciate the elegance of leveling the playing field that way. Now we have.

Before we get too deeply into these results, I want to point out loudly and clearly that these numbers are not and will never be representative of real-world performance. Laptop users will nearly always be running the internal displays of their systems when they're on battery, and removing that major source of power draw from a mobile computer is an entirely synthetic and artificial way to run a battery life test. We're also still testing two different vendor implementations of different SoCs, and it's possible that Acer's engineers might have some kind of magic that HP's don't (or vice versa). Still, for folks curious about platform performance and efficiency, rather than the more real-world system performance tests we would typically conduct, these results might prove interesting.

To give this approach a try, I connected both the Envy x360 and the MX150-powered Acer Swift 3 to 2560x1440 external monitors running at 60 Hz using each machine's HDMI output. I then configured each system to show a display output on the external monitor only and confirmed that both laptops' internal displays were 100% off. After those preparations, I ran our TR Browserbench web-browsing test until each machine automatically shut off at 5% battery before recording their run times.

As we'd expect, both machines' battery life benefits from not having to power an internal monitor. Counter to our expectations, though, the Envy x360 doesn't actually seem to spend a great deal of its power budget on running its screen. The Envy gained only 53 minutes, or 15%, more web-browsing time than when it didn't have to drive its internal monitor. The MX150-powered Acer, on the other hand, gained a whopping five hours of battery life when we removed its screen from the picture. I was so astounded by that result that I retested the Envy to ensure that a background process or other anomaly wasn't affecting battery life, but the HP machine repeated its first performance.

We can take battery capacity out of the efficiency picture for this light workload by dividing minutes of run time by the capacity of the battery in watt-hours. This approach gives us a normalized "minutes per watt-hour" figure that should be comparable across our two test systems. HWiNFO64 reports that the Envy x360 has a 54.8 Wh battery, and since it's brand-new, a full charge tops up that battery completely. Using the technique described above, we get 7.8 minutes of run time per watt hour from the HP system.

The Acer Swift 3 I got from Intel appears to have been a test mule at some point in its life. HWiNFO64 reports that the Swift 3 has already lost 10% of its battery capacity, from 50.7 Wh when it was new to 45.7 Wh now. In this measure of efficiency, though, that capacity decrease actually helps the Swift 3. The system posts a jaw-dropping 19 minutes of run time per watt-hour for light web browsing, or a 2.4-times-better result.

Although this is a staggering difference, I emphasize that it's not representative of performance in the real world. If we don't remove the display from the picture, the Optimus-equipped Swift 3 only posted nine and a half hours of run time in our i5-8250U review, or only about half again as long as the Envy's six hours and 12 minutes. If we drop the MX150 from the picture, the IGP-only Swift 3s and their 10.5 hours of battery only run 67% longer than the Envy. Those are only rough assessments of platform potential, given that we aren't normalizing for battery capacity or screen size. Still, Ryzen Mobile systems might have a ways to go to catch Intel in the battery life race. The blue team has been obsessed with mobile power management for years, and technologies like Speed Shift are just the latest and most visible results of those efforts.

In any case, it's clear that there's a lot of moving parts behind the battery life of these systems. I've repeatedly cautioned that it's early days for both drivers and firmware for the Ryzen 5 2500U, and it's possible that future refinements will close this gap somewhat. Benchmarking a similar Intel-powered system from HP might also help even the field, given my research in my first examination of the Ryzen-powered Envy x360's battery life. (If you'd like to help with that project, throw us a few bucks, eh?) Still, if you favor battery-sipping longevity over convertible versatility and raw performance, it seems like the Envy x360 requires a compromise that our GeForce-powered Acer Swift 3 doesn't. Stay tuned for more battery-life testing soon.

66 comments — Last by ihack13 at 9:31 AM on 12/06/17

Here's a first look at the battery life of HP's Ryzen-powered Envy x360
— 8:15 AM on November 29, 2017

My initial tests of AMD's Ryzen 5 2500U APU gave us a fine picture of the APU's performance, but we admittedly didn't test battery life in that initial article. Part of the reason for that omission was to avoid drawing unfair comparisons between the 15.6" HP Envy x360 that plays host to the Ryzen 5 2500U and the 14" Acer Swift 3 machines we used to represent Intel's Core i5-8250U.

Although the jump in screen size might not sound large on paper, the practical effects of the Ryzen system's bigger screen on battery life are likely quite significant. Getting the same light output from a bigger panel requires more power, as just one variable, and the HP system also has a pen digitizer in its LCD panel with unclear power-management characteristics.

So long as we keep those caveats in mind, though, we can at least offer a basic picture of how long the Ryzen 5 2500U lets the Envy x360 run on battery using a couple different tests. The one quirk of our test rig compared to the $750 default configuration you'd get off the shelf at Best Buy is the Samsung 960 EVO 500GB SSD we're using as our system drive. A hard-drive-only Envy might run for less time, though Windows' power-management features should generally take the mechanical hard drive out of the equation when it's not in active use.

To run these tests, I set the Envy's screen brightness to 50% and left Windows on its default Balanced power plan. The only changes we made to that default configuration involved disabling the operating system's Battery Saver safeguard and forcing the screen to remain on over the entire course of the test.

First off, I ran the Envy's battery down with TR's Browserbench. This benchmark runs a loop of an older version of our home page with plenty of Flash content sprinkled in, along with some cache-busting code to make for more work on the test system. Browserbench is getting up there in years, but it is repeatable and still offers a decent proxy for light web use. We're working on a new version of Browserbench that runs through a range of real-world web sites, but for now, the old version will have to do.

When the Envy shut off automatically with 5% remaining in its juice pack, it registered six hours and 12 minutes of battery life under Browserbench. That's well short of HP's claimed 11-hour battery life figure, but it would at least get you from New York to Los Angeles on in-flight Wi-Fi. To try and see just how good that figure was among similar machines, at least, I scoured the web for reviews of comparable PCs.
 
Reviews of Envy x360s with Intel eighth-gen processors inside remain scarce, but I did find that an Intel Kaby Lake-powered Envy x360 with a configuration and battery similar to that of our Ryzen system turned in six hours of web-browsing battery life for the folks at Laptop Mag. That test suggests Ryzen Mobile could be delivering competitive battery life against similar Intel systems, but it's hard to say just how competitive the Ryzen 5 2500U is without more in-depth (and possibly less representative) directed testing on our part.

Web browsing isn't the only use of a machine on the go, of course. To test video playback, I set up Windows 10's Movies and TV app to loop the 1920x1080, 55 Mbps H.264 version of the Jellyfish reference video until the Envy's battery died. I confirmed that Movies and TV was firing up the GPU's video decode engine using the GPU-monitoring tools in the latest version of Windows 10's Task Manager before letting the test run. Incidentally, here's a full accounting of the Ryzen APU's video-decoding capabilities, as ferreted out by DXVA Checker:

After displaying four hours and 37 minutes of pulsating sea life, the Envy x360 went dark once more. That result parallels the four-hour-and-32-minute run time achieved by the folks at HotHardware in their video playback testing, although the site claimed it had to run its x360's screen at 100% brightness to achieve a comparable output level with its other laptops. While I didn't run our Envy x360 so brightly, HotHardware's results still suggest we can be confident that our run time is in the right ballpark.

With these two tests in the bag, it seems like our 15.6" Ryzen system delivers only average battery life for its size class. I'd still caution against drawing too many comparisons between the Envy x360's battery life and that of other laptops at this stage, though. Implementation differences matter, and we don't know how Ryzen Mobile will behave in smaller and lighter systems. It's still early days for drivers and firmware, too. Our preliminary results and research suggest that Ryzen Mobile's battery life could be competitive with that of similar Intel systems', though, and that's as good a bit of news for AMD as its chip's well-rounded performance.

57 comments — Last by lampuiho at 1:46 AM on 02/05/18