Aorus’ X3 Plus v5 gaming laptop reviewed

A close friend who shares my passions for hardware, writing, and whiskey recently wrote about the value of a Scotch called Lagavulin. At $90, Lagavulin’s regular bottle is one of the more expensive Scotches on the shelf. By just about any measure, Lagavulin is a luxury Scotch, not a value pick. I could bring home a smoky Scotch, a rich bourbon, and a peated Islay malt for less than I would pay to get one bottle of Lagavulin.

My friend makes a point that I can’t ignore, though. There’s something special about the way Lagavulin combines those smoky, rich, and peaty flavors into one sip. While I can look elsewhere and find all those elements separately, I can’t quite replicate the experience that the luxury Scotch provides any other way.

Over the last couple weeks, I’ve had the pleasure of working with a notebook that’s undisputably a luxury item. Aorus, Gigabyte’s gaming brand, sent us a tricked-out version of its X3 Plus v5 laptop. Powered by an Intel Core i7-6700HQ and Nvidia’s GeForce GTX 970M, this slim 14″ notebook houses some impressive hardware. At $2200 for the configuration we tested, it also carries a hefty price tag. While the X3 Plus has more graphics muscle than a similarly-priced MacBook Pro, it faces some serious competition in gaming machines from Razer and MSI.

The Intel Core i7-6700HQ is a Skylake part with a 45W TDP. It has a 2.6GHz base clock and a 3.5GHz Turbo speed. The processor is paired with 16GB of DDR4-2133 RAM. For graphics power, Aorus turns to Nvidia’s GeForce GTX 970M, which has its own 6GB of GDDR5 RAM. Aorus includes Nvidia’s Optimus tech in the X3 Plus v5, which allows the machine to seamlessly switch between the GeForce chip and the Intel integrated graphics processor to save power.

TR readers will be familiar with the GM204 GPU that powers the GTX 970M from its past appearances in the GeForce GTX 970 and GeForce GTX 980. This chip is cut down a fair bit compared to those desktop monsters, though. Its 1280 shader processors, 48 ROPs, 924MHz base clock, and 192-bit path to memory put this chip closer to the desktop GTX 960 in a rough comparison of resources—not that there’s anything wrong with that, mind. GPU-Z can’t reveal the boost clock on this mobile part, but observation tells us that the GTX 970M in the X3 Plus v5 can run in a boost range around 1040MHz.

To complement the other high-end components, Aorus included a 512GB Samsung SM951 SSD. This speedy NVMe drive should turn some heads when we run it through some benchmarks. The Aorus X3 also comes with a 3200×1800 IPS display. It’s an IGZO-TFT panel that could be brighter and more power-efficient than the average LCD.

Gaming laptops and peripherals have a reputation for bizarre shapes and garish lighting. The Aorus X3 suffers from neither of those afflictions. It has straight, clean lines and cool white keyboard backlighting.  One might not even take it for a gaming laptop at first glance, save for the large vents on the back. The sturdy, all-aluminum chassis doesn’t exhibit any flex, especially in the critical keyboard and palm rest regions. Another hint of luxury is the Aorus logo on the back of the display. The mirrored glass logo faintly glows when the machine is turned on. 

The thick rear edge of the X3 Plus houses the fans and heatsinks for the X3’s cooling system, as well as the display hinge. The hinge offers an appropriate amount of stiffness when opening the display, and it holds firm when the display is closed, too. That’s important because the X3 Plus uses no latch to hold the display shut. If this laptop came with a touchscreen, we’d prefer to have seen a stiffer hinge yet, as the screen wobbles a bit when it’s tapped. However, I only tapped the screen out of habit. This laptop doesn’t have a touchscreen. 

The bottom of the laptop has six angular feet and six ventilated openings. The entire bottom panel is secured with ten Torx screws, and it feels plenty rigid. The ventilation on the bottom and back of the laptop should produce a reasonable amount of airflow as long as the vents aren’t covered. However, it would be pretty easy to block much of the laptop’s air intake simply by setting the laptop on top of a lap or other soft surface.

We didn’t have the necessary screwdrivers on hand to take apart the X3 Plus v5, but the folks over at Notebookcheck took apart their X3 Plus and found two SO-DIMM slots, two M.2 2280 slots, and one M.2 2230 slot for the wireless card. With no 2.5″ bays in sight, owners that want to expand the X3 Plus v5’s storage will have to rely on potentially expensive gumstick SSDs.

Here are the X3 Plus v5’s essential specs, for easy comparison with other notebooks on the market:

  Aorus X3 Plus v5
Processor Intel Core i7-6700HQ
Memory 8GB or 16GB DDR4-2133
Chipset Intel HM170 Express
Graphics Intel HD Graphics 530

Nvidia GeForce GTX 970M with 6GB GDDR5 RAM

Display 13.9″ IPS panel with 3200×1800 resolution
Storage Samsung SM951 NVMe SSD, 256 or 512GB

Expansion options: 1 M.2 2280 slot

Audio 2W speakers
Expansion and display outputs 1 USB 3.1 Type-C

3 USB 3.0

HDMI 2.0

Mini DisplayPort

Card reader 1 SD card reader
Communications Killer E2200 Gigabit Ethernet adapter

Intel 802.11.ac Wi-Fi

Bluetooth 4.1+LE

Input Devices Backlit keyboard

Clickpad

Internal microphone

Camera HD webcam
Dimensions 12.9″ x 10.4″ x 0.9″  (330 x 263.5 x 22.9 mm)
Weight 3.96 lbs (1.8 kg)
Battery 73.26Wh Li-polymer
OS Windows 10 Home

Now that we’ve taken stock of the X3 Plus v5, let’s turn it on and see what it can do.

 

Living with a high-DPI screen

With the prices of high-resolution panels trending downward and the availability of adaptive-sync monitors increasing rapidly, this may be the year that I finally replace my 1080p monitor. Reviewing the Aorus x3 gave me the opportunity to play with a 3200×1800 panel and find out what working with a high-DPI screen is like.

This high-density panel isn’t really meant to be run at its native resolution. Instead, with Windows scaling on, it’s supposed to provide smoother-looking text (as Apple does with its Retina displays). It’s also an IPS panel, meaning that it offers superior colors and viewing angles compared to TN screens. The only thing that I found myself missing about my regular monitor is its size. A 14″ screen makes for a portable notebook, but it’s not as immersive as the 24″ screen that I’ve become used to.

I should give some credit to Aorus for shipping the machine with an appropriate Windows scaling factor turned on. From the moment I first turned on the Aorus X3, the taskbar and desktop icons were appropriately sized and text was clearly readable. For example, I didn’t find myself zooming in much when browsing in Firefox. Not everything worked correctly right out of the box, however. Here’s a snippet from our beloved home page in two different browsers:

Firefox on top, Chrome on the bottom

While Aorus may be doing its part to provide users with a seamless experience with a high-resolution display, I can’t say the same about software developers. While I was happy with Firefox, for example, Chrome doesn’t look as good at 3200×1800. From my two screenshots, you can see that Firefox makes better use of the screen. It displays eight items in our news feed without scrolling down, compared to five in Chrome, and it does so without sacrificing readability. I ran across occasional problems in other applications as well. Here’s a picture of one of the worst offenders:

Recycle Bin for scale.

Inside a number of applications, I found menus and interfaces that were almost too small to read. Bending over and squinting to look for the play button makes me feel like an old man in need of bifocals. I wouldn’t call the inconsistency in scaling behavior a dealbreaker for high-resolution displays, but it is a reminder that the industry in general is still transitioning to support QHD and 4K screens.

Display testing

Subjective impressions are one thing, but we put the X3 Plus’ screen under the scrutinizing eye of a colorimeter to get an idea of its accuracy and consistency. We used a Datacolor Spyder4 Elite and its bundled software to perform our testing, along with the free-and-open-source DisplayCal (formerly dispcalgui) and HFCR tools to fill in some holes in the Datacolor software.

Out of the box, the X3 Plus v5’s screen is a bit too cold, and it’s also somewhat inaccurate. DisplayCal’s pre-calibration report indicates a roughly 7000K white point, and its gamma is a not-particularly-accurate 1.9 compared to the 2.2 gamma we want with sRGB.

After calibration, the X3 Plus’ screen comes into near-perfect conformance with the sRGB gamut. Datacolor’s utility says the screen covers 100% of the sRGB gamut, while DisplayCal claims 98% coverage. We don’t think most people will be able to see that minor difference.

The graph above gives a little more insight as to where the Aorus’ screen is most and least accurate. Blues seem to give the display the most trouble. All told, though, the screen’s average delta-E is just 1.2. We’d consider any average dE below three to be a good result.

The gray ramp above suggests the X3 Plus’ display has trouble with grayscale accuracy, though, a problem that’s echoed somewhat in the delta-E figures above. Instead of a flat line near 6500K, the gray ramp graph is all over the place as we approach pure white.

Even after calibration, the X3 Plus’ gamma doesn’t quite track the ideal 2.2 curve we want.

The X3 Plus’ brightness and contrast figures are good, if not outstanding. At the 50% brightness setting, the display’s white level is about 123 cd/m2 at the center of the screen, and our colorimeter measured a 330:1 contrast ratio. Cranking the display up to 100% shows a maximum brightness of 243 cd/m2.


As you can see in the four pictures of the X3 Plus v5’s screen above, the IPS screen offers viewing angles that are more than good enough. The screen does undergo some color shifts when viewed at the most extreme angles, but that’s hardly a flaw when you consider that anything on-screen will be barely visible at those angles anyway.

Datacolor’s luminance uniformity tests reveal a distressing flaw in the X3 Plus’ screen, though. Going by our results, there’s about a 40% drop in brightness from the top of the screen to its bottom. That’s far worse uniformity than we’d expect from the screen in a $2200 laptop. The falloff manifests as a sort of dirtiness on pure white or gray backgrounds like you’d see in Photoshop and Windows Explorer, though it’s less noticeable in “real-world” scenarios like pictures and games. Picky media professionals will want to seek out a different laptop or plug in an external monitor.

Aorus doesn’t specify a gray-to-gray response time for the X3 Plus’ display, but that figure is important for a gaming monitor. We don’t have the necessary gear to check gray-to-gray response time at TR, but the folks at Notebookcheck do, and they measured a 47.2ms black-to-white response time and a 52.4ms gray-to-gray response time for the X3 Plus’ screen. Those numbers might be typical for a notebook screen, but they’re an order of magnitude greater than the 4ms response times we’re used to from IPS displays on the desktop.

To see just what kind of effect these response times have in practice, we rigged up a pursuit camera in tandem with the Blur Busters motion test. In the photo above, you can see just how much smear is evident on the X3 Plus v5’s screen. This blur is quite visible to the naked eye. It can be pretty distracting, especially on scrolling webpages and high-contrast edges in games.

Whether that blurriness will bother you depends on the games you like to play. In Counter-Strike: Global Offensive and other fast-twitch shooters, the X3 Plus v5’s screen feels a bit like treading through mud. Input in those titles feels frustratingly laggy. Less-twitchy games like Grand Theft Auto V aren’t so bothersome to play on this machine, but some smearing is still evident if you’re looking for it.

Given the X3 Plus v5’s price tag, it’s also annoying that Nvidia’s mobile G-Sync tech isn’t included, either. As it happens, G-Sync and Optimus can’t coexist in notebooks right now, so this omission isn’t Aorus’ fault. Even so, the 3200×1800 native resolution of this display can be tough for the GeForce GTX 970M to drive smoothly, so the feature would be nice to have. Even at 1920×1080, G-Sync would be a useful safety net for graphically intense games that aren’t running at or above 60 FPS all the time.

All told, the X3 Plus v5’s display is something of a letdown. Content creators won’t appreciate the considerable change in brightness from the top to the bottom of the display, and gamers will probably notice its slow response time. The screen does offer accurate colors and great viewing angles, but those are just two requirements that a good display needs to meet.

 

Keyboard and trackpad

I had some trepidations about the X3 Plus v5’s keyboard and clickpad. If there’s one aspect of laptops that I continually struggle with, it’s the input devices. As a child, I learned to type on an IBM Model M, and it’s set the standard for every keyboard I’ve used since.

The primary keys are all full-size at 0.63″ square. The top row, which includes the f-keys, are a bit shorter, measuring in at 0.44″ tall. Along the left hand side of the keyboard are five 0.38″-wide macro buttons. A glowing button immediately above the macro buttons cycles through five different macro groups, changing color to indicate the selected profile. The functions of these keys can be changed in Aorus’ MacroEngine software, which we’ll cover in a bit. The white backlight can be set to full brightness or a dimmer mode, and it can also be turned off if its glow isn’t wanted.

The switches underneath the keys might not be the Cherry MX Browns that delight my fingers in my Rosewill RK-9000V2, but they have just enough buckle to remind me that I’m pushing a key. They’re also far more quiet than my Rosewill, which is preferable in many situations outside of the office. To test my accuracy and speed, I went to typingtest.com. In three one-minute runs, my adjusted typing speeds were 91 wpm, 103 wpm, and 106 wpm, for an average of 100 words per minute. Doing the same tests on my Rosewill, I averaged 102 words per minute. While I’m a bit more accurate on my full-sized mechanical keyboard, I’m impressed with how close those results are. Overall, I’d say the Aorus x3 has a reliable, functional keyboard.

I’m also not a big fan of trackpads. At best, they provide the mousing functionality I need when it’s not feasible to plug in a mouse. At worst, they move my cursor around every time the heel of my hand drops down too far. Aorus uses an Elan trackpad in the X3 Plus v5. This multi-touch surface has a textured area at the bottom border to indicate left- and right-click functionality, but users can also simply tap anywhere on the pad to left click. It has a smooth, responsive surface, and there’s a function key dedicated to turning it on and off quickly. For the times when I don’t have a mouse handy, the X3 Plus v5’s trackpad serves just fine.

Connectivity and utilities

Aorus scoffs at Apple’s “less is more” strategy for ports, and instead provides a wide range of connectors on the X3 Plus v5. First up are the ports for attaching a secondary display to the machine. On the left panel, we get an HDMI 2.0 port and a Mini DisplayPort. We can also see a USB 3.0 port, headphone and microphone jacks, and a USB 3.1 Type-C port from this angle. Aorus doesn’t include the necessary hardware to turn that USB-C port into a Thunderbolt 3 connector, but we still appreciate the fast peripheral I/O.

I plugged in a 24” 1080p monitor into the HDMI port, and was pleased with how quickly the system handled the display change. One of the built-in function buttons seamlessly switches back and forth between the displays. The only annoyance I encountered with switching displays concerned changing resolutions. To make text and icons readable on the laptop’s 3200×1800 display, everything has to be scaled up. When I switch to the 1080p monitor, Windows uses the same scaling factor as the rest of the system, so I have to manually change those settings every time I connect and disconnect my external monitor. To make matters worse, some of the scaling settings require you to log out of Windows and back in.

On the other side of the machine, Aorus includes an SD card reader and two USB 3.0 ports. You can also see the power button. On the rear of the machine, we get an Ethernet jack and the connector for the machine’s AC adapter.

There are no buttons or ports on the laptop’s thin front edge, but there is a row of five tiny LEDs that indicate the Bluetooth, Wi-Fi, storage, battery, and power status. The speakers are located at the front of the notebook on its left and right edges. The speakers perform about as well as I’d expect for tiny drivers. They don’t offer much in the way of bass, and have a rather tinny treble range. However, I expect that most users will either use a headset or plug the laptop into a real speaker system.

The Aorus X3 offers a number of utilities to access the laptop’s important functions. I’ll draw your attention to the Command & Control interface, the System Gauge panel, and the macro engine.

Through the Command & Control interface, you can access a number of important functions. Notably, you can change the machine’s power profile, adjust the fan settings, access the System Gauge, and turn on and off a number of functions, like the Windows key.

The System Gauge gives you a quick look at vital information about your hardware. Five gauges indicate your used disk space, GPU and memory utilization, and remaining battery life. At the bottom of screen, the System Gauge provides CPU and GPU temperatures, the current speed of the fans, and an estimate of the time remaining on the battery. In my experience, the battery estimate was fairly accurate.

Through the MacroEngine, users can program the five macro buttons on the left side of the keyboard. Users can define up to five different groups and cycle through them using a colorful button in the upper left-hand corner of the keyboard.

 

Memory subsystem performance

To assess the performance of the Aorus X3, we used benchmark applications and data from our reviews of AMD’s A10-7800 and FX-8370E. Additionally, there’s some data from our review of Intel’s Broadwell-powered NUC5i5RYK, which is powered by an Intel Core i5-5250U. This data set includes some Extreme Edition desktop processors in addition to the ultra-low-power variants, so the graphs show a broad spectrum of performance.

There’s a large delta here between bandwidth on tap from the Exteme Edition Core i7 processors and everything else. The X3 comes out near the top of the pack, but its DDR4 RAM doesn’t give it a notable edge over the other CPUs in our stable.

Productivity benchmarks

The following desktop-style applications test both single-threaded and multithreaded CPU performance. Let’s see what happens when we give the Core i7-6700HQ a chance to stretch its legs and run.

The Intel Core i7-6700HQ posts a middle-of-the-pack score in the Sunspider benchmark, but otherwise, it’s a great performer. It’s particularly impressive to see how this mobile processor not only competes with desktop processors, but also beats a few of them. Considering that the i7-6700HQ is a 45W part in a 14″ notebook chassis, these results are most impressive.

The Handbrake video encoding test also puts the i7-6700HQ in some prestigious company. It’s just shy of a tie with the i7-4790K, an unlocked Haswell desktop processor. I also ran the i7-6700HQ through the x264 benchmark using the settings and encoder build from our Intel Core i7-6700K review. It averaged 62 frames per second, putting it well behind the Core i7-5960X, but in competition with the Core i7-5775C and AMD FX-8370. That’s remarkable for a mobile part.

 

Cinebench rendering

Cinebench benchmarks a processor’s performance by rendering a 3D scene without the help of the GPU.

In the single-threaded test, the Core i7-6700HQ holds a small lead over the other lower-power processors in the list, but it falls a bit behind its desktop Intel brethren. When the application takes advantage of the i7-6700HQ’s four cores and eight threads, however, the CPU takes a commanding lead over lower-power parts and begins trading punches with some powerful desktop-class company.

From a CPU-performance standpoint, we’re quite impressed with the X3 Plus v5. We don’t think it’s a stretch to say this machine offers desktop-class performance in a tiny footprint. For media and photo pros that need to pack light without sacrificing processing power, this notebook could be just the ticket.

Storage performance

The Aorus X3 we tested comes with a 512GB Samsung SM951 M.2 drive.  We’ve already reviewed this drive in detail, and we found it to be among the speediest SSDs around. To briefly sum up our review, we found the SM951 can provide spectacular performance when it’s appropriately cooled. The tight enclosure of a slim laptop is likely one of the hotter environments this drive will encounter, so let’s take a look at sequential read results from Iometer at two different queue depths.

In this quick test, my numbers came in a little bit lower than the ones we recorded in our earlier review, but not significantly so. The SM951 in the Aorus X3 still performs at a very high level. It looks like the X3 Plus’ cooling system is up to the task of keeping the SM951 cool.

It’s especially revealing to look at the difference between the SM951 and the popular 850 EVO. In synthetic benchmarks, the NVMe SSDs that have cropped up recently have set new standards for storage performance. I’ll add that the numbers fit my subjective experience with the Aorus X3. It’s been a number of years since I upgraded the system drive in my personal desktop, so the difference between my aging SATA SSD system drive and the SM951 was startling. Boot times and application load times are noticeably shorter, and the drive is just overall more responsive.

As a final note about the drive, I’ll say that I’m happy that Aorus included a 512GB drive in the X3 Plus v5. It’s really easy to fill up 256GB with a decently sized Steam library. If variety in your gaming isn’t that important to you, 256GB should be fine for now, but I think people buying gaming laptops this year will end up regretting their decision if they don’t spring for a 512GB or larger drive.

 

Gaming performance

The greatest challenge a gaming laptop faces is, well, how it handles games. Our goal when testing the X3 Plus v5 was to figure out what sort of graphics-quality settings we could dial in while maintaining playable frame rates. We checked how well the machine could run games at its 3200×1800 native resolution, and we also dropped the resolution back to 1920×1080 to see how the machine performed under that less-demanding load.

Ashes of the Singularity
Ashes of the Singularity has gotten plenty of attention for a while thanks to its built-in DirectX 12 benchmark. The game’s built-in tool fills the screen with countless warring spaceships, and then zooms out a few times so that you can see even more of the extraterrestrial battlefield.

I tested Ashes of the Singularity at three of its graphics quality presets. At the high quality setting, the game uses high quality textures, medium quality shadows, and 2x MSAA. Dropping down to standard maintains the texture quality, but lowers shadow quality, terrain object quality, and shading samples. The low preset turns down textures and turns off antialiasing, among other features.

During my tests, Nvidia released a driver that provided optimizations for Ashes of the Singularity. It was an eye-opening lesson in the importance of driver support. In my first DirectX 12 tests, I encountered color glitches, flickering textures, and stuttering animations. With updated drivers, these problems disappeared immediately.


Using the DirectX 12 renderer at 3200×1800, I had to dial the settings way down to get the average frame rates in the neighborhood of 45-50 FPS. However, dropping the resolution to 1080p didn’t change my results as much as I expected. Even at the lower resolution, the low graphics preset was the best option. 34 FPS on standard settings is marginal for a real-time strategy game, in my opinion. These results just go to show that Ashes is an extremely challenging game for today’s systems to run well. Gamers after the best results should probably use the DirectX 11 renderer with the Aorus, too. 

Tomb Raider

Next, I queued up the built-in benchmark from the 2013 reboot of the Tomb Raider franchise. The Aorus X3 performed admirably here, even at the display’s native 3200×1800 resolution.

The ultra graphics preset employs tesselation, 16x anisotropic filtering, and FXAA. Dropping down to high turns off tesselation, lowers the anisotropic filtering to 8x, and lowers the texture quality. The normal preset dials back anisotropic filtering even more, and the game lowers the settings for shadows, overall level of detail, and hair quality. The difference between the ultra and high presets isn’t that dramatic, but there’s noticeably less eye candy at the normal preset.


At the screen’s native resolution, my frame rate was on the south side of playable on the ultra presets. I saw quite a few dips below 30 FPS using those settings. However, the game was more than playable on high settings. Turning down the resolution to 1080p increases the average FPS so much that gamers should look around in the settings for some visual effects to turn up.

As a brief note, no setting impacted the game’s performance more than TressFX. It’s a gorgeous bit of technology. My wife insisted that I keep it on when she tried the game out for a few minutes, but I found that performance can be significantly increased by simply turning it off—in some cases, by up to 40 FPS, on average. To find playable frame rates with TressFX turned on, I had to stick with 1920×1080.

Fallout 4

Bethesda’s most recent addition to the venerable Fallout franchise was less friendly to the Aorus than Tomb Raider was. I played with the settings for quite some time to find something comfortable. I used FRAPS to benchmark runs through an early scene in the game where the main character, along with many of his neighbors, runs to the local Vault. That scene shows numerous character models in an outdoor setting, so it should serve as a decent indicator of how the game will perform.

I tested the ultra and low graphics presets. On ultra settings, the game uses ultra quality textures and shadows, temporal anti-aliasing, and 16x anisotropic filtering. On low settings, Fallout 4 drops down to medium quality textures, low shadows, and FXAA, and turns off anisotropic filtering.


I don’t recommend playing Fallout 4 at the Aorus X3’s native resolution. Using low presets at 3200×1800, this machine churns out about 44 FPS. Normally, I’d consider that frame rate playable, especially since the game’s VATS system reduces the need for the hyper-fast reflexes so vital in most first-person shooter games. While the average frame rate might look inviting, I experienced a number of frame-rate drops that made the experience less than smooth. Second, character faces just don’t look that good at the low presets. I hate to pick on Bethesda, but the faces in the game aren’t that amazing to begin with, and they look even worse with low-quality shadows and textures.

At 1920×1080, however, the game looks great, especially when you turn up the settings a bit. The ultra preset was perhaps a bit too demanding, but the game looks quite a bit better at a lower resolution with improved textures and shadows.

Dota 2

Despite their relatively simple graphics, MOBA games like Dota 2 are quite popular. While these games may not have the prettiest eye candy, they need to run at high frame rates. Gamers need to be able to respond quickly and decisively at critical moments, or risk being flamed by their teammates. Dota 2 has a slider in its graphics menu to quickly adjust settings. I started at the “best looking” preset, and then did some runs at the second and third best presets.

Gamers should have no trouble getting the performance they need out of Dota 2, even at the display’s native resolution.

Older favorites

Newer, more demanding titles are useful for benchmarking, but the popularity of a game isn’t often determined by the quality of its eye candy. The biggest e-sports titles, for example, aren’t all that demanding on your hardware. I ran a few of my personal favorites to get a sense of how the X3 Plus can handle them. Since it’ll become painfully obvious in a moment, I’ll just admit from the start that I’m a total Blizzard fanboy, and have been since the days of Warcraft II. I tested a few of my Blizzard favorites on the X3 Plus, and it crushes all of them.

In an absolute worst-case scenario for Starcraft II (maxed-out settings and four maxed-out armies at 3200×1800), I did get frame rates below 30 FPS. Adjusting the settings slightly ensured that Starcraft II never dropped below 60 FPS. My beloved Hearthstone, a great waster of my time these last two years, looks gorgeous at 3200×1800, even if the mouse cursor is tiny at this resolution. (Considering that Hearthstone could probably run on a potato, I don’t know how impressed you should be here.) At max settings, Diablo III cruises along at 75 FPS.

Similarly, Portal 2 and other Source engine games have never looked better. While it’d be overkill to purchase a $2200 laptop just to play these older titles, there is an argument for investing in decent hardware. If these are the kinds of games you play now, and you’d like to keep your options open for upcoming titles, a capable gaming machine isn’t a bad consideration even if you won’t immediately stress its capabilities.

Can it do VR?

Nvidia’s mobile GeForce GTX 980 (non-M) is the only mobile graphics card the company calls VR-ready, so the GTX 970M in the X3 Plus v5 probably isn’t up to the task of driving Oculus’ Rift or HTC’s Vive. Still, I’m a curious man, so I tried a few things to assess the X3 Plus’ VR capabilities.

Valve’s recently released SteamVR Performance Test offers a simple way to test systems for VR readiness, so I gave it a whirl. I was surprised to not only see this system fail, but fail spectacularly. With a score of 3.7, this laptop doesn’t even qualify for the “Capable” category. I poked around and discovered that the SteamVR test was running on the integrated Intel HD 530 graphics instead of the much more capable GeForce GTX 970M. 

I tried a few tricks to convince SteamVR to use all of the X3 Plus v5’s graphics horsepower, but to no avail. Forcing the SteamVR performance test to run on the GTX 970M using Nvidia’s control panel had no effect, so I started rummaging around in the machine’s BIOS. To my chagrin, Aorus provides the option in BIOS to disable graphics acceleration, but not to turn off the integrated graphics. That’s because Optimus uses the Intel IGP to connect to the laptop’s display.

At that point, I called off my attempts to get the SteamVR Performance Test running. We’ll have to wait and see what kind of VR performance mobile hardware like this can deliver at a later date.

 

Battery life and mobile gaming performance

To test the Aorus X3’s battery life, I ran down the machine in two different situations. To simulate regular usage, I used the tried-and-true Tech Report BrowserBench. BrowserBench reloads an old version of our site’s home page (ah, the nostalgia) and cycles through content every 45 seconds. To mimic gaming on the go, I ran Unigine’s Heaven benchmark until the system automatically shut down.

For the BrowserBench run, I changed the power setting to “balanced.” I dimmed the brightness of the screen to a reasonable level—dim enough to save power, bright enough to actually read the screen comfortably. The Aorus shuts down with 10% of its battery life remaining, and it reached that point after 4 hours and 35 minutes of web browsing. That’s nothing spectacular for a modern machine, but it’s not bad, either.

For the gaming tests, I ran the same test three times at the laptop’s different power settings: high performance, balanced, and power saver. I turned the high-performance run into a worst-case scenario for the battery by leaving the screen brightness at 90%. For the other runs, I let the power preset lower the brightness to a dimmer setting.

According to my tests, the Aorus x3 can game on battery for a little bit less than an hour. Changing the power profile buys you an extra ten or fifteen minutes of playing time. For a more subjective test, I unplugged the laptop and played some games for a bit. The laptop lasted a bit longer, about an hour and a half. That’s somewhat disappointing, but it’s also to be expected given the powerful hardware inside the X3 Plus.

During this subjective testing, I noticed something peculiar. I expected performance to drop when I set the laptop in balanced or power saver mode, but I didn’t expect it to drop as far as it did. When the laptop is plugged in, my old favorite Diablo III averages about 120 FPS at 1080p. When I unplug it—even at the high performance setting—average FPS drops to around 30, and input lag starts causing some serious frustration. According to some folks on the Battle.net forums, though, this is a problem with Blizzard games and Nvidia’s Optimus switchable graphics technology.

Fallout 4 ran better unplugged than Diablo did, offering a playable experience at 1080p. Given these wildly varying experiences, I dug into the Aorus’ mobile gaming performance a bit further. To get a more objective sense of how much performance varies between the different power settings, I ran several benchmarking runs using Tomb Raider. I ran each test at 3200×1800, testing three different graphics presets with each of the laptop’s three power profiles.

While the difference in performance isn’t as dramatic here as it was for Diablo III, dropping the power profile down to balanced incurs a steep performance penalty. At each setting, average FPS dropped by about 25%. Interestingly, there wasn’t much performance difference between the balanced and power saver settings in this particular benchmark. It seems that the middle path doesn’t make as much sense here. If you’re going to accept the performance hit, you might as well drop all the way down to power saver mode. In other applications, I found that power saver mode had a larger effect on performance.

These performance drops are unsurprisingly due to the fact that the X3 Plus throttles back its CPU and graphics card considerably when it’s on battery. I did some informal monitoring of these clock speeds using AIDA64 with the machine unplugged, and the CPU’s highest clock speeds didn’t even reach the base clocks I saw while the Aorus was plugged in. The graphics card’s core and memory clocks fall significantly, too. All told, those clock speed reductions lower the machine’s performance quite a bit.

To be fair, this behavior is understandable. If the X3 Plus ran all-out while it was unplugged, gamers would see an even shorter runtime than our already-abbreviated benchmarking runs. It’s probably most sensible to look at this machine as a high-performance PC that can easily move between power outlets than a true solution for gaming or heavy number-crunching away from a plug.

Cooling performance and noise levels

To my way of thinking, one of the primary ways that well-engineered systems distinguish themselves is in how they handle heat. It’s no mean feat to balance performance, temperature and noise inside of a constrained space. With the X3 Plus v5, Aorus crams some high-performance parts inside of a slim chassis, and that creates challenges for the machine’s cooling system.

At idle, the X3 Plus is quiet. Its fans never stop spinning, so there’s always a hint of noise from the cooling system. Any ambient noise will drown it out, though. The X3 Plus’ temperatures and noise levels certainly rise once you put the system to work. Using the built-in System Gauge tool to track temperatures and a Android app to measure sound levels, I put the X3 to the test.

After starting a demanding application, CPU and graphics card temperatures both rise above 70 degrees Celsius, causing the fans to spin up to about 3780 rpm. At this speed, the fan noise registers about 35 dBA near the machine. As the system continues to work, the entire laptop heats up. The keyboard is warmest underneath the “K” key, which is thankfully a healthy distance away from where my hand usually sits while gaming. The bottom and back panel of the laptop both get fairly hot. My infrared thermometer registered temperatures above 40 degrees Celsius.

The Aorus x3 will spin its fans up to 4321 RPM when CPU temperatures get higher than 80 degrees Celsius. I measured noise levels over 45 dBA when the system was running all-out, too. The system certainly makes itself known when the fans are running this fast, but the character of its sound isn’t unpleasant. To my ear, there was none of the whine and rattle that occurs when cheap fans are pushed to their limits.

If we fire up the Prime95 Small FFTs test on the X3 Plus’ CPU and Unigine Heaven on its graphics card all at once, it is possible to make the X3 Plus v5 throttle those components because of temperature extremes. To be fair, though, this extreme case isn’t at all representative of the way most owners will use this machine, and we never saw thermal throttling in regular use. The relatively high temperatures I observed during my tests do suggest that regular cleaning will be very important to this laptop, though. If the bottom vents get clogged, the cooling system could be overburdened. You also won’t want to game with this machine on your lap, but that’s not a very natural position for fragging, anyway.

 

Conclusions

To wrap things up, let’s start with what the Aorus X3 does well. It offers truly impressive performance for a small, slim notebook. Intel’s Core i7-6700HQ and Nvidia’s GeForce GTX 970M are a potent combination that’ll change your expectations of what laptop hardware can accomplish. This machine has enough pixel-pushing power to run modern titles with plenty of eye candy turned on at 1080p, and I was also impressed by how well the X3 Plus v5 handled the unforgiving benchmark in Ashes of the Singularity. Excuse us while we pinch ourselves, but it’s hard to believe a 14″ notebook can be so powerful.

All that being said, I felt let down by the X3 Plus v5’s display. The lack of G-Sync in a machine this expensive is hard to overlook, but G-Sync and Optimus can’t coexist right now, so that’s not Aorus’ fault. Even so, the 3200×1800 IPS screen is pretty and calibrates well, but it’s hard for the GTX 970M to drive recent triple-A titles at the screen’s native resolution. The panel’s slow response times can also make fast-twitch gaming unpleasant, and the screen’s below-average luminance uniformity might cause some content creators to blanch. Those flaws hurt this machine’s gaming and workstation cred alike.

Aorus does deserve praise for the design of the X3 Plus v5. Too many companies make “gaming” hardware that’s gaudy and adolescent-looking, but the X3 Plus v5 is sleek and understated. If you don’t want your laptop to embarrass you in front of clients and coworkers, the X3 Plus (or one of Aorus’ other notebooks) is definitely worth a look. I also appreciate the X3 Plus’ cooling system, which is more than up to the demands of the high-performance parts inside. The fans get noisy enough under load that you might not want to play Grand Theft Auto V down at the local Starbucks, but the X3 Plus should be a great mobile gaming companion in a hotel room or other more private spaces.

With all its virtues, we’d heartily recommend the X3 Plus v5 if it used a better display, even for its $2200 as-tested price. Slipping a machine this powerful into a backpack or briefcase feels like living in the future, and that’s the kind of special experience we’d want if we were to shell out for a PC like this one. As it stands, though, the flawed display keeps us from sending the X3 Plus v5 home with our full endorsement.

Comments closed
    • Vhalidictes
    • 4 years ago

    This article is odd. A High-DPI screen isn’t appropriate for a laptop, in more ways than just the Windows Desktop and applications needing scaling factor to be seen well. There’s also games running at a native resolution that can’t be supported by mobile GPUs.

    In fact, the only possible use for High DPI in a smaller laptop is running games without any AA; but a smaller number of pixels with AA would get around the same framerate (if not better depending on the game engine).

    This would be a solution looking for a problem, but it’s not even that since not all applications are high-DPI-aware.

    • Crackhead Johny
    • 4 years ago

    “At $90, Lagavulin’s regular bottle is one of the more expensive Scotches on the shelf.”
    I’m not a scotch guy but all the 42s (for 42nd birthdays) I have bought have cost well more than that. At the liquor stores I go to (dedicated liquor stores not “Jerry’s Gas and Liquor” or Frat Row Booze Warehouse”) 90 seems about middle of the road*, I have seen 600$ bottles of scotch at Costco for Christ sake).

    As for gaming laptops I have begun to wonder about the point. We all have hauled gaming boxe3s to LAN parties (are those still a thing?). If you are not going to a LAN party why do you need to portability?
    For “Work but able to game” it would seem better to build a gaming rig and have a cheap work laptop for less money than a gaming laptop.
    Am I wrong or is a gaming laptop the computer equivalent of “All season radials” AKA “No season radial”. When I got mine it seemed like a good idea until I realized I should have had 2 machines for less money than the 1 laptop. It isn’t like I need to play WoW at a job site.
    Is it mainly great if you are a game reviewer for a tech site who can test the games they need to review then write the review while at business meetings?

    *This does not include the thing called “blended scotch” as Scotch.

      • EricBorn
      • 4 years ago

      I wondered if anyone would complain about that sentence about Scotch. Absolutely, there are more expensive bottles out there.

      However, when you start to compare the baseline offerings of many distillers, you find that Lagavulin’s price starts higher than most. Glenmorangie’s lowest price bottle, for example, starts at about $50 in my state. Do they have more expensive expressions? Absolutely, but the entry point is a lot lower.

      As to the gaming laptop questions, I’ll just briefly say that while I was writing this article, a friend of my wife asked me for my opinion on what laptop she should buy for her college-bound son. She wanted something portable so he could use it for school, but she also wanted something that he could play his favorite games on.

        • Crackhead Johny
        • 4 years ago

        As someone who was playing 9+ hours a day, 7 days a week, of Q2 through out most of college I’m not sure a gaming laptop is a good plan.
        If I was the student that would be awesome. If I was that student 20 years later it might not be as awesome in hind site. If I was the parent who was paying for the college I’d flay the person who recommended it.

        I can only imagine the GPA nightmare that WoW was. I have met people that EQ took out of college.

        Nuke your GPA with the Aorus’ X3 Plus v5 gaming laptop!

    • WaltC
    • 4 years ago

    If people really like gaming and do it more than casually, I’d recommend they steer completely clear of laptops and go with a desktop. Best option: build your own desktop with handpicked peripherals. A big problem with laptops is that most of them use custom hardware that requires you use GPU drivers (among others) furnished only through the laptop OEM…and some of them really drag their feet in the driver department. On most any game forum you visit, 8 out of every 10 problems have to do with a laptop being used for gaming. Use the laptop for light tasks that require mobility; use the desktop for the heavy lifting and grunt work, like gaming. (Unless by gaming you mean those dinky little cell-phone games and Solitaire, etc. But you don’t need a “gaming” laptop for anything like that.)

    I also wholeheartedly agree with Kretschmer, cramming a 3k display into a teeny-tiny little 14″ screen is idiotville revisited…;) For all of the reasons he listed and more.

      • travbrad
      • 4 years ago

      I tend to agree. You can build a gaming desktop with better performance AND still get a cheap decent laptop for non-gaming purposes for the same price or cheaper than you can get a “gaming laptop” for. The only people a gaming laptop really makes sense for IMO are people who are constantly travelling. Even then you will need a power outlet to actually play games though.

      Gaming laptops aren’t great as laptops because their battery life sucks, and they are way too expensive for the gaming performance you are getting. Plus it’s much harder (or impossible) and more expensive to upgrade or replace parts when something goes wrong.

      The biggest problem with this particular one is still the display though. That resolution combined with a GPU similar to a desktop GTX960 seems like a terrible match, not to mention that ridiculous response time.

      • ronch
      • 4 years ago

      Yup. Build a serious gaming desktop for $1,500 and buy a $700 laptop. Unless you’re one of those RKOI (Rich Kids of Instagram), you probably don’t have time or energy to play games while on a plane to Paris anyway.

        • travbrad
        • 4 years ago

        You definitely won’t have the energy unless you bring a car battery with you.

          • ronch
          • 4 years ago

          Yeah. And if you need to game everywhere you go, you probably need a life.

            • iBend
            • 4 years ago

            or just buy some smartphone.. it can fullfill your gaming need for 4-5hours or more each charge..

            gaming laptop that only can run games for less than 1 hour is totally useless, lol

        • Vhalidictes
        • 4 years ago

        Something I’ve found out by accident – modern Intel integrated graphics and CPUs are fast enough to use Steam Streaming.

        I’ve played a game on my desktop using my laptop as the input/output device over WiFi. It works really well.

    • Captain Ned
    • 4 years ago

    “However, I only tapped the screen out of habit.”

    May I never have to utter those words.

    • coolflame57
    • 4 years ago

    I’m liking this Eric Born dude.

      • EricBorn
      • 4 years ago

      It’s mutual.

    • alrey
    • 4 years ago

    How can a 970m be a luxury? For the the glamorous introduction, I was expecting at least 980m.

      • EricBorn
      • 4 years ago

      There’s always something a little better.

      Seriously, though, 970M exceeded my expectations.

      • TwoEars
      • 4 years ago

      It’s 14-incher. Try finding a 14-incher with the 980m.

    • Kretschmer
    • 4 years ago

    I’m beginning to feel like a broken record, but there is no reason to cram a 3K display into a 14″ unit. Scaling issues, battery life degradation, and price trade-offs (e.g. display quality) make it a losing proposition, and the unit can’t..you know..game at that resolution.

      • Deanjo
      • 4 years ago

      That is still a lower PPI than what most phones are now days. The iPhone 4 was 329 PPI (vs this ones 260 PPI) and there is a noticeable improvement going to an even higher PPI on newer phones.

        • DrDominodog51
        • 4 years ago

        Your phone is a lot closer to your face than a laptop.

          • Deanjo
          • 4 years ago

          Not really but if you think that is that big of a deal, many tablets also have a much higher PPI and are held further away and you still can tell the difference between a hidpi display and one that isn’t.

          Let’s put it this way, look at a 15″ MBP non-retina @ 1,680 × 1,050 (132 PPI) vs one with retina @ 2880×1800 (220 PPI). If you cannot immediately tell the difference than you seriously have really poor eyesight.

          That’s not that far off from a 1080P (157 PPI) display vs actual display on this guy 3200×1800 (260 PPI).

            • DrDominodog51
            • 4 years ago

            I’m about 20″ away from a 23″ 1920×1080 (96 PPI) display and I can’t see pixels without straining my eyes looking for them on the edge of small font. Any distance beyond 9″ from my old, decommissioned 3″ LG Rumor Touch @ 420×240 (161 PPI) and I can’t see the pixels.

            I probably could tell the difference between the two screens, but I have no need for the retina screen. A better screen would be nice and all , but with my vision more battery life and a screen the gpu can drive easily would be more useful.

            • Deanjo
            • 4 years ago

            Battery life has not suffered at the hands of higher resolutions. Again take a look at the MacBook Pro, battery life went UP after introducing higher res screens. As higher res screens come out, the rest of the system has become more efficient.

            (Again, I would get your eyesight checked if you cannot see the difference of hiDPI displays, seriously, my father with progressives can even tell the difference).

        • Froz
        • 4 years ago

        The problem is that smartphones and tablets software is prepared to handle that kind of PPIs, but Windows is not. I’m really surprised it takes them that long, but DPI scaling is still very problematic. Text for example is often less readable then without DPI scaling, contrary to what the article claims. Google “blurry text” to find lots of complaints about that, especially in win 10. I guess there are some ways to fix it at least partially, but it should work out of the box. On top of that you of course have issues with software, some will accept the rules and scale, some will not, in general it’s a mess.

          • Deanjo
          • 4 years ago

          Well that is (and always has been) a Windows problem. Fortunately there are other OS’s that scale beautifully.

        • travbrad
        • 4 years ago

        I would say gaming performance should be more important than PPI on a GAMING laptop though, even if you do generally like ridiculously high PPI displays.

    • thesmileman
    • 4 years ago

    After seeing a gsync monitor I will not by a laptop without it. It really is that nice.

      • VincentHanna
      • 4 years ago

      Which makes perfect sense. Even high-end laptops will struggle at 60fps in many games, and playing at essentially 30-40fps is simply untenable without Gsync (barring the occasional MMO.) Laptops ought to be the number one group pushing gsync/freesync.

    • DragonDaddyBear
    • 4 years ago

    Another scotch lover, eh?

    • TwoEars
    • 4 years ago

    Top tip for you: [url<]http://www.amazon.com/GIGABYTE-P34Wv5-SL3K2-GTX970M-i7-6700HQ-Computer/dp/B0195XZW64/ref=sr_1_1?ie=UTF8&qid=1459525996&sr=8-1&keywords=gigabyte+p34w+v5[/url<] 14-inch IPS 1080p, 128GB ssd + 1TB regular, skylake, 970m, $1600. I have a slightly older version and it's great.

      • cygnus1
      • 4 years ago

      That’s not far off from being a newer version of the Gigabyte laptop I have, the U2442. This laptop is about 3 or 4 years old now I think. It spent its first year with me in Afghanistan and I’ve swapped several it’s parts re-configuring the storage, ram, and wifi. It’s honestly never been abused, but it’s unfortunately very much showing it’s age. The dGPU is fried (literally, burned traces leading from the power input to the GPU, no idea how that happened have to keep it disabled in BIOS or system hard locks), many bits of plastic are broken (keyboard is close to ready to falling out), and it runs very hot and loud. I don’t think I plan to buy another laptop with a dGPU, mainly because of that heat and noise.

        • TwoEars
        • 4 years ago

        The gigabyte I have switches to the intel gpu under light load and all the fans switch off completely. It becomes totally silent except for a VERY faint clicking from that 1TB mechanical hard drive.

        Under load it’s a bit load, but not louder or hotter than other gaming laptops. They’re all load and hot, hard to get around.

        Just FYI.

        • Ikepuska
        • 4 years ago

        I had a similar experience with a Y series laptop in Afghanistan. The dgpu crashes under load immediately, it looks like the dirty power and problems with the lead free solder were the problem in my case

    • Anovoca
    • 4 years ago

    You want to see bad resolution scaling, try playing MassEffect 3 on that laptop.

      • TwoEars
      • 4 years ago

      Yeah, I don’t like those high-res displays. Too many games and applications that can’t handle it.

      14-inch, 1080p, ips. That’s perfection for me.

    • mczak
    • 4 years ago

    Well, the lack of G-Sync is expected, it’s just not feasible. G-Sync with Optimus isn’t possible, so you’d have to say good bye to battery runtime. Not going to happen in this form factor (there’s some more bulky notebooks which skip Optimus in favor of G-Sync).
    Unless intel starts supporting adaptive refresh (frankly I expected that for Skylake but apparently it didn’t happen).
    The display seems like a poor choice indeed though – the measured response times are close to highest ever measured at notebookcheck, there’s definitely faster IPS notebook screens out there.

      • Jeff Kampman
      • 4 years ago

      [s<]Aorus offers other, larger machines with G-Sync and Optimus on board, so it doesn't appear to be a limitation of that combo.[/s<]

        • cygnus1
        • 4 years ago

        I know this will sound harsh, but I believe that’s a factual error Jeff. I actually would really like to see a laptop with both, so I did a fact check double check… I see 3 models on their website. The 14″ TR reviewed with Optimus and no G-Sync, a 15″ with the same, Optimus and no G-Sync, and finally a 17″ with G-Sync and no Optimus mentioned in it’s spec list.

        Please point to a laptop that actually lists both and explain why you think the nVidia FAQ on G-Sync is wrong.
        [url<]http://www.geforce.com/hardware/technology/g-sync/faq#q17[/url<]

          • Jeff Kampman
          • 4 years ago

          Welp, you’re right. Wasn’t aware of that.

            • cygnus1
            • 4 years ago

            No worries. It really would be the ideal config if they could pull it off. Being able to save power to make the thing actually mobile. There’s a reason they don’t mention battery life at all on their 17″ with G-Sync (from dual/SLI 970M’s) and no Optimus.

            Honestly I think my next laptop purchase is going to be some sort of touchscreen ultrabook with TB3 that can hook up to something like Razer’s exGPU dock for gaming at home on the big monitor.

        • mczak
        • 4 years ago

        Which model would that be? I’ve just checked and haven’t seen any. It’s either G-Sync or Optimus.
        The only way to make both Optimus and G-Sync work without intel supporting adaptive refresh would be with a display mux. Optimus nowadays doesn’t support that type of operation (it certainly did when display mux were in fashion more than 5 years ago, but straight from nvidia: [url<]http://www.geforce.com/hardware/technology/g-sync/faq#q17).[/url<] But there are indeed some notebooks which seem to use a mux. So, you can switch them in the bios - either you get optimus (display always driven by IGP), or you get G-Sync (display always driven by discrete chip). So, in that sense, it's possible to have G-Sync and Optimus with the same notebook - but not at the same time. That looks like a very half-baked solution to me, and adding cost as well (so it looks like few vendors do it). The real solution will really be intel supporting adaptive refresh, everything else is a dead end. So, maybe it's not quite infeasible, but still, the hurdles are considerable (so, you really only see it on some high-end gaming notebooks using gtx980m or sli).

          • Jeff Kampman
          • 4 years ago

          Much as it pains me to admit it, I was wrong and skimmed a spec sheet. As other posters have pointed out, G-Sync and Optimus can’t coexist right now. As you note, we’ll probably have to wait for Intel to start supporting Adaptive-Sync in its IGPs before Optimus and G-Sync can work together. I’ve also softened the conclusion of our review a bit in light of this new info.

            • mczak
            • 4 years ago

            Thanks for the update!
            FWIW the rumors are saying Kaby Lake might support adaptive sync, so let’s hope it indeed will…
            (btw some funny combination which technically should work already at least from the hw perspective (I sort of doubt the drivers would be ready for it) for “G-Sync” with Optimus, would be an amd apu (carrizo) with a discrete nvidia gpu – not a combination which is supported anywhere though…)

            • cygnus1
            • 4 years ago

            There’s another plus to them picking Optimus over G-Sync I just realized, Miracast streaming. Miracast doesn’t work if the iGPU is disabled and the dGPU is primary as is required for G-Sync .

            • Andrew Lauritzen
            • 4 years ago

            Note that you are never going to see “gsync” in particular supported outside of NVIDA hardware because it is not an open standard. Thus you could possibly see “freesync” or equivalent out of these multiadapter machines, but not gsync.

            Similar for VR – I’m not sure why you tried to test that in the review as VR is fundamentally incompatible with multiadapter systems like this today. The HMD needs to be plugged into the same GPU as is doing the rendering, so there’s no way you’re going to be able to force it to work on a system like this. Even if you could, as noted, it does not meet the min spec for VR anyways.

            We have a few of these machines at the office and they are certainly some of the nicer “gaming” laptops. I personally would not use anything more obnoxious than this. Agreed that the 4k monitor is overkill and a bad tradeoff, but at least it’s not the disaster it once was in terms of browsers and so on. That said, you definitely *do* notice the blur and bleeding/smearing in motion in games on this monitor, which is highly unfortunate…

            But if you want a higher end gaming laptop that isn’t completely obnoxious it’s one of the better choices right now.

            • mczak
            • 4 years ago

            My guess is nvidia will still call this G-Sync. G-Sync is as much a branding as it’s a technology. So, a discrete nvidia gpu coupled with a intel adapative sync compatible IGP would probably get a G-Sync moniker (albeit nvidia could also put some additional requirements to earn that brand moniker, like minimum supported range of frequencies, not too slow response time or whatever).
            Unless they are willing to give up on G-Sync completely (there’s imho no way you’ll see “true” G-Sync notebooks anymore once intel supports adaptive sync).

            • Jeff Kampman
            • 4 years ago

            [quote<]Similar for VR - I'm not sure why you tried to test that in the review as VR is fundamentally incompatible with multiadapter systems like this today. The HMD needs to be plugged into the same GPU as is doing the rendering, so there's no way you're going to be able to force it to work on a system like this. Even if you could, as noted, it does not meet the min spec for VR anyways.[/quote<] Same reason we climb mountains 😉 In all seriousness, though, this is an excellent point and I'll add a note to the review explaining why VR won't work well on Optimus-equipped machines.

Pin It on Pinterest

Share This