Latest Alienware laptops get desktop Pascal and Polaris graphics

Alienware teased the Internet a few days ago about an important product reveal, and the curtains have now opened. The company is showing off some hefty updates to its well-known lineup of gaming laptops.

Let's kick things off with the big one. The Alienware 17 is now equipped with Tobii's EyeX gaze-tracking hardware, letting owners use their eyes to control games and Windows itself with their pupils alone. Recent graphics card releases didn't go unnoticed by Alienware, either. On the green corner, buyers get the option of the celestial GTX 1080, the demigod GTX 1070, and the mighty GTX 1060.

Most interestingly, though, Alienware is perhaps the first laptop maker to incorporate one of AMD's Polaris chips into its gaming notebooks. The company is offering the same Radeon RX 470 graphics chip that's available in desktop cards as an option for the latest Alienware 17 (and 15) notebooks. While we're curious to see what effect the thermal constraints of a laptop form factor have on the 120W RX 470's performance, this is a big design win for AMD.

CPU options range up to a Core i7-6820HK overclocked to 4.1 GHz, in tandem with up to 32GB of RAM. Alienware offers multiple options for the laptops' storage compartments, up to a mix of PCIe and SATA storage. Screen options are aplenty, too: 1920×1080, 2560×1440, and 3840×2160 monitors are all on offer. Touchy-feely gamers should enjoy the "TactX" keyboard, which offers RGB LED lighting, a steel-reinforced base, and 2.2-mm key travel.

Next up is the Alienware 15. This machine is mostly similar to its 17" brother, and offers GeForce GTX 1070 and GTX 1060 graphics cards options, as well as the fully-enabled Radeon RX 470. Buyers can get a 1920×1080 IPS screen or a 120Hz monitor with G-Sync variable-refresh-rate tech. There's no indication of a FreeSync display for the RX 470, though. The stock battery offers a capacity of 68 Wh and an option for a 99 Wh upgrade.

Last but by no means least, there's the Alienware 13. Although it could easily be dismissed as the runt of the litter, this machine has a couple tricks up its sleeve. First, gamers can order it with a GeForce GTX 1060 card inside, which is an unusual pairing with a 13" laptop size. Second, Alienware offers an OLED screen option. Yes, you read that right. The company says this monitor has a 1-ms response time and a contrast ratio of 100,000:1. Given our experience with OLED screens, we'd expect black levels like nothing else out there.

Comments closed
    • Anovoca
    • 3 years ago

    OLED for gaming makes me a bit nervous. I have read enough about the problems with screen burn to feel reasonably certain that after 1 month of usage on that machine you would be able to see exactly where my action bar and minimap are located while trying to watch netflix.

      • tipoo
      • 3 years ago

      I thought that was a solved problem, but I guess we’ll only find out if 2016s OLEDs burn in by 2021.

        • Pwnstar
        • 3 years ago

        Five years is a decent amount of time to own a monitor. Get a new one in 2021.

    • ImSpartacus
    • 3 years ago

    Can we stop falsely claiming that desktop graphics cards are making it into laptops and SFF machines that definitely cannot support desktop graphics?

    Yes, I know that modern GPUs have closed the gap between desktop-level performance and mobile-level performance enough for Nvidia to release the m-less “laptop” 980 and 960 (really just a 970m) as well as the entire mobile Pascal lineup without the “m” suffix. However, that doesn’t mean that they magically become desktop cards. They are configured differently and it’s misleading to keep claiming that they are “desktop” graphics. This is the 3rd or 4th time that TR has made this mistake.

    If you want to say “desktop-level performance” or something like that, then that’s tolerable, but these are definitely not desktop cards. They do not use PCIe and they do not have PCIe-like TDPs. They use MXM or MXM-like form factors, which can only support 100W (130ishW if you’re daring).

    I know someone is going to say that the 470 has a 120 TDP and could presumably be implemented in a laptop-friendly MXM or MXM-like form factor without any major issues, but there’s no way that’s happening. AMD hasn’t announced their mobile Polaris lineup yet, but at the very least, it’ll be a well binned Polaris 10 Pro die (NOT a desktop-class Polaris 10 Pro die). I wouldn’t be surprised if it was an underclocked/volted Polaris 10 XT die that achieves 470-like performance. Nvidia did that with their laptop 1070 to make it more mobile-friendly, so I wouldn’t be surprised if AMD did the same for the laptop 470.

    The bottom line is that this behavior is misleading to the audience and TR is better than that.

      • gc9
      • 3 years ago

      (Article says same chip, not card.)

      • RAGEPRO
      • 3 years ago

      They’re the same chips, same RAM. Probably lower clocks, but they’d suffer the same in a cramped desktop. Not sure what you’re on about. Maybe you didn’t realize that they’re actually the same chips?

      They do use PCIe. How did you think they are connected? You should back off the self-righteous attitude a little.

        • ImSpartacus
        • 3 years ago

        We’ve had “the same chips, same ram, etc, etc” for YEARS.

        It’s not like Nvidia and amd make entirely new graphics cards for mobile.

        The only difference is that where we used to have the “chip” from something like a 7870 desktop card get renamed 7970m when it gets used in laptops, now the model numbers are generally the same.

        It’s a marketing change. Nothing more.

        And yet, tr confuses its audience by claiming that desktop graphics cards are fitting in laptops and sff machines.

        I had mentioned that this wasn’t the first time. Go read the comments [url=https://techreport.com/news/29867/msi-vortex-puts-two-gtx-980s-in-one-cylindrical-case<]here[/url<]. People honestly thought that desktop graphics cards were fitting in that case. They talked about power requirements and switching out the 980s for Nanos. They were genuinely confused. I can overlook it when a site like the verge does that kind of thing. But I come to a site like tech report to learn. My goal is to humbly fill an empty space in my noggin with new information that I couldn't always get elsewhere. I imagine there are others like me, but stuff like this could cause false information to be learned instead. Tech report built its reputation by ensuring that the things that it taught to its audience were correct. I hope it continues that on the future. EDIT The verge removed mention of "desktop graphics" from their alienware article's headline. You'll just have to trust me on that one. Or you could continue to argue with a random guy from the internet that doesn't always properly differentiate between pcie the interface and pcie the form factor...

          • RAGEPRO
          • 3 years ago

          You’re [i<]kinda[/i<] making an ass of yourself here, boss. Maybe tone it down a little. You should read [url=https://techreport.com/news/30522/nvidia-brings-its-gtx-1080-gtx-1070-and-gtx-1060-to-laptops<]this.[/url<] In short, yes, aside from thermals, mobile Pascal are literally the same chips. Mobile 1080 = desktop 1080. Mobile 1070 even has MORE units enabled. We are much more competent than you give us credit for. The Verge is a fine site, but they cover a much wider range of topics than us. We are more authoritative on the topic of specifically "PC hardware" than them. So we know what we're talking about, and what The Verge says or does is irrelevant. And we'd never try to intentionally mislead our readers. Give us some credit, bro.

            • sweatshopking
            • 3 years ago

            correction: the verge is a [i<] terrible [/i<] site, and shouldn't be read by anyone. it's literally the worst tech site on the internet, and only exists because it's so incredibly biased and clickbaity that it gets page hits. Please don't make this mistake again.

            • ImSpartacus
            • 3 years ago

            I’m fully aware that Nvidia adjusted their mobile part naming scheme to match up with their desktop offerings. I literally commented on this marketing change in my original comment.

            It’s undeniable that the article’s wording is technically “true”. If that’s all that’s necessary, then TR is hitting it out of the park.

            But if that’s all you need, then you could just regurgitate press releases. No, that’s not good enough. TR (and similar blogs) exist because there’s a little bit of extra educational commentary that can be added to ensure that everything can be understood with all of the necessary context.

            If we just want to regurgitate facts, then TR could’ve used that “literally the same chips” comment for the past five years. Yes, we’ve had a mobile G*104-type part in every generation of Nvidia GPUs since that chip codename nomenclature was first used in Fermi. Thankfully TR didn’t do that because that would be misleading.

            The performance margin between desktops & laptops has steadily closed, but it’s still misleading today and TR can do better.

            Also, I, in no way, intended to compare TR to the Verge. I literally said that I could forgive them (and I can). But I like to think that I can hold TR to a higher standard. Or at least I hope. :/

            • RAGEPRO
            • 3 years ago

            I don’t really understand why you don’t get this. It isn’t the same as it has been in the past.

            The GTX 770 in a PCIe card is a fully-enabled GK104 GPU. It has 8 Kepler SMX. By contrast, the GTX 770M is actually closer to a GTX 660. It uses the GK106 GPU, with 5 SMX. The difference in performance between the two is much larger than the numbers would imply. To get a GK104 in mobile during the 700 series, you had to have a GTX 780M. However, the GTX 780 on desktop used (a partially enabled version of) the much larger GK110 GPU that debuted in the GTX Titan. Yes, it’s confusing and frustrating. We didn’t like this situation any more than you, and if you go back and read our articles we went to some lengths to consistently point out the truth among the confusion.

            Another example: the GTX 980 in a PCIe card is a fully enabled GM204 GPU. The GTX 980[b<]M[/b<] in laptops is also based on the GM204 GPU, but it has 4 SMMs disabled. It is [b<]not[/b<] the same product. There is also a "desktop GTX 980" for laptops. Surprise surprise, it's a fully-enabled GM204 GPU, just like the desktop version. Given the same clock rates, they have the same performance. This was a big deal when it was new and we covered it as such. So now, with mobile Pascal, the GTX 1060 in a PCIe card is a fully-enabled GP106 GPU. The GTX 1060 in laptop form is a fully-enabled GP106 GPU. Both cards have 6GB of GDDR5 memory on their 192-bit bus. This is [b<]NOT[/b<] the same as previous generations. Obviously laptops will be thermally limited more than a custom-built desktop (at least, more than a decently-built one.) However, they [b<]are[/b<] the same graphics cards. At the same clock, they'll offer the same performance. It is not disingenuous, misleading, or in any way incorrect to say that they are the same chips.

            • ImSpartacus
            • 3 years ago

            Again, I have a fantastic handle on what’s going on here and, I’ll repeat, the entire content of this article could easily be argued as factual.

            I’m asking for TR to take that step past just sharing facts, to explaining facts in a manner most unlikely to confuse its audience.

            This is a confusing subject and, fitting enough, you’ve confused a couple things with your examples:

            [quote<]the GTX 980 in a PCIe card is a fully enabled GM204 GPU. The GTX 980M in laptops is also based on the GM204 GPU, but it has 4 SMMs disabled. It is not the same product. There is also a "desktop GTX 980" for laptops. Surprise surprise, it's a fully-enabled GM204 GPU, just like the desktop version. Given the same clock rates, they have the same performance. This was a big deal when it was new and we covered it as such.[/quote<] The laptop 980 does not perform like a typical desktop 980. I'm not digging on Nvidia because I respect what they tried to do and the laptop 980 was "version 1.0" of this effort. There are a couple reasons for this performance difference: [list<] [*<]The laptop 980 runs at clocks over 5% lower than a stock desktop 980 ([url=https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_900M_.289xxM.29_Series<]1064MHz[/url<] vs [url=https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_900_Series<]1126MHz[/url<], base clock, boost clocks unknown on the laptop 980). [/*<][*<]But even if it did reach stock clocks, stock clocks aren't enough. Most "typical" desktop 980s will be overclocked out of the factory. Mr. Wasson famously said that, [url=https://techreport.com/discussion/28612/asus-strix-radeon-r9-fury-graphics-card-reviewed?post=921416#921416<]"The reality is that what you buy at Newegg generally will not be a reference card with reference clocks"[/url<] in the context of why TR used an overclocked 980 in their review cards. That particular 980 is clocked 6-9% higher than stock ([url=https://techreport.com/review/28612/asus-strix-radeon-r9-fury-graphics-card-reviewed/2<]1228/1329 MHz[/url<] vs[url=https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_900_Series<]1126/1253 MHz[/url<]. [/*<][*<]But even if it did reach a "typical" factory overclock (whatever that is, lol), then it still wouldn't be enough because the actual distribution of clocks between the base & boost clocks are heavily affected by the power that the GPU is allowed to draw. The 980's stock power limit is [url=http://cryptomining-blog.com/tag/gtx-980-power-target/<]180W[/url<] (despite its advertised TDP being 165W). The laptop 980's limit is unknown, but generally agreed to be roughly 130W. All the binning in the world isn't going to get a laptop 980 to boost as hard as a non-reference desktop 980 when its power limit is gimped by over 25%. [/*<] [/list<] [quote<]So now, with mobile Pascal, the GTX 1060 in a PCIe card is a fully-enabled GP106 GPU. The GTX 1060 in laptop form is a fully-enabled GP106 GPU. Both cards have 6GB of GDDR5 memory on their 192-bit bus. This is NOT the same as previous generations. Obviously laptops will be thermally limited more than a custom-built desktop (at least, more than a decently-built one.) However, they are the same graphics cards. At the same clock, they'll offer the same performance. It is not disingenuous, misleading, or in any way incorrect to say that they are the same chips.[/quote<] The laptop 1060 is in a similar situation: [list<] [*<]Its clocks are 2-7% lower than stock ([url=https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_10_.2810xx.29_Series<]1404/1670 MHz[/url<] vs [url=https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_900M_.289xxM.29_Series<]1506/1708 MHz[/url<]). [/*<][*<]TR's 1060 review unit was a Founder's Edition, but in today's market, [url=http://pcpartpicker.com/products/video-card/#c=373&sort=a8&page=1<]the cheapest 1060[/url<] is roughly 6% faster than stock. ([url=https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_10_.2810xx.29_Series<]1404/1670 MHz[/url<] vs [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16814137033<]1594/1809 MHz[/url<]) [/*<][*<]The laptop 1060's TDP is predicted to be roughly 75W, while the desktop 1060 clocks in at 120W. [/*<] [/list<] [quote<]The GTX 770 in a PCIe card is a fully-enabled GK104 GPU. It has 8 Kepler SMX. By contrast, the GTX 770M is actually closer to a GTX 660. It uses the GK106 GPU, with 5 SMX. The difference in performance between the two is much larger than the numbers would imply. To get a GK104 in mobile during the 700 series, you had to have a GTX 780M. However, the GTX 780 on desktop used (a partially enabled version of) the much larger GK110 GPU that debuted in the GTX Titan. Yes, it's confusing and frustrating. We didn't like this situation any more than you, and if you go back and read our articles we went to some lengths to consistently point out the truth among the confusion.[/quote<] That is just a marketing issue. The part existed. Does it matter that was named 780M instead 770M? I agree that it's confusing, but mobile lineups have always been confusing. I would hazard that Nvidia's new mobile lineup might be more confusing because of the overlapping naming, though I can respect their intentions. Overall, as I said before, TR isn't "wrong" with their current wording. But if TR's correctness lives & dies on the exact usage & interpretation of the generic word "chip", then I feel like it could easily be misleading to someone that doesn't really understand this stuff well. Past instances of this issue (particularly the "vortex" article) show that these people definitely exist in TR's audience. It's so much easier to just stop claiming that desktop GPUs are going in laptops and SFFs.

            • RAGEPRO
            • 3 years ago

            Your goalpost-shifting aside—I said in my first post, quite literally, “probably lower clocks”—ultimately it’s the same GPU. The difference is in the implementation; the bits surrounding the GPU. I’m not sure what to tell you. I’m not even sure what you would want us to say differently.

            We clearly state that these are going into laptops, and the performance of mobile GPUs now is closer than it has ever been to desktop. It really seems like you’re grasping at straws with which to pick nits.

            • ImSpartacus
            • 3 years ago

            I’m not shifting the goalposts, I’m just trying to be as cordial as possible after I was referred to as an “ass”.

            But if you’re interested in how I’d handle this story, I’m happy to share the perspective of a humble layman.
            [list<] [*<]No Clickbait In The Title [list<] [/*<][*<]The words "desktop" and "Pascal" don't add anything, while "Polaris" is unnecessarily broad in this case. All three are buzz words in this context. [list<] [/*<][*<]These are mobile GPUs, full stop. No need to use "desktop" at all. Now I understand that your inbox is probably constantly full of press releases that are probably pretty convincing, but you have to be above that or else you might as well just post those press releases. [/*<][*<]Pascal is incredible and Nvidia has brought their mobile lineup closer to their desktop lineup than ever before, but TR has already covered mobile Pascal and it's obvious that a big thick Alienware laptop will have the latest mobile GPUs. The inclusion of Pascal is a footnote in the body of your article, not the headline. [/*<][*<]Normally, AMD would've also debuted a mobile lineup and "Polaris" would be a poor word choice for the same reasons stated for "Pascal". But in this case, it's too broad. We're just talking about the 470 and it [i<]is[/i<] a moderately big deal because this is a new debut. Just use "RX 470" rather than the nebulous "Polaris" and you're golden. [/*<][/list<][/list<] [*<]Dig Beyond The Press Release [list<] [/*<][*<]I'm sure there was a convincing narrative from Alienware, but you gotta differentiate yourself. There's a story to tell, here. I think it was good to start with the graphics, the other components are stale right now. Also a good idea to dig into the 470, since that's a new model. The problem is that there wasn't much actual info. I would've reached out to your AMD (and/or Alienware) contact to explicitly confirm a ton of stuff. [list<] [/*<][*<]Obviously you start with the typical spec table (and you plop the results right next to a "Desktop 470" column, maybe it's a carbon copy, but maybe not, useful no matter what): [list<] [/*<][*<]Base Clock [/*<][*<]Boost Clock [/*<][*<]Memory Clock [/*<][*<]Enabled CUs [/*<][*<]ROPs [/*<][*<]Memory bus width [/*<][*<]TDP [/*<][*<]etc [/*<][/list<] [*<]Check if OEMs are allowed to change clocks & power limits and if there's a possibility that more than one chip config will be found on the streets. If the Alienware 15/17 config is a legitimate "copy" of the desktop 470, then does that mean that every laptop 470 will be? Right now, I don't know. [/*<][*<]Validate the implementation of the graphics. Are we talking MXM or is this directly on the board? If it's MXM, what's the power limit on the laptop's implementation of MXM? If it's directly on the board, then things get a little more boring, but it's useful to mention it. [/*<][/list<][/list<] [*<]Don't Be Afraid to Speculate [list<] [/*<][*<]As long as TR (or any blog) is exhaustively and painstakingly making it braindead clear that they are speculating certain things, then there's absolutely no harm in dropping some nuggets towards the end of an article. There are a ton of questions that come to my mind from just this tiny spec of news, alone. [list<] [/*<][*<]Was the 470 designed for mobile from the start? I know a lot of smart people thought the 470 would only have 28 CUs so it could better bridge the gap between Polaris 10 & 11, but it's only barely cut down. It performs strangely similar to the 480, but its official TDP is a hefty 20% lower. Also, the memory runs at 6.6 Gbps. Do any of the fabs even sell chips that run at that speed? The chips are probably rated at an even 7 Gbps. Sometimes other things can limit the clocks, but we know the controller & general memory subsystem can handle 8.0 Gbps, so 7.0 Gbps shouldn't be a challenge. But you know a lot of mobile parts have slightly underclocked memory for the power savings. That doesn't make sense in a desktop card with a reference cooler built to dissipate 150W, but it would make it pretty seamless if you were intending to release this card as a mobile part from the very beginning. [/*<][*<]Why did AMD stop at the laptop 470? Polaris 11's desktop implementation squeaks under that important 75W limit, but a lot of folks assume that cards like the laptop 1060 also need 75W. So if Alienware is using the 75ishW laptop 1060, then why not also do a carbon copy transplant of the 460 to go head with the laptop 1060? Did AMD think the 460 couldn't hold its own against a power-starved GP106 (it probably can't)? They cut it down - we know that fully enabled chips at slightly lower clocks are generally more efficient (laptop 1070 anyone?) and they are using full blown 7 Gbps memory, no underclocking there. I'd hazard a guess that the 470 was intended for mobile from the very beginning, but the 460 was not. AMD is probably stashing all of the efficient/complete Polaris 11 chips for the likes of Apple. Maybe that's right, maybe not. [/*<][*<]The 470 isn't officially VR-ready. AMD probably wants a VR-ready option for laptops. If the 480 could've been used, then why wasn't it? Too much power to get that last bit of performance beyond what the 470 already had? Maybe that's the case, but that means AMD might not have a VR option for laptops. That's not good for AMD. Meanwhile, the <100W laptop 1060 probably doesn't perform like a desktop 1060, but it probably still meets the VR min spec. The laptop 1070 & 1080 obviously qualify. So Nvidia has several VR options for laptops. That's really not good for AMD. Will Vega fix this? Assuming Little Vega matches GP104 and Big Vega matches GP102, that's too much GPU for a laptop in a market that doesn't even want a laptop 480. Navi is a long ways away. This is getting spectacularly not good for AMD in a VR market that is prime for tremendous growth. [/*<][*<]If the Alienware 17 gets a laptop 1080, but the Alienware 15 doesn't and both the 15 & 17 get a laptop 470, then that means that the laptop 1080 needs more than 120W to parch its thirst (if that turned out to be the laptop 470's TDP after confirmation). That might be obvious to some, but it's a great data point to mention since Nvidia does not disclose the TDPs of its mobile parts. TR drops clever little data point like that and they'll have random theorycrafting schmucks sourcing their articles for that shit months down the road. [/*<][*<]Likewise, the Alienware 13 can only support a laptop 1060. That weakly suggests that the laptop 1070 and laptop 470 have similar power requirements. It's not remotely strong, but it's a reasonable speculation. [/*<][/list<][/list<] [/list<] All of the other specs (short of maybe the OLED panel) are pretty boring, but the laptop 470 debut provides a ton of angles to do this article. TR doesn't need to regurgitate a press release about the magic of putting "desktop graphics" into a laptop. TR is better than that.

            • RAGEPRO
            • 3 years ago

            So you have a lot of problems in this post and frankly I’m sick and don’t have the energy to go through all of them.

            Overall you have some misconceptions about exactly how this site and tech blogging in general operate, both as a business and as writers. Just as one example, we generally don’t hear back from PR contacts on requests for info, and if we do, it’s usually a day or two later. While that’s fine or even great for reviews and longer articles, that isn’t very helpful for news.

            I understand you’re annoyed that other people do consider these to be “desktop” GPUs, but the fact of the matter is that they are NOT the “mobile” GPUs of generations past. It’s newsworthy and it’s the way we’re going to keep reporting it, just like everyone else.

            You spend a lot of time making accusations that we “regurgitate” press releases but that simply isn’t true. Go check sites that DO do that (Guru3D, TechPowerUp) and you’ll see that what we do is quite far from that. You may feel that the extra information we add beyond the press release isn’t valuable or useful, and that’s fine. I hope you’ll recognize though that we aren’t simply re-wording the press releases handed to us. A lot of thought and care goes into even simple posts like these new product announcements. By the way, we post less than ten percent of the releases that get sent to us.

            You also make a lot of assumptions about the information we have available to us. No, Bruno didn’t copy the spec table here from the press release; he had to generate it himself. Generally press releases don’t contain most of the information we post in our news pieces. We have to patch it together from product pages, previous releases, and simple inference. Note that I said ‘inference’ and not ‘speculation’, because speculation is dangerous. If you really want speculation, go read SemiAccurate or WCCF. TR is here to report on the facts, not to make wild-ass guesses about hardware configurations.

            I know our job seems easy and like we don’t do enough/very much, because I used to think the same thing. Now that I’ve been writing for TR for a while, this job is a lot harder than I ever imagined. I’m always open to fair criticism, but I’m sorry friend; you just don’t know what you’re talking about.

            • Andrew Lauritzen
            • 3 years ago

            Thermals are super-important though… in fact they are quickly becoming the primary determiner of overall performance.

            If calling it the same thing because it’s the “same chip” is okay here, it’s also okay to call a 3-5W Core M the same as a 25W Core i7 U… which of course they are blurring now by calling them both i7 (to much community ire :). But even there they aren’t literally calling them the exact same name.

            I can see both sides here, but if the silicon only is what determines the name now with no thought to TDPs, then the name is borderline meaningless in terms of what matters.

            • RAGEPRO
            • 3 years ago

            [quote<]Thermals are super-important though... in fact they are quickly becoming the primary determiner of overall performance.[/quote<]Oh, absolutely. Jeff and I have been talking about how the rated boost clocks of Pascal parts are almost irrelevant given that most of the non-reference cards will boost to near or over 2GHz anyway. GPU Boost 3.0 has really changed the game. I understand the new Turbo logic on the Broadwell-E chips is similar. To be clear, I'm not saying that thermals (and by extension, clocks) aren't important. Obviously. And I'm not trying to say that they will have exactly the same performance. That's obviously false; anyone who knows the first thing about hardware would know that, and we don't write for The Verge's audience. All I'm saying is that his accusation that it's disingenuous to say "the same GPUs are going into laptops as in the desktops" is unfair and unfounded. As I told ImSpartacus, the difference is in the periphery, not in the GPUs. That is to say, if I tell you "my laptop has a GTX 1080", you should know immediately that the performance will be similar-to-but-worse-than your desktop with a GTX 1080. Perhaps most critically, all of the same features are supported, which is not always the case even within the same series (where the 700, 800, and 900 series of mobile GeForces were sometimes based on Maxwell, Kepler, or even Fermi, all with their own varying GPU decode blocks and DX feature levels.) WRT the Core M vs. Core i7-U idea, I mean, I wouldn't want to use either. 😛 But seriously speaking, that's a [b<]much[/b<] bigger difference than desktop-Pascal vs. mobile-Pascal. For one thing, they don't necessarily have the same memory configurations; the Kaby i7-Y doesn't even support DDR4. More to the point, you're talking about a greater-than-500% difference in TDP. So it's pretty different. Ultimately what it comes down to (why calling the mobile GeForces by the same name is fine but the Intel chips is not) is just that the difference is so much bigger. Beyond that, I would say what Intel is doing is a lot more disingenuous than what NVIDIA is doing now; more comparable to the problematic situation concerning the mobile GeForce 700 series I described in my earlier post. Frankly speaking, it really wouldn't surprise me if Intel DID call them the same thing; both chips have configurable TDP both up and down, after all. (Has anyone [i<]ever[/i<] sold a -U chip configured with TDP-up, though?) In any event, by no means do either of the Core i7-7Y75 nor 7500U deserve the "Core i7" appellation. That's my opinion though, and a whole other discussion. 🙂

            • Andrew Lauritzen
            • 3 years ago

            > That is to say, if I tell you “my laptop has a GTX 1080”, you should know immediately that the performance will be similar-to-but-worse-than your desktop with a GTX 1080 … Ultimately what it comes down to (why calling the mobile GeForces by the same name is fine but the Intel chips is not) is just that the difference is so much bigger.

            That’s the point though – what you’re saying is fundamentally subjective in two aspects. First, how much is ‘too much of a difference’ for it to be called a GTX 1080 in both cases? What if the mobile 1080 is slower than a desktop 1070 (plausible for this to happen depending on cooling)? That doesn’t seem “okay” to me to name them exactly the same thing.

            And on the other front – as I said I won’t defend the “call everything an i5/i7” bit, but at least they are different model names. That puts it in a strictly different discussion IMO so I don’t necessarily buy the “NVIDIA’s is okay, Intel’s is not” conclusion.

            > (Has anyone ever sold a -U chip configured with TDP-up, though?)

            Yes, the Surface Pro 4 i7 can vary its TDP up to 25W depending on chassis thermals. There may be other examples as well. There have been a few up-config’d Core M’s as well as I recall (7-8W is doable fanless in laptop form factors).

            Anyways my point is not to split hairs here, but the question ultimately comes down to what should the model name represent. I don’t think most consumers care about the internal architecture of the GPU and how many SMs it has and if some are disabled, so I’d argue that the name still primarily has to tie to performance, and secondarily to feature set. NVIDIA certainly picks this logic when it suits them too (see GTX 860M situation where they sold two chips with different architectures under the same name with the argument that performance was close-ish).

            I think you’re being a bit naive if you think NVIDIA is doing this for any other reason than why Intel has an “i7” in each form factor as well. It’s clearly “legitimate” in that they are legally allowed to do it, but I think it’s a fair criticism to say that it is misleading to potential buyers who might consider a laptop vs. desktop setup. I don’t think that’s a huge audience in any case so I can’t get too bothered by it, but I get the complaint.

            • RAGEPRO
            • 3 years ago

            That’s interesting. I wouldn’t really have expected you (or anyone) to say that the nomenclature should be based on final performance rather than on the specifics of the hardware.

            The way I see it, if you have chip A and it has performance X, with name Alpha, then chip A with performance Y due to factor F should still be named Alpha. Because you have factor F in play. You remove F, it’ll have performance X, at least in theory. And either way, it’s still chip A.

            To put it another way (perhaps with more clarity), I would rather the name represent specifically what I’m purchasing in terms of physical goods rather than some ephemeral performance value. I want to know my potential performance rather than the performance as configured. I suppose that’s the mindset of a tweaker, but I think it makes sense for any educated consumer.

            Still, I think it makes more sense for chip names to be based on chip configuration than on their potential performance, especially because chip configuration [i<]usually[/i<] includes varying feature support. I guess what I'm trying to say is that in my mind, describing the chip with specificity is more important than trying to describe its performance value, especially since—as you said yourself—these days performance can vary with a lot of factors anyway, like thermals. I suppose we'll have to agree to disagree in the end. I think what NVIDIA is doing here is a lot more honest than calling a -Y chip a "Core i7", and I think it's certainly a lot more honest than what NV used to do with mobile parts (e.g. 970M).

            • Andrew Lauritzen
            • 3 years ago

            > I wouldn’t really have expected you (or anyone) to say that the nomenclature should be based on final performance rather than on the specifics of the hardware.

            When there are feature differences that are relevant to consumers, I definitely think they should be represented in the product numbers somewhere (leading digit as architecture or whatever). Things like the 860M situation are top of my list of stupidity no doubt.

            But the second bit is not as cut and dry as you seem to think it is. “Potential performance” is a nebulous, ill-defined notion that doesn’t even comprehend the realities of SKUs, fusing parts off, different chip/transistor layouts, different power delivery, etc. There are a myriad of variables that get varied between SKUs beyond simple stuff like how many cores or execution units it has in theory, or enabled.

            > I think what NVIDIA is doing here is a lot more honest than calling a -Y chip a “Core i7”, and I think it’s certainly a lot more honest than what NV used to do with mobile parts (e.g. 970M).

            You seem to be arguing against your own point here… Core M and U are effectively the same chips (at the level you are comparing them at least), so by your logic they both *should* be called Core i7. In fact your logic would say they should even have the same model number, which is something is strongly disagree with!

            To relate the same thing to NVIDIA, I have no problem with them calling them both “Geforce”, but calling them the exact same model name when they could perform fairly differently in practice is misleading. It doesn’t matter if they have a similar number of execution units or anything else – clocks matter a lot.

            To pick up another example… should devil’s canyon have had the same model numbers as the previous Haswell chips despite being 500+Mhz faster? Pretty hard to argue that they should based simply on the fact that they were similar silicon. In fact if that’s the argument, should basically all quad core Intel client CPUs have the exact same name, with no way what-so-ever for consumers to determine performance levels of each?

            I don’t think that’s reasonable. Broad architecture is definitely important to represent in a model number and we agree on that. But beyond that it has to be primarily motivated by some sort of realized performance or else you’re basically just making stuff up to fit whatever marketing agenda you have.

    • HERETIC
    • 3 years ago

    “While we’re curious to see what effect the thermal constraints of a laptop form factor have on the 120W RX 470’s performance, this is a big design win for AMD.”

    Pretty sure AMD will bin it’s chips(so they can charge more-just like NVGreedy)
    and we’ll probably end up with a sub 100 Watt GPU that can run at same clocks
    and performance as desktop…………..

      • psuedonymous
      • 3 years ago

      Have AMD managed to actually achieve that with any of their previous mobile GPU releases?

        • HERETIC
        • 3 years ago

        With AMD and NV’s convoluted naming scheme for Mobile GPU’s in the past plus
        varying clocks,it’s almost been impossible to make a direct comparison to desktop.

        Have to laugh when looking at lower end lappy’s with 2GB VIDEO RAM in huge print
        then need a magnifying glass to find out what the GPU actually is.

        One has to believe some chips will be better than others and careful binning
        will give us lower power GPU’s.

    • BoilerGamer
    • 3 years ago

    Instead of an useless eye-tracker, how about a windows hello Camera that is actually useful?

      • sweatshopking
      • 3 years ago

      yes, please.

      • Tirk
      • 3 years ago

      The second link in the article mentions that the camera is also compatible with Windows Hello. It just wasn’t mentioned in the TR article.

      [url<]http://en.community.dell.com/dell-blogs/direct2dell/b/direct2dell/archive/2016/09/01/alienware-blasts-into-pax-with-first-ever-vr-ready-notebooks[/url<]

        • psuedonymous
        • 3 years ago

        Yup. They’re using the Tobii EyeX module, which is one of the few consumer devices that you can currently use with Windows Hello (others indluce the Kinect 2, and the Realsense devkit cameras). My Steelseries Sentry (another Tobii EyeX module) works just fine with Windows Hello.

    • vargis14
    • 3 years ago

    Sooooooo ugly and pink…come on!!!!

    I would take the 2560 panel over 4k…besides that the GPU CPU config is nice for desktop replacement.

    EDIT: thnak the lord i do not need a replacement yet:)

      • Spunjji
      • 3 years ago

      Pink = RGB LEDs.

      That said I am really not for their design language either.

    • Chrispy_
    • 3 years ago

    Where are the 45W mobile GPUs on 14nm?

    I don’t want more performance, I want quieter cooling solutions and longer gaming on batteries!

      • morphine
      • 3 years ago

      You shouldn’t be barking up Alienware’s tree, then? 🙂

        • Chrispy_
        • 3 years ago

        I dunno, Alienware have made some of the better-cooled, compact gaming laptops that don’t try and cram high-TDP chips where they don’t belong.

        The old Alienware 13 had a 65W GTX960 in it. Nothing particularly remarkable, but they didn’t skimp on cooling so it ran nice and quiet.

      • tipoo
      • 3 years ago

      Yep, that’ll be what’s in the next 15″ rMBP which I’d be interested in. Nvidia is likely still out due to OpenCL politics.

      • ImSpartacus
      • 3 years ago

      Can’t a beefy IGP accommodate that need?

        • Chrispy_
        • 3 years ago

        Wake me up when an IGP surpasses a dGPU!

        2012: GTX650m vs Intel HD4000 (about 5x faster)
        2014: GTX860m vs Intel HD5000 (about 4x faster, and that’s an expensive-as-hell IGP)
        2016: GTX950m vs Intel HD530 (about 6x faster)

        It’s the difference between 1080p60 on medium/high and 20-25 fps on lowest possible settings at a horrible non-native 720p

        If you want a blocky, blurry, slideshow approximating the game but not actually playable, then yes – IGP is fine.

          • tipoo
          • 3 years ago

          By replacing a 45W dGPU, I think Spartacus meant more Iris Pro 580. The Haswell Iris Pro 5200 traded blows with the 650M and wasn’t terribly far off from a 750M when it was new, and with 72EUs instead of 40 the 580 should be quite capable.

            • Chrispy_
            • 3 years ago

            That would be relevant if the Iris Pro models ever made it into laptop design wins.

            In theory there is a Skylake 6770HQ that was just like the 6700HQ I have in my laptop now, only it has Iris Pro 580 graphics instead of the Intel HD 530.

            Find me a laptop on sale with that processor, I dare you!
            I even Googled for 6870HQ and 6970HQ.
            Iris Pro in a laptop is a myth that exists for reviewers and perhaps Apple.

            • ImSpartacus
            • 3 years ago

            Yeah, it’s a shame that only Apple is capable enough to put those kinds of parts onto their machines. Everyone else can’t market themselves out of a paper bag unless they have a fancy high cpu clock speed on their spec sheet and the name “Nvidia” or “amd” (mostly the former lately…).

            • sweatshopking
            • 3 years ago

            why would you want iris for the cost vs NVidia?

            • Chrispy_
            • 3 years ago

            Actually, Iris Pro 580 costs zero extra in the 6700HQ vs the 6770HQ, [url=https://en.wikipedia.org/wiki/List_of_Intel_Core_i7_microprocessors#.22Skylake-H.22_.28MCP.2C_quad-core.2C_14_nm.29<]according to this[/url<]. As mentioned though, it's irrelevant because those chips are unicorns that never end up in laptops available to the public.

            • DrDominodog51
            • 3 years ago

            [url=http://ark.intel.com/compare/88967,93341<]Ark.intel says[/url<] otherwise.

            • Chrispy_
            • 3 years ago

            Interesting. I wonder if that $56 is cheaper than the Nvidia alternative. It’s gotta be cheaper to implement and cool a single chip, that’s for sure.

            I’m basically curious why there are so few laptops with Iris Pro when it doesn’t look like there’s any real reason for it.

            • Andrew Lauritzen
            • 3 years ago

            Lots of reasons, but your first post hit on one and it has nothing to do with tech: consumers think “dGPU good, iGPU bad” regardless of the performance of either option. That’s why china is still full of dGPUs that perform worse than the iGPUs that are already in the machine…

            But in all seriousness you’re not going to get what you want from a dGPU either here… no OEM is interested in the volume or battery use of “gaming” laptops at all. The assumption is you only game plugged in and you wear headphones. I realize that’s pretty narrow, but gaming laptops are a fairly narrow market to begin with so I can kind of empathize with the OEMs.

            • tipoo
            • 3 years ago

            Sounds similar to China also driving the trend to 8 weak cores in SoCs…I wonder if AMD FX has a small bump compared to the rest of the world in China because of the 8 = good luck thing.

            • ImSpartacus
            • 3 years ago

            I think most users, especially those frequenting this kind of site, would be better served with a dGPU (especially for the value, as mentioned). And I bet we’ll see some pretty potent stuff from Pascal.

            However, if you’re looking for something to be compact, then reducing the “hot” chip count down to just the CPU would make everything a lot easier. I know this is a critical thing for someone like Apple and several other high-end lines of laptops (where value is less of a concern).

          • ImSpartacus
          • 3 years ago

          I misread the “45W” part of your comment for “45nm”, as in, all of the garbage super low end stuff that keeps getting rebranded. That’s surely a different market segment from the 45W stuff that you mentioned.

          I don’t follow mobile graphics much, but I imagine Polaris 11 will be hitting mobile if it hasn’t already. That might tickle your fancy until we get gp107 to the rescue.

      • HERETIC
      • 3 years ago

      NOT 45Watt BUT-
      [url<]http://videocardz.com/63605/nvidia-geforce-gtx-1050-specifications-leaked[/url<] I was expecting 105 to be 1/2 a 106 and broken 106's like the 3GB to fill the gap in between. Seems NV has different ideas and it's 2/3 of a 106-clocks seem a bit low but perhaps that"s to keep sub 75 Watt. If they can bin the best of these to sub 60 Watt-might make a really nice lappy GPU.......................................

        • Chrispy_
        • 3 years ago

        handful of salt etc, but good find.

        Those clocks and specs make me think GP107 is going to be a 65W drop-in replacement for the GM107 with performance in the ballpark of desktop GTX960 or somewhere thereabouts.

        That’s EXACTLY what I was hoping for, and to be honest with you, the “current” GTX960M is a glorified 750Ti. Getting a performance upgrade from 750Ti to GTX960 in the same 65W envelope (could even be as low as 45W given those conservative clocks) is perfect for portable gaming machines that don’t need hair-dryer cooling systems.

    • Mat3
    • 3 years ago

    [quote<]...fully-enabled Radeon RX 470...[/quote<] Technically, isn't that a 480? 😛

    • Neutronbeam
    • 3 years ago

    Interesting. The front vents on the 17 are something I don’t recall seeing on a laptop before.

      • RAGEPRO
      • 3 years ago

      [url<]https://techreport.com/news/30019/acer-predator-17-x-notebook-and-g1-desktop-join-the-hunt[/url<]

      • Chrispy_
      • 3 years ago

      Loadsa laptops have vents there to provide airflow to the magnetic air pressure resonators.

      • Spunjji
      • 3 years ago

      I’m pretty sure (based on previous Alienware ownership) that those are speakers.

    • tipoo
    • 3 years ago

    Even the 1060 in laptop form was over 100 watts, right? Wonder what a battery test of the 68Wh one with a QHD screen and 1070 would look like, just for the lulz even if they’re not meant for that.

    edit: Nope, 75 watts

      • chµck
      • 3 years ago

      Should be comparable to a maxwell GTX 970M

      • ImSpartacus
      • 3 years ago

      Where did you find that tdp? Aren’t mobile tdps notoriously difficult to ascertain?

        • tipoo
        • 3 years ago

        [url<]http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Pascal-Mobile-GTX-1080-1070-and-1060-Enter-Gaming-Notebooks[/url<] Its performance will probably depend on the laptops cooling though, yeah.

          • ImSpartacus
          • 3 years ago

          Ok, I feel ya. Just wanted to make sure there wasn’t new official info out.

Pin It on Pinterest

Share This