Alienware’s Graphics Amplifier brings desktop GPUs to notebooks

For years, PC hardware makers have demoed external modules that pair full-sized desktop graphics cards with notebooks. The idea has always been appealing, but no one has turned the concept into a finished product—until now. Dell’s new Alienware 13 gaming notebook comes with a Graphics Amplifier sidekick that can accept high-end desktop cards like the GeForce GTX 980.

The Graphics Amplifier has its own PSU, and according to AnandTech, it can accept double-wide cards with TDP ratings up to 375W. GeForces and Radeons are both supported, though the pre-installed options are limited to Nvidia-based offerings right now. The external module is also available sans GPU, so users can add their own card.

In addition to housing a graphics card, the Graphics Amplier has a four-port USB 3.0 hub. USB and PCI Express signals are combined on a single cable that connects to the Alienware 13. That connection is proprietary, and there’s no word on how much bandwidth is available. However, it’s worth noting that the link can pass display output back to the laptop, removing the need for an external monitor. I believe this is the first external graphics implementation to include such a capability. A reboot is required to activate the external GPU, though.

Even without the amplifier in tow, the Alienware 13 is a pretty formidable system. It pairs a Core i5-4210U processor with a GeForce GTX 860M graphics chip. The base display uses a 1366×768 TN panel, but there are options for IPS units with 1080p and 2560×1440 resolutions. Up to 16GB of RAM and 512GB of M.2-based storage can be added to the machine, which also has a Killer NIC, 802.11ac Wi-Fi, Bluetooth 4.0, Klipsch speakers, and a 2MP webcam. All of that is wrapped in a 1″ chassis that weighs 4.5 lbs.

Source: Dell

The Alienware 13 starts at $999, but you’ll need to shell out $1149 to get the 1080p display and $1299 for the QHD one. The Graphics Amplifier adds another $270 to the total, and that’s without a graphics card installed. You didn’t think the first external graphics unit since 2008’s ill-fated AMD External Graphics Platform would be cheap, did you?

As much as I dislike the proprietary connection and high asking price, I’ve gotta give it to Alienware for creating the Graphics Amplifier. Hopefully, this development will encourage other notebook makers to bring similar technology to market, ideally ushering in standardized connectors and lower prices.

Comments closed
    • Bensam123
    • 5 years ago

    Still waiting for a laptop daughter system you can simply plug into and ‘upgrade’ your system. Honestly still think this is the best answer to a desktop/laptop combo. Being able to drop your laptop into a docking station that would also power up the ram, graphics, and add expansion ports would be amazing.

    Making something like this work seemlessly where you could plug it it all in on the go would be hard to do though. Something someone definitely could slap their name on.

    • HisDivineOrder
    • 5 years ago

    Too bad about that proprietary connection. Alienware was so close to making something the market would have gobbled up.

    Then they screwed it up, Alienware-style.

      • darc
      • 5 years ago

      That, and the fact that it’s paired with a laptop that already has a moderately powerful GPU, with price, footprint and thermals to go along with it. To make this attractive, you pair it with an ultra-book form-factor at a reasonable price, so that you can have the best in mobile productivity on the go, and a capable gaming system when you get home… as opposed to asking the consumer to shell out hundreds on redundant hardware.

      Still a good step. Someone will pick up the idea and get this right.

    • Antias
    • 5 years ago

    I’m curious…
    You’ve alluded to your distaste for a proprietary connection between the AW13 and the GA Unit, but in peoples opinions, what would be the best non proprietary connection to use instead?
    Just curious…

    PS. the bezel width between those 2 monitors in the image is VERY small… yet large on the outside edges?

      • sweatshopking
      • 5 years ago

      People want thunderbolt. Idc. I DON’T PRETEND TB IS STILL RELEVANT.

        • Chrispy_
        • 5 years ago

        What’s Thunderbolt? Was it that thing from 2011 that nobody wanted and nobody bought?

        I know Apple customers are afflicted with it but they have infinite money and limited choice already, so they don’t care.

        • Antias
        • 5 years ago

        Thanks SSK.
        Curious though if TB is no longer relevant, then what IS relevant and ubiquitous?

          • sweatshopking
          • 5 years ago

          USB 3.0, and display port. Display port is probably the future connection.

      • Chrispy_
      • 5 years ago

      Those bezels are legit. What you’re seeing on the left side is the depth of the screen, not bezel.

      Those models look a lot like the U2414’s I’m using right now and the bezel on three sides is 1mm or so.

        • Antias
        • 5 years ago

        oooh – thanks for that – was wondering…

    • sweatshopking
    • 5 years ago

    apparently the battery life on this machine is sub 3 hours….

    • Chrispy_
    • 5 years ago

    Wait a second – Isn’t the ONLY REASON you bought the expensive Alienware gaming laptop in the first place so that you could get a decent mobile GPU?

    Unless this proprietary connector also does SLI, you’re effectively paying extra to vendor-lock yourself into a solution that [b<][i<]disables[/i<][/b<] the primary reason you chose this vendor in the first place? /OMGFACEPALMSOHARDITLEAVESAREDMARK

      • slowriot
      • 5 years ago

      Weird false dichotomy you’ve created there.

      The expectation isn’t for you to carry the GPU Amplifier over to a friends house. It’s to sit on your home desk so you can enjoy better performance at home. You still have the luxury of a decent mobile GPU when you take the laptop over to a friends house or when traveling. I assume this product is aimed primarily towards people who want only a single computer.

      I don’t like the execution on Alienware 13 itself. It’s honestly quite large and heavy given the components. This tech combined with a laptop like the Razer Blade or MSI’s Ghost series it would be pretty sweet I think.

        • Chrispy_
        • 5 years ago

        But a GPU Amplifier and a GPU costs almost as much as a complete PC.

          • Pwnstar
          • 5 years ago

          No, it’s $300 cheaper.

    • f0d
    • 5 years ago

    if this sort of thing becomes mainstream i will finally have a reason to replace my 6 year old core 2 duo laptop (which still does everything i need)

    it would be good to have another gaming capable machine in the house, most games dont need crazy cpu’s anyways so even a dual core pentium with one of these would be very handy to have

    im not gonna get an alienware just for it but if something similar becomes common on cheapish laptops then i will be one the first to get one 🙂

    • Rza79
    • 5 years ago

    Didn’t Fujitsu already do something similar like 6 years ago?

    [url<]http://www.engadget.com/2008/12/03/fujitsu-siemens-rolls-out-amilo-sa-3650-laptop-graphicbooster-a/[/url<]

      • christos_thski
      • 5 years ago

      Not quite, Fujitsu’s thingy had an integrated GPU (a mobile class one, at that) and you were stuck with that. Alienware’s is a different class of product altogether, as it allows you to use any desktop graphics card you choose, and upgrade as you go along.

    • cynan
    • 5 years ago

    [quote<]However, it's worth noting that the link can pass display output back to the laptop, removing the need for an external monitor. I believe this is the first external graphics implementation to include such a capability. [/quote<] The [url=http://www.anandtech.com/show/4474/sony-updates-vaio-z-thinner-lighter-light-peak-and-external-gpu<]Sony Vaio Z from 3 years [/url<] ago had an optional GPU dock that drove the laptop's display (as well as external ones). Yes, it did not accept modular GPUs and only had a fixed mid-level (at the time) mobile GPU - which was a huge caveat and, I suppose, does not make it all that comparable. And it used Thunderbolt (or an immediate precursor thereof).

    • kamikaziechameleon
    • 5 years ago

    If alienware wasn’t so ugly…

    I’m excited for 3rd party offering of something like this that would house a HDD and a GPU plus other connection options. That is a very enticing sudo desktop supplemental option for mobile professionals who still need a workstation but don’t want to shell out for a full tower solution.

    • UnfriendlyFire
    • 5 years ago

    And meanwhile the Thunderbolt-to-PCIE adapter boxes will still cost you $300 to $500…

    • internetsandman
    • 5 years ago

    Of course it’s proprietary. Wouldn’t want anyone to be able to buy this revolutionary product, nope, you gotta force people to buy the laptop if they wanna use it at all. Thank you, alienware, for providing value that almost nobody else can stoop to

    • 1040am
    • 5 years ago

    There is this interesting alternative.
    But probably say goodbye to your wireless card…

    [url<]http://www.banggood.com/EXP-GDC-Laptop-External-PCI-E-Graphics-Card-p-934367.html[/url<]

      • l33t-g4m3r
      • 5 years ago

      Not really. You can just get a wireless usb adapter.

        • Airmantharp
        • 5 years ago

        Which they could just as easily wire internally 🙂

    • Techgoudy
    • 5 years ago

    Can you guys at techreport benchmark one these things. I’d really like to know how well it performs as an external component.

    It sucks that the connector is proprietary and that this thing is so huge. I can’t see why you would really want to purchase one when you can’t even take it with you comfortably. It’ll be interesting to see how many units they push though.

    • Ninjitsu
    • 5 years ago

    Damn, I was hoping it would use Thunderbolt. Could have been so much more useful.

    EDIT: From AT…
    [quote<] Even if it’s not a full 8 lanes (the Haswell U series CPUs don’t support 16 lanes), given the severe bandwidth limitations of ExpressCard and Thunderbolt, even a 4 lane setup would represent a significant improvement in bandwidth. Like Thunderbolt however this interface does appear to use active cabling (it would be very difficult to carry that much bandwidth externally without it), in which case the cable is going to be an expensive part of the entire setup. [/quote<]

      • Airmantharp
      • 5 years ago

      If only Thunderbolt was ‘there’…

    • slowriot
    • 5 years ago

    The Alienware 13 specs seem pretty poor to be honest. Its 1″ thick and weighs 4.5lbs but they only managed to get a GTX860M inside? That’s bad. And the CPU choice isn’t confidence inspiring either.

    • Andrew Lauritzen
    • 5 years ago

    I really do not get the point of these things, and doubt that I ever will. They seem predicated on the notion that a laptop has much to offer in terms of the components that you would typically put into a gaming desktop, which really is not the case. A discrete GPU is usually a big chunk of both the cost and bulk of a desktop system these days (especially in something like mATX), so if you’re already buying one of those… and a PSU… and a case of sorts, you’re more than half way to just building a machine in that case, and one that would be significantly more powerful and easier to use.

    Laptops are great for mobility and for some folks the ability to pack them away when they aren’t in use is important as well. This sort of setup has none of those advantages, and introduces several more disadvantages.

    IMO there’s a good reason why no one has productized these things yet.

      • cynan
      • 5 years ago

      Totally agree. You may as well spend the extra $300-400 on a complete mini ITX system and then have 2 systems. Heck, you could save some of this cost by going with a more pedestrian laptop and not paying the Alienware tax or instead go with something sleeker and lighter.

      This product is really only for Alienware fanbois (especially since it`s proprietary).

      • Chrispy_
      • 5 years ago

      [quote<]IMO there's a good reason why no one has productized these things yet.[/quote<] This is it - External GPU housings are the solution to a problem that doesn't exist. A desktop PC with a decent GPU these days is pretty much a GPU-support system; they take the lion's share of the power requirements, physical space, transistor budget, financial budget and available cooling to do 90% of the work. There are plenty of benchmarks proving that lowly processors with great graphics cards delivery decent gaming, whilst the opposite reinforces how unimportant everything else is; An i7 with gobs of RAM and the best SSD money can buy is still going to be hopeless at gaming if you pair it with a $60 GPU or use the IGP.

    • Firestarter
    • 5 years ago

    i5-4210U?

    I don’t think I’d want to pair a 200W+ GPU with a 15W CPU, that’s just asking for underwhelming performance bottlenecked by the CPU. It does sound like a great platform though, and it would finally offer an answer to the demands of college-aged gamers who want a laptop that does [i<]everything[/i<].

      • DragonDaddyBear
      • 5 years ago

      1) Games are rarely CPU limited. It helps, but the CPU typically is not the bottle neck.

      2) Just guessing here, but when you run games without the internal GPU doing anything the cooling/thermal overhead should be availabe to pump the CPU to near max turbo settings.

      I would love to see how the Graphics Amplifier affects the CPU speed. Perhaps they could get as close of a GPU as the internal one (same silicon), clock them the same, and test the CPU.

        • slowriot
        • 5 years ago

        1) Numerous games are CPU limited these days. Especially when we’re talking CPUs of this speed.

        2) This CPU, inside this chassis should be constantly at turbo. They’re not fitting much power inside a rather large chassis. (See what MSI, Gigabyte, Razer, etc have all been able to put in thinner and lighter laptops. Faster processors and faster graphics).

          • Firestarter
          • 5 years ago

          If it’s constantly at max turbo then they’ve should’ve picked a faster CPU, especially considering how much more cooling capacity is available when the mobile GPU is disabled (assuming they combined the cooling system for the CPU and GPU)

            • DragonDaddyBear
            • 5 years ago

            My theory is this: if the system could not keep max turbo without the Graphics Amplifier it should with it. Often the cooling of the GPU and CPU are tied to the same chunk of copper.

          • DragonDaddyBear
          • 5 years ago

          I get it, you guys all think the latest CPU is needed. But let’s be honest here, the TR hardware survey shows that people throw the latest GPU behind the not-so-latest CPU. It does help having a fast CPU but if it were such a large difference the practice would not be common.

          I’m guessing the CPU here is slightly faster (2c/4t vs 2c/2t and down on CPU clocks) than the G3258, which handles, at stock, games “just fine,” with the notable exception of the “CPU limited” games.

          Bottom line, show me the numbers and games where it makes such a large difference. Manle is supposed to reduce CPU overhead and it hasn’t been the boon it was hyped to be, though it’s still young. I wager there is better frame latency with faster CPU’s but in the majority (say 75%) of games the that are played a CPU like this, or an older Core i2xxxx, would be sufficient (time beyond, say, 33ms or two display refresh intervals at 60Hz… the most common display refresh with LCD’s) for the needs of most people with the most common GPU’s.

            • slowriot
            • 5 years ago

            [quote<]I get it, you guys all think the latest CPU is needed.[/quote<] Hard to have a discussion if you're going to misrepresent what I said. [quote<]But let's be honest here, the TR hardware survey shows that people throw the latest GPU behind the not-so-latest CPU. It does help having a fast CPU but if it were such a large difference the practice would not be common.[/quote<] Yes, because the TR survey is comprised of people with desktop computers who are progressively upgrading those piece by piece over time as their budgets allow. If the only option was to replace all the hardware at once (like in a laptop!) this wouldn't be the case. If you were to ask me "Hey slowriot, what's the single best bang for my buck upgrade, a GPU or CPU?" I'd say "What for?" and you'd say "Gaming" and I'd say "GPU." But this isn't the discussion we're having. No one pairs a G3285 with a GTX980 (or even a GTX970 for that matter) which is the type of GPU you'd be putting in an enclosure like this. [quote<]I'm guessing the CPU here is slightly faster (2c/4t vs 2c/2t and down on CPU clocks) than the G3258, which handles, at stock, games "just fine," with the notable exception of the "CPU limited" games.[/quote<] What's your point here? The i5-4210U is an expensive mobile GPU oriented more towards power savings than performance. The G3258 isn't either of those. It's more than fine for a budget constrained person to buy a G3258 for their desktop and enjoy "good enough" performance. However, if you're buying an expensive gaming laptop AND ALSO buying an expensive external GPU enclosure AND ALSO buying an expensive GPU to put in that enclosure then I'm willing to bet you're expecting more than "good enough" or "just fine." [quote<]Bottom line, show me the numbers and games where it makes such a large difference. Manle is supposed to reduce CPU overhead and it hasn't been the boon it was hyped to be, though it's still young. I wager there is better frame latency with faster CPU's but in the majority (say 75%) of games the that are played a CPU like this, or an older Core i2xxxx, would be sufficient (time beyond, say, 33ms or two display refresh intervals at 60Hz... the most common display refresh with LCD's) for the needs of most people with the most common GPU's.[/quote<] I'd need you to define what "large difference" means to you. Because it's quite clear to me we're just going to get into a circle of subjectivity on what's a "large enough" difference or what's "good enough" performance or what's an "acceptable" level of detail. But just off the top of my head games like Battlefield 4 and Civilization 5/Beyond Earth are very much CPU limited. But personally, the last thing I'd want to do after dropping close to $2000 or more on a setup like this is to settle for "just fine" or "good enough." Which wouldn't be the case if Alienware had put a more powerful CPU choice in the system. For crying out loud Razer puts a i7-4702HQ inside the Blade and that's a much smaller laptop. MSI managed to do it also in the Ghost series of laptops.

            • DragonDaddyBear
            • 5 years ago

            I wasn’t saying you, specifically, but the readership as a whole. I did not specifically call you out, hence the word chocie of “guys.” In hindsight I could have used forum members or readership. But, in my eyes, the idea that I was singling you out seems far fetched.

            I mentioned the performance of the Intel AE chip because I could not find a direct comparison of the one in this laptop that has been benchmarked before. I suppose I could have said i3.

            Good enogh is purely subjective, but the frame latency series has tried to address when it becomes noticable, hence the 50, 33.3, and 16.7 ms metrics. There are games, though, that are CPU bound. Of that I make no contest. But in GPU bound games, of which most are (I wager 75%), putting a mid/high-end GPU that costs $300 would still make sense.

            For games that at highly CPU bound, it’s hard to use anything but a slab of a barely-mobile lap-crushing/burning laptop being sufficent. But, in the majority of cases, I see this being a great addition to the laptop and not wasted because the CPU isn’t worth getting this additional chassis plus card.

            I would love to see in the TR review the point of diminishing returns with this device and just how much the CPU bottle-necks it. They could test GPU’s against this device and a high-end super-clocked i7 K CPU. Hey, I could be wrong, but I don’t think so, not in the majority of games, and it probably wont in the near future. Not with the CPU benchmark for developer is the the Jaguar cores in the XBone and PS4.

            • slowriot
            • 5 years ago

            [quote<]I wasn't saying you, specifically, but the readership as a whole. I did not specifically call you out, hence the word chocie of "guys." In hindsight I could have used forum members or readership. But, in my eyes, the idea that I was singling you out seems far fetched.[/quote<] Then what was the point in saying it at all if you weren't trying to lump me into this made up group? The group you're trying to paint doesn't really exist either, as shown by the very TR survey you brought up... but anyway... [quote<]I mentioned the performance of the Intel AE chip because I could not find a direct comparison of the one in this laptop that has been benchmarked before. I suppose I could have said i3.[/quote<] [url<]http://www.notebookcheck.net/Intel-Core-i5-4210U-Notebook-Processor.115081.0.html[/url<] That's the best you're likely to find. No one is going to be comparing these chips really. The best you're going to get is indirect comparisons. By the way, the i5-4210U is much slower than you thought. [quote<]Good enogh is purely subjective, but the frame latency series has tried to address when it becomes noticable, hence the 50, 33.3, and 16.7 ms metrics. There are games, though, that are CPU bound. Of that I make no contest. But in GPU bound games, of which most are (I wager 75%), putting a mid/high-end GPU that costs $300 would still make sense.[/quote<] The frame latency benchmarks tries to address when it becomes noticeable... at a given game setting. Further, are you talking about recent releases? Like ones in the last year? Or the games you play? Or all games? What is it? I don't know. If it's all games you'd be right (Hell, it's more like 99.99% then) but for recent releases? For games that are going to be coming out in the time you will own and use this laptop? I don't agree with your 75% if that's the case, but who knows what set of games you're referring to. Just a quick Google brought up this article too: [url<]http://www.techspot.com/review/849-intel-pentium-anniversary-edition-overclock/page13.html[/url<] It's a review of the Pentium AE (again, noticeably faster than the i5-4210U) but this page gives us some insight to the impact lower CPU performance can have on game performance. Pretty eye opening, right? Frankly these results leave me even more negative towards Alienware's CPU choice for the Alienware 13 model. [quote<]For games that at highly CPU bound, it's hard to use anything but a slab of a barely-mobile lap-crushing/burning laptop being sufficent. But, in the majority of cases, I see this being a great addition to the laptop and not wasted because the CPU isn't worth getting this additional chassis plus card.[/quote<] First, you can't game with a laptop in your lap. At least not comfortably, and that goes for all laptops. Second, I already gave you two examples of SMALLER and FASTER laptops than this Alienware 13. Google "Razer Blade" or "MSI Ghost" or "Gigabyte P34W." All of these are thinner, lighter, and better performing laptops than the Alienware 13. [quote<]I would love to see in the TR review the point of diminishing returns with this device and just how much the CPU bottle-necks it. They could test GPU's against this device and a high-end super-clocked i7 K CPU. Hey, I could be wrong, but I don't think so, not in the majority of games, and it probably wont in the near future. Not with the CPU benchmark for developer is the the Jaguar cores in the XBone and PS4.[/quote<] I would be interested in an article like that too. But I think the inclusion of a CPU like the Core i7-4710HQ would be just as important as a overclocked desktop i7. The i7-4710HQ is the type of CPU that should be in the Alienware 13 at minimum. Actually, given that the Alienware 13 really isn't especially small or light (1" thick, 4.5lbs) I think a faster CPU choice than even the i7-4710HQ should be available. I just don't see why they've limited customers to such a slow CPU when their competitors are all able to deliver more in smaller packages.

        • Airmantharp
        • 5 years ago

        Games aren’t typically (but many times are) CPU limited with the fastest CPUs available; this is a 2C/4T version with a 2.7GHz max boost. Even a quad-core variant running at ~4.0GHz can be limiting.

    • GasBandit
    • 5 years ago

    Will it work with older laptops as well? I’ve got an old ASUS K50-AF laptop with an old amd/ati Radeon Mobility 5145 GPU with only a VGA out… would it work with that?

      • Firestarter
      • 5 years ago

      it uses a proprietary connector, so no it will never work

        • GasBandit
        • 5 years ago

        Rats. Wellp, so much for that idea.

Pin It on Pinterest

Share This