AMD XConnect makes living with an external Radeon easier

AMD took the wraps off a new driver feature this morning that it's calling XConnect. As part of its Radeon Software 16.3 drivers, the company has added external graphics hot-plugging support. XConnect also includes management features in Radeon drivers that allow gamers to choose the display they want to use and gracefully close applications running on an external GPU before unplugging it.

According to Ars Technica, Thunderbolt-equipped notebooks need to meet a few supporting hardware and software requirements to be XConnect-compatible. They'll need Windows 10 build 10586 or newer, Thunderbolt firmware v.16 or later, a Thunderbolt 3 port and an active Thunderbolt 3 cable, and the necessary BIOS extensions for Thunderbolt external graphics. AMD says companies can add external graphics BIOS extensions to systems that otherwise meet XConnect's requirements through a firmware update. 

On the graphics hardware side, gamers will need a compatible enclosure and one of the following Radeon graphics cards:

  • Radeon R9 Fury
  • Radeon R9 Nano
  • All Radeon R9 300 Series
  • Radeon R9 290X
  • Radeon R9 290
  • Radeon R9 285 

So far, the Razer Blade Stealth and Razer Core appear to be the only XConnect-compatible hardware. We'd expect the list of compatible systems to grow as time goes on, though.

Comments closed
    • thorz
    • 4 years ago

    Can this be an option for a Retina Macbook Pro 15″ 2015 model? I love this machine but the AMD graphics card it has still sucks for serious gaming. The Core i7 it has is beefy enough for most games so an external GPU and Windows 10 under bootcamp could definitely be a solution.

      • Deanjo
      • 4 years ago

      It would likely work under OS X, not sure about Windows. Other thunderbolt devices on a Mac running windows requires the device to be plugged in during boot (no hot plug in Windows).

      • tipoo
      • 4 years ago

      Unfortunately most of the external GPU action seems to be on TB3, being on TB2 we’re just missing out. Kind of annoying as even a 780TI got the vast majority of its performance when it was tested on TB2, on a Macbook Pro no less. 80-90% in most games, only the very worst case dropping to 50, but that would still be head and shoulders above my Iris Pro model.

      [url<]http://www.anandtech.com/show/7987/running-an-nvidia-gtx-780-ti-over-thunderbolt-2[/url<] I hope some of the TB3 boxes are backwards compatible. I agree, I have the 4770HQ model and the CPU is more than enough for most modern games, just needs a good GPU. And I also agree, the GCN 1.0 GPU in the rMBP 15 is such a letdown. With AMDs size, why not at least a custom Volcanic Islands part customized for the power draw? I know Nvidia was out due to OpenCL.

        • Deanjo
        • 4 years ago

        [quote<]I know Nvidia was out due to OpenCL.[/quote<] That's not the reason at all. There have been several models of Mac with nVidia post openCL. Apple simply switches vendors time to time based on what is available in mass and at a price point. Also being a Thunderbolt 3 devices will work on a Thunderbolt 1/2 machines provided you are going through a dock that offers physical connection compatibility, just at Thunderbolt 1/2 rates. It is the same thing as PCI-e in that regard. [url<]https://thunderbolttechnology.net/sites/default/files/tbt3presentation.pdf[/url<] see page 22

          • tipoo
          • 4 years ago

          OpenCL performance of GCN vs Maxwell. It’s a professional laptop, they’re not going to gimp its GPGPU abilities for a better gaming chip in the 960M for instance. They’ve passed on Nvidia GPUs ever since Nvidia started cutting GPGPU performance out of the consumer line.

            • Deanjo
            • 4 years ago

            Maxwell actually increased nvidia’s openCL performance. And as I pointed out before, Apple flips flops. AMD has arguably had the better openCL performance even when they were using nvidia GPU’s. Don’t be surprised if the Mac Pro / iMac / Macbook Pros flip back to nvidia on their next refreshes.

            openCL came out in snow leopard. From 2009-2011 it was AMD, 2012 to 2014 nvidia, 2014 to present AMD.

            Does this really make it sound like openCL is priority for Apples choice of graphics?

            [url<]http://preta3d.com/os-x-users-unite/[/url<] Apple only supports up to openCL 1.2 (a 2011 spec). I would say that openCL capability is actually pretty low on the list of needed features as far as Apple is concerned.

            • tipoo
            • 4 years ago

            A bug blog post from May, 2015 singularly disproves Apple cares about OpenCL? Say what you want about them, they don’t want to sacrifice their AutoCAD, Mudbox, Maya, etc users.

            Yes, Maxwell does better. But it doesn’t close this gap. Have a look at this, this is why Apple was stuck on AMD for now.

            [url<]http://cdn.arstechnica.net/wp-content/uploads/2015/06/2015-Retina-MBP.010-980x720.png[/url<] The *IRIS PRO* does better than Nvidia on compute (I'm actually a little proud of the little bugger, I have one πŸ˜› [now if only driver updates didn't suck worse than AMD at their worst]). If Nvidia adds compute performance back, I'm sure they'll flip flop again. Not until then.

            • Deanjo
            • 4 years ago

            [quote<]A bug blog post from May, 2015 singularly disproves Apple cares about OpenCL? Say what you want about them, they don't want to sacrifice their AutoCAD, Mudbox, Maya, etc users. [/quote<] That bug is still present in the Apple supplied drivers. Nvidia users are fortunate enough to be able to download the web drivers (and leverage the Cuda code path and post 2011 features) that corrects the issue. AMD equipped Macs..... not so much. Let's put it this way, who here worked for Apple during openCL's development...... /me raises hand I'm pretty damn familiar how Apple chooses what for their product and why.

            • tipoo
            • 4 years ago

            Oh yay, we’re down to appeals to authority.

            We’ll see if there’s another flip flop before Nvidia adds compute performance back to its mainstream lines.

        • tu2thepoo
        • 4 years ago

        Startech will be making a TB3->TB2 adapter ( [url<]http://www.startech.com/Cables/thunderbolt-3-cables/thunderbolt-3-usb-c-thunderbolt-adapter~TBT3TBTADAP[/url<] ). Theoretically you could hook something like the razer core to the adapter and then to a TB2 system, you'd only be losing out on TB3's increased bandwidth.

    • Mr Bill
    • 4 years ago

    Jeff, I’d like to see an article explaining the specifics of how this consumer solution compares to the sharing of $6000 server video cards between workstations at work. Like, how they are hosted and how they are connected to an individual workstation.

    • NeelyCam
    • 4 years ago

    Soo…. why didn’t AMD use Lightning Bolt instead? Didn’t they promote it as a superior [i<]and[/i<] cheaper alternative to Thunderbolt?

      • chuckula
      • 4 years ago

      Let me point out your mistake there:

      [quote<]Soo.... why didn't [b<][i<]AMD[/i<][/b<][/quote<] Yes, but this wasn't AMD. This was the Radeon Technology Group (RTG). You know, the same group showing off racks full of developer systems with the R9-Fury X2 GPUs that are comfortably slotted into Skylake motherboards. The RTG is subtly distancing itself from AMD even though it's still part of AMD.

    • Deanjo
    • 4 years ago

    Amazing how many positive comments here but yet how many articles have mentioned Thunderbolt before previously only to have the masses here call it a deadend tech and USB was the only thing that was ever needed (especially if Apple was mentioned in the same article).

      • LostCat
      • 4 years ago

      I still haven’t managed to get a system with Thunderbolt on it.

      I mean, Firewire was everywhere and I never even knew anyone who used that, but TB seems genuinely useful and…

        • Voldenuit
        • 4 years ago

        Laptops with Thunderbolt 3/USB-C are making the rounds now (and have been since around Xmas or so).

        So far, only Razer and ASUS seem to have models with the proper BIOS hooks to support eGPU that I’m aware of, though.

        • End User
        • 4 years ago

        My ASUS MB from 2012 has Thunderbolt on it.

        Thunderbolt 1 was ahead of its time as far as the PC was concerned but I thinkThunderbolt 3 with Type-C is the best chance Thunderbolt has ever had of becoming a mainstream PC tech. Perhaps they should have just called it PCIe capable Display Port originally.

        I’ve had a Thunderbolt display since 2012 and it was/is a fantastic concept for a display. A Thunderbolt 3 display would kick ass.

      • Kougar
      • 4 years ago

      Thunderbolt is the best of the possible options for AMD here. Intel’s backing has been strong enough to keep the standard alive for several years now. Apple is in love with it.

      AMD cannot create it’s own interface standard as it will never get the support to be anything but dead-on-arrival. And if AMD created one then NVIDIA would create its own to counter regardless.

      Unless Intel execs think Thunderbolt is doing well they should be helping standardize this for the industry just to give Thunderbolt a strong reason for broader adoption. Apple could certainly use some real GPUs hooked up its diminutive Mac Pros for that matter.

      • Voldenuit
      • 4 years ago

      Well, if intel had not integrated USB 3.1, PCIE and device charging with Thunderbolt 3, it would probably have died out as a standard.

      They’ve finally made it compelling enough for OEMs to bother including the chip and for end-users to want it as well.

        • Deanjo
        • 4 years ago

        Charging isn’t a real concern for TB, never has been as the use cases for TB are not charging oriented. USB 3.1 also isn’t a real concern, sure it is nice that the connector allows it to be used for either but again it is not the “killer feature”. PCI-e connectivity is and always has been the killer feature of TB. I would venture that people that use TB would probably rather have a separate port for USB 3.1 connectivity instead of having to daisy chain.

          • Krogoth
          • 4 years ago

          I would argue that the bigger killer app for Thunderbolt in its intended market is the bandwidth and headroom.

      • tipoo
      • 4 years ago

      Thunderbolt 1 and 2 didn’t see much use, external graphics wasn’t a big use case with 1 or 2, so I don’t see that contrast as fair. The interface provided something more with 3, which is now what’s appealing. Plus, using the same connector as USB 3 C is genius.

        • Deanjo
        • 4 years ago

        [quote<]Thunderbolt 1 and 2 didn't see much use, on PC's[/quote<] FTFY. Using the same connector isn't the genius capability it is made out to be. An additional port on a system for USB C and another for TB would have made little difference. It's not like a lot of people are going to be daisy chaining their mouse onto their storage or their external video card. You also still need to use a thunderbolt compatible cable (a standard usb 3 c cable won't do thunderbolt). If you are going to plug multiple protocol devices in, you are going to use a dock and TB 1/2 already had tons of docks that offered multi-function ports.

          • tipoo
          • 4 years ago

          Ok, but you talked about how “the masses” called it dead end tech. For “the masses”, TB 1 and 2 were pretty irrelevant. Now that TB3 is getting something really cool with external GPUs, people are coming around.

          There’s nothing hypocritical about that, is my point. It wasn’t overly useful to a lot of people before, now it slowly is getting more draw, so the opinion is changing with it. External GPUs are probably still only for a fairly small percentage of people though.

            • Krogoth
            • 4 years ago

            External GPUs are going to be a big deal for DTR and laptop gamers, but normal deskop users are going to be indifferent.

            • tipoo
            • 4 years ago

            I largely agree, though even as a gaming PC enthusiast, I’d love to be able to consolidate my computer life into a single system. Quad core i7 laptops have plenty of CPU grunt in an era where dual core i3s can run most games well, it’s just a matter of the GPU now. I have the CPU, plenty of RAM, a really fast PCI-E SSD all in a laptop, the external GPU sure would eliminate a lot of redundant hardware and space with a dedicated gaming rig.

            But I don’t disagree that this is still a relatively small percentage of the PC population that may use this, I said so myself.

      • Krogoth
      • 4 years ago

      Thunderbolt was never a dead-end tech. It was simply a niche solution that will fall into the same vein as Firewire. A technically superior, but a more complicated interface that commands higher price points.

      USB 2 and 3 are good and cheap enough for the needs of the masses. Prosumers will opt for the faster, but more expensive TB for their needs.

      • End User
      • 4 years ago

      Type-C is the secret sauce that will bring Thunderbolt 3 to the masses.

      • Chrispy_
      • 4 years ago

      It might still be a deadend tech.

      The only thing that made Thunderbolt really worth having over USB was the promise of external GPUs for laptops.

      Five years later and it’s still a pipe dream that companies are struggling to solve, and managing only with ridiculous caveats and high costs.

    • RdVi
    • 4 years ago

    How long can active thunderbold 3 cables be? Maybe I can put my 290X in another room…

      • Laykun
      • 4 years ago

      3 meters is currently the maximum length of copper thunderbolt cables.

        • Airmantharp
        • 4 years ago

        So, long enough that the enclosure could be somewhere out of sight and possibly out of hearing range if in a dampened compartment?

          • Redocbew
          • 4 years ago

          Possibly, but you’ve still got cooling to consider. Dampening noise without restricting airflow can sometimes be a bit tricky.

    • Chrispy_
    • 4 years ago

    I am still upset by the size of most PCIe GPU enclosures.

    Most of them are big enough to match a full (mITX) PC. Where’s the GTX950 with a small, integrated DC power brick for the size of your typical 1-bay NAS? Those would actually sell and still be ample GPU to turn your typical ultrabook into a gaming beast.

    • christos_thski
    • 4 years ago

    Is there ANY actual, real technical reason that AMD would limit this to these GPUs? A 7870 would be a tremendous graphical upgrade for most laptops out there, in fact it is being sold by AMD -as an only slightly upgraded R270- as we speak.

      • Andrew Lauritzen
      • 4 years ago

      There are almost guaranteed to be technical reasons this only works with the latest WDDM2 stuff.

        • Flatland_Spider
        • 4 years ago

        Does WDDM2 add back the ability to mix video card drivers? At one point, WDDM couldn’t run multiple video card drivers, so all video cards in a box had to be either Nvidia or AMD.

        At least, it was something like that. I don’t remember the exact details.

          • Andrew Lauritzen
          • 4 years ago

          I don’t actually remember a time when that has been the case… definitely pre Win7. I mean that would break pretty much every laptop out there πŸ™‚

          So yes, you can definitely run multiple adapters from different vendors each with their own drivers.

            • Flatland_Spider
            • 4 years ago

            Which part? The mixed drivers or single driver? You couldn’t run the cards in SLI or Crossfire mode if they weren’t all AMD or all Nvidia, but you could run mixed cards with different drivers.

            WDDM 1.0 (Vista) was limited to a single driver. People were kind of mad at the time since XP could run multiple drivers, and it looks like WDDM 1.2 (Win7) fixed that, now that I look at it.

            I found a slide show from a joint AMD, Nvidia, and MS presentation (http://slideplayer.com/slide/3220231/) that suggests WDDM 2 has improvements to synchronization when multiple GPUs are present. Hmm…

            Anyway, I don’t care enough to spend any more time on it.

            • Andrew Lauritzen
            • 4 years ago

            That slide deck is actually pretty much entirely unrelated to what WDDM2 actually became. Notice it is 9 years old πŸ™‚ The original plans for WDDM2 never really came to fruition as defined – the current version of WDDM2 is a somewhat different direction.

            • Klimax
            • 4 years ago

            You could not mix cards with different drivers under old XDDM . One of major features of WDDM was ability to mix GPU drivers. You have it strongly mixed up there. (You have it completely backwards)

          • auxy
          • 4 years ago

          Hasn’t been that way for some time. At least since Windows 7.

          NVIDIA still (illegally, imo) blocks many features of their GPUs if there is a Radeon card in the system, though.

      • tanker27
      • 4 years ago

      Grumble stoopid phone. Disregard.

    • southrncomfortjm
    • 4 years ago

    So, theoretically (or practically at this point?) you could have your 15 or 17 inch laptop with whatever mobile GPU inside, bring along your XConnect device with high-end GPU on the road with you for top notch graphics.

    Could you also connect the Xconnect device to an external monitor and cut out the desktop all together? Or would the Xconnect only connect to the laptop that then needs to connect to the monitor?

    Interesting although rather expensive sounding solution. I’d obviously want to see what kind of extra latency all these other connections cause for frame delivery.

    • maxxcool
    • 4 years ago

    Love this … and that’s coming from me, a huge intel fan and NV fan. Gj amd..

      • anotherengineer
      • 4 years ago

      Fan^2

      πŸ˜‰

    • ronch
    • 4 years ago

    Next up: NVConnectβ„’.

    • derFunkenstein
    • 4 years ago

    This is great. My employer-owned work computer notwithstanding, I want my entire digital life to be on one PC, not strewn about between mobile machine and a desktop machine. I look forward to the day that I can buy a relatively svelte quad-core Core i7 notebook with “good enough” integrated graphics that I can take with me wherever I want, and a beefy GPU to use at my desk when I want to get my game on.

    Thing is, nobody seems to be putting quad-core CPUs into 13″ notebooks yet. Edit: I know they’re 45W parts, so part of this conundrum is on Intel. If they could make a 35W part even with a lower 4-core turbo that would help considerably.

      • derFunkenstein
      • 4 years ago

      The other thing to consider, I guess, is price. The Tech Inferno forums have been abuzz with mods of a $200 Akitio Thunder 2 enclosure by adding a 200W power supply meant for some AIO Dell machine, or something of the like. That’s Thunderbolt 2, btw. Someone in Russia took that idea and ran with it, and they sell a modded version of the Akitio for like $600 on eBay.

      If I have to pay an extra $250 or so for the convenience, I guess that’s OK. If Razer prices themselves into failure territory with their enclosure (which $600 definitely is), this eGPU thing will tank yet again.

      • anotherengineer
      • 4 years ago

      Quad core in a 14″

      [url<]http://www.bestbuy.ca/en-CA/product/hewlett-packard-hp-pavilion-14-laptop-silver-amd-a8-7410-apu-750gb-hdd-8gb-ram-windows-10-14-ab168ca/10393550.aspx?path=8533633ea9a1ec86ad86b333df6305c3en02[/url<] πŸ˜€

        • derFunkenstein
        • 4 years ago

        I hate you for getting my hopes up.

          • drfish
          • 4 years ago

          My little Clevo 11.6″ W110ER was bought 4 years ago and packs a quad core Ivy. Still love it.

            • derFunkenstein
            • 4 years ago

            Oh, man, forgot about that system. That’s the dream. Would like something a little bigger (13″ MBP size) with a quad, but that’s close.

            • Bauxite
            • 4 years ago

            MSI GS30 has a quad with iris pro even, 13″, came out towards end of haswell timeframe.

            It also has an x16 slot of sorts. Battery life…not so much.

          • anotherengineer
          • 4 years ago

          πŸ˜€

          But it’s a real true quad though πŸ™‚

            • derFunkenstein
            • 4 years ago

            you’re a real true quad. :p

        • cygnus1
        • 4 years ago

        Yeah, AMD marketing…. Too bad that’s not really quad core. It’s 2modules/4threads and that does not equal real quad core. It’s closer to a real quad than an 2core/4thread intel CPU, but that doesn’t make it a real quad core.

          • auxy
          • 4 years ago

          You are mistaken. This A8-7410 is a Carrizo-L unit; it is a real quad-core processor, with four Puma+ cores. A very nice APU for a low-power machine, but it is what it is and as such is not suitable for gaming.

            • MOSFET
            • 4 years ago

            Gaming in a 15W thermal envelope for CPU, GPU, MCU….sounds like it would take the fun right out of gaming!

            • auxy
            • 4 years ago

            Right? Hehehe. ( *Β΄θ‰Έο½€)

            But when you put it that way, the performance of these little APUs at 15W is nothing short of absolutely incredible. 15W for a chip that will -actually run- games like Warframe and Ultra Street Fighter IV at acceptable framerates is nothing short of incredible.

            • Tirk
            • 4 years ago

            At lows resolutions they game quite well for the TDP. Consider the fact that an A8-7410 has the same “slightly updated in fact” cores as the XBOX one and PS4. Granted it has 4 cores instead of 8 but its still impressive.

            • DarkMikaru
            • 4 years ago

            You’d actually be wrong. πŸ˜‰ These little APU’s are actually pretty darn good… can play many games decently.

            [url<]https://www.youtube.com/watch?v=Vj5yTWlmkXw[/url<]

            • cygnus1
            • 4 years ago

            Ahh… I didn’t realize they had started mixing in the feline cores in the same product naming scheme as the dozer cores… I guess they’re taking after Intel there

          • Tirk
          • 4 years ago

          Catch up, that is a Beema core that in fact is 4 full cores. Its IPC is lower than Intel’s chips but considering it has 2 more cores than any Intel chip in its price range it is often faster in overall performance.

            • cygnus1
            • 4 years ago

            Thanks for pointing it out the 2nd time…

            • Tirk
            • 4 years ago

            Yeah posted before I saw the other response sorry.

      • LostCat
      • 4 years ago

      I thought Alienware had a quad core 13″ with an external GPU option already.

        • derFunkenstein
        • 4 years ago

        Alienware’s Graphics Amplifier thingy uses a proprietary connector. I can’t get onboard with it knowing that I’m tied to a specific notebook.

        edit: although it’s “only” $300. That’s pretty nice.

        edit again: the Alienware 13 only comes with dual-core CPUs. [url<]http://www.dell.com/us/p/alienware-13-r2/pd[/url<]

          • LostCat
          • 4 years ago

          Hmm. Didn’t even realize that. Sorreh. πŸ™‚ Still looks like a good machine tho.

            • derFunkenstein
            • 4 years ago

            yeah, definitely. My only concern is that certain games really do seem to want a quad for best performance.

      • tipoo
      • 4 years ago

      I seem to recall they did make a 28 or 35 watt quad, but no one seemed to use it. I guess at those sorts of power limits, most found it would be better to have a faster dual. Though that’s also accounting for the iGPUs TDP, which could be off with an external GPU that could change things.

      Though I have a laptop with the 4770HQ, so I’m set for CPU power…If only it had TB3, shame TB2 is missing out.

        • derFunkenstein
        • 4 years ago

        [quote<] shame TB2 is missing out.[/quote<] On the surface it seems like a completely arbitrary distinction and it's frustrating. On the other hand, both AMD and Nvidia like money, and they're both only supporting TB3, so it must be out of their hands.

          • tipoo
          • 4 years ago

          Something about Intels blessing with external GPU Thunderbolt Licencing fees.

          Frustrating as they payed lip service to external GPUs for every generation of it, but only now seem serious.

      • auxy
      • 4 years ago

      You’d be better off with a fast 2C/4T anyway. Better a 3.5Ghz dual-core than a 2.5Ghz quad-core, especially on modern Intels where Hyper-threading is nearly as good as another pair of cores anyway. (‘Ο‰’)

        • derFunkenstein
        • 4 years ago

        Well, maybe. There is such a thing as the i5-6300HQ, which has 4 cores and 4 threads. Can be configured down to 35W, too. Wowza.

        [url<]http://ark.intel.com/products/88959/Intel-Core-i5-6300HQ-Processor-6M-Cache-up-to-3_20-GHz[/url<]

          • auxy
          • 4 years ago

          Configured down means lower clock rates… still better off with 2C/4T… stop wasting TDP on extra cores that are 5% utilized.

          For the overwhelming majority of things you will do on a computer that can realistically be done on a laptop the only thing that really matters with respect to the CPU (since we can’t select for cache sizes or a variety of architectures) is that single-threaded CPU performance.

        • cygnus1
        • 4 years ago

        No version of Hyperthreading is as good as another pair of cores. That’s why the i5 4C/4T always scores better than an i3 2C/4T running at the same speed. But it will always depend on the workload whether less faster cores beats more slightly slower cores.

        I personally prefer the real quads mainly because while I game I like to leave all my normal crap running in the background. Including my browsers that typically eat up nearly a combined 10GB of ram. A real quad won’t stutter quite as much in that scenario. On a side note, now that games have started using a decent amount of RAM, I’ll probably be bumping my 16GB up to 32GB for the very same reason. More cores are good and paging is bad.

          • auxy
          • 4 years ago

          Depends on what you’re doing: [url=http://www.techspot.com/articles-info/1087/bench/Encoding_02.png<]in some tasks[/url<] Hyper-Threading can be a nearly 100% improvement over the lack thereof and it CAN allow the Core i3 to compete with the desktop Core i5s in some tasks, particularly when the Core i3s have a higher clock rate, as with Skylake. Revise your thinking on hyper-threading. Ever since Haswell the CPU cores themselves have a lot more execution resources to play with, so the extra thread-per-core has more space to fill in. These days, it's a real benefit.

            • cygnus1
            • 4 years ago

            I personally haven’t seen a benchmark of something well threaded that gets that much benefit. Like I said, and I think we’re agreeing on, it entirely depends on the workload. In my experience poorly threaded tasks benefit from the higher speeds on the duallies where as anything well threaded benefits significantly more from the real cores on a quad than from hyperthreading on a dual.

      • Andrew Lauritzen
      • 4 years ago

      > I want my entire digital life to be on one PC, not strewn about between mobile machine and a desktop machine

      I’m completely the opposite – I’ve had too many bad experiences with single-point-of-failure setups like that! My policy is that I should be able wipe any one machine at any time and lose minimal data (say a day or two at most). It’s surprisingly not even that hard to achieve these days. Even my desktop is effectively a fast “cache” for data, steam games, etc. Nothing lives *only* in one place.

      On the other point I know for a fact there are 35W quad cores; pretty sure of the SFF machines and steam boxes use those ones. That said, 35W is still pushing it for 13″.

        • derFunkenstein
        • 4 years ago

        To each his own. I use NAS (well, cheap consumer NAS) as a backup, and the really important stuff is also on Dropbox. With a 200 mbps internet connection, restoring stuff I actually want to play from Steam doesn’t take long, either. I got an opportunity to test this out recently (and by opportunity, I mean my 2TB storage drive totally failed) and I didn’t lose a thing that was important to me.

        The work machine also gets backed up, but it’s backing up to enterprise resources over a VPN. Code goes in TFS, docs go in specific doc locations, and it’s all either in St. Louis or Maine so a house fire doesn’t kill me professionally.

          • Andrew Lauritzen
          • 4 years ago

          > I got an opportunity to test this out recently (and by opportunity, I mean my 2TB storage drive totally failed) and I didn’t lose a thing that was important to me.

          Absolutely but that’s an argument for why it doesn’t need to be all on one machine really πŸ™‚ There’s tons of good ways to share files seamlessly between machines these days. And yeah, all your files should be accessible from all your machines, but that doesn’t imply you should only have one machine…

          I still think there are almost no advantages of this strange laptop + desktop-like-box-that-only-works-when-connected-to-laptop setup though. I get why people instinctively think it’s “cool”, but I think once you sit down and compare it to just having them be separate machines most folks will come to similar conclusions.

          And I’ll again note that people massively overestimate how fast ultrabook chips are compared to desktops. These setups are going to be adequate for “moderate” gaming, but you’re not going to reach up into the high end with this kind of setup.

      • Flatland_Spider
      • 4 years ago

      A dock with a GPU is the first thing I thought of when I first heard about this, and with Thunderbolt and USB merging, this might be pretty common. IBM used to have a Thinkpad dock with room to put an video card in, so the precedent is there.

      Someone was talking about MXM cards on another article, and this would be a good use for them.

      • tanker27
      • 4 years ago

      You make a compelling case Derfunk. One I can get on board with………except for one thing….AMD/ATi. Nope no way, no how. Aint gonna happen.

        • derFunkenstein
        • 4 years ago

        You know that Nvidia’s drivers from late January added TB3 support, too, right? AMD is making a big deal about it but for whatever reason Nvidia chose to not.

          • tanker27
          • 4 years ago

          Hmm did not know that or even noticed.

      • Tirk
      • 4 years ago

      AMD puts Quad cores into 13″ notebooks at or below 35w TDPs. In fact, I have a quad core 25w TDP AMD CPU in my HP Pavilion x360 13z Touch. Its base clock is 2Ghz which makes it quite speedy for its size but I’m sure there will be naysayers to the Beema chip.

      Even HP who made the fricking laptop gives it absolutely no advertising compared to the slower Intel X360 models ugh.

      • albundy
      • 4 years ago

      that’s just waiting for a connector to break/bend/worn out/too loose/not work anymore. i’d rather keep work and play completely separate.

      • Zizy
      • 4 years ago

      I don’t think there is any 35W one by default (without touching cTDP), there is this 25W thingy though.
      [url<]http://ark.intel.com/products/90615/Intel-Core-i7-6822EQ-Processor-8M-Cache-up-to-2_80-GHz[/url<] Embedded. Haven't seen it in the wild yet though.

      • smilingcrow
      • 4 years ago

      They had 37W quads with Haswell and the 45W SkyLakes can be configured down to 35W so it’s just a matter of who wants to do that.
      But until the enclosures become mainstream whose gonna put a quad core with only integrated graphics in a laptop?
      Apple and maybe Dell!

    • backwoods357
    • 4 years ago

    The Razer Blade Stealth looks like a wonderful laptop and I was yanking my card out of my wallet to throw money at them when I noticed there was no option for anything more than 8GB of RAM. Really? Not cool.

      • cygnus1
      • 4 years ago

      I was annoyed with this as well. It’s not like the system is so small they can’t have the soldered 8GB plus a single DIMM slot for upgrades.

        • backwoods357
        • 4 years ago

        I wouldn’t even care if they didn’t make it serviceable or allow memory upgrades, it comes with the ultra thin territory, but at least put an acceptable amount of RAM into the box if I can’t add any. It’s a few dollars worth of chips to make all the difference.

      • derFunkenstein
      • 4 years ago

      Wow that’s a total buzzkill.

    • ImSpartacus
    • 4 years ago

    I think this is actually a bigger deal for Razer.

    I’ve always thought that these early external gpu implementations were temporary and would be replaced by more cohesive Nvidia/Amd-backed solutions.

    But if Razer’s solution was built to support Xconnect, then that’s a decent sized win for Razer. Maybe the Core might have some staying power.

    • DragonDaddyBear
    • 4 years ago

    I might be alone in this, but a Surface Pro 5 that is compatible with this would be mighty tempting.

      • JalaleenRumi
      • 4 years ago

      No, you’re not alone. An external graphics card always fancied me because you don’t have to worry about your board getting fried with all the heat during intense hardcore gaming. My previous laptop had this issue. Everytime I gamed, it would make me worried over how hot the laptop would get.

    • chuckula
    • 4 years ago

    AMD didn’t try to reinvent the wheel but instead used the existing technology in a smart way: Props.

      • shank15217
      • 4 years ago

      Get ready for Nvidia’s solution.. GConnectXXX. Twice as powerful as X connect, requires a special silicon with nvidia hotplug 1.0 support, cuda support will be supported in a future version. PhyX support on 980ti only, dual active thunderbolt cable with reversible connector required.

        • ImSpartacus
        • 4 years ago

        And ridiculously enough, it would probably still be popular because it’s Nvidia. Look at g sync.

        • JalaleenRumi
        • 4 years ago

        If Nvidia name it GConnectXXX then all the kids that go ‘astray’ over the internet will be Nvidia’s fault. At least parents will know who to blame.

        • nanoflower
        • 4 years ago

        Nvidia already has a solution [url<]https://techreport.com/news/29649/nvidia-361-75-drivers-support-thunderbolt-3-external-graphics[/url<] and they didn't even give it a special name.

        • christos_thski
        • 4 years ago

        “The company has added beta support for external graphics cards over Thunderbolt 3 connections. This new feature is supported on all GeForce cards in the 900 series, as well as the GeForce Titan X, GTX 750, and 750 Ti models. ”

        Seems like not only do they support it, but they have not excluded lower end GPUs, like AMD did.

      • ImSpartacus
      • 4 years ago

      But it’s amd, so it very well could end up like dockport.

Pin It on Pinterest

Share This