Airmantharp wrote:JohnC wrote:Terra_Nocuus wrote:any SLI
Personally I would stay away from any multi-GPU solution, regardless of the GPU manufacturer. With a single GPU you don't have to wait indefinite amount of time for SLI/CrossFire profiles with good scaling or for other fixes for potential issues caused by SLI/Crossfire.Terra_Nocuus wrote:What about USB Wifi adapters? Are those decent nowadays
No.Terra_Nocuus wrote:The a TP-Link WiFi adapter
I have that, it's a pretty good card, works well with my current router (Asus RT-N66U). It supports both 802.11n bands and doesn't have a useless 802.11ac support.
How did you get so negative?
I mean, really- there doesn't have to be a difference between USB and PCIe based WiFi adapters- sure, there's been plenty of crappy USB ones, but I assure you that there's been plenty of crappy PCIe ones too. The only ones that I truly consider solid are the mPCIe cards Intel makes, and they don't even making a 3x3 802.11ac adapter yet.
And how the hell is 802.11ac useless? At the very least, it guarantees a 2x2:2 solution (two channels on 2.4GHz and 5.0GHz). And it uses a different kind of modulation than 802.11b/g/n, similar to what LTE uses. It works very well if you have a good 802.11ac router, which I read as implied in the OP, though I could be wrong. Still, if the OP is interested in .ac, then why would you counter that?
And I do have that TP-Link card- with an external triple-mast antenna purchased separately, and it does work great, though I'm not currently using it at my new place since the router is now next to my desktop. I also have a TP-Link 802.11n router that worked wonders too, but somehow ATT's router is more than functional for now. I'll plug the TP-Link router in when I start trying to load Lightroom Catalogs remotely (and consider an upgrade to 802.11ac across the board if needed...).
As for SLI- well, at least lately, Nvidia has substantially upped their game, while AMD has been found to essentially be non-participating, and the jury's still out on them. So we'll see after this next round of cards hits. But if one were to buy today, a pair of GTX770 4GB cards would likely be the best price/performance bid, easily outperforming any single-GPU solution on the market, none of which are actually fast enough for 1440p/1600p.
Terra_Nocuus wrote:I'm also hoping to hear more about how G-Sync plays into things, which makes me think I might put off the 1440p display purchase for just a while. I'm very excited to see how things will fall out, thanks to the 290 cards. It's a shame they run so hot & loud, but wow, perf/dollar ratio is incredible.
superjawes wrote:AMD doesn't need to panic yet, and no one should pass on AMD GPUs just because of G-Sync. Early on, you're only going to see it implemented in "premium" monitors geared towards uber gamers (so 144 Hz TN displays). As the tech switches to ASICs, I believe it will become less proprietary to Nvidia and AMD will learn or be shown how to tap into the functionality. Even with the FPGA implementation, it should be possible for any GPU to use it (it just doesn't mean that they know how to yet).