The GeForce GTX 750 and 750 Ti
The GeForce GTX 750 Ti reference card.
Nvidia has produced a couple of graphics card models based on the GM107, the GeForce GTX 750 Ti and the GeForce GTX 750. Here are their base specifications.
|GTX 750 Ti||1020||1085||16||40||640||5.4||128||60W||$149|
Notice the low TDP ratings, which make possible things like the stubby little cooler on the GTX 750 Ti reference card pictured above. Neither card requires an auxiliary PCIe power connector, so they should be able to go into a whole host of systems where other cards can't quite fit. The recommended PSU capacity is just 300W.
Nvidia pitches the GTX 750 series as ideal for a home-theater PC or a Steam box, many of which are squeezing into ultra-compact cases with smaller power supplies. And dude, if you've gotten a Dell that doesn't have game, one of these cards should be able to slide into a PCIe expansion slot and provide a substantial upgrade over integrated graphics of any sort.
There's not a tremendous amount of difference between these two products, as you can see in the table above. Essentially, the 750 uses a GM107 with one SM disabled, while the 750 Ti keeps all five SMs intact. Beyond that, GTX 750 cards will generally ship with 1GB of GDDR5 memory, while the 750 Ti will come with 2GB. Both of these cards should be available at online retailers right now. A third variant, a GTX 750 Ti with only 1GB of memory, is slated for release later this month for $139.99.
The GTX 750 series replaces several existing products, including the GeForce GTX 650 Ti and 650 Ti Boost. The GTX 750 Ti is stepping into some big shoes at $149, since the GTX 650 Ti Boost is based on the larger GK106 chip, has a 192-bit memory interface, and is a 110W part. That's an awful lot to overcome. Even with Maxwell's higher efficiency, the GTX 750 Ti may not be able to match the Boost's performance. Then again, the 650 Ti Boost has hit end-of-life and is already hard to find in stores. Only one GTX 600-series card, the GeForce GTX 650, will stick around to serve the under-$100 market.
Zotac has provided us with a couple of GTX 750-series cards to review, one each of the 750 and the 750 Ti. The two cards are nearly identical; in our case, the Ti card is the one with the clear fan. Zotac has set the base and boost clocks for both of these cards at 1033 and 1111MHz, respectively, which is a smidgen higher than stock. Even with the handsome coolers and faster speeds, though, Zotac doesn't ask anything more than Nvidia's suggested retail price.
One glance will tell you Asus has taken a more upscale approach with its GTX 750 Ti OC. This card's GPU base and boost clocks are a little higher than the Zotac's, at 1072 and 1150MHz, but its 2GB of GDDR5 memory remains at a stock 5.4Gbps. What you're getting here is an unnecessarily long circuit board that looks to be a custom design, backed up by a needlessly monstrous dual-fan cooler and an unnecessary six-pin aux power input. This card is clearly built for exceeding the GPU's intended specifications via some egregious overclocking. Asus has also upgraded the HDMI port to full-sized and has added—get this—ye olde VGA port, for that one dude who can't seem to let go of his Trinitron.
The extra goodness in this version of the GTX 750 Ti will set you back $10 more than Nvidia's list, or $159.99. One caveat here is the placement of that auxiliary power connector, which is weirdly on the "wrong" end of the card. Having the connector there may be a help or a hindrance, I suppose, depending on the layout of the PC case in question.
|Nanoxia Project S case slides into home-theater setups||13|
|Nvidia previews Xavier SoC with Volta GPU for self-driving cars||15|
|be quiet! Silent Loop AIO liquid coolers hum along quietly||2|
|Microsoft catapults datacenter performance with FPGAs||41|
|Asus J3455M-E mobo sails out with Apollo Lake SoC aboard||20|
|AOC's Agon family of gaming monitors heads stateside||16|
|Google Data Saver improves mobile browsing on narrow pipes||11|
|Toshiba expands its budget SSD lineup with its OCZ TL100||13|
|Rumor: Nvidia and Apple may reunite for future Mac GPUs||30|