Affordable and slim GeForce GT 710 cards hit store shelves

Looking for a simple, affordable graphics card offering better performance than integrated graphics? Look no further than Nvidia's GeForce GT 710 models. This tiny card became available at retail today, and its small size and tame thermal requirements could make it a good choice for a number of systems.

EVGA GeForce 710 2GB passive

The GT 710 GPU hails from the Kepler clan and packs 192 CUDA cores clocked at 954 MHz. VRAM capacity is 1GB or 2GB of DDR3, accessible by a 64-bit-wide bus and clocked around the 1600 to 1800 GT/s mark, depending on the exact model.

While the card's purported performance of "80% faster than integrated graphics" won't win it any awards, there are a number of situations it's nevertheless handy for. GT 710 cards don't need a PCIe power connector, and several manufacturers have both passive and half-height versions available. The GeForce GT 710 offers support for DirectX 12 and three monitor outputs, too. These characteristics ought to make it an interesting option for HTPCs and desktops where physical space and thermal performance are at a premium.

EVGA GeForce 710 2GB

EVGA and other companies have offered no word on pricing, but these cards should be quite affordable. Similar GeForce cards from the 720 series are available from $45 and up, so the 710 should come in under that.

Comments closed
    • south side sammy
    • 4 years ago

    I still use “cheap” graphics cards once in a while. am disappointed there’s no display port on it considering the time in history it’s being made…………. what do you call those little blue ports again…????

    • torquer
    • 4 years ago

    I’ve used some ultra low end Nvidia cards in Dell desktops at work for triple monitor setups. Believe it or not, the Intel HD 3000/4000 GPUs start to struggle when you’re running multiple 1440p displays in Windows 7. So, assuming this could handle multiple 1440p displays without struggling (or better yet, dual 4k displays) there is some real value here.

    For gaming? Nope.

    • JustAnEngineer
    • 4 years ago

    Question 1: Why is this useless crap worthy of a news post on a site like the Tech Report?

    Question 2: Where are your actual benchmark results compared to the integrated graphics in a Core i5-6600 and to a $99 GeForce GTX750Ti?

    Use Skylake’s integrated graphics or save up for a better card. Don’t waste money on something that’s not fit for purpose.

      • Krogoth
      • 4 years ago

      This unit is a lot faster than integrated graphics on all of the Skylake line-up (most units of them don’t even the high-end version of Intel HD 530 ) and is a cheap 2D PCIe card for those who need more video ports or a test card for troubleshooting.

      It can handle any HTPC need while only need passive cooling (It has HDMI 2.0 port).

        • JustAnEngineer
        • 4 years ago

        [url=http://pcpartpicker.com/parts/video-card/#c=163,164&sort=a8&page=1<]$104.99[/url<] Zotac ZT-70605-10M GeForce GTX 750Ti 2 GiB Let's see some actual benchmarks comparing the performance of useless GeForce GT710 to the GeForce GTX750Ti 2GiB card above.

          • ronch
          • 4 years ago

          I once bought a cheap GeForce 7100GS out of curiosity to see how it does against my X1950Pro back then (2007). Didn’t do much for me so I just shelved it, but it did come in handy when I had to replace an office PC’s graphics card that quit. For Word and Excel, you gotta love a 7100GS more than a GTX 960. Power ain’t free, you know.

            • JustAnEngineer
            • 4 years ago

            Newer gaming GPUs drop to very low power states when you’re not doing anything other than running Word or Excel. Have your gaming cake and eat the efficiency gains, too.

        • Dudeface
        • 4 years ago

        Does it have HDMI 2.0? Being Kepler based, I don’t think so… EVGA specs page only lists HDMI 1.4a

          • Krogoth
          • 4 years ago

          My bad, I thought it was a Maxwell chip.

          It is still a good choice for a HTPC build if you don’t need a HDMI 2.0 port.

            • JustAnEngineer
            • 4 years ago

            Using the integrated graphics instead would be a better choice.

            • Krogoth
            • 4 years ago

            Depending on the motherboard and platform in question. It may not be possible.

      • spiritwalker2222
      • 4 years ago

      The card would be perfect for my old wolfdale PC that currently doesn’t have a video card or integrated graphics.

        • JustAnEngineer
        • 4 years ago

        Don’t spend [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16814487219<]$43[/url<] on one of these useless cards.

          • ronch
          • 4 years ago

          Well, if he doesn’t play games, spending even just $100+ on a graphics card doesn’t make sense. Maybe he just wants to keep that Wolfdale rig usable.

          • Krogoth
          • 4 years ago

          It is a $40.00 for a cheap drop-in 2D/basic 3D PCIe card that is supported on current OS. It doesn’t need external power connectors either.

          It wasn’t designed with gaming in mind. Not everyone needs a full blown 950/960 and HD 370 for simple 2D and 3D needs.

      • ronch
      • 4 years ago

      I guess high end just isn’t as exciting as it once was given the pace of advancement. And many of us do have a place for cheap stuff that do their tasks well enough. I just don’t know many folks who would bother shelling out for a cheap graphics card to replace their slow IGPs and still end up not being able to play games that require serious pixel pushers.

        • Krogoth
        • 4 years ago

        It is meant to be a cheap add-in for more ports (VGA, DVI and HDMI 1.4) or you need a cheap card for basic troubleshooting.

        There’s a market for such cards. That’s why it is only ~$40 USD for goodness sake. That’s the same as a run of a mill PCIe USB 3, Firewall controller card or a decent 1GiB Ethernet PCI 1x NIC.

          • willmore
          • 4 years ago

          So, you’re impressed?

      • meerkt
      • 4 years ago

      Answer 1: Based on the number of comments, there’s higher-than-average interest. 🙂

        • willmore
        • 4 years ago

        That’s not how this works. The articles that get the most comments are the super trivial ones and the very serious ones. The middle ground tends to get few comments.

        I would assert that this is one is towards the super trivial side.

        While we’re at it, my crazy conspiracy theory is that these are actually based off of die harvested K1 chips where the ARM processors were bad.

      • Theolendras
      • 4 years ago

      You’re visibly single minded on your enthusiast needs. It could do it to convert a relatively old PC to a decend HTPC investing 50$ compared to 500+ for a new system. Buying a new system is something else entirely tough, in that context tough I would agree completely.

      • K-L-Waster
      • 4 years ago

      Not everyone needs to buy a new system — the use case for these is to provide upgraded video outputs to existing, *older* systems that are used for light computing duty.

      If you have Skylake, you don’t need this, but if you have an old Celeron or Duron system that you just want better video outs for, this is serviceable and saves you needing to replace the rest of the system.

    • ray890
    • 4 years ago

    I love how they re-use a same model number they’ve already used back in 2014… Why not name this, the GT 910.

      • crabjokeman
      • 4 years ago

      Because it’s not a Maxwell Gen2 chip..

    • Chrispy_
    • 4 years ago

    Kudos to EVGA for making these things physically tiny, the second one pictured in particular – because so many board vendors choose to make them double-wide and half-height, which is so, so wrong for most mITX and tiny cases that have only a single-width slot.

    What I don’t understand is why they’re still using Kepler (well, clearly I do – it’s for old inventory clearance reasons) but there’s the GM108 as used in the 830M, 840M, 930M, and 940M is literally three times the performance – 3SMM’s compared to 1SMX – of the old Kepler chip yet has a similar 25-35W TDP. As others have mentioned, there are also desirable HTPC and connectivity features missing from Kepler than were added with Maxwell V1.

    I’m pretty sure there’s a market for a more expensive product, perhaps using the latest silicon but having a premium cost associated with small form-factor, and higher-quality passive cooling. People will pay a bit more if it means that the damn thing will actually fit!

      • MOSFET
      • 4 years ago

      [quote<]Kudos to EVGA for making these things physically tiny, the second one pictured in particular - [/quote<] Exactly - the second one is interesting from a multiple monitor perspective. The word framerate doesn't belong in the discussion.

    • Meadows
    • 4 years ago

    According to the specs, this is about one third of a 750 Ti and should expect framerates accordingly.

      • meerkt
      • 4 years ago

      If I got it right, the 720 (which I assume is slightly faster than 710) vs. 750ti is:
      30% CUDA cores
      80% clock speed
      50% ROPs
      17% memory bandwidth

      192 cores @ 950MHz for the 710 is strange, as that would make it faster than the 720 in this regard.

        • Topinio
        • 4 years ago

        The retail GT 710 is the same chip as the retail GT 720, same number of cores enabled, and is clocked faster so TDP is up from 19 W to 25 W.

        It has the same RAM configs available too.

        I have to assume that NVIDIA has gimped it in the ROPs or TMUs or both, otherwise it would out-perform its bigger sibling (though nowhere seems to be listing this).

      • Hinton
      • 4 years ago

      750 Ti doesn’t require a powerconnecter either.

    • LoneWolf15
    • 4 years ago

    Make me one of these with a cut-down Maxwell 2 and I’ll buy it immediately.

    I don’t want gaming performance for the product, but I do want 8-bit/10-bit H.265 HEVC hardware decode.

    I don’t think nVidia has anything lower-end than the GTX 950 that does it though. =(

      • egon
      • 4 years ago

      Noticed so many HTPC enthusiasts interested in the GTX960 (and later the GTX950) for that reason, but those cards are otherwise such overkill :-/

      Be a pity if the GT 710 doesn’t fill that niche.

      • crabjokeman
      • 4 years ago

      Exactly. If I’m going to buy an HTPC card, I want the latest PureVideo/VDPAU goodness instead of a last gen retread. I’m not sure why Nvidia struggles so much with making good cards for HTPC’s.

    • mkk
    • 4 years ago

    Having used the 720 for things like expanding low budget OEM office machines purposely limited to a single VGA port, power brick PSU limit included; this meager level of TDP makes them little heroes.

    • K-L-Waster
    • 4 years ago

    Actually had a co-worker ask me recently about addressing a problem with his Dad’s system that would be a good fit for these type of cards.

    Use case is an old system owned by a person who only does light computing and who had a VGA monitor that has finally bit the biscuit. The motherboard is old enough it only has a VGA out, and any replacement monitor the user buys will of course not have a VGA in.

    Either of these cards would be an inexpensive way to get a usable video output to a more modern monitor.

    • wingless
    • 4 years ago

    This could be good for my P4 retro gaming system that has a PCIe slot. Windows XP games will be…..probably still slow, but playable.

      • meerkt
      • 4 years ago

      Depends on how much retro.

      More likely, anything 3D will benefit greatly from a faster card. I wouldn’t be terribly surprised if even mid- to high-end cards from Nvidia’s 7000 or 8000 series (2006-2007) are quicker.

      I also wonder if compatibility will be better with older cards, especially when combined with older drivers.

      • DrCR
      • 4 years ago

      For that purpose and ~$40, I’d look for a 9800 GT or the like on ebay.

      Edit: You may be able to lurk and snag something like a 660 or 750 for not much more too.

      • Laykun
      • 4 years ago

      Good luck finding drivers.

      EDIT: What the hell, they still make modern drivers for Windows XP?!

        • meerkt
        • 4 years ago

        Yes, and good on them. XP still has a significant enough market share, though I don’t know how many of those game. The XP drivers are stupidly missing some basic features, like full brightness range on HDMI in standard resolutions.

          • meerkt
          • 4 years ago

          Above post downvoted because, uh, someone’s unhappy that Nvidia supports XP?

          Sheesh.

      • BobbinThreadbare
      • 4 years ago

      You’d probably be better off with a faster DX9 card tbh.

    • barich
    • 4 years ago

    I have a GT 720 in my HTPC. It’s got a Sandy Bridge i3 in it, but the motherboard doesn’t have HDMI out.

      • Concupiscence
      • 4 years ago

      I was about to ask how the 710 stacked up against the 720…

        • willmore
        • 4 years ago

        Vertically.

        • jokinin
        • 4 years ago

        I bought a passively cooled ASUS Geforce GT730 that has twice the CUDA cores (384), which makes it fast enough for older windows XP games, but to my dismay, it has 64bit DDR3 memory only, which severely cripples its performance due to the very low 16GB/s memory bandwidth.

        • Topinio
        • 4 years ago

        Faster clocked same chip, so higher TDP (25 W). Same RAM config. Supports resolution up to 4096×2160, up from 2560×1600, though on HDMI it’s only up to 30 Hz.

        I have to assume it has fewer ROPs or TMUs or both, but spec sheets are not listing that.

          • EndlessWaves
          • 4 years ago

          The resolution change is probably just marketing unless these are shipping with DisplayPort as standard (oh please, oh please, oh please).

          So we appear to have a config that’s faster and cheaper but with lower end branding.

          Methinks this signals the death of the GT 720. Hopefully because nVidia’s going to make the (long overdue) move of launching GM108 (830M/930M, 840M/940M) on the desktop. If we’re really lucky it’ll be a new GM209 chip with HEVC Main10 support, although if they do that it may mean we don’t get VP9 for another three years.

            • ozzuneoj
            • 4 years ago

            Yeah, nvidia really should release low end Maxwell parts. They have to run out of Kepler chips eventually, right?

            … who am I kidding, they were still using Fermi chips at the low end with the GT 705 that was used in some OEM systems.

    • ronch
    • 4 years ago

    Reminds me of the value proposition of APUs.

    Serious Gamer? Get a GTX 960 or better.

    Non-gamer? Intel IGPs are fine.

    Gamer on a budget? Save up and get a GTX 960 or better.

      • NTMBK
      • 4 years ago

      Meh, the APUs made sense a few years ago- Llano and Trinity were “good enough” in their day. Problem is they stalled since then. Richland was just Trinity with a new sticker on it, and Kaveri is too memory-bottlenecked to be any faster.

      Hopefully DDR4/HBM will spice things up again.

      • tipoo
      • 4 years ago

      Hey now, 750TI is a decent “slightly better than console” entry point. The DF budget box with an i3 and a 750TI constantly outpaces the PS4s visuals AND framerate.

      Granted I guess since “serious” is a wobbly qualifier you could cut the line higher. But if it can run 95% of new games acceptably I’d lump it in.

        • Chrispy_
        • 4 years ago

        There are a whole bunch of ridiculously popular games that you can actually play on an IGP and if an APU or low end GPU is all that their case can accommodate, it’s a decent improvement that saves those people forking out for a whole new machine.

      • rechicero
      • 4 years ago

      I know several ppl playing with APUs and very happy with the choice. Not everybody plays FPS or want to pay for IQ. Good enough for free is the best option for them.

        • ronch
        • 4 years ago

        They’re happy with it because they’re (a) AMD fans, (b) not demanding, or (c) they’re happy with how ‘capable’ APUs are given their price. Satisfaction almost always take into account how much one paid for something.

        If I paid $8,000 for a small Suzuki vehicle and it does the job well enough (drives well enough, offers enough utility and quality given the price, etc.), I’d be pretty happy with it too. But if I paid $40,000 for a small Benz, even if it also does a good job, I won’t be as satisfied with it as I would be with the cheap Suzuki. After all, I paid 5x more for it so I expect more from it.

        I reckon people are happy with their APUs because they’re cheap and cheerful, but they’re not really that good.

          • NTMBK
          • 4 years ago

          From Steam, the 5 most popular games right now are:

          881,512 Dota 2
          589,465 Counter-Strike: Global Offensive
          61,354 Team Fortress 2
          47,761 Grand Theft Auto V
          70,271 Football Manager 2016

          [url<]http://store.steampowered.com/stats/[/url<] Out of those, there is only one game (GTA V) which I wouldn't happily play on an APU. There are plenty of fun games out there which don't need a Titan X to run.

            • JustAnEngineer
            • 4 years ago

            They don’t need a useless GT710 card, either. Run ’em on integrated Intel HD530 or step up to the GTX750Ti.

          • rechicero
          • 4 years ago

          C) seem like a pretty damn good reason to be happy. APUs are really great for ppl that just want to play MOBAS, MMORPG and similar and don’t care much about maximizing detail. I like more eye candy, you do as well… but they don’t care about it and, for them, APUs are the best option: exactly what they need… for free.

      • Theolendras
      • 4 years ago

      I mostly agree, although I admit I might go the other way. I’m definitely building a new PC this year (who would blame me, I’m still on Thuban) and if Kaby Lake does provide interesting SKU with eDram I might do a dual stage upgrade. Skip monitor and GPU purchase for another year and game almost any 2015 games at 1080p.

      • jessterman21
      • 4 years ago

      [quote<]Gamer on a budget? Save up and get a GTX [i<]950[/i<] or better.[/quote<] FTFY

    • NTMBK
    • 4 years ago

    Hi Bruno, is that actually [b<]G[/b<]DDR3, or is it DDR3? Other sources claimed it was the latter. These look kind of useful for putting in ancient machines to bring them up to date. Get a few more years out of that Core 2 Duo. What sort of video decode block does it have? Hardware h.264 decode?

      • chuckula
      • 4 years ago

      It’s probably GDDR3 simply because I doubt Kepler is setup to actually use plain vanilla DDR3 (DDR3 is a much different beast than GDDR3, which is actually a derivative of DDR2 and not related to DDR3 at all).

      Edit: Whoops, I stand corrected. See below.

        • morphine
        • 4 years ago

        Oddly enough, it’s actually DDR3. The article is updated to reflect this.

          • chuckula
          • 4 years ago

          Fair Enuff! I didn’t realize that plain DDR3 in GPUs was really a thing, but these are *low* end devices.

            • NTMBK
            • 4 years ago

            Well GDDR5 is based on DDR3, so it kind of makes sense that they would design a memory controller which can handle both.

            • tipoo
            • 4 years ago

            It’s fairly common under the midrange line. Heck the Xbox One uses it.

      • meerkt
      • 4 years ago

      Graphics cards started supporting hardware decode of H264 about 10 years ago. The first generation or two may have been limited in some regards, but it shouldn’t be a concern nowadays. Today the things to worry about are stuff like 4K decode, 60Hz, H265, etc.

      NV:
      [url<]https://en.wikipedia.org/wiki/Nvidia_PureVideo[/url<] AMD: [url<]https://en.wikipedia.org/wiki/Unified_Video_Decoder[/url<]

      • morphine
      • 4 years ago

      That is indeed correct, it’s DDR3 and not GDDR3. Thanks for the heads-up.

      Nvidia’s spec sheet lists PureVideo HD as supported, which in turn [url=http://www.nvidia.com/object/IO_43029.html<]includes an H.264 decode block[/url<], so you should be covered there.

        • NTMBK
        • 4 years ago

        Thanks for the info!

      • Deanjo
      • 4 years ago

      Same video decode block as the GT-510, GT-610, etc. They all have VDPAU feature set D which allows hardware decoding of h264 up to 4k resolutions (no h265 hardware decoding support).

    • EzioAs
    • 4 years ago

    Do these things really offer better performance against newer iGPUs or do they simply exist to give more display outputs?

      • chuckula
      • 4 years ago

      Compared to a high-end IGP from AMD or Intel: I wouldn’t expect these things to win. Compared to a low-end IGP, they could win.

      But it kind of doesn’t matter since high-performance graphics are like a step function: Either your solution works well enough to be usable or it fails. A GTX-710 getting 13 FPS vs. an integrated part at 10 FPS might “win” in a technical sense, but both parts fail in a real-world sense.

        • Pitabred
        • 4 years ago

        But they said it’s 80% faster! 18FPS is totally playable, right? Even if you do have to run at a resolution and detail level that you can barely tell a building from a tree from an orc…

      • BobbinThreadbare
      • 4 years ago

      It gives you more ports, and it gives you modern drivers with modern features (video decoding, supporting newer opengl and directx for things like AutoCAD etc).

      There are situations where you don’t need fast 3d acceleration, but you do need modern 3d acceleration.

        • ozzuneoj
        • 4 years ago

        Cards like these aren’t for brand new systems with new IGPs. They are for older systems with plenty of CPU power but outdated IGPs.

        A card like this will add a ton of multimedia capabilities over an Intel GMA or Geforce 6150 that is crippling an otherwise usable Core 2 or Athlon II system.

          • BloodSoul
          • 4 years ago

          Thank you for seeing how this card can actually be used… I’ve come across a number of people that seem to think this market segment is worthless, but they are failing to see that not everyone that needs a cheap graphics solution has a core i7

            • MathMan
            • 4 years ago

            I have an older i7 without iGPU. This card would be good enough to drive an 4K monitor for programming. No need for anything fancy 3D.

            • Krogoth
            • 4 years ago

            Exactly, it is so strange there’s a large number of people here who assume that discrete cards are only good for gaming.

Pin It on Pinterest

Share This