The Skylake Core i3-6320 is the gamer’s new best friend

Today brings the official introduction of Intel's broad lineup of Skylake processors. This is the launch that brings us more affordable chips to go along with the Core i7-6700K and i5-6600K, which were released early.

As a result, we now know how the entire lineup of socketed desktop Skylake processors looks. There are a bunch of them beyond the two K-series parts, and they all have power ratings of 65W or less. Here's a look at the Core i3, i5, and i7 models.

That's a lot to take in, and I'm not sure what to make of each and every product. My general sense is that Intel would be better off presenting a simpler set of choices to consumers rather than tailoring chips for so many different fine gradations of price, speed, and features.

I think many PC enthusiasts will find the unlocked Core i5-6600K to be a better option than anything in the table above. That quad-core chip has a 91W power envelope and a 3.9GHz Turbo peak. For $242, it's more attractive to me than any of the regular Core i5 or i7 offerings.

If you're a PC gamer on a tight budget, though, I think you should give a long, hard look at the Core i3-6320 for $157. This chip has dual cores and four threads, which is probably optimal for most of today's games. We've seen games take advantage of four threads in many cases, but they rarely seem to care about whether chips are "real" quad-cores or just dual-core parts with four hardware threads. And heck, Skylake has improved Hyper-Threading compared to past generations.

At 3.9GHz, the i3-6320's two Skylake cores should offer a nice peak amount of per-thread performance, which makes this chip my Amdhal's Law special of the month. This CPU looks to be ideal for a mid-range gaming build. Unless you're going to be doing video encoding or something like that, you'll probably rarely miss the additional cores.

Since this is a Core i3, there's no Turbo involved, either. Both cores just run at 3.9GHz as needed. Also, the i3-6320 has a seriously skimpy 47W power envelope, so you could probably get by with a wimpy stock cooler and a lower-capacity power supply than you'd need for something with more cores.

For what it's worth, I'd like very much to back up my analysis with some testing, but Intel turned down our request for anything other than a 6700K for review. We'll have to consider doing a full review once the new Skylake desktop chips are available for purchase on the open market.

Speaking of good CPUs for gaming, you'll notice in the table above that there are no socketed Skylake processors with eDRAM onboard. That's a whole other story, which I've written about right here.

That said, there are more Skylake-derived parts coming out today. Here are the budget Pentiums:

These are interesting only in the sense that they might replace our beloved Pentium G3258 Anniversary Edition, yet there is no successor to that unlocked overclocking dynamo being introduced today. We also have no indication of future plans for an unlocked Skylake Pentium, either.

Those folks building home-theater PCs or mini-ITX systems may be interested in the T-series parts, which are low-power variants of Intel's socketed desktop CPUs.

The 6700T squeezes an awful lot of power into a 35W envelope. Ain't cheap, though.

By the way, although the specs tables above come from Intel, I've clipped out some of the columns so that they'll fit our format. Just know that all of the socketed Skylake chips support two memory speeds: DDR4 at 2133 MT/s and DDR3L at 1600 MT/s.

Skylake processors and systems based on them are slated to be available in Asia starting today, and Intel expects Skylake products to become available in North America "over the next six weeks." Two socketed desktop parts have been shipping in limited volume in the U.S. for several weeks, but availability has been spotty. Hopefully, that situation will be remedied as the full range of Skylake parts floods the market.

Comments closed
    • bintsmok
    • 5 years ago

    Based on my tests, Core i7 and Core i5 have a much better 99th percentile frame time than a highly-clocked Core i3 when it comes to CPU-intensive games. See links below

    [url<]http://www.overclock.net/a/intel-core-i3-vs-core-i5-vs-core-i7-gaming-performance-with-geforce-gtx970[/url<] [url<]http://tipidpc.com/viewtopic.php?tid=299752&page=1[/url<]

    • NapTownOC
    • 5 years ago

    Question about binning. So the 6320 is clocked at 3.9 and the 6100 at 3.7 Is one going to be theoretically more OCable than the other? Is there a reason the 6100 weren’t sold as 6320s like they just couldn’t hit 3.9 stable. Any more info or theories on this would be appreciated!

    • raddude9
    • 5 years ago

    The [quote<]Core i3-6320 is the gamer's new best friend[/quote<] ... Really? What is that based on? where are the reviews? The i3-6100 is only 5% slower at peak clock-speed but the 6320 costs $40 (30%) more. Could that speed difference even be measured in fps? And wouldn't that $40 be better spent on a better GPU? On the other side, spend $50 more and you have a machine with 4 real cores. Wouldn't a real 4-core CPU give you much better longevity in the face of DirectX12? Seriously, of all the chips that were released here, the i3-6320 would be well down the list of CPUs that I'd personally recommend people to buy.

      • jessterman21
      • 5 years ago

      Uh oh Damage, now you have to get an i3-6100 and a 6320 to compare, or readers will grab their pitchforks again. And don’t you dare test Project Cars!

    • brothergc
    • 5 years ago

    I am looking very hard at that i3 6320 and maybe a H170 chipset board what do you think ? Good buget build ?

    • brothergc
    • 5 years ago

    “For what it’s worth, I’d like very much to back up my analysis with some testing, but Intel turned down our request for anything other than a 6700K for review”
    wonder why that is ? Is intel afraid that it would hurt their higher end CPU sales ? hummm just wondering

    • captaintrav
    • 5 years ago

    I’ve been using i3s for gaming on a budget since Sandy Bridge. You’re not going to be CPU bound if you skimp on the graphics card anyhow. Instead save a few bucks to go i3 instead of i5, and use that on an SSD and/or better graphics card. On a tight budget, you’d be choosing between an i5 and a 750Ti or an i3 and GTX 960. Big difference, unless you want to game at lower resolution/settings. Of course this assumes a tight budget and primary use as gaming. Wish someone would do an honest comparison, but I suspect many games nowadays probably don’t see that much performance gain from 4 cores, heck some might even decrease if you cheap out and get a slower clocked i5.

    edit: here’s a quick result I found.

    [url<]http://www.techspot.com/review/972-intel-core-i3-vs-i5-vs-i7/page5.html[/url<] TLDR is, with a GTX 960, little to no difference between i3/i5/i7 Haswell chips with 1080p gaming and max details in most titles. GTX 980 changes the picture. I'd like to see TR do inside the second testing like this, though, but the fps looks to change little.

      • Ninjitsu
      • 5 years ago

      The games techspot are testing are not really CPU bound, but I wholeheartedly agree with your TLDR conclusion.

      • chuckula
      • 5 years ago

      To expand on this, one big purported draw of DX12 and Vulkan is that they reduce CPU overhead and include better threading. That should benefit both low-core count parts like the i3 and high core count parts like quads with Hyperthreading or the big 6 and 8 core parts.

      From what we’ve seen so far, the biggest real-world gain has actually been in the lower-core count parts like the i3 since the bigger CPUs were already not particularly constrained by DX11 anyway. Hence, despite all the noise about how you need moar coars to be future-proof, the ironic effect is that DX12 and Vulkan should make having a chip like the i3 [i<]more[/i<] desirable going forward.

    • jessterman21
    • 5 years ago

    Thank you Scott. I mentioned a 4GHz Skylake i3 as the definitive budget gaming CPU when the i7 was reviewed and got downvoted off the page…

    That i3-6100 slots nicely into the basic $600 console-killer build I’ve been evangelizing to my friends and family.

    • windwalker
    • 5 years ago

    I’m curious as to how one would choose the Core i3-6320 as the best deal.
    It seems to me that the G4400, i3-6100 and i5-6600 should be the most cost effective options.

    • Flapdrol
    • 5 years ago

    Is bclk overclocking possible on these? or did they lock it down?

      • Damage
      • 5 years ago

      Pretty sure it’s locked down on anything that’s not a K-series.

        • ronch
        • 5 years ago

        And AFAIK Intel has never sold unlocked dual cores except for the Pentium Anniv Edition (PAE for short).

        • Flapdrol
        • 5 years ago

        too bad

        • Ninjitsu
        • 5 years ago

        Do check, though…just in case they let it through but didn’t talk about it.

        • sleepygeepy
        • 5 years ago

        Just curious…

        If we paired the non K-series Core i5-6400 with a Z170 motherboard, would overclocking the processor be possible by just raising the BCLK frequency?

        The multipliers used by the Core i5-6400 seem to have a wide gap ranging from 27x to 33x compared to the i5-6500 and i5-6600 model which are next in line.

        I just find it strange why it has the lowest base multiplier among the desktop processors and that includes the Core i3 and Pentium models (?)

        • Krogoth
        • 5 years ago

        I thought BCLK overclocking was tied to the chipset not the CPU.

      • DKL
      • 5 years ago

      It is possible, although it requires a very roundabout method!
      [url<]http://forums.elitegamingcomputers.com/index.php?/topic/4081-skylake-unlocked-with-undisclosed-blck-mod/[/url<]

    • LoneWolf15
    • 5 years ago

    I was seriously lamenting the fact that I just bought a used i5-4590S/ASUS mITX mainboard combo to update my HTPC after reading this.

    Then I saw that none of them have iGPUs worth squat, and felt happy. If they don’t have Iris, no need for me to be interested. HD4600 has hybrid HEVC decode, which will be enough for the foreseeable future with four cores.

    • maroon1
    • 5 years ago

    Pentium G4500 & G4520 has HD 530 ?! So, that means the iGPU is comparable to the i3 part and much better than haswel pentium parts

    Also, core i3 also look impressive as it has higher clock speed than haswell, and only 47w TDP

      • divide_by_zero
      • 5 years ago

      My thinking exactly. w00t, time for an HTPC refresh!

    • Ninjitsu
    • 5 years ago

    So I would have said that the i3-6100 is the real steal here, but they’ve lopped off 1MB of L3. Not sure how much that’ll make a difference, and whether it’ll justify the additional $40 premium for the 6320. The 6320 is also fairly close to the i5-6400, I wonder how it compares though.

    Is BCLK overclocking available on these chips, I wonder?

    I don’t fully understand the point of the 6400 either, comparing to the 6600T, except that it’s cheaper. Knowing Intel, that’s probably the point! πŸ˜€

      • eofpi
      • 5 years ago

      Benchmarks will say how much the cut-down L3 matters, but if Intel hadn’t twiddled that knob (or if it doesn’t matter much in games), 25% less cost for 5% less speed would be quite a bargain.

      Funnily enough, the price difference between the 6600T and 6400 is the same as the electricity bill difference for running the chip full-tilt for a year at 10 cents per kW-hr.

      But the chips that really make me wonder are the 6500, 6300, 6500T, and G4500. Each is close in performance and price to another chip (usually higher, but sometimes both ways). Does Intel really move enough of them to justify their SKUs?

        • Ninjitsu
        • 5 years ago

        [quote<] Does Intel really move enough of them to justify their SKUs? [/quote<] Probably OEMs look to save as much as possible or have convoluted requirements that us mortals can't fathom. πŸ˜€

      • Krogoth
      • 5 years ago

      BCLK overclocking is possible with all Skylake silicon since its clock generator isn’t on the silicon and isn’t tied to anything else.

      The only caveats is how much overclocking the motherboard can handle and you need to get overclocked DDR4 sticks to keep up.

    • tsk
    • 5 years ago

    Now I want an android tablet with a Core-M. Make it happen Dell, or suffer the consequences.

    • DarkMikaru
    • 5 years ago

    This ever so faithful AMD FanBoy just threw down $249 bones on Newegg for a shiny new Intel Core i5 6600k and boy do I fee dirty. See…. what had happened was…..

    Came home from recent business trip, powered on my beloved 8350 and my APC battery backup alarm tripped on, Windows said it couldn’t load because it’s corrupted and my baby wouldn’t boot anymore. Since I travel for work constantly I just didn’t have time to trouble shoot as I had to pack and leave first thing Monday morning. A few days earlier, as I am a Newegg EggXpert I was offed the chance to review the Gigabyte GA-Z170X-Gaming 7 LGA 1151 Motherboard. As long as we review it in time we get to keep it. So from the standpoint of basically having 1/3 the upgrade cost of going Intel absorbed I thought. Hell…why not. I clearly need a new system (until I can figure out what the freak happened) and if I don’t like it I can still sell it and make my money back.

    Now that I see these cheaper i3’s on the horizon I kinda wished I had waited. As all I needed to do is get the CPU & Ram so I can review the board. But I might keep the 6600k, we’ll see. Sorry Ronch, I wasn’t planning on upgrading it just kinda happened. I plan to find out what happened to the 8350 as it will bother me until I do. But if I like the 6600k I’m not sure what I’d repurpose it for later.

      • ronch
      • 5 years ago

      It wasn’t your fault, so there’s no need to apologize (why do I get the funny feeling people associate me with the FX?). Enjoy your new CPU. πŸ™‚

      • ronch
      • 5 years ago

      Probably a dying mobo. That’s what happened to me. MSI 990FXA-GD65. System started to show drive read errors. Thought it was my SSD. Turned out, the mobo was slowly dying until it failed to power on altogether. Replaced it with a Gigabyte 990FXA-UD3 rev. 4. Works like a champ.

    • guardianl
    • 5 years ago

    Given that both Dragon Age Inquisition and Far Cry 4 fail to startup up on dual-core CPUs (although they will work on the hyper-threaded i3s) I think the dual core era is finally being shoved out. Intel might be holding on for market segmentation reasons, but I expect we’ll see more games force the issue with this generation of consoles.

    Spending 19% more on the i5 seems like the better choice, although the high clock of the i3 is interesting.

      • DarkMikaru
      • 5 years ago

      Yeah, but your forgetting that probably most PC users don’t care if Dragon Age will run or not. Dual Core is still plenty for the masses and that is what those chips are probably all about.

      Especially the OEM’s who are just trying to crank out bang for the buck systems. These i3’s & Pentiums will sell like hot cakes.

      • Ninjitsu
      • 5 years ago

      Dual core is plenty for the non-gaming office-work crowd.

      • Flapdrol
      • 5 years ago

      Those games were patched, work fine now.

    • ronch
    • 5 years ago

    That’s one fast dual-core CPU. However, at $157 boxed, it’s inching too close to Core i5 territory. And for those who also look on the AMD side, the FX-8350 at $170 looks a little tempting. Yes, yes, I know what you’re gonna say, but really, the choice between a 2-core/4-thread chip for $157 or an 8-core for $170 will probably make most buyers think for 5 minutes. With most newer games using 4 or more threads and with the FX surely able to handle any game that’s old enough to still be using 1 or 2 threads, the FX does present a credible alternative even though it’s getting old.

      • FuturePastNow
      • 5 years ago

      I think the FX would be a no-brainer for a budget build if it were more power efficient. And had more modern motherboard options.

        • ronch
        • 5 years ago

        Honestly though, if you’re even the kind of buyer who would go for an i3 instead of an i7, I don’t think you’d go for a high end graphics card such as a GTX 970 either. More likely, you’d be getting something like a 260X or 750Ti. With that level of graphics, the FX should do about just as well as an i3 or i5, and so PCIe 2.0 (instead of PCIe 3.0) probably won’t hold you back.

        Also, while M.2 and all those new storage options look compelling, I don’t think they’re noticeably that much faster than SATA SSDs in real world use. I’m not sure if they carry a premium either in terms of cost per GB, but if they do, the budget buyer probably wouldn’t go for them either. Of course he can always go for a PCIe-based M.2 host if he wants to get an M.2 drive today for future proofing (like the Silverstone card recently featured here at TR), so not having an M.2 slot on the motherboard may not be such a big deal. Maybe he’ll have some trouble booting off a PCIe card with an M.2 SSD on it, but personally I don’t see getting a SATA SSD instead of an M.2 SSD as something to sulk over.

          • w76
          • 5 years ago

          The CPU and GPU stuff aside, really have to disagree on the M.2 and other motherboard/platform features part. Seriously, look how long Sandy Bridge systems have remained viable — they still are today, frankly. CPU development has stopped moving forward in leaps and bounds, for <insert favorite reasons here>. Why on Earth would you start life with a platform that, on Day 1, is utterly obsolete? Why not get platform that has some capacity to adopt the upcoming standards like M.2 as time goes on? You’re talking about a budget build but ignoring the platform is being penny wise, pound foolish, to a huge degree, IMO.

        • DarkMikaru
        • 5 years ago

        AMD vs Intel – Does AMD really wreck your power bill?

        [url<]https://www.youtube.com/watch?v=fBeeGHozSY0[/url<] Interesting video.... in the end, AMD doesn't cost you more than Intel on power. Not enough to care at all. EVER.

          • ronch
          • 5 years ago

          That was hilarious!

          • Glix
          • 5 years ago

          And for those that don’t pay their electricity supplier in $?

          Power matters, don’t pretend that electricity is free.
          It makes a difference, maybe not for you, but for me it does.

          Budget builds have to take into account all aspects. Otherwise it isn’t a budget build?

    • anotherengineer
    • 5 years ago

    “The Skylake Core i3-6320 is the gamer’s new best friend”

    And the G4520 looks like the budget gamer’s new best friend at 60% less!!

      • mczak
      • 5 years ago

      Two threads is a bit limiting at times. Plus, Pentiums still lose AVX/AVX2.
      That said, the big news (if it’s true) with the pentiums is that (except the lowest model) they now seem to come with GT2 graphics (as evidenced by having the same HD 530 moniker attached to them as the higher end siblings). So, that would be a big performance gpu performance increase for them (and quite a threat to AMDs chips I might add).

        • anotherengineer
        • 5 years ago

        True, but if you’re on a tight budget, the graphics card still might end up being the bottleneck anyway.

    • Bensam123
    • 5 years ago

    8300 for $115. I’ve seen them as low as $85. They come with a unlocked multiplier , turbo to 3.6, and easily hit 4.2 with a bit of lax OCing. Still my number one recommended budget build processor.

    [url<]http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9494387&CatId=11857[/url<] You can argue that it's antiquated, but for a budget build an effective 'quad' core chip is pretty boss, especially when it can end up cheaper.

      • auxy
      • 5 years ago

      Platform latency is so high, though. It’s difficult to suggest an FX chip when even a Pentium will give better performance in the overwhelming majority of games that don’t care that it only has two threads. ( ;βˆ€;)

        • Bensam123
        • 5 years ago

        Most games have been multithreaded for the last decade. Not sure what you’re talking about as far as ‘platform latency’.

        Overall use of a system isn’t just playing games. The fact that a two threaded processor can ‘make do’ in ones that aren’t coded very well and it performs on par does not mean all games will be coded poorly. It also doesn’t detract from the 8300.

          • chuckula
          • 5 years ago

          [quote<]Most games have been multithreaded for the last decade. Not sure what you're talking about as far as 'platform latency'.[/quote<] SURE! Let's look at how Ashes of the Singularity -- ya know, that DX12 masterpiece that Oxide made with AMD's direct involvement -- shows off AMD's amazing 8-core FX powers: [url<]http://www.pcper.com/image/view/60354?return=node%2F63600[/url<] Yes Kids, that's right! Using a GCN video card with AMD's full [s<]Mantle[/s<] [u<]DX12[/u<] stack, an i3 4330 [b<][i<]flat out beats a 4.3GHZ 8-core FX part[/i<][/b<]. So games really are multi-threaded... in as much as they work a LOT better when the chip has two or more real cores.

            • Bensam123
            • 5 years ago

            As I mentioned you below, a i3-4330 is $20-50 more expensive then a 8300 on a budget build. A overall system involves more then just games and it involves more then one game.

            You’re rocking the sensationalism tonight though. Seems as though I’m supposed to ooh and ahhh at Ashes for some reason. Speaking of kids, I’m apparently not up on the new hotness. Am I supposed to be arguing about Ashes when it comes to budget builds? You really seem to want me to do that for some reason and I’m not sure why.

            • chuckula
            • 5 years ago

            As I mention right now, the i3-4330 is easily $50-60 more [i<]valuable[/i<] than the higher-end FX-8370 that it flat-out beat in an AMD sponsored game benchmark. Let's not even get into the rest of the gaming world where AMD didn't sponsor the development. Considering that the downclocked 8300 is even worse than the flagship 8370, the only logical conclusion is that the 8300 is even a worse purchasing idea than its more expensive sibling. You seem to be very keen to know the price of everything and the value of nothing.

            • Ninjitsu
            • 5 years ago

            But the i3-6100 is just $2 more expensive, now…

            • Bensam123
            • 5 years ago

            And you can find the 8300 even cheaper depending on when you check…

            Although it would be interesting to see a comparison of the two.

            • chuckula
            • 5 years ago

            Neither Newegg nor Amazon even directly carry this magical FX-8300. The best I could find were some sketchy affiliate links to used chips. [Edit: Oh yeah, a used FX-8300… for $140 after S&H… who’s ripping people off exactly? [url<]http://www.amazon.com/AMD-FX-8300-FD8300WMW8KHK-Processor-938-pin/dp/B00P7OPODC/ref=sr_1_2?ie=UTF8&qid=1441208502&sr=8-2&keywords=FX-8300[/url<] ] But then again, you already knew that you were lying through your teeth since you made that post about the magical prices on the magical 8300 but then intentionally failed to post a single link. If you want to look at used chips, I could find a great deal on a used Sandy Bridge part that would annihilate these toys.

            • Bensam123
            • 5 years ago

            It’s from Tigderirect, I’ve posted the link about five times in the thread, including in the opening post. I don’t know why it matters that there is only one supplier. Tigerdirect is definitely a reputable site and they constantly have them in supply (or the 8310 which is sometimes found for the same price).

            TR first pointed out this chip on one of their Friday deals.

          • auxy
          • 5 years ago

          “Most games”? Really? You are so divorced from reality it is brain-bending. (Β΄Π”βŠ‚γƒ½Maybe if we’re talking about two threads…

          I was kinda sad when you got suspended or whatever but now I’m not sure it was worth the concern.

          David Kanter said it best:[quote=”David Kanter, on the TechReport Livestream”<]The only thing that matters for client workloads is latency.[/quote<]The FX series are server processors designed to handle wide, heavy loads. They have poor memory performance, poor floating-point throughput, and yes, high overall platform latency due to the aged HyperTransport interconnect and extremely long CPU pipeline. They're just not suited for desktop PC tasks, and especially unsuited for gaming, which overwhelmingly [url=http://www.anandtech.com/show/8774/intel-haswell-low-power-cpu-review-core-i3-4130t-i5-4570s-and-i7-4790s-tested/5<]still prefers two fast cores over many slower ones.[/url<] I love AMD as much as any rational person but recommending the $115 FX-8300 just doesn't make sense unless someone is reeeally into video encoding and has a ludicrously tight budget. And even then, you've still got to buy a graphics card of some description, which puts you in Core i3 territory ... and a modern Core i3 (like one of these Skylake chips) will do that as fast or faster anyway, whether CPU or QuickSync. Didn't you, yourself, move from an FX to an Intel rig to resolve issues with encoding while streaming? [i<][edit][/i<] Oh, and [url=http://store.steampowered.com/search/?sort_by=Released_DESC#sort_by=Released_DESC&category1=998&page=1<]here's the list of new games on Steam.[/url<] You wanna go through this list and tell me that 'most games' are multithreaded? This is stuff that came out today, mind you; there's a whole PAGE of stuff that came out today and yesterday. You think Ballistic Overkill uses four threads, with that Core 2 Duo CPU recommendation? What about Exile's End, with its 32-bit 2D retro aesthetic? Hmm? What a load of crap.

            • Bensam123
            • 5 years ago

            Is there a measure of this latency as a influence on video games?

            Budget builds are all about cutting corners and getting the most bang for your buck. There are tons of threads that run while you’re in windows. The whole OS is multithreaded.

            There are budget graphics cards as well. You could throw a extra $100 you’d spend upgrading to a quad core at a better graphics card, or less then that. We’re operating under the assumption a 8300 is a completely inferior chip, which it’s not… If it holds it’s own against a two core chip AND it’s better at common desktop workloads and every day use, it’s win-win. Just because a i3 would be better at some games does not make it a overall better selection.

            Auxy, open up resource monitor > cpu tab, thread count. H1Z1 just as a game I have open has 55 threads running on it. You can see the game being balanced across multiple processors as well if you select the process.

            I switched from a 8350 to a 4690k as I was playing Planetside 2, which favors single threaded performance quite abit. The Forgelight Engine is a pile of garbage and right now sucks from a massive memory leak in H1Z1 because DBG can’t figure out how to plug it and it has for the last half a year. The only option was to throw more hardware at it.

            …and I STILL ended up having to buy a capture PC because it didn’t fix my problems completely.

            And please stick to mature comments. Anyone can insult someone else and act condescending.

            Not sure why you’re bringing up my suspension over a year ago unless this is some backhanded attempt at trying to twist my hand, in which case you can shove it.

            • Bensam123
            • 5 years ago

            Just to stipulate as Laykun has pointed out, there is a difference in terminology being used here. Auxy hasn’t once talked about frame time and has just referred to ‘latency’, which I interpreted as time from input to execution.

            That aside all my prior arguments still apply.

          • travbrad
          • 5 years ago

          Most games have been multi-threaded, but that usually only means fully taking advantage of 2-4 threads, not 8. Even games that do use 8 threads often just offload small parts of the workload to most of those threads, while a few main threads do all of the real work.

        • Krogoth
        • 5 years ago

        It makes no difference.

        CPU hasn’t matter that much for gaming for a long time. The games that are typically CPU-bound (mainly strategy-games in the late-stage) are crawling on both AMD and Intel platforms.

        The real reason to go on the Intel side is that their platform is more up to date (PCIe 3.0 and NVMe) and is easier on power consumption (cooler/quieter system)

        Latency is a non-issue for either chip. The human element and I/O (newer games tend to stream their data constantly) are far bigger factors in “latency” than the CPU/memory.

          • auxy
          • 5 years ago

          I like that you’re posting on techreport.com, a website famous in the most part for breaking the story that actually, even minimum frame rates still don’t tell the whole story, and yet, you’re still going to sit here and tell me that CPUs don’t matter.

          Even if there were no difference in average frame rate — and there is — tests on this very website have shown a dramatic difference between AMD FX processors and Intel’s chips in latency-focussed frametime metrics, like the 99th percentile charts on, oh, every article.

          Repeating some old canard about how the CPU doesn’t matter for gaming on this website in the second half of 2015 is just kind of ridiculous. (・へ・)

            • Krogoth
            • 5 years ago

            They don’t make a difference for the actual user experience. The GPU has a far larger influence than the CPU in majority of modern games unless you like to game at 1024×768 or lower which in that case even the “lowly” AMD chips are capable of pumping out 200+ FPS and still go under 16.7ms in the worse case.

            The tests are purely academic exercises that are meant to show differences between the platforms on paper, but in practice you couldn’t tell the difference in a double-blind test assuming the application isn’t GPU-bound.

            The instances where the CPU is clearly the bottleneck, both platforms struggle under the same conditions.

            • auxy
            • 5 years ago

            Stop wasting my time with your idiotic and transparent mental gymnastics, you cretin. “The GPU has a far larger influence” is not the same as “the CPU does not matter”. God, I hate people like you who try to twist words. CPU single-threaded performance is very important to a fluid gaming experience. I HAVE all this hardware myself, it’s clear as day.

            [url=https://techreport.com/review_full/23246/inside-the-second-gaming-performance-with-today-cpus<]Go read this article again[/url<] and then come back and tell me the same crap. Better yet, tell Damage that same crap. That last five percent of frames being markedly worse makes a HUGE difference in subjective performance as the game stutters and hitches visibly more often. Nevermind that the average framerates are just total crap comparatively; the FX-8370 gets stomped on by an i7-2600K. You can drink the kool-aid all you want but don't piss on my face and tell me it's raining, and I'm not gonna stand by and let you crap in somebody else's mouth and tell them it's chocolate. Whatever your agenda is, you're spewing misinformation everywhere. Take it somewhere where people don't know better already.

            • Laykun
            • 5 years ago

            Although your attitude is toxic (seriously, no need to call people names), as an engine programmer I can certainly attest to single core performance being a real blocker in games and agree with you on that point. Sure you can multi-thread a game but usually you can only multi-thread something that is easily made parallel, but sometimes there are just tasks that just need to be run in serial fashion that really choke one low per core performance machines. I think the worst example I have seen lately is the ArmA series / DayZ, the view space culling is incredibly bad, and a simple increase in FOV can crush your frame rates as a single core chokes as it iterates a object list. From what I heard Dying light was a pretty bad example too.

            Semantic wise I think the use of the word “latency” is a poor choice in this scenario, you’d be looking more at “frame time” as a rendered frame may not be latent/behind input (given appropriate input prediction).

            • Krogoth
            • 5 years ago

            The problem is that situations where it is because an issue. Both latest Intel and AMD chips struggle. You have something that shudders and something that shudders a little more. It is jarring for those who want a smooth experience.

            Otherwise, the GPU is the bottleneck is most cases especially if you want to game at 4Megapixels with all of the works.

            In summery, if you going to budget a gaming rig. You should prioritize GPU performance over CPU performance.

            • Bensam123
            • 5 years ago

            Yup, I thought every time Auxy mentioned ‘latency’ she was referring to input to output on the PC, not frame time as she hasn’t once referred to it as frame time or pointed to frame time benchmarks.

            That aside, there are plenty of games that aren’t that dependent on single threaded performance, like Battlefield for instance and then it all comes down to the GPU.

            • Ninjitsu
            • 5 years ago

            Why so aggressive, auxy? Krogoth isn’t right but s/he’s not calling you names, either.

            • auxy
            • 5 years ago

            Because I’m [b<]TIRED[/b<] of this crap, all over the internet! I'm tired of hearing the same old tired [s<]bull[/s<]Stierscheisse dragged out over and over and OVER! I have had it up to my sinuses with "common knowledge" and "conventional wisdom" based on crap from ten years ago and I'm drowning in self-absorbed narcissists who can't even be bothered to admit when they've made a mistake! My patience is utterly exhausted with people making authoritative assertions based on half-truths and half-misinterpreted data. I am completely fed up with dishonesty and misinformation and I have zero tolerance left for people who don't have a rational, balanced viewpoint. DON'T make judgments until you have ALL the facts and NEVER express your viewpoint authoritatively unless you can bring out the data to back it up, and if you CAN'T do that, then SHUT THE HELL UP! \( ; Π” ;)γƒŽγ‚· I'm just really tired, Ninjitsu. I'm tired of being sick [i<][I have chronic health issues][/i<], I'm sick of being tired. [url=https://www.youtube.com/watch?v=8eLvkqKijeQ<]I'm tired and I'm cold, and I want to go to bed, but there's no-one here to tuck me in.[/url<] [sub<]Don't worry, all the shotguns are downstairs in the armory. Listen to the song if this seems like a weirdly dark non-sequitur.[/sub<]

            • Ninjitsu
            • 5 years ago

            😐

            • Krogoth
            • 5 years ago

            It is because it has been [b<]true[/b<] in the LAST 10-15 years. The CPU's importance in the end-user experience for mainstream usage patterns and gaming has diminished. The GPU matters far more which is why when you are budgeting a gaming system. You always try to get the fastest GPU that your budget can permit before you start focusing on the CPU. Even when you get into the CPU. You only need a half-decent CPU chip (lower and mid-range) for overwhelming majority of the games out there. High-end chips don't yield that much of an improvement. That's why Intel and AMD have shifted their CPU markets to reflect this. The performance chips from both camps are catered towards HPC, workstations and servers. Extreme Editions chips are just rebranded workstation-tier stuff that remove ECC support but are completely unlocked. I've been around the scene for over 20 years. I remember back when CPU was the absolute king in gaming performance and there's was a massive world of difference between a Cyrix 6x86, K6 and Pentium II. I remember back Athlon and Pentium 3 kept leap-frogging each other and completely smash their predecessors in mainstream and gaming usage patterns.

            • Bensam123
            • 5 years ago

            I agree… I expected more from Auxy then another Chuckie asshole. So much for respect.

            • Krogoth
            • 5 years ago

            Great, another FPS-junkie that is absolutely convinced that 100FPS+ rendering matters for the end-user experience. >_<

            Placebo effect is strong with this one.

            • Freon
            • 5 years ago

            You didn’t follow his link, did you?

            • Krogoth
            • 5 years ago

            Already read it back when it was new.

            It pretty much confirms that CPUs don’t matter as much as the GPU when gaming performance is concerned and the difference between the platforms isn’t that earth-shattering unless you care about pumping out as many frames as possible and want to reduce the occasional shudder as much as possible. The only crowd who cares about that? FPS-Junkies that are still convinced that 120FPS frame-rate makes a noticeable difference. A vocal minority. I’m willing to wager that majority of PC gamers wouldn’t be able to tell a difference between platforms in a [b<]double blind test[/b<]. The story doesn't change with current generation of silicon.

            • tfp
            • 5 years ago

            So 50-100% increase in FPS from changing the CPU is not worth while?
            [url<]http://www.pcper.com/image/view/60354?return=node%2F63600[/url<] The standard Krogoth response is only funny when it isn't completely wrong.

            • Krogoth
            • 5 years ago

            That’s a strategy game in end-game stages which where CPU matters, but even on a Skylake it i not exactly a “smooth” experience. ~40FPS versus ~25-30FPS (on lower-end stuff) is hardly earth shattering.

            You see similar results elsewhere in Supreme Commander, Civilization V, Galactic Cilvilizations II, Planetary Annihilation etc.

            • Bensam123
            • 5 years ago

            Are we reading the same site? The i3 (which is what we’re comparing this chip to) shows the same performance. A 8300 is a downclocked 8370 that sells at the same price as the i3 (or much cheaper depending on which you’re looking at and if you find it on sale).

            • f0d
            • 5 years ago

            [quote<]The only crowd who cares about that? FPS-Junkies that are still convinced that 120FPS frame-rate makes a noticeable difference. A vocal minority. I'm willing to wager that majority of PC gamers wouldn't be able to tell a difference between platforms in a double blind test. The story doesn't change with current generation of silicon.[/quote<] [url<]https://techreport.com/news/25051/blind-test-suggests-gamers-overwhelmingly-prefer-120hz-refresh-rates[/url<] people CAN notice the difference

            • Krogoth
            • 5 years ago

            Blind test =! double blind test

            Blind test can be distorted by bias on the testers/experimenters part….

            • f0d
            • 5 years ago

            whatever
            i can certainly tell the difference between 60>120 on my own monitor and obviously many others can too

            it seem like YOU are the only one denying that it makes a difference

            • Bensam123
            • 5 years ago

            It entirely depends on the game you’re playing Auxy. It’s definitely not single threaded performance. And it’s entirely possible for a game to become ‘bogged down’ that is heavily multithreaded by running on a one or two core processor.

            The article is definitely old, it doesn’t even have a 83XX series processor on it and based on the Battlefield benchmark (which is a reasonably well threaded game), a 8150 ties most of the i5s.

            (Insert some stupid condescending comments here I don’t have time to type in all the replies to you)

          • Bensam123
          • 5 years ago

          Keep in mind we’re talking about a budget build here. State of the art and budget don’t necessarily go hand in hand, especially when there are other options available.

            • Krogoth
            • 5 years ago

            Exactly.

            Intel side offers newer platform specs and lower power consumption that commands a higher premiums (DDR4 is still pricey).

            If you don’t care for PCIe 3, NVMe support or power consumption. AMD platforms are good for budget builds. They offer the same overall gaming experience as a Skylake chip. You may experience a “few” more hic-ups here and there but you can’t can really argue since it is a budget-minded system.

            • Bensam123
            • 5 years ago

            Aye, I think most people are departing from that and striving for the best solution. Sure a i3 may beat out a 8300 in a couple games, but it’s about the overall system.

        • ronch
        • 5 years ago

        Presented with a real-world scenario though, most buyers don’t buy PCs just to play games; they use it for other things as well. One does have to look at the entire picture. And in the real world, people do multitask and do a bunch of other things with their computers, plus the fact that more and more devs are learning to code for more and more cores. In this regard, at $115 as Ben says, I think the FX-8300 does present a real alternative. And hey, it’s quite cheaper too.

          • Ninjitsu
          • 5 years ago

          It’s not much cheaper anymore, though – look at the i3-6100…

          • auxy
          • 5 years ago

          Not really. You have to buy a graphics card. And the real-world performance in say booting up or even in simple things like launching applications is still slower. ( ;βˆ€;)

          It’s a sad state and I lament where AMD has ended up but let’s not kid ourselves.

            • ronch
            • 5 years ago

            What about the video card? You’d need a discrete video card if you’re serious about gaming whether you get Intel or AMD.

            • auxy
            • 5 years ago

            [quote=”ronch”<]Presented with a real-world scenario though, most buyers don't buy PCs just to play games; they use it for other things as well.[/quote<]So are we buying a gaming machine or not? Ξ£( οΎŸΠ”οΎŸ;) If we're buying a gaming machine, then CPU performance is important and you should have an Intel chip. If we're not buying a gaming machine, then you don't need a GPU for the Intel box, while you would for an FX.

            • ronch
            • 5 years ago

            I said… people buy PCs not only for games. They may very well use it for other things. They’d still game on it, so they’d need to plug in a proper graphics card. I never said they either buy a PC for gaming or a PC that’s strictly for office use with no gaming whatsoever.

            For gaming, sure, Intel rkcks. But chances are the user will need it for other tasks in which having more cores could be better. Don’t count the FX out just yet.

            Why so edgy today, Auxy?

            • Bensam123
            • 5 years ago

            I thought we were building a PC that does more then play games as my original post was ‘budget build PC’ not just ‘it only plays games’, which I’ve stated about a dozen or so times in this thread and all you do is go off on some sort of BS rant.

            You can build a budget machine that plays games AND does other things. It doesn’t need to be one or the other. We’re looking at the best overall chip. The world isn’t so white and black.

            • auxy
            • 5 years ago

            You’re the one making it black and white.

            If you’re building a machine which doesn’t play games, an Intel processor is better.

            If you’re building a machine which sometimes plays games, an Intel processor is better.

            If you’re building a machine which mostly plays games, an Intel processor is better.

            It’s a better value for the money all the way around in every case.

            Is the AMD processor sufficient? Yeah, sure. It’s even a reasonably decent value for the money. That doesn’t make it a good choice, though. The comparable Intel processor performs better with less power for the same or very slightly more money and it has a ton of ancillary benefits too (like continued driver support, and all the benefits of a newer platform.)

            • Krogoth
            • 5 years ago

            Not really.

            AMD chips are better at multi-tasking, content creation and VMs for their price point (~$100). Intel has nothing other than Pentium and i3 which are only good at causal computing and are somewhat better at gaming in games that are CPU-bound.

            Platform itself costs more if you want to get a low-end Skylake chip since DDR4 still has the “new” premium on it and there is no budget-minded 1xx series board yet. It is better with Haswell-era stuff since you can still use DDR3 and there are a good array of budget-minded 9 series boards.

            • Bensam123
            • 5 years ago

            You aren’t making a value comparison at all. You compare a i3 only in select games and then refer to some CPU graphs in those select games. Where are the system benchmarks? You seem to have neglected those… Or any other sort of benchmarks, say encoding or photoshop.

            It’s sufficient in some games, on par in others, and much better in everything else then a i3… That’s the whole reason I recommend it. It’s the package deal.

            Newer platform doesn’t matter for a budget system. Almost no one is going to use a m.2 slot or benefit from pci-e 3.0 bandwidth on a budget build. OVERCLOCKING on the other hand, I could definitely see people who don’t have a lot of cash using… And the I3s can’t do that, but we haven’t even scratched the surface of that.

            Power usage matters for data centers and mobile.

            • Krogoth
            • 5 years ago

            There is no difference in mainstream applications between the two platforms for causal usage patterns. Intel platform will load-up a few demanding apps a few seconds quicker, but AMD chip can handle loading multiple applications/VMs at the same time. Intel only comes back around if you were to get their workstation-tier stuff.

            For gaming there is almost no difference between the two in situations where the CPU is clearly the bottleneck.

            They either breeze through or struggle under stress.

          • Laykun
          • 5 years ago

          With the way people “multi-task” in the real world I’d wager that the i3 would still come out on top. Since the majority of people are multi-tasking chrome tabs and maybe some music playback there isn’t going to be the need for a large multi-core array and I doubt the thread context switching would make an i3 even begin to sweat. In this scenario the amount of RAM a user has would affect the user more than number of dedicated cores/thread scheduling hardware.

          • Kretschmer
          • 5 years ago

          It’s a real alternative for niche, niche workloads.

          • Bensam123
          • 5 years ago

          Yeah, we need a updated budget article that isn’t just focused on chips that are ‘budget’ chips, rather chips that are in the price range.

        • Milo Burke
        • 5 years ago
        • anotherengineer
        • 5 years ago

        But as a renderer though……………

        115 + ecc ram is pretty cheap

      • chuckula
      • 5 years ago

      Yeah and given their obsolete platform, I’d take the Skylake part.

      P.S. –> And no, DX12 is not your miraculous saviour either. Behold DX12 AotS CPU scaling with the vaunted 390x and note that an 8-core FX part [Edit: a freakin’ 4.3GHz 8 Core FX space-heater to boot] is [b<][i<]STILL SLOWER[/b<][/i<] than a practically throw-away Haswell i3: [url<]http://www.techpowerup.com/img/15-08-18/79b.jpg[/url<] Please feel free to demonize AotS as not being a real benchmark, I'll copy & paste for the next time you pretend that only AMD can do DX12.

        • Bensam123
        • 5 years ago

        Yup… M.2 on a budget build, lol.

        Not sure what DX12 has to do with this. If anything DX12 would make all lower end processors look better… A i3-4330 is more expensive then the chip I pointed out from anywhere from $20-$50 on a budget build.

          • chuckula
          • 5 years ago

          [quote<]Not sure what DX12 has to do with this.[/quote<] Oh Rlly? Old Bensam123 didn't seem to have much of a problem spouting his own propaganda: [quote<]I think it'll be interesting definitely how the 8350 and the module variants perform in the next year or two. Mixing something like this with the direction consoles are going, which will also take games along for the ride, yielding significant results in mulithreading, may actually cause a paradigm shift. Imagine the 8350 being better then Haswell in two years? ...possibly[/quote<] [url<]https://techreport.com/news/25545/amd-heterogeneous-queuing-aims-to-make-cpu-gpu-more-equal-partners?post=770098[/url<] [quote<]Overall I'm pretty happy with mantle. Having encountered tons of CPU spikes while trying to stream BF4 with a 8350 pre-mantle, it has pretty much eliminated all of them. I originally had to turn down resolution to 540p/encoding setting/opencl in order to get semi-decent performance and even that I'd still get spikes with. One thing I noticed is Ambient Occlusion adds quite a bit of variation to frame times. Turning it off almost makes for a rock solid line, where as if you turn it on you get spiking. I'm guessing it's not fully optimized yet. Everything else you can leave on Ultra (tested on a R9-290) with little to no variance in frame times. [/quote<] [url<]https://techreport.com/discussion/25995/first-look-amd-mantle-cpu-performance-in-battlefield-4?post=798224[/url<] Did New Bensam123 get a lobotomy? I believe that Robert Downey Jr. has a piece of advice for you: [url<]https://www.youtube.com/watch?v=1Y3FzVQi-R8[/url<] If you can't take Mr. Downey's advice, I suggest an alternative solution that will make everyone else here very happy: Try disconnecting your keyboard.

            • Ninjitsu
            • 5 years ago

            Do you keep a database of every post you found wrong? πŸ˜€

            • chuckula
            • 5 years ago

            No, I only have a few hundred GB of disk space available. πŸ˜‰

            • Bensam123
            • 5 years ago

            Yeah getting a kinda creepy and sadistic vibe here.

            He doesn’t have anything better to do except troll the forums with keyword searches.

            • Bensam123
            • 5 years ago

            Once again, what does any of that have to do with the current argument about a 8300 vs a i3? Completely putting aside you’re talking about my streaming experiences. You know, quotes actually have to pertain to what we’re talking about.

            • chuckula
            • 5 years ago

            1. You lied. Period. I’ve dealt with sociopathic little worms like you before who lie though their teeth, have narcissistic streaks a mile wide and then scream about how oppressed they are the minute that an honest person turns on the light and the little cockroaches flee into the corners.

            You aren’t even special, you’re a bad knock-off of the Lifetime movie of the week. I’ve been listening to you lie over and over and OVER again for years and I’m tired of it. You literally flat-out lied in this freakin’ thread by acting like the magical FX-8300 is growing on trees when you can’t even find it at any major retailer!

            2. The i3 is a superior chip. Period. And that’s a Haswell I3 with an older architecture I don’t give a rat’s posterior that an unavailable FX-8300 is supposedly a few bucks cheaper, it’s still an inferior product on an obsolete platform, and recommending it to anybody just shows how desperate you are to manipulate other people.. which brings us back point 1.

            • Bensam123
            • 5 years ago

            lol… So this has absolutely nothing to do with a budget build based around a 8300 or a i3?

            Do you even know what a sociopath is Chuckula? I don’t keep archives of your posts to randomly quote. Speaking of which, I’m surprised none of the moderators here find this a little bit disturbing.

            PS: Tigerdirect.

      • Meadows
      • 5 years ago

      I’ve been known to support AMD whenever possible but even I find the idea of purchasing an FX in H2 2015 quite ridiculous.

      Either get the new platform or wait until next year.

        • NoOne ButMe
        • 5 years ago

        If you need maximum total CPU performance? It’s currently pretty hard to beat at that prices the OP described.

        Now, I think most budget gamers need high ST performance personally.

          • Ninjitsu
          • 5 years ago

          Most “gamers” need high ST performance in general!

            • Krogoth
            • 5 years ago

            You are grossly exaggerating the importance of single-threaded performance with mainstream applications.

            Most games are typically GPU-bound, especially if you want to game at 1440P or beyond.

            • Ninjitsu
            • 5 years ago

            I’m not, really. I have a Q8400, I know where the CPU matters.

            My own testing with Arma 3:
            [url<]https://dl.dropboxusercontent.com/u/45160510/arma3_obj_viewD_CPU.jpg[/url<] [url<]https://dl.dropboxusercontent.com/u/45160510/arma3_obj_viewD_GPU.jpg[/url<] Otherwise: [url<]http://www.techspot.com/review/712-arma-3-benchmarks/[/url<] [url<]http://www.pcper.com/image/view/60353?return=node%2F63600[/url<] AoS only appears to scale till 4 ALUs, as I calculated [url=https://techreport.com/forums/viewtopic.php?f=3&t=116175&start=60#p1272012<]here[/url<]. So the only reasons I can see that the 6700K and 5960X are tied is the former's clock speeds and the latter's cache.

            • Krogoth
            • 5 years ago

            I’ve never stated single-threaded performance was completely irrelevant. It simply just doesn’t matter as much as before except for FPS-junkies that insist you must get 100FPS+ or bust.

            • Bensam123
            • 5 years ago

            Arma is another shit engine game, just like Forgelight (H1Z1 and Planetside2). It’s been shown time and time again that throwing more hardware at it does next to nothing depending on the build and the time of the week. Oh and the phases of the moon.

            Try Battlefield 3/4/Hardline.

            • Ninjitsu
            • 5 years ago

            Frostbite is GPU bound. And Battlefield doesn’t do much when compared to what’s going on in Arma, in terms of simulation, AI or player count. BTW, BF multiplayer is supposed to be CPU bound, but no one’s ever tested because of the variability.

            Battlefield is a terrible example, really.

            • Bensam123
            • 5 years ago

            Yuh… You basically made my point for me. If the games are GPU bound… then there really is no benefit from single threaded performance, is there? However… your OS itself has tons of threads running at any given time. Conversely, there is a lot more benefit from having a really strong multi-threaded performance when you have to pick and choose.

            A niche case game verse everything else you do in your daily life. It’s a no-brainer. I’m actually sorta surprised I’ve gotten so much gruff for pointing out a really good deal when there is one.

        • Bensam123
        • 5 years ago

        So tell me, you’re on a really tight budget… Does a m.2 disk or a PCI-E 3.0 raid card factor into this? Newer is always better?

          • chuckula
          • 5 years ago

          Oh look, it’s Bensam123 making strawman arguments* again.

          See, in Bensam-world, the fact that any offering from Intel features higher-end components that AMD can’t be bothered to put onto the market must mean that [b<]all[/b<] Intel products require high-end components. You've used this strawman over & over in the past to pretend that since some high-end Intel parts cost a bunch (although onle a trifly more than the launch price of the FX-9590) that ALL Intel parts must cost the same amount. Wrong. There are plenty of less expensive Intel parts out on the market, and Intel offers this thing called "choice" in platforms. You want proof? Try actually reading TR's stories for a change: [url<]https://techreport.com/news/28960/h170-and-b150-chipsets-arrive-on-asus-mainstream-skylake-mobos[/url<] * See Bensam, there's this logical fallacy that you are regularly guilty of called a "Straw MAn" where you invent a completely derived target based on your irrational hatred of Intel and then proceed to attack that imaginary "straw man" instead of actually dealing with the fatal weaknesses in your own propaganda. Try going to Wikipedia and learning more before you post again: [url<]https://en.wikipedia.org/wiki/Straw_man[/url<]

            • Bensam123
            • 5 years ago

            lol… alright. So I first pointed out that a lot of your arguments do that and it looks like you’ve been waiting awhile to use it back on me. You know my argument actually has to be a strawman for it to be one.

            In this case I pointed out a budget build isn’t going to use features, which is entirely true and show corners that can be cut and consequently don’t matter.

            Then you make a response that my argument is a strawman… which low and behold is actually a strawman in and of itself. You’re no longer actually talking about a i3 or a 8300, your entire argument is just here to make me look bad. Here, lets use this as an example.

            [quote<]* See [b<]Chuckula[/b<], there's this logical fallacy that you are regularly guilty of called a "Straw MAn" where you invent a completely derived target based on your irrational hatred of [b<]Bensam123[/b<] and then proceed to attack that imaginary "straw man" instead of actually dealing with the fatal weaknesses in your own propaganda.[/quote<] See the changes? See how it directly reflects your post, however the opposite is not true? I hope this helps. ^^

      • DarkMikaru
      • 5 years ago

      Before our Tiger Direct closed here in Raleigh, NC I picked one of those up for $109… would of been $89 if I had bothered with the $20 rebate. I hate rebates! Was going to build an entry level photo editing PC for a friend and at 100 bucks I couldn’t beat the multi-thread performance for the money. Since this friend wanted a “cheap” machine for Photo & Video Editing purposes this is probably by far the best bang for the buck work station you could buy. Especially if I wasn’t lazy and lowered it to 89!

      Alas, this same friend ran out of funding and so here I sit with pretty much an entire 8300 based system waiting to be built! lol Not mad, I’m sure I could sell it for at least 75% of what I spent. I guess my point is, as a productivity machine where lots of cores / threads matter it can’t be beat @ just over 100 bones. I’m honestly thinking of taking it to work as my new KML / Google Earth / Maps work horse if I can’t sell it. Would beat the pants of the FM1 A8-3870 I’m running now.

      Can’t say I agree on gaming. But for productivity and everything else, def a great little work station build on a tight budget.

      • ronch
      • 5 years ago

      Guys, why the downthumbs? I made a similar post here but used the FX-8350 and I got upvotes. Ben here presented a much cheaper but overclockable SKU and got bashed for it.

      I know some folks here are more loved than others but let’s try to be impartial.

        • Krogoth
        • 5 years ago

        It is the Intel defense force and FPS-junkies at work. >_<

      • Zizy
      • 5 years ago

      FX-63xx are very interesting indeed, but 8xxx are a waste of money for home use imo. Those 2 cores don’t offer much extra even when multitasking.
      As for builds, my progression is Celeron->FX-6->i5 for systems with dGPU. Celeron->i3->i5 without. A8 maybe for some specific scenarios, but the only person interested in something similar opted for console instead.

        • Bensam123
        • 5 years ago

        That’s like saying a quad core chip doesn’t do more then a dual core (or triple if they still sold them). Everything in your OS is multithreaded. A 8300 obsoletes the 63XX series because it’s at the same price point… and you get two (one) more cores.

      • NoOne ButMe
      • 5 years ago

      SMT is nowhere near quad core. Compare an i3 with HT to a 4C i5 at the same clock. It would be a slaughter.

        • Krogoth
        • 5 years ago

        SMT on post-Haswell chips is almost as good as a real core in applications that are hilariously multi-threaded (not games or mainstream stuff).

        Otherwise, you actually suffer from a slight performance penalty in single-threaded applications due to thread-scheduling on the CPU end.

      • brothergc
      • 5 years ago

      “You can argue that it’s antiquated”
      how about antique !
      lets see , uses more power , check
      slower single thread proformance , check
      no native USB 3.0 , check
      pci express 2.0 vs intel 3.0 , check
      poor sata proformance , check
      yea right ! just what every new build should be LMAF

      • Chrispy_
      • 5 years ago

      IPC is king for gaming and the 8300 is barely competitive with Nehalem from 2009.

      There’s nothing shabby about having 8 threads of Nehalem performance – especially in this age of “good enough” computing, but gaming is one of those tasks that is held up by low IPC. All the gaming benchmarks over the last four years have proved that you need to clock Piledriver and Bulldozer beyond 5GHz to become competitive with 3GHz intel Sandy, regardless of how many cores the AMD had.

      A couple of 3.9GHz Skylake cores are going to run modern games so much faster than an 8300 that you’re barking up the wrong tree.

        • Krogoth
        • 5 years ago

        Skylake is not really that much *faster* at gaming when CPUs are concerned. Actually, gaming is where you are not going to find Skylake’s strengths.

        It is certain CPU-intensive tasks where it really pulls ahead, especially if the application takes advantage of the newer instructional sets since Sandy Bridge. It is very efficient at SMT as well on chips that have it.

        IPC may be king at gaming, but there hasn’t been that much improvement in that area since Nehalem. A far cry from the generations of chips that came prior to Nehalem.

        There’s a reason why performance computing has made a paradigm shift towards parallelism when it became clear that you couldn’t crank-up the “MEGAHURTZ” to the stratosphere to get more computing performance.

          • Ninjitsu
          • 5 years ago

          [url<]https://techreport.com/r.x/skylake/pcars-fps.gif[/url<] [url<]https://techreport.com/r.x/skylake/pcars-2.gif[/url<] [url<]https://techreport.com/r.x/skylake/pcars-1.gif[/url<] [url<]https://techreport.com/r.x/skylake/pcars-99th.gif[/url<] [url<]https://techreport.com/r.x/skylake/pcars-curve.gif[/url<] [url<]https://techreport.com/r.x/skylake/pcars-8ms.gif[/url<] Looks faster to me.... Though I will admit, if you just care about 60 fps, then a top tier FX is [url=https://techreport.com/r.x/skylake/pcars-16ms.gif<]good enough[/url<], but only when you consider the TR suite, which is arguably not very CPU bound.

            • Krogoth
            • 5 years ago

            You are not looking at the large picture.

            The difference is trivial at best and majority of the gamers would not be able to tell a difference in a double-blind test, especially when you are comparing anything post-Sandy Bridge on the Intel-side.

            • Bensam123
            • 5 years ago

            You noticed that TR lowered the beyond X threshold to further hone on the results right? 8.3ms is perfectly executed frames for a 144hz display. Try looking at some of the older benchmarks on TR and then look at the beyond X threshold.

            We’re talking about a budget build. Not a perfect gaming PC and I do agree a i5 would be great, it’s also $150 more.

            Comparing a i3 to a 8300 (or a 8350 if you easily OC the 8300, which isn’t far fetched for a budget build). There isn’t a i3 in the Skylake review.

            That aside, using your computer involves more then playing games and a 8300 being beaten (not bad) at a handful of games that are more dependent on single threaded performance.

    • NoOne ButMe
    • 5 years ago

    Er, I would say most people are better off saving $30-40 towards GPU/SSD and losing .2Ghz clockspeed with the i3-6100…

      • VIIII
      • 5 years ago

      Depends on what these end up actually retailing for, but yeah, at those prices I completely agree.

        • JustAnEngineer
        • 5 years ago

        Those are the prices that Intel charges if you order 1000 processors at a time. Newegg will start out by gouging early adopters, but once supply is plentiful at their competitors, retail prices will settle to within $10 of Intel’s 1K pricing.

      • willmore
      • 5 years ago

      I was thinking the other way, wouldn’t they pick the i5-6500 instead? $44 more and you get four full cores and 50% more cache.

        • NoOne ButMe
        • 5 years ago

        Clock speed is quite a bit lower. .6-1.2Ghz slower. .6-.8 realistically, but, that’s hitting 15-20% reduction. .2 is 5% or so.

          • Ninjitsu
          • 5 years ago

          Yeah but the extra cores do help enough to compensate. I’m somewhat torn on this though – I remember the Pentium G3258 doing quite well at 4.5 GHz.

          I do feel that anyone who is willing to spend another $40 or so would have started by considering the i5s anyway, so I’m not sure if that’s a good way to look at it.

        • BestJinjo
        • 5 years ago

        Exactly, or even the i5-6400 or try to find a deal on the i5-4690K as I’ve seen those around on sale too in a bundle. Spending $40 on the i3-6320 over the i3-6100 for little to no tangible benefit for gaming, but yet ignoring that for a bit more $ from the i3-6320 one could get an i5 is shockingly bad advice from a tech reviewer. Then again I am not surprised since the same person recommended GTX960 2GB gaming GPU for budget gaming, fully ignoring 2GB VRAM bottleneck in modern games that will impact a great deal of gamers given the strategic $150-200 affordability of said cards, and yet did a full article if 4GB of VRAM is a limitation for gaming for $650+ GPUs that less than 5% of the GPU market is interested in purchasing.

      • BestJinjo
      • 5 years ago

      This post should be +1000. This is easily the most informed and concise post of the thread. There is absolutely no point in spending $40 extra on the i3-6320 over the i3-6100 for budget gaming since that $ is better used towards a faster GPU, such as moving up from GTX950/960 2GB to a GTX960 4GB/R9 280X or even better a $230 R9 290. Another way to look at it someone on a tight $120-130 budget could move up from a GTX750Ti/R9 270X to a GTX960 2GB/R9 285. In all of these scenarios, the $ is better spent towards the GPU for the budget gamer.

      Another benefits would be to use the $40 towards a larger SSD such as moving from a 256GB to a 500GB one. That $40 could be used towards a better quality/larger size monitor too. Finally, how can anyone recommend an i3-6320 for $157 when i3-6100 is $117 for just 200mhz lower clock but then ignore that for $187 a gamer can step up from the i3-6320 to i5-6400? It’s shocking to see an Editor of a professional review site who does this for a living make such an illogical/poor recommendation to label “The Skylake Core i3-6320 is the gamer’s new best friend” especially on the cusp of DX12 that should allow games to benefit from multi-cores. Since Intel’s Skylake CPUs have such excellent IPC (including Haswell), it’s actually a no brainer for someone looking at the i3-6320 to step up to the i5-6400/6500 or try to find a used Haswell i5/i7 because the i5s will suffice for 4-5 years while the i3 is outdated for many games right away.

      Pretty horrible advice overall from TR on this one that no budget gamer should follow.

    • JustAnEngineer
    • 5 years ago

    It’s about time.

    These LGA1151 CPUs haven’t shown up at Micro Center yet, though.
    [url<]http://www.microcenter.com/site/brands/intel-processor-bundles.aspx[/url<]

      • flybywiretl
      • 5 years ago

      When they do, I hope they come with those mb/cpu discounts. Those low power chips are mighty tempting.

Pin It on Pinterest

Share This