In the lab: a pair of historical systems to broaden our frame of reference

TR's test labs are awash in the latest and greatest CPUs and graphics cards, but our ability to provide a frame of reference past Sandy Bridge CPUs is a little fuzzy. My P965 Express motherboard fell to a botched firmware update years ago, and I had stretched that Core 2 Duo E6400-powered system out for the better part of seven years before I hopped on board the Haswell train. Plenty of stuff happened in the intervening time, of course, and holdout enthusiasts with chips from the Nehalem era might want to see how their hardware is holding up.

Problem is, chips from that era are cheap, but contemporary motherboards still demand a pretty penny—often hundreds of dollars. It's hard to justify spending real money on those historical platforms out of curiosity alone. TR BBQ host and all-around excellent person drfish and I have been chatting for a while about getting a couple older systems out of his parts pile and into mine, and this week, we finally went through with it.

First up, we have a Core i5-750 sitting pretty on MSI's P55-GD65 motherboard. Lynnfield chips like this one make up the "first-gen Core" family (I think) for mainstream Intel desktop platforms, and the i5-750 quickly established itself as a value favorite of ours. That P55-GD65 board picked up a TR Recommended award way back when. In any case, this CPU fills an architectural hole we've had in our library of testing systems for a while, and I'm glad to know it's riding in on a high-quality motherboard.

Going even further back in time, we have an Intel Core 2 Quad Q6600 on board an Asus P5N-E SLI. The Q6600 needs no introduction—as one of the world's first attainable single-socket quad-core CPUs, this chip proved a legendary performer and another value favorite for enthusiasts. The P5N-E proved a fine way to get those four cores up and running for not a lot of cash. We don't expect much from this 11-year-old chip, but it should be a useful historical reference for anybody still relying on CPUs of that era for day-to-day use.

Finally, we have a 16-GB kit of G.Skill's Trident X DDR3-2400 memory. As Sandy Bridge, Ivy Bridge, Haswell, and Broadwell chips get up there in years, enthusiasts have stretched their useful lives out a bit by turning to higher-clocked DDR3 than the run-of-the-mill 1600 MT/s stuff we usually recommended in contemporary System Guides. This fast, low-latency memory could help us put our older systems on a more even footing with our DDR4-3200-equipped Intel and AMD test rigs.

That's it for this quick walk down memory lane. Our thanks to drfish for digging these chips and motherboards out of his stash. Don't expect these processors to appear in every review, but we might do some targeted testing soon to see just how some of these wizened systems are holding up for the curious.

Comments closed
    • cynan
    • 1 year ago

    No Cyrix 6×86? I mean, if you’re gonna do these historical comparisons half arsed, why even bother?

    • Forge
    • 1 year ago

    Just to chime in on the nostalgia, I personally don’t have anything running older than a Core i5-3570 or so, but my daughter computes to this very day on the Core 2 Duo E8400 rig I picked up at TR BBQ X. It’s just started crashing and acting weirdly, though, so the days are numbered.

    • smog
    • 1 year ago

    My I7 940 is still going strong. Clocks to 4 gig on air for about an hour before bluescreen which still makes me absurdly proud.

    If you think I should have got the 920, my buyers remorse is all used up on the gtx 770 I bought a month before 970 released.

    • Ikepuska
    • 1 year ago

    I’m still running a Mac Pro 3,1 cheese grater with a modified GTX 780 so that it will show the boot screen. I think it’s running the 2xE5472, but I don’t have access to it to check right at the moment. It’s finally getting long enough in the tooth that I’m having to think about what to do about it. Having said that, it still was able to run Final Fantasy XV, which I consider amazing, and I might just see about upgrading to the X5482 depending on price. The hardest part was the ram over the years, it’s got 24GB ram but it’s a total mix of sizes and parts. I probably could increase ram performance by replacing it with a more uniform set but that’s still not cheap last I checked.

    I honestly don’t know where it stands in the performance hierarchy, but for a secondary system it’s performed surprisingly well.

    • Mr Bill
    • 1 year ago

    Will you be benching them under Win10 or under the OS of that moment? Also, on your tweets you mention no issues of specter/meltdown patches for these older systems. Would that still be true if running Linux?

    • Forge
    • 1 year ago

    For a second there I saw that SLI selector card and hoped you had a legendary A8N-SLI running. Would be fun to see that once-world-champion trying to keep up with today’s stuff.

    • derFunkenstein
    • 1 year ago

    Neat! I had a P5N-E SLI back in the day not because I ran SLI, but because I had some sort of weird loyalty to Nvidia chipsets after being an nForce 2 aficionado in the Athlon XP days. That system had a super-pimpin’ motherboard that was overcompensating for its Pentium E2130 overclocked to 3GHz. Can’t wait to see how the Q6600 gets roflstomped on modern benchmarks.

    • Klimax
    • 1 year ago

    Spoiler:Neither will fare well.

    • Wonders
    • 1 year ago

    Awesome! This is very exciting. For me personally, the stock-clocked 2600K will remain a mental reference point for a long time yet. So it would be cool to see that borne out quantitatively as well.

      • DancinJack
      • 1 year ago

      It’s time to move on. The 2600K, especially at stock, can’t keep up anymore.

      • DPete27
      • 1 year ago

      Your 2600K (stock and OC’d is in the review below)
      [url<]https://techreport.com/review/32642/intel-core-i7-8700k-cpu-reviewed/12[/url<]

    • G8torbyte
    • 1 year ago

    TR testers, is there a particular older motherboard series or CPU you would like to add to your inventory? I have Intel P/Z series chipsets and a few AMD ones going back 20+ years. All were working fine after I stored them. I’d be glad to donate to the cause.

    • Kretschmer
    • 1 year ago

    Would it also be possible to add a current i3 and Pentium to your comparisons? They appear to be a large blind spot in these reviews for users who are trying to optimize their CPU purchases.

      • Eversor
      • 1 year ago

      Second this.

    • tipoo
    • 1 year ago

    Ah Nehalem, not many launches since approached the launch badassery

    [url<]https://c2.staticflickr.com/2/1136/1403384017_e5ce9aa616_z.jpg?zz=1[/url<]

    • ronch
    • 1 year ago

    I think we should include the Intel 8086.

      • DeadOfKnight
      • 1 year ago

      From what I understand, it’s just an overpriced 8700k with a very minor overclock to “5GHz!”

    • fomamok
    • 1 year ago

    I still have my i7-920 as main PC.

    Days ago ran cpu-z benchmark, and it told me that the best processor in existence just has 50% single threaded advantage over my i7-920.

    I see no reason to upgrade.

      • K-L-Waster
      • 1 year ago

      So 1.5x the performance per thread + double the threads for a total of 3x performance increase isn’t enough for you? Sheesh, tough room…

      • psuedonymous
      • 1 year ago

      Is that by any chance comparing an OCed i7-920 to a stock CPU?

      • hiki
      • 1 year ago

      I had run the exact same benchmark, and actually, the 920 is 57% slower than the 8700K. Both at stock clocks.

      But that means that the 8700K is 132% faster than the 920.

      Still a disappointment for a decade of difference. The 920 was the best buy ever. There is nothing it can’t do.

      [url<]https://i.imgur.com/sU20qrm.png[/url<]

      • FuturePastNow
      • 1 year ago

      One of my friends was using an i7 920 until about a year ago, when he plopped a now-cheap Gulftown 6 core in there and probably bought four more years out of that PC.

      • fredsnotdead
      • 1 year ago

      I’m still running an i7-860 on that MSI P55-GD65.

      • moose17145
      • 1 year ago

      I too still have my i7-920 rig running (at stock speeds even). I relegated it to a Home theater PC role, so really all it has to do is stream Amazon Prime, netflix, etc. It works quite well in that role.

    • ermo
    • 1 year ago

    Jeff, if you can, would you mind including the newest quad core Atom-derived chips such as the J5005? It can be found in e.g. the [url=https://www.intel.com/content/www/us/en/products/docs/boards-kits/nuc/nuc7cjyh-nuc7pjyh-nuc7cjys-brief.html<]Intel NUC7PJYH[/url<] I have this feeling that it will compare favourably with the Q6600 at a tenth of its power draw. EDIT: Totally forgot to mention how cool it is to see these old platforms being included as part of the test battery where it makes sense!

      • psuedonymous
      • 1 year ago

      You may be interested in [url=https://www.techpowerup.com/reviews/Intel/Skull_Canyon_NUC/<]Skull Trail vs. Skull Canyon[/url<], an (admittedly rather beefy) NUC vs. a dual-quad-core rig with an extra GHz on the Q6600. Spoiler: the NUC comes out on top.

        • ermo
        • 1 year ago

        I’m specifically interested in the 10W Out-of-order Quad Core architecture employed by the Atom-derived family.

        AIUI, Skull Trail is a full Skylake architecture chip aimed at high-end laptops.

        The interesting thing here is the juxtaposition of today’s presumably “low end” Pentium J5005 of Atom lineage vs. the king of the hill from 10 years ago.

        You’re certainly free to disagree, mind. =)

      • DPete27
      • 1 year ago

      +1 to this. I feel that nobody is covering the J5005. That’s a hole TR could fill and get some site traffic.
      Perhaps, [url=https://www.asrock.com/mb/Intel/J4105B-ITX/index.asp<]this J4105 mATX mobo[/url<] would allow for gaming testing!?!?

    • Klopsik206
    • 1 year ago

    He, my main PC is still the exact same c2q6600 / asus p5n rig!
    Iā€™d love to see the level of improvement I can count on with long due upgrade.

      • Growler
      • 1 year ago

      I still have mine, but it’s in a box now. It was pretty sweet for quite a while, and I was rocking a Tahiti LE for a good portion of that.

      • RickyTick
      • 1 year ago

      I still have a Q6600 working perfectly well in the kid’s bonus room. They use it for basic web searches, Youtube, and playing some non-demanding games like Roblox and Minecraft. I used it to play Crysis way back when.
      Q6600, Abit IP35 Pro, 4 gb ram, GTX275

    • Anonymous Coward
    • 1 year ago

    This is excellent, I’m going to like seeing how that old quad stands up.

    The problem will perhaps be that it will be [i<]destroyed[/i<] to such an extent that we don't learn much, if you just bring out all the standard tests. It'll just sit there way on the edge of the graphs and look miserable.

    • Austin
    • 1 year ago

    šŸ˜‰ I have 2 running and in use systems at each end of the spectrum in my home:

    1. 6C12T Westmere-EP based Intel Xeon X5650 (default 2.67-3.0ghz o/c to 3.8ghz stock voltage) with triple-channel DDR3-1600 (effectively similar to a “Xeon X5680/5690” and memory bandwidth similar to dual channel DDR4-2400)

    2. 6C12T Coffee-Lake i7 8700K (running default 3.7-4.7ghz).

    • setaG_lliB
    • 1 year ago

    Oh god, the P5N-E SLI. The nvidia chipset ran super hot on those. In order to run my Q6600 at 3GHz, I actually had to provide additional cooling to the [i<]northbridge[/i<], not the CPU! The far superior i965 could take a Q6600 to 3GHz (and beyond) without even breaking a sweat.

    • DragonDaddyBear
    • 1 year ago

    Thank you, Dr. Fish. I really look forward to seeing these show up in an article. LLT husband related a video about a $150 gaming PC with a similar chip, but I look forward to a more thorough investigation of just how far hardware had come in 10 years from TR.

    • ozzuneoj
    • 1 year ago

    I’m looking forward to seeing an article about this! Please PLEASE do some overclocking! Most people who have hung on to these older high end parts have been able to do so with the help of a good reliable overclock. Just find an easily attainable common speed for a chip and bench with that… no need to go for records.

    Also, does anyone else miss the practical and effective looking heatsinks used on these older boards? They look like they’d actually cool something if air passed over them, as opposed to the enclosed aluminum stealth fighter panels they glue on to modern boards.

      • BurntMyBacon
      • 1 year ago

      I think there is a group of people (myself included) who find the functional heatsinks aesthetically more pleasing than the heatsinks that sacrifice cooling capability for aesthetics.

    • Firestarter
    • 1 year ago

    ah Conroe

    together with Sandy Bridge I feel like these 2 product launches were the last times where you just had to have the new stuff because it made everything else obsolete

      • ozzuneoj
      • 1 year ago

      Agreed.

      • BurntMyBacon
      • 1 year ago

      AMD diehards that moved from Piledriver to Zen probably felt about the same way.

        • ermo
        • 1 year ago

        Hey! My two old Piledriver FX-8350 boxen w/16GB DDR3-1600 RAM and SSDs are doing just fine serving as Linux dev stations, thank-you-very-much!

        Believe or not, parallel compilation and compression are just as quick if not slightly quicker than on the contemporary i7 3770K chips (and this was before the L1TF and Meltdown Spectres reared their ugly heads).

        I guess I have a thing for keeping old and obsolete hardware alive and kicking.

          • BurntMyBacon
          • 1 year ago

          You have not yet moved from Piledriver to Zen, so you don’t fall into this category … yet.

          Regardless of the proficiency of your Piledriver processors to your current task and how they compared to their Intel counterparts, it is likely you would notice if your dev stations suddenly sported high end Zen processors in place of the Piledrivers.

            • ermo
            • 1 year ago

            I was initially planning to jump on the Zen+ bandwagon (to give board partners time to shake out the kinks and give AMD time to debug their new microarch).

            However, the high price of DDR4 plus the fact that Zen+ pretty much demands two sticks of low-latency RAM to work at its best means that I’m holding on to my (also obsolete) delidded i7-3770K@4.2 GHz w/4x8GB DDR3-2400 probably until the consumer version of Zen2 lands.

            I like the performance uplift of the i9-9900K, but I’m not overjoyed with the continuing exposure of new intel HT-related issues, so I’m not quite sold on that platform yet if I’m honest.

      • HERETIC
      • 1 year ago

      Spot on.
      2006 to 2010 was a real headrush for enthusiasts.
      Also-end of a era-perhaps a moments silence-
      Owners of Opterons, no longer thought of them as gods………………………

    • thedosbox
    • 1 year ago

    It’s fun to see how motherboard aesthetics have changed over the years. Compare these to the relatively sleek Gigabyte X299-WU8.

    • bthylafh
    • 1 year ago

    You’d better get a G5 Power Mac and compare how fast they all run Photoshop filters.

      • setaG_lliB
      • 1 year ago

      Hey, I remember playing UT2004 on a friend’s Power Mac G5 with a GeForce 6800.

      “WUH-HO! SO THIS IS HOW THE GAME IS SUPPOSED TO LOOK!” was my immediate reaction. At the time, I was used to seeing it on my crappy P4 1.8GHz and Radeon 9600 Pro.

      • Anonymous Coward
      • 1 year ago

      I still haven’t forgiven Apple for dropping PPC, they did it right at the point that major progress stalled anyway.

        • NTMBK
        • 1 year ago

        …I think the follow up to this will prove you very wrong

          • Anonymous Coward
          • 1 year ago

          Yeah, you mean seeing a Q6600 cry at the feet of modern CPUs? Well I won’t deny progress has been made, but not in the way progress was made previous to that point. A G4 was looking pretty sad next to the latest from x86-land, but there was [url<]https://en.wikipedia.org/wiki/PWRficient[/url<] which Apple actually bought, and lets not overlook the way in which IBM has [i<]refused[/i<] to die. They can actually sell CPUs in head-to-head competition with Intel, granted its to big cloud players, but its not hiding in some big-iron niche. Its rackmounts where the competition is settled by watts and dollars. Edit: IBM Power9 [url<]https://en.wikipedia.org/wiki/POWER9[/url<]

            • Forge
            • 1 year ago

            I work with Power7-9 boxes at work all day (AIX, baby!), and while they are incredibly capable and powerful CPUs with some crazy features (selectable SMT widths from 1-4 on the fly? SURE!), I don’t think I’d want one in my office. I think Apple jumped off the cart just as it was turning up a path they didn’t want to follow (massive capability combined with insane thermal/power needs).

            • Anonymous Coward
            • 1 year ago

            I wonder to what extent IBM specialized on server because desktop vanished, or desktop vanished because they specialized on server. Either way, it seems like there are Power9 variants for sale right now that fall within the expected TDP range of something like Threadripper, with competitive core counts as well. The heatsinks look rather unsophisticated though. šŸ˜‰

            If I was rich, I’d make sure to have some around just because.

        • Kretschmer
        • 1 year ago

        The G4 and G5 CPUs were being left in the dust by advances in clocks and design. Where would Apple be if they tried to pit IBM and Motorola’s leftovers against the Pentium Northwood or AMD Clawhammer?

          • Anonymous Coward
          • 1 year ago

          Meh, the G5 needed to get an on-die memory controller and maybe borrow the SMT-enabled core from IBM’s Power5, but that seems like less work that jumping ISA. Given how much they neglect their desktop computers even now, it seems that performance has never been important, and they could have always found [i<]something[/i<] IBM's designs were super good at.

            • Voldenuit
            • 1 year ago

            You pretty much spelled out the reasons there. Mac sales volumes were so low that neither IBM nor Motorola had any incentive to pour money into developing PowerPC for desktop/laptop, and it didn’t help that Apple kept trying to play IBM and Motorola off each other every generation, halving potential sales volume.

            PowerPC Reference Platform (PReP) could have saved both MacOS and PowerPC, but Apple saw how it cannibalized their sales and panicked into not licensing the BIOS for OSX to 3rd party OEMs.

            • Anonymous Coward
            • 1 year ago

            The lesson I draw from both Intel and AMD is that the server, desktop and laptop markets can be served by the same core and often by the same silicon. IBM was pretty interested in big chunks of silicon it has to be said, and efficiency wasn’t always their strong point, but apparently they’re sufficiently competitive these days to keep it going around on their own.

            Very interesting I think that Intel and AMD are shadowed by a mysterious, mostly unseen player in the CPU design games.

            • Kretschmer
            • 1 year ago

            IBM’s Power line is insanely expensive and it often has stayed competitive through use of exotic technologies (or exotic price points) that may not have equally panned out on the consumer side. Even if possible IBM was unlikely to “downscale” Power for Apple’s tiny sales base. And even if the G5 was ballpark competitive, there’s no guarantee that developers wouldn’t continue to jump ship to Windows.

            • Anonymous Coward
            • 1 year ago

            Yeah its not super price competitive, but $5k will buy you a workstation with Power9 in it.

            Someone has even reviewed this stuff against Intel & AMD: [url<]https://www.phoronix.com/scan.php?page=article&item=power9-talos-2&num=1[/url<] Now I'm not so crazy as to sit here and claim anything about thing being [i<]the best value[/i<] but I think its [b<]really[/b<] interesting how competitive Power9 is against Xeon and Epyc. I guess at the end of the day, a decent team of designers who make no major mistakes and fab on similar processes arrive at similar places. I note that video & audio encoding is not optimized for Power9, got beat bad with encoding. The question of optimization is huge for CPUs that don't even eat the same instructions. Fascinating stuff I say.

            • Kretschmer
            • 1 year ago

            Intel (and thus Windows compatibility) was a big part in keeping OSX relevant; many users use Boot Camp for that one essential application or game. There’s no guarantee that a better G5 would be funded by Apple’s anemic sales volumes. The G4 was relegated to the back burner and languished at 500Mhz forever; Apple was desperate to swap to a vendor that existed to provide desktop consumer CPUs instead of one that might maybe get around to it someday.

        • derFunkenstein
        • 1 year ago

        IBM either couldn’t or wouldn’t design a G5 CPU to go into notebooks. PowerBooks and iBooks far outsold Power Macs and iMacs, and the portable computers were still running CPU 5-year-old CPUs.

        Then again, up until last week, the Mac Mini was in the same spot. /rimshot

          • Anonymous Coward
          • 1 year ago

          Yeah Apple has a pretty crappy track record of updating their Intel stuff don’t they, so what did they really achieve? BTW regarding the G4, there was that “PWRficient” processor which looks like it was on track to be a fine laptop CPU, 64-bit and aimed at lower watts. It actually is a real product today contracted to military applications, it was right there, Apple cut it off at the knees.

          Sure Intel has generally been better than anyone else on desktops, but who really cares, apparently not Apple’s customers nor Apple itself.

          I will forever morn the moment that desktop PC’s became essentially single platform.

          • FuturePastNow
          • 1 year ago

          Yeah, Apple attempted to build some prototype Powerbook G5s and it just wasn’t happening.

    • DeadOfKnight
    • 1 year ago

    How DARE you do things and write stuff not everyone is interested in.

    • Chrispy_
    • 1 year ago

    Awww yiss.

    These systems are still valid for most people who bought them. I’m looking forward to seeing how much or little they’ve aged.

    • Chrispy_
    • 1 year ago

    Awww yiss.

    These systems are still valid for most people who bought them. I’m looking forward to seeing how much or little they’ve aged.

      • Jeff Kampman
      • 1 year ago

      in the case of the Q6600 so far the answer is terribly

        • Voldenuit
        • 1 year ago

        Sounds like Sandy Bridge was the last big ‘tock’, and every intel CPU since then has just been tiny incremental process and IPC gains?

        Not surprised, although I *am* surprised AMD was never able make up the difference with intel sitting on its thumbs for 7 years running.

          • MOSFET
          • 1 year ago

          If you need an i7-920 (the OG “Core” overclocker) I’d be happy to loan/donate a power-hungry example. It hits 3.8 GHz with ease and stability, and can take 16GB DDR3-1600 (and even higher with OC), and sits in an Asus Rampage II Extreme mobo. It shows VERY little difference in quick benchmarks between it and Sandy Bridge 2600K (SB at stock).

          Come to think of it, if interested I have:
          i7-920 @ 3.8 – Bloomfield
          Xeon E3-1270 (3.4-3.8) – Sandy Bridge
          Xeon E3-1240v2 (3.4-3.8) – Ivy Bridge

          • DragonDaddyBear
          • 1 year ago

          Dr. Fish posted a article stating just how significant those gains had become over the years. When he wrote the article it became very apparent that Sandy Bridge was a bridge too far when applications that rely heavily on a single thread are used.

          [url<]https://techreport.com/review/31410/a-bridge-too-far-migrating-from-sandy-to-kaby-lake[/url<] That said, there are big gains to be had going from dual to quad core, even if it's a Q6600. LTT's $150 PC video demonstrated that some games were bound by the dual core and that dropping into place a quad core made modern games just barely playable.

            • dodozoid
            • 1 year ago

            I guess Dr.Fish nicely demostrated that there is more than IPC*clock to a modern CPU. It is how fast it responds to a user input what makes it feel snappy. Yes, throughput (even single threaded) is important but so are latencies, how fast you can ger from standstill to that max throughput.

          • Eversor
          • 1 year ago

          Haswell was a pretty big bump in IPC, though I don’t recall the the exact numbers – I think 10%+.
          Skylake+ has been improved mostly on the back of frequencies and higher frequency DDR4.

        • Anonymous Coward
        • 1 year ago

        I hope you are able to tease some value out of the tests, not just some hopeless beatdown. I’m thinking low resolution gaming, and with modest graphics cards.

        • notfred
        • 1 year ago

        Are you running them with everything up to date in terms of Meltdown / Spectre patches?

        I have a feeling that these older chips have been hit a lot harder as they are missing many of the instructions that help mitigate the performance impact of some of the workarounds.

        As someone with both a C2Q 6600 and a Core i7 920 in daily use (running Linux and not gaming) and considering an upgrade to a Ryzen 2600 based system, I’m very interested in your results and seeing what it would get me.

      • drfish
      • 1 year ago

      Long story short, I’ve occasionally filled large pressure vessels with cheap boxed wine to test its surfactant properties/the performance of various seals. One time I left a test running too long and the wine grew its own mother from local airborne bacteria. I have photos of the result, but they are too disgusting to share.

      I think that’s how Jeff feels about the Q6600 right now.

    • Takeshi7
    • 1 year ago

    I definitely recommend getting a cheap 8-thread Xeon chip for the P55 board. I built my friend a PC with a Xeon X3460 ( equivalent to i7 860 ) and that thing rocks.

    It would give a nice “i7” experience compared to the i5 750.

    • Voldenuit
    • 1 year ago

    Clicked on the link hoping to see TR show off an IBM PC 5150 with an 8087 co-processor, 1152K of RAM (512K onboard, 128K on an expansion card, and another 512K on a second expansion card), [s<]MCGA[/s<]* Hercules graphics card, and DR DOS 6.0 on a Compactflash card. Am disappoint. * Turns out there were never any discrete MCGA cards, and MCGA controllers were only available on PS/2 systems, not PC XT (5150) or PC AT (5170) systems. Bummer.

      • Krogoth
      • 1 year ago

      VGA was superior to MCGA so there was no reason for discrete video card manufacturers at time to make solutions for it.

      MCGA was entirely IBM’s baby.

      • psuedonymous
      • 1 year ago

      Forget that PC rubbish, Amiga Forever!

        • KeillRandor
        • 1 year ago

        Don’t forget the Atari ST for the audio benchmarks…

      • ozzuneoj
      • 1 year ago

      Hey now, if you’re going to start throwing retro PC numbers out there, prepare for the weirdos to come out of the woodwork… The PC 5150 only supported 16-64K onboard early revision motherboards and 64-256K on later ones. You could add 384k via an expansion card (mmm… nothing like having more than half your system RAM running over an 8bit ISA bus). Conventional memory tops out at 640K (above that, UMA is occupied by video and other memory up to the 1MB mark) and there is no support for XMS memory on a 5150. Any additional memory would be in the form of EMS expansion cards and those are only useful for very limited situations, like Lotus or maybe as a RAM drive.

      With only five expansion slots to install floppy disk controllers, hard disk controllers, Parallel, serial, gameport, RAM, RTC (you want it to know the time, right?), and video, you definitely don’t want to be cramming unused memory expansion into a 5150.

      My 5150 is sitting right next to me and is running an IBM 5153 CGA monitor (supports most EGA 16 color modes as well), Orchid Tiny Turbo 286 upgrade card with a blazing fast AMD 7.16Mhz 286 (with a switch on the rear to get you back to the sane speeds of the original 4.77Mhz 8088), an Everex EV-659 Deluxe 256K EGA+Parallel card, IBM Floppy controller, WD MFM hard disk controller, AST SixPak Plus Card with 384K, RTC, Serial and gameport, 20MB Miniscribe 3.5″ MFM hard drive, Tandon TM100-2a 360K 5.25″ floppy, Sony 3.5″ floppy (cheated a bit here for compatibility), Keytronic “Model F clone” foam and foil keyboard and a parallel port CF card reader that works in DOS to facilitate moving larger amounts of data back and forth between the IBM and my main system.

      Old stuff for the win. (Do people still say that? Probably not…)

        • bthylafh
        • 1 year ago

        <points>

        Nerd!

        • faramir
        • 1 year ago

        “mmm… nothing like having more than half your system RAM running over an 8bit ISA bus”

        Where do you think those onboard chips were attached? Memory bus was hooked up directly to the CPU. In case of 5150 (with 8088 CPU) this meant 8 bit bus regardless of whether the memory was situated on the motherboard or hooked up via expansion slot; it was all the same bus … *ALL* your memory was hooked up via 8-bit bus on an 8088 system.

          • ozzuneoj
          • 1 year ago

          Right, it probably didn’t make any difference then, but in modern terms the thought of running your RAM from any expansion card, let alone an 8bit ISA card, seems almost comical. That’s all I was saying.

        • K-L-Waster
        • 1 year ago

        Wouldn’t worry about that 8 bit memory bus much — at the handful of megahertz that RAM ran at, it probably wasn’t saturating it anyway šŸ˜›

    • DPete27
    • 1 year ago

    For reference, the FX8350 review was the last one in my memory with a significant age range of CPUs tested. I still go back to that review from time to time to extrapolate.

    [url<]https://techreport.com/review/23750/amd-fx-8350-processor-reviewed/10[/url<]

      • DPete27
      • 1 year ago

      Oh, to extrapolate that conclusion graph to the [url=https://techreport.com/review/32642/intel-core-i7-8700k-cpu-reviewed/16<]i7-8700K review[/url<] I think it's safe to say that the i7-8700K offers approximately double the framerates compared to the i5-750....in CPU-bound games. (please show one test with Battlefield or similarly GPU-bound game)

    • DPete27
    • 1 year ago

    Great to see this. But please [b<]INCLUDE OVERCLOCKED RESULTS[/b<] (near-stock voltage) in any benchmarks since it's safe to assume that anyone still rocking a system this old that cares about benchmarks is pushing to get every last breath of usefulness out of it.

      • drfish
      • 1 year ago

      I don’t know what Jeff’s got planned, but I’ll say that both of these CPU/mobo combos spent their entire lives at stock. I have no idea how they’d behave overclocked.

      • YukaKun
      • 1 year ago

      I second this. Sandy CPUs would OC to 4.5Ghz with little to no effort. My i7 2700K was rock solid at 4.6Ghz for 6 years (!).

      Cheers!

        • Jeff Kampman
        • 1 year ago

        We did a look at the i7-2600K overclocked in the first Coffee Lake review: [url<]https://techreport.com/review/32642/intel-core-i7-8700k-cpu-reviewed/12[/url<]

          • Srsly_Bro
          • 1 year ago

          Thank you for that, Jeff.

      • continuum
      • 1 year ago

      I wonder how well they’d STILL overclock at near-stock voltages.

      I know my Q6600 and i7-4770 have both felt electromigration’s hit. Q6600 was mildly OC’ed to 3.2ghz and ended its life at 3.0ghz before I retired it when Ivy Bridge hit, I went through a few Haswell chips, this one wasn’t the best, did 4.1ghz and degraded to 4.0ghz not too long after. (to its credit, it still runs at 4.0ghz years later….).

      (foolishly sold the i5-4570K that did 4.1ghz more comfortably at near-stock voltages…)

        • Chz
        • 1 year ago

        My i5-750 ran at 4GHz single core, 3.6GHz all cores for six years with nary an incident. I did once have it crash while encoding video, but it turned out that the fan on the CPU cooler had been dead for months and I just hadn’t given it a 100% all cores load before then. I didn’t personally adjust the voltage, but I suspect the Gigabyte motherboard gave it a nudge or two. (FWIW, I no longer use Gigabyte kit based on some issues with that) It’s not Sandy Bridge speeds, but it was perfectly adequate for the time I owned it. If they’re going to have overclocked results, I’d suggest this as a good baseline.

          • zqw
          • 1 year ago

          And for Q6600: My Q6600 was in use overclocked from 2.4Ghz to 3.6Ghz for about that long. (It could do 3.8ish with extra voltage.) From what I remember, 50% overclock on air was really easy and common, once the G0 stepping came out.

            • Krogoth
            • 1 year ago

            B3s easily overclocked to 3.0-3.2Ghz provided you got solid cooling and were willing pump a few volts into them.

      • SuperPanda
      • 1 year ago

      I think I’d rather see the modern processors underclocked. Not only is it easier to do, there’s less risk to the hardware. If you’re clock-limited you can always extrapolate an OC on the older stuff within a margin of error or two by contrasting how much performance the modern one loses by underclocking.

        • K-L-Waster
        • 1 year ago

        That would give you an idea of the performance per clock cycle, but it’s really only useful academically. In the real world, no one would underclock their CPU unless they were troubleshooting or fighting with thermals.

        OTOH, whether it is better to continue to overclock an old chip vs. buy a new chip is a question many users want to answer.

          • Krogoth
          • 1 year ago

          Some people do intentionally underclock(or limit turbo speed) their CPUs by a small amount for extra stability.

          The slient/near-slient PC crowd has been dabbling in the art of underclocking/undervolting trying to find the sweet spot of performance along with near to complete silence.

            • HERETIC
            • 1 year ago

            Most of us would choose a moderate OC, that sweet spot at the bottom of the bathtub curve.
            With both AMD and Intel seemingly pushing a notch past that point with modern CPU’s,
            a small undervolt/underclock is probably required to now find that sweet spot…………….

    • SecretMaster
    • 1 year ago

    I’m still rocking an i5-750 with a Gigabyte GA-P55-UD3R and a HD5850 as my main rig, so I’ll be very interested to see if/when older systems are included in tests.

      • Helmore
      • 1 year ago

      I’m also running the same CPU and GPU, though on an Asus mobo, so I’m also fairly interested to see how my system compares to current systems.

      • anotherengineer
      • 1 year ago

      I was using an AMD 955BE stock speeds with 8GB ddr3-1600 and an HD6850 until about 10 months ago, now my nephew has it. The ssd helped lots, got my moneys worth out of that system.

      • odizzido
      • 1 year ago

      i5-750 user here as well.

      • Grape Flavor
      • 1 year ago

      I’m curious, have you run into stability problems when playing newer games? Asking because AMD cut off driver support for TeraScale GPUs a few years ago.

      Obviously the performance is what it is, just wondering what effect not having GPU driver updates has had on crashes / glitches.

        • SecretMaster
        • 1 year ago

        I haven’t played many newer games (part of the reason for the longevity of my current system), so I can’t help you there. I’ve been using a monitor with a 1280 x 1024 resolution for ages as well, which also has stretched the usefulness of the graphics card.

    • chuckula
    • 1 year ago

    Only obsolete Intel systems?? Where’s the Phenom test rig??!?

    WHERE’S MUH QUADFATHER!!

    Wait, on second thought we’re Ok with being left out of this category.

    — AMD

      • Krogoth
      • 1 year ago

      “AMD, stop whining about it!” (In Arnold accent)

      — Cryix

      PS: WE WILL BE BACK……

      • drfish
      • 1 year ago

      AMD fans will be glad to know that my wife’s old Phenom II X4 955 couldn’t be donated to the cause because it, and my old GTX 780, are still hard at work serving frames up to my sister-in-law as her primary gaming system.

      After that, it was all APUs for me, so I’ve got nothing left to give.

        • gerryg
        • 1 year ago

        Wow, wish I could upgrade to that level! I game with a Phenom II X3 and Radeon R7 260X. But I don’t play many modern games. I may get a Ryzen 2400G and build a new PC when the sales start kicking off soon.

          • maasenstodt
          • 1 year ago

          I’m currently finishing up a 2400G build – my first new PC in a LONG time – as I type this on my old Phenom II X6 1100T. Let me tell you something: it’s shocking how much quicker the 2400G feels. I highly recommend the upgrade!

        • BurntMyBacon
        • 1 year ago

        Similar story for my old Phenom II X4 955. It is sitting on an old Crosshair III motherboard and overclocked to 4.0GHz (no voltage bump required). I have it paired with an MSI GTX780 Lightning. It is currently pulling gaming duty for sister and does fairly well with her go to game: Warframe.

        • Mr Bill
        • 1 year ago

        I still have mine along with the motherboard. I upgraded to the X6 1100T and then had to upgrade the motherboard because of power draw.

      • DPete27
      • 1 year ago

      Once you get either an AMD or Intel chip from that era in the benchmarks you can much more easily go back and extrapolate performance.
      It becomes a “one degree of separation” instead of 5+

      • Anonymous Coward
      • 1 year ago

      Yeah I would have liked to see a Phenom X6 too, been a cheap one for sale here that I saw, almost had to go buy it. I wonder if those 6 cores have aged better than a quad.

        • BurntMyBacon
        • 1 year ago

        The short answer is they don’t seemed to have aged any better for games or basic usage. Encoding has some advantages, but the situation is pretty much the same as when launched.

        The long answer is a Phenom X4 955 can often (I went 5 for 5) be clocked to 4.0GHz before even giving it a voltage bump. If you don’t like overclocking, then the Phenom II X4 980 is still 3.7GHz. The Phenom II X6 models maxed out at 3.3GHz (turbo core did not work well for me) and were terrible overclockers. Due to the lack of clock frequency, the single threaded performance of my 1090T made it a worse gaming chip than the Phenom II X4 955 it was intended to replace. In my recent testing, this remains the case in Ashes of the Singularity, though to be fair, they both perform rather poorly here and there are other factors that could be dominating. Still, given the evidence, it doesn’t seem like the Phenom II X6 processor have aged any better than their quad core brethren for games. Encoding sees an appreciable advantage given 50% more cores at ~20% less frequency, but the advantage is no greater than it was when it launched, so I’d say it aged about the same here as well.

          • Mr Bill
          • 1 year ago

          [quote<]The Phenom II X6 models maxed out at 3.3GHz [/quote<] yep

          • Anonymous Coward
          • 1 year ago

          Thats a complete response there. So in the end the clock speeds were too low for the extra cores to compensate. Not an entirely dissimilar situation to Ryzen vs i5/i7 today, though also not entirely similar.

      • MileageMayVary
      • 1 year ago

      I have my Phenom II X6 and motherboard mounted on my wall.

        • Anonymous Coward
        • 1 year ago

        I was debating buying a 1090T used here, pretty cheap of course. I have a C2D at 3.16ghz (stock) that has trouble scrolling web pages smoothly, even has a GF-1030 in it. So your wall art is actually faster than hardware that I use here (mostly for the kids to have a cheapo game box).

      • ermo
      • 1 year ago

      Actually, I have three Ph II systems (955BE X4, 940BE X4, 720BE X3 — all at stock speeds) that I’m setting up to use as an Exherbo* Linux build cluster utilising SUSE’s [url=https://github.com/icecc/icecream<]icecream distributed compiler daemon[/url<]. My custom kernel builds are hovering in the 6 minute range with "make -j14" (each node will always have an extra job queued). The intent is to use the head node as a Wayland + Sway box w/3x old 1280x1024 19" TN monitors driven by a Radeon HD6570 1GB just for the fun** of it. I'm watching clang 6 compile as I write this. *: Exherbo is an awesome source-based Linux distribution with a super nice package manager called paludis that has Gentoo's portage beat by miles. **: Ok, maybe this is a very particular (peculiar?) definition of "fun".

    • Krogoth
    • 1 year ago

    Awesome sauce.

      • Dposcorp
      • 1 year ago

      That sounds close to you being …….i am hesitant to use the other word, so lets says you sound close to be dazzled.

        • Firestarter
        • 1 year ago

        krogothn’t

        • derFunkenstein
        • 1 year ago

        The awesome sauce he’s talking about is [url=https://www.target.com/p/sweet-baby-ray-s-174-barbecue-sauce-28oz/-/A-14779733<]Sweet Baby Ray's BBQ Sauce[/url<], which is a personal favorite of mine.

          • Gyromancer
          • 1 year ago

          Get your molasses way from me and bring on the KC spice.

            • derFunkenstein
            • 1 year ago

            Everybody thinks their bbq is the best, especially in KC lol

            • JustAnEngineer
            • 1 year ago

            That’s probably why they’re able to sell so many varieties:
            [url<]https://www.sweetbabyrays.com/Sauces[/url<]

Pin It on Pinterest

Share This