Core i3-530 processor listed in Canada

You might recall listings for upcoming Clarkdale processors showing up in Europe late last month. Well, the pre-launch listing bug has now bitten North America, too. Canadian e-tailer A-Power currently lists an unannounced Core i3-530 processor, complete with specifications and a box shot.

The processor has two cores clocked at 2.93GHz, 256KB of L2 cache per core, 4MB of L3 cache per chip, a 73W power envelope, and an LGA1156 package. If 73W sounds high to you for a 32-nm dual-core processor, keep in mind this CPU’s package also contains a 45-nm die with the memory controller and an integrated graphics processor.

The Canadian price of $152.19 CAD works out to around $144 USD, although judging by how much the same e-tailer charges for the Core i5-750, we can probably expect the Core i3-530 to cost more like $135 south of the border—a little more than last month’s European listings suggested.

Based on those listings and other reports, it looks like the Clarkdale lineup will also include a faster Core i3-540 and three Core i5-600-series processors, which will have Hyper-Threading enabled. The i3-530 should be the cheapest member of the Clarkdale family, while the priciest should be the 3.46GHz Core i5-670 at around $300. (Thanks to PC World for the tip.)

Comments closed
    • NeelyCam
    • 10 years ago

    Crap you’re right about the turbo.

    However, the game isn’t played in some enthusiast tech forums, but in the corporate profit report pressrooms, and there Intel is KILLING AMD. Anyone quoting that an equivalently performing AMD quad-core is being sold for $99 only proves my point.

    EDIT: Reply fail again.. this was for #18

      • OneArmedScissor
      • 10 years ago

      I’m still wondering about the die size of the 32nm part of Westmere, though. Considering the size of Nehalem chips, if it’s just cut down to 1/4, it wouldn’t be as small as Pentium dual-cores, and those also didn’t cost like $30+ billion to put into production. See where I’m going with this?

        • NeelyCam
        • 10 years ago

        Yeah, looking at the TweakTown picture, the 32nm part looks to be some 70mm^2 (eyeballin’), and that’s with the memory controller moved to the relatively large 45nm GPU chip. Then again, they can easily charge more for these than the pentium dual-cores, simply because of the higher performance.

        In the end, the profit margin on Core2Duos was great, and I think the profit margin on i3’s won’t be much different. I’m pretty sure AMD can’t make anything equally performing with similar margins until H2/2011.

        32nm process development cost won’t be prohibitive if it gets amortized over millions and millions of SOCs they keep hyping.

          • OneArmedScissor
          • 10 years ago

          The Pentium dual-core version is just a 2.8 GHz part with 1MB of cache, HT, and turbo boost disabled, though, at the same price tag. Only the very expensive ones with all the features are really better, but they cost as much as quad-cores and won’t exactly be flying off the shelves.

          I think they’re taking a hit in a few places there, but they’ll probably just keep selling small Penryns, rather than intentionally producing large amounts of Westmere with less cache.

          What they’re really after is probably a new flagship midrange laptop CPU.

    • wira020
    • 10 years ago

    If everything is good and ready.. why cant they just release the damn thing… mobos had also shown up… it might not still arrive at some retailers or some country but why cant they paper launch the cpus while it’s on its way… the specifications is out anyway (from informal source like above that is )… whats with the hold up?!…

      • NeelyCam
      • 10 years ago

      Gotta sell off the old stock of Core2Duos – Christmas sale madness is great for that. Same thing with old Atoms.

      Things would be different if AMD was competitive – Intel would be forced to release Pinetrail and the ChippenDales now to grab a good Xmas market share, but since old crap still sells well, why bother.

    • curls
    • 10 years ago

    just got a i5 750… will i regret my purchase when the clarkdales come out? planning on pushing it to over 3.5ghz with some overclocking.

    • bdwilcox
    • 10 years ago

    Man, Intel’s naming convention is a mess. To see where this chip fits in the Intel roadmap, you basically have to become a detective to tell how it differs from sister i5 and i7 chips. AMD is suffering this to a degree as well, but not nearly as bad as Intel has mucked it up. Seems like they not only multiplied cores, but confusion as well.

    P.S. Anyone know a good, reliable spreadsheet/chart that lays out all the comparative specs of the most recent and proposed Intel chips?

      • NeelyCam
      • 10 years ago

      Reminds me of the ol’ Stars ratings… Why would ANYONE buy a one-star CPU?!?

      Intel marketing guys really need to get fired.

      • NeelyCam
      • 10 years ago

      This is pretty good:

      §[<http://www.engadget.com/2009/11/29/intels-desktop-roadmap-leaked-with-faster-i5-and-i7-introduct/<]§ I've seen a better one somewhere (that shows also the core counts) from HKEPC, but can't find it now.

        • bdwilcox
        • 10 years ago

        Thanks. Appreciate it. The problem isn’t so much googling a chart. It’s making sure the chart provider is trustworthy. Engadget has been pretty trustworthy in the past.

    • geekl33tgamer
    • 10 years ago

    Hopefully I am not alone in saying this, but…

    I *really* can’t get excited about the whole idea of the CPU and GPU being saddled together in one happy, silicone-infused relationship?

    Call me a traditionalist, but I prefer each hardware part to be seperated, and do it’s job.

    From my point of view I guess, it means if I want one of those CPU’s, I have to pay more for it because Intel want to force GMA on you. Under the “old” way, I would intentionally choose a motherboard that was devoid of any on-board graphics because I would never use them, and therfore, not have to pay for them neither.

    It’s extra cash then for your AMD/nVidia card of choice. Are there also any compatability issues with disabling that on-chip graphics cr@p if you have a discrete GPU instead?

    Hopefully this on-chip stuff does not find it’s way further up the Intel i5/7 line!

      • Spotpuff
      • 10 years ago

      What is the “job” of each “part”? With GPUs and CPUs converging, the line will blur as to what tasks each are supposed to accomplish. Would you not want the CPU to “help” with GPU tasks if it meant a 100% FPS increase in games? Solely to stick to the “traditional role” of each part?

      Further, who defines what the job of each part is? The memory controller’s supposed to be on the north bridge right? Sticking it on the CPU would be madness. And we should all have dedicated RAID 5/SATA/IDE controllers in PCI/PCI-E slots instead of merging all that onto the southbridge/MCH/whatever the hell they call it now.

        • MadManOriginal
        • 10 years ago

        I agree in general. This is just a stepping stone to the parts being on the same die. Unfortunately that menas geeks in the know will say ‘Hey, get rid of that GPU die that’s on the same package but separate from my CPU!’

        • geekl33tgamer
        • 10 years ago

        While I fully support any boost to performance something like this could potentially bring, if on-chip graphics are to maybe one day replace our trusty PCI-E interface, then I need that Core i7 870 with an RV870 on board please 🙂

        On a side note, the placement of the memory controller on the CPU has shown a huge performance benifit, and is fully welcomed. Going back to my OP, there’s little to no performance advantage in what Intel are setting out to do. It’s good if there’s no price premium over the CPU if it was left off tho.

        I would have thought the main plus to doing this would be to suspend your meter-spinning X-Fire/SLI set-up from use if your on the web for example? But, IIRC, Windows 7 wont allow mixed display drivers, so thats out right?

        Finally (woah, getting typing cramp!), if all CPU/GPU’s of the future were on die, upgrading your graphics card when it’s due to retire just got a whole lot more expensive if thr CPU is still of a good enough speed.

        We may be years off, or it may never happen, but you never know with Intel.

        AMD planning anything like this?

          • DreadCthulhu
          • 10 years ago

          You are incorrect about Windows 7 not supporting mixed display drivers – it does..

          • zima
          • 10 years ago

          /[

      • OneArmedScissor
      • 10 years ago

      r[

        • MadManOriginal
        • 10 years ago

        YO DAWG WE PUT A COMPUTER IN YOUR COMPUTER SO YOU CAN GPU WHILE YOU CPU!

          • OneArmedScissor
          • 10 years ago

          lol I knew that was coming, and I chuckled, anyways.

          • 5150
          • 10 years ago

          Nice!

      • NeelyCam
      • 10 years ago

      /[

        • geekl33tgamer
        • 10 years ago

        +1 LOL

      • LaChupacabra
      • 10 years ago

      Progress is ineveitable. CPU’s today look completely different than they did 20 years ago. The floating point unit, L2 cache, memory controller, pci-e lanes, and now the GPU. All of these at one point were comletely seperate from the CPU silicon. Imagine if you still had to buy L2 cache, or a floating point unit. This is beneficial for everyone in the long run.

        • TO11MTM
        • 10 years ago

        If we’re lucky, with time this will be usable for more General Purpose computing items if you’ve got a discrete graphics card.

        Of course, we’ll be lucky if you see something like that in the next 5 years…

      • SPOOFE
      • 10 years ago

      /[

      • ybf
      • 10 years ago

      i can’t see how you can tell whether it’s got separate cpu and gpu, or an integrated cpu/gpu

      except that it’s cheaper and smaller and weighs less

      which is what everyone but you thinks is good

      so guess how things will go from now on

      • Krogoth
      • 10 years ago

      MediaGX called, it wants its integrated GPU back.

        • geekl33tgamer
        • 10 years ago

        Woah, never thought I would hear of that CPU/GPU/Soundcard contraption on a public forum, lol. It’s long since died!

        Do you suppose with it having everything from the cache (L1) to the GPU and sound on board, it could have run Crysis…

        Don’t mention it’s existance some 12-14 years ago to the Intel devs tho, or we will have an Intel Core i3 530 GMA 950 Turbo XF-I, erm, thing…

        Or not, didnt Cyrix sell it to VIA, who sold it to AMD over a dodgy deal in a pub somewhere?

          • zima
          • 10 years ago

          It lives on in OLPC XO-1

    • sledgehammer
    • 10 years ago

    intel is a mess.

    why someone would pay 150 dollars for a dual core when they can get amd’s quad cores from 99 dollars?.

    no more intel gma (graphics media acrappyzator).

    intel graphics are a real professional risk (beware).
    §[<http://www.mombu.com/medicine/medicine/t-computer-screens-and-eye-strain-eye-glaucoma-retina-astigmatism-3780404.html<]§ what's wrong with intel?

      • NeelyCam
      • 10 years ago

      Um… because the $150 dual-core is about 50% better than the $99 quad-core. AND IT COMES WITH A FREE IGP!!

      And a prettier logo.

    • DreadCthulhu
    • 10 years ago

    At that price this chip will be competing with the Phenom II x4 925. I would guess the Core i3 will be faster in single and dual threaded code, but slower on stuff with more threads.

      • NeelyCam
      • 10 years ago

      Don’t forget the turbo, hyperthreading, and the in-package IGP (-> cheaper mobos).

      These things will yield huge profits. AMD won’t make a profit. This game is o-vah!

        • Ryhadar
        • 10 years ago

        The core i3 doesn’t have turbo mode, and from what’s been rumored H55 and H57 motherboards are going to release between $100 – $130. The game ain’t over yet.

    • 5150
    • 10 years ago

    amd has faster processor

    intel has crappy human resources, cafeteria

    whats wrong with intel?

      • NeelyCam
      • 10 years ago

      Clock frequency doesn’t matter much; didn’t AMD teach you that back then during the Age of Netburst?

      100% correct on the cafe, though.

      Sammy Hagar sucks.

        • 5150
        • 10 years ago

        Agreed, Hagar does indeed suck, but Ed’s live solo during that song kicks ass!

          • OneArmedScissor
          • 10 years ago

          On the subject of Hagar, I accidentally saw the beginning of Chickenfoot’s set when they played here a few months ago (I did not want to be there…), and that was quality entertainment in a way I never would have imagined.

          Few things in this world could be funnier than a 60 something year old man hopping around on stage, pumping his fists up in the air, and literally screaming at random breaks in the song. I was dying laughing and had to leave before I attracted any negative attention from the army of fanboys.

            • 5150
            • 10 years ago

            Guh, I’ve seen Sammy in concert a few times (solo and with VH) and it is nothing but crap. It’s like an infomercial for his tequila (which is good, but won’t buy on principle).

            • NeelyCam
            • 10 years ago

            When Van Halen had their reunion tour last (?) year with Diamond David on vocals, they didn’t play any Hagar era songs. Not a major surprise, but it’s still like pretending that the band didn’t exist after David Lee Roth…

            Eddie’s 15min solo was pure magic, though.

            • 5150
            • 10 years ago

            I got to go to that one in Portland, it was awesome! At least I didn’t have to hear Sammy drone through Panama and Jump again. *shudders*

            • NeelyCam
            • 10 years ago

            That’s the one I saw. Live in Portland?

            • 5150
            • 10 years ago

            That’s the one. Horrible roads to drive from Montana to Oregon in the middle of winter, but it was an awesome road trip.

            • 5150
            • 10 years ago

            His lyrics are awful too.

            • Buzzard44
            • 10 years ago

            I can’t overclock, 55 (GHz)!

    • khands
    • 10 years ago

    I believe they effectively took the 4500 and stuck it on same chip.

    Edit: This was supposed to be a reply to #1 :/

      • MadManOriginal
      • 10 years ago

      Yup. Good enough for general desktop use and maybe some HD acceleration. It’s supposed to have a higher clock than NB-based IGPs at least, not that that will help for any gaming (IGPs in general are poor for gaming.)

      The less I’m in to gaming the more interesting this trend becomes to me. I’m looking forward to maybe a mITX Sandy Bridge. That will supposedly have a different less rehashed IGP too.

        • OneArmedScissor
        • 10 years ago

        “It’s supposed to have a higher clock than NB-based IGPs at least, not that that will help for any gaming (IGPs in general are poor for gaming.)”

        Of course, now that Intel have dumped the FSB for integrated graphics, they will have more bandwidth and less latency, so increasing processing power may someday accomplish something…someday. That day is unfortunately not the release of this chip. It’s still pretty comparable to a Radeon 3200, which I don’t believe was even limited by DDR2 bandwidth…

        It will be interesting to see what AMD do with the 800 series chipsets, as they have not yet moved from 55nm or standardized DDR3. There is a very large amount of potential before they even move the GPU off the board.

          • Flying Fox
          • 10 years ago

          AM3 brings us DDR3 already, what are you talking about?

            • OneArmedScissor
            • 10 years ago

            Exactly what I said. It’s not standardized, and as such, they had no reason to bother making the IGPs more poweful with recent chipset updates. Lots of boards are still DDR2, even if they’re AM3. 760G doesn’t even support DDR3, period.

            • NeelyCam
            • 10 years ago

            What’s wrong with AMD?

        • NeelyCam
        • 10 years ago

        Amen, brotha!

        These things are HTPC gold nuggets. I already got everything else, now just waiting for the CPU+mobo combo.

    • Gerbil Jedidiah
    • 10 years ago

    What’s the graphics supposed to be like on these? Typical intel crappiness?

      • sweatshopking
      • 10 years ago

      pretty much. a little better bandwidth, but that isnt really their bottleneck…

        • derFunkenstein
        • 10 years ago

        on one and two threaded apps, i wonder how these will compare to core 2 duos?

          • MadManOriginal
          • 10 years ago

          I *think* Nehalem is about 10-15% faster per clock, likely from the intergrated memory. Turboboost throws off any chance of a comparison for overclockers in almost every mainstream review though.

            • Flying Fox
            • 10 years ago

            Keep in mind that the Clarkdale memory controller will be on the IGP side, not IMC on the CPU die itself. So there will be some latency hits.

      • oMa
      • 10 years ago

      I think Intel will boost the GPU frequency pretty significantly. Maybe we will see 780g/8200-ish performance? These chips would be nice in a ultraportable. A whole system with under 20 Watt TDP should be possible.

Pin It on Pinterest

Share This