Corsair’s desktop DDR4 memory is primed for Haswell-E

We knew it was coming, and here it is. Corsair has announced two families of high-performance DDR4 memory for desktops: the Vengeance LPX and Dominator Platinum. Both are aimed at Intel's upcoming Haswell-E processors and matching X99 motherboards, and Corsair is showing them off in a dizzying video backed by what sounds suspiciously like dubstep:

These DIMMs have speed ratings of 2666MHz, 2800MHz, and 3000MHz, and they all sip only 1.2V. (That's DDR4 for ya.) Initially, Corsair will offer 4GB and 8GB modules in addition to 8GB, 16GB, 32GB, and 64GB kits. "By 2015," though, the company says we can expect 16GB modules, which will pave the way for 128GB kits.

What else? The modules have XMP 2.0 profiles, are validated to work with motherboards from a variety of vendors (Asus, ASRock, EVGA, Gigabyte, and MSI), and feature "user-swappable colored 'light pipes' for customizable downwash lighting." Also, thanks to Corsair Link, you can monitor DIMM temperatures in real time.

Look for these bad boys in stores at "the end of August"—but don't try to stick them in a non-Haswell-E system. They'll only fit in 288-pin DDR4 DIMM slots.

Comments closed
    • canoli
    • 5 years ago

    yay! finally some consumer boards with 128GB capacity. Of course nobody needs 128GB of RAM…

    …except when you do…work on a high-res video/animation in AfterFX (or Nuke) … it’s crazy how fast RAM is gobbled up. AfterFX has to cache the entire frame into a contiguous block of RAM before it’ll generate a preview…32-bit projects easily use ~150 MB per frame…at 29.97 fps a 10-sec timeline will need over 40GB of RAM for the preview. Add to that whatever the OS is using… and generally if you’re working on an animation you have Photoshop open, probably Bridge as well.

    Of course you can use proxies and a ton of other tricks to get your work done with as little as 8GB of RAM. But more is always welcome.

    Thankfully the DDR4 isn’t too expensive right now…only a couple grand to populate all 8 slots with 16G DIMMs //sarcasm Guess it’ll be awhile before I build my X99 dream machine.

    • Flapdrol
    • 5 years ago

    Pure aluminum, what’s that?

    • merrymarian
    • 5 years ago

    16 gigs (4×4) Corsair Vengeance LPX will set you back about 400$ if they were available today. I hope the price will drop to something reasonable within 6 months or so.

    • Chrispy_
    • 5 years ago

    Okay, since I’m probably going to have to budget for five Haswell-E setups with 128GB of RAM each, any clues on what 40 sticks of 16GB DDR4 are going to cost when they arrive (Samsung are looking like the only ones to have announced such a module for Haswell-E launch)

    If it’s insane, I don’t see why I shouldn’t just go for dual Xeon E5-2xxx with insane quantities of DDR3. No matter how expensive high-clock, 2P Xeons are, RAM could well be the single budgetary concern given what I’ve seen of DDR4 so far.

      • JustAnEngineer
      • 5 years ago

      The ones that I linked earlier in these comments are available for $249 each. That’s only 31.7% more than equivalent DDR3 modules.

        • Chrispy_
        • 5 years ago

        Or, £150, which isn’t a complete USD-GBP conversion ripoff for a change \o/

    • Klimax
    • 5 years ago

    Good. CPU, RAM known. Now good ATX-E or bigger mainboard and good new case. And upgrade from SB-E is complete…

      • JustAnEngineer
      • 5 years ago

      Micro-ATX has everything that you need in a 9.6″ x 9.6″ = 92.2 in² package.
      [url<]https://techreport.com/news/26925/asrock-x99m-killer-brings-haswell-e-to-microatx[/url<] How could you justify a 12" x 13" = 156 in² E-ATX motherboard?

        • Krogoth
        • 5 years ago

        Prehaps he is getting a dual-socket board and/or is intents on throwing in a large number of PCIe expansion cards.

        EATX still has a place for prosumers and enterprise market.

        • Jason181
        • 5 years ago

        You ask as if he only has a 3 square foot apartment. Just because you can go smaller doesn’t mean everyone wants to.

    • MadManOriginal
    • 5 years ago

    mmm…memory kits larger than the SSD I used a few short years ago! 😮

      • deepblueq
      • 5 years ago

      Larger than the SSD I’m still using now, even.

    • BanThoseKikes
    • 5 years ago
    • Chrispy_
    • 5 years ago

    Someone please clarify for me if this is right:

    [list<][*<]128GiB is the largest module size in the JEDEC spec (16GB modules) [/*<][*<]Haswell-E has a quad-channel memory controller [/*<][*<]DDR4 allows only a single module per channel.[/*<][/list<] In other words, 64GB of RAM is the upper limit for Haswell-E if you're using 16GB modules. Where is a "128GB kit" coming from, unless one of my three points above is wrong.

      • JustAnEngineer
      • 5 years ago

      [quote=”Chrispy_”<] · Haswell-E has a quad-channel memory controller · DDR4 allows only a single module per channel. [/quote<] Are we certain that the latter is not [b<]also[/b<] a limitation of Intel's Haswell-E processors and Wellsburg X99 chipset rather than the JEDEC standard?

      • EzioAs
      • 5 years ago

      If it has 8 DIMM slots then you can populate it up to 128GiB right?

        • Chrispy_
        • 5 years ago

        How can it have 8 slots if everyone (google “haswell-e one dimm per channel”) is saying that you can only have four dimms?

        Either all the speculation is wrong, or 16GB dimms will limit Haswell-E to 64GB of RAM.

          • EzioAs
          • 5 years ago

          This one has 8 slots.

          [url<]http://www.techpowerup.com/204174/asrock-shows-off-its-x99-ws-lga2011-motherboard.html[/url<]

          • chuckula
          • 5 years ago

          The speculation about 1 DIMM per slot is not wrong.. but it’s incomplete.
          Without any additional motherboard hardware present, the limit for DDR4 is one slot per channel because of DDR4’s point-to-point topology. However… if there is an on-motherboard switching chip, that chip can support more than one DIMM (e.g. 2 DIMMs, theoretically more) using a single memory channel.

            • Chrispy_
            • 5 years ago

            Thanks, that’s what I was looking for. Without that information, product shots [url=http://images.anandtech.com/doci/8133/IMGP2860_678x452.JPG<]like this[/url<] seemed to be in violation of both JEDEC spec and Intel's released info on their memory controller.

            • Wirko
            • 5 years ago

            What about buffered DDR4? Here’s ExtremeTech speculation on that.

            [url<]http://www.extremetech.com/computing/158824-haswell-e-to-offer-ddr4-support-up-to-eight-cores-in-2014[/url<]

      • Ninjitsu
      • 5 years ago

      Take a look at this: [url<]http://images.anandtech.com/doci/8364/P900.png[/url<] Lenovo have confused their numbers there*, but they'll offer 512GB and 1TB of memory in some of their workstations. *It says 1024GB with 64GB DIMMs, but 8*64=512 so not sure what they're trying to say. EDIT: Okay, they're talking about Xeons, sorry.

    • UnfriendlyFire
    • 5 years ago

    I could see this being useful for AMD’s K12 APU that comes after Carrizo (hopefully by then the DDR4 is more mature).

    Stacked DRAM cache + DDR4 memory = “Guys, we need more GPU cores because bandwidth is no longer a limit.”

    EDIT: And perhaps that will convince Nividia to stop selling Fermi rebadges.

    GCN 1.2 or 2.0 would be a tough competitor for a 2010 architecture.

    • UnfriendlyFire
    • 5 years ago

    RAM manufacturers: “We need more early adopters other than the server market.”

    Intel: “I can require DDR4 with the upcoming Haswell-E. If they’re willing to pay through the nose for a 6 or 8 core consumer CPU, then they can also afford more expensive RAM…”

    • Ninjitsu
    • 5 years ago

    Lenovo seems to be promising 1TB of DDR4 in some of it’s new [s<]Haswell-E[/s<] Xeon workstations, so i guess we'll see 128GB sticks as well, eventually. EDIT: [url<]http://www.anandtech.com/show/8364/lenovo-announces-new-thinkstation-p-series-desktop-workstations[/url<]

      • Laykun
      • 5 years ago

      Unless they’rer linux then I highly doubt it. [url<]http://msdn.microsoft.com/en-nz/library/windows/desktop/aa366778%28v=vs.85%29.aspx[/url<]

        • cygnus1
        • 5 years ago

        4TB RAM limit on 2012 Server Standard. I know of people that have run the server OS on a workstation to get more memory available, definitely not unheard of.

          • Laykun
          • 5 years ago

          You’re asking for trouble using a server edition OS on a workstation. I’m not sure what the situation is like these days but I bet there are going to be driver problems (although the server OS these days does use the same kernel as the desktop OS). I’m sure people DO it, but it’s not advisable.

            • cygnus1
            • 5 years ago

            Not sure what you’re talking about. Even in the NT4 and 2000 days there was a desktop version of the OS running the same kernel and drivers.

            The server and consumer desktop OS have shared a kernel and drivers for over a decade.

            2003 server / xp
            2003 R2 server / xp sp2/3
            2008 server / vista
            2008 R2 server / 7
            2012 server / 8
            2012 R2 server / 8.1

            The only problem you can really run into is when you have a piece of software written expecting desktop components that aren’t installed on the server or poorly coded OS version checking.

        • Farting Bob
        • 5 years ago

        Haswell-E workstations should be using the correct OS for their needs, and if those needs are enormous amounts of RAM it makes sense to get server editions of windows (or Linux).

    • dodozoid
    • 5 years ago

    ehm, what about timings?

    • Anovoca
    • 5 years ago

    Haswell-E and 128gb of 3000+mhz clocked DDR4, now all I need is an open loop water cooler to run off my drool.

    • geekl33tgamer
    • 5 years ago

    I’ve got DDR3 modules running a pretty aggressive overclock, and they don’t need active cooling – Any idea why Corsair want to sell you a fan to cool much lower voltage DDR4?

      • Major-Failure
      • 5 years ago

      I’m not privy of their business tactics, but I suspect it’s because this tactic makes them money since people buy into it.

      • HisDivineOrder
      • 5 years ago

      Because they’re Corsair. Kings of the huge, honkin’ memory heatsinks that do nothing.

        • Neutronbeam
        • 5 years ago

        They don’t HAVE to do anything….they just have to LOOK bad*ss, with extra glowy lighty things for uber-bad*ssery. My first rig had Corsair then I moved on to something cheaper and with smaller heatsinks for better clearance. Admire Corsair’s quality but there are alternatives at least as good out there.

    • deruberhanyok
    • 5 years ago

    I think the biggest gains most consumers will see as a result of DDR4 is improved IGP performance due to the extra bandwidth it will (eventually) provide. At the speeds they’re listing, it’s already close to double the amount offered by DDR3-1600, and once you start getting up to 50GB/s or more in a dual channel config we might see the sub-$100 video card market change dramatically.

      • HisDivineOrder
      • 5 years ago

      And ironically Haswell-E have no IGP’s.

        • Klimax
        • 5 years ago

        As if IGP is the only thing requiring large bandwidth…

      • Chrispy_
      • 5 years ago

      LOL, IGPs and DDR4 are diametric opposites in the price/purpose/common-sense spectrum.

      Assuming you could actually get a DDR4-enabled processor with Iris Pro graphics, they’d still be significantly slower than a $69.99 R7 250. Probably not a good reason to pay an extra $300 for the new platform tbh 😉

      AMD might benefit once DDR4 replaced DDR3 at the budget end, but I’m not expecting that to happen in the next few years.

        • UberGerbil
        • 5 years ago

        If we’re talking about “consumers” (or at least consumer platforms) and we’re talking about the point in the future when DDR4 has taken over from DDR3, then it’s probably true we’re talking about IGPs — but the lower power draw of DDR4 will probably be at least as important, since most of those consumer platforms will be running off a battery at least part of the time.

    • Chrispy_
    • 5 years ago

    [i<]<insert obligatory cynical comment about price premiums, small module capacities and there being absolutely no consumer need for more memory bandwidth on the desktop>[/i<]

      • JustAnEngineer
      • 5 years ago

      Crucial is already selling 16 GiB DIMMs.
      [url<]http://www.crucial.com/usa/en/memory-ddr4/ct4k16g4rfd4213[/url<]

      • Anovoca
      • 5 years ago

      <heavily thumbed-down rebuttal>

        • superjawes
        • 5 years ago

        [i<]<angry, hyperbole-ridden counterpoint>[/i<]

          • Anovoca
          • 5 years ago

          <NOT VERY WELL THOUGHT OUT RANT IN CAPS>

            • derFunkenstein
            • 5 years ago

            who are you, ssk?

            • superjawes
            • 5 years ago

            [quote=”Actually SSK”<]YEAH STOP USING MY TRADEMARKED MOVE[/quote<]

        • willmore
        • 5 years ago

        <Confusingly thumbed up post from Dammage>

      • divide_by_zero
      • 5 years ago

      Well done meta-commentary, everyone. I think my brain might have broken a little bit reading it. Maybe ya’ll caused a glitch in the Matrix or something

      • MadManOriginal
      • 5 years ago

      tldr version:

      <insert Krogoth>

    • chuckula
    • 5 years ago

    [quote<]and feature "user-swappable colored 'light pipes' for customizable downwash lighting."[/quote<] Why would you swap out the red light pipe when EVERYONE knows the red [s<]racing stripe[/s<] "downwash light" makes it go FASTAR!!

      • Thresher
      • 5 years ago

      No, that requires speed holes.

    • Peter.Parker
    • 5 years ago

    DDR4! As if I didn’t have a reason to buy Haswell-E already! I’m saving for this upgrade. My 4 yo Athlon II X2 is already shaking in its socket.

      • geekl33tgamer
      • 5 years ago

      My 6 year old C2Q is drawing it’s pension. This upgrade is long overdue, but worth the wait…

        • TravelMug
        • 5 years ago

        Same here. Looking forward to replace my Q9550, 8GB DDR2-800 and Radeon 6850 combo with something potent that will last another couple of years. With Windows 9 if the timing works out nicely.

      • mesyn191
      • 5 years ago

      Performance wise these won’t do much if anything for you vs DDR3 1600. You’ll need DDR4 3200 or faster to make a significant difference.

    • humannn
    • 5 years ago

    I can’t help but wonder if all this new tech (Haswell-E & DDR4) will only end up giving you a ~10% performance increase over 1 year old technology.

      • geekl33tgamer
      • 5 years ago

      Ya. Don’t benchmarks already show the current quad-channel Z79 having a decent amount of memory bandwidth in the synthetics, but not much real world gain? There’s probably not many programs in the consumer space to leverage all that bandwidth, if any?

      • JustAnEngineer
      • 5 years ago

      The real killer is to keep comparing the new stuff to the Core i7-2600K Sandy Bridge that I bought on January 8, 2011.

        • Ochadd
        • 5 years ago

        Agree 100%. My i7 2600k overclocked to 4.6 ghz would appear to meet or beat the best Intel has to offer for gaming and other reasonably threaded workloads. I’ll probably go Haswell-E just for the platform update and for fun.

          • christos_thski
          • 5 years ago

          I wish intel would focus more on acceptable integrated graphics performance, instead of nominal CPU speed gains. Even with their newest integrated graphics, they remain multiple generations behind discrete GPUs (and simply laughable compared to equivalent cost CPU-discrete GPU combinations, as intel insists on bundling their best graphics units -such as they are- with obscenely expensive top shelf CPUs).

          Iris makes sense only if it can be affordably bundled with low end CPUs. Splurging 500 dollars on a CPU for nominal gains and a lousy integrated graphics subsystem is a ridiculous proposition.

          Wasn’t the original bruhaha around integrated gpus that they were supposedly no physical limits to bar them from competitive performance? Particularly on the desktop, they serve no conceivable purpose if you can assemble a system twice as fast on a lower budget, using discrete graphics. Even laptops are better off with discrete mobile gpus, so long as you’re not going for ultraportables.

            • Laykun
            • 5 years ago

            We’re talking high end desktops here. Integrate graphics become a barrier to high CPU performance, particularly since people in this segment are going to have a dedicated graphics card. In my opinion integrated graphics chips on high end CPUs are a waste of silicon and TDP. And I do agree, they really should be relegated to lower-end CPUs.

            I’d prefer Intel focused on bigger CPU speed gains rather than Integrated graphics, at least on the high end. You’ll NEVER get a competitive IGP simply because they’ll always be limited to sharing the TDP with the CPU and the memory bus bandwidth.

    • Krogoth
    • 5 years ago

    On board the hype train!

      • smilingcrow
      • 5 years ago

      3GHz @ 1.2V seems good to me.
      DDR4 will be more useful for portable devices than consumer desktops which is par for the course nowadays. DDR3 is so Tiger Woods.

Pin It on Pinterest

Share This