G.Skill readies up Fortis and Flare X DIMMs for Ryzen

Many gerbils who didn't skip to the conclusion in our recent Ryzen review probably picked up on the notion that AMD's octa-core processors are sensitive to memory bandwidth in some applications. Ryzen's core and thread count look ready to take on Intel's high-end Broadwell-E desktop chips, but the dual-channel memory configuration's peak theoretical bandwidth is more in line with Intel's more pedestrian Kaby Lake Core processors. G.Skill probably saw this scenario coming, and it readied up its Fortis and Flare X lines of memory specifically designed and tested for AMD's AM4 platform.

G.Skill markets its Fortis memory modules as a cost-effective solution for builders assembling a gaming-focused AM4 system. Fortis modules come in pairs with total capacity of 8GB or 16GB, allowing for four-DIMM configurations of up to 32GB. The 8GB kits are available in 2400 MT/s speed, and the 16 GB kits are offered in 2133, 2400, and 2666 MT/s configurations. The 8GB and 16GB kits 2400 MT/s kits are available with 16-16-16-39 or 17-17-17-39 timings.

The company's Flare X modules are designed for maximum performance. G.Skill is currently offering 8GB and 16GB kits of paired modules, but the manufacturer says 32GB kits are on the way, allowing for Ryzen systems with up to 64GB of memory. The currently-shipping configurations are the same as the Fortis series, but G.Skill says 3200 MT/s and 3466 MT/s models are coming. The top-speed modules will only be available in a maximum capacity of 8GB per DIMM, though. The timings for the 2133 MT/s kit are 15-15-15-35 and the 2400 MT/s set is clocked at 16-16-16-39 or 17-17-17-39. Finally, the 2666 MT/s kit is set to 18-18-18-43.

We were unable to locate product pages for the Fortis and Flare X memory modules on G.Skill's web site, but they're already selling at Newegg. The 8GB Fortis kits sell for $67 and the 16GB Flare X sets go for $130.

Comments closed
    • DrDominodog51
    • 3 years ago

    [url<]https://youtu.be/frw9HRwqODk[/url<] G.Skill is overclocking the base clock to get 3466 MHz. This wouldn't be an issue if all Ryzen motherboards had an external clock generator, but only high end boards have a clock generator presently.

    • Chrispy_
    • 3 years ago

    [b<]"AMD COMPATIBLE DDR4"[/b<] ...because other JEDEC-spec RAM without the sticker will fail, obviously. /sarcastic :rolleyes: whilst golfclapping and looking down my nose at G.Skill.

      • BurntMyBacon
      • 3 years ago

      +1

    • JosiahBradley
    • 3 years ago

    [quote<]AMD-Tuned DDR4-3200MHz CL14 64GB(4x16GB)[/quote<] Now that is what I'm talking about.

    • RickyTick
    • 3 years ago

    You have to go to the NEWS tab on their website to find info.
    [url<]https://www.gskill.com/en/press/view/g-skill-announces-flare-x-series-and-fortis-series-ddr4-memory-for-amd-ryzen[/url<]

    • DPete27
    • 3 years ago

    I don’t get it. How exactly are these better for Ryzen? DDR4-2400 @ 16CAS is pretty pedestrian. Just cause you slap an AMD sticker on a stick of RAM doesn’t mean it’s any better suited for the task than any of the products already on the market….

      • Concupiscence
      • 3 years ago

      Isn’t piggyback marketing the worst? I’m sure the RAM is fine, but c’mon.

      • Growler
      • 3 years ago

      Other DDR4-2400 DIMMs don’t have the “AMD Compatible” sticker. Sure, they [i<]might[/i<] work, but can you really afford to take the chance? Also, regular RAM can't handle the buffalo stampede that is Ryzen.

        • Bobs_Your_Uncle
        • 3 years ago

        I’ve gotta ask for some help here …

        Despite being a pretty regular reader (almost every day) somehow I missed the joke that revolves around “buffalo” & it’s kind of driving me nuts. I’ve searched a few times (as free time would allow) but I haven’t been able to track it down.

        You might say “It’s got me completely Buffaloed!” … which is a usage I’m familiar with: confused or stymied.

        Can anyone clue me in???

        Edit: FWIW I also know that “Ya can’t roller skate in a buffalo herd!” … (so maybe I’m just too durned old to grok this hyar new-fangled contemporary buffalo tech humor!)

          • chuckula
          • 3 years ago

          Yeah.
          It’s kind of my fault.

          It’s a long story, but basically “Buffalo” is a code-word used for talking about RyZen pre-release (it was technically used for Skylake too but it got a lot bigger for RyZen).

          Now that RyZen is out we don’t need the Buffalo anymore (but it’s still fun!)

            • RickyTick
            • 3 years ago

            Yeah.
            It’s all my fault.

            FTFY

            • chuckula
            • 3 years ago

            YOU’RE WELCOME.

            • Bobs_Your_Uncle
            • 3 years ago

            Thanks for the explanation. I was thinking that I’d be reading references to “buffalo” well into the future & gnashing my teeth in frustration each time.

            It sounds like the brunt of the “buffalo stampede” is mostly behind us now, & I don’t need to sweat it any further. I can just relax & regard the “buffalo stampede” as passe, or essentially … BS!? 😉

            • Renko
            • 3 years ago

            I am actually glad you asked. I lurk in the shadows and didn’t want to admit that I had no idea where I missed it. In all honesty I thought it was a reference to the mythical “Great White Buffalo” because of all the Ryzen hype. Not that it didn’t live up to it, just seemed like it fit.

            • BurntMyBacon
            • 3 years ago

            [quote=”Renko”<]I am actually glad you asked. I lurk in the shadows and didn't want to admit that I had no idea where I missed it.[/quote<] Let me get this straight. You spent all that time lurking in the shadows, not knowing what the whole buffalo thing was about, and too embarrassed(?) to admit it. Along comes Bobs_Your_Uncle asking for the very explanation you desired, saving you the embarrassment and allowing you to keep your secret in perpetuity. Then your immediate response is to tell everyone. Perhaps next time you'd be better off just asking the question and saving yourself the worry. We (most of us) don't bite. :')

Pin It on Pinterest

Share This