Sapphire posts Radeon X1900 specs

Sapphire appears to have leaked details on ATI’s upcoming Radeon X1900 series graphics cards. According to card specs available on the company’s web site, the Radeon X1900 XTX will have 48 pixel pipelines and be clocked at 650MHz core and 1.5GHz memory. A Radeon X1900 XT will have the same number of pixel pipes, but slightly lower 625MHz core and 1.45GHz memory clocks. Interestingly, the Radeon X1900 CrossFire Edition will be clocked at XT rather than XTX speeds.


48 pixel pipelines sounds pretty ominous, but since ATI decoupled stages of the graphics pipeline for its Radeon X1000 series, it’s unlikely those are pixel pipelines in the traditional sense. All three cards are listed with 512MB of GDDR3 memory on a 256-bit memory bus. Thanks to TR regular Ricardo for the tip.

Comments closed
    • WaltC
    • 14 years ago

    Couple of things:

    *Why does “48 pixel pipelines” sound “ominous”…? As a consumer, it doesn’t sound particularly “ominous” to me. Now, if I was nVidia, well….;)

    *Why do people confuse what Sapphire says on its web site with ATi’s PR department? Last I checked Sapphire was its own company. I mean, I could see the point if the info was written on the ATi site…but…since it isn’t…

    *I followed both your links to the Sapphire site and all I see now is nothing…;) Perhaps you should update your article to reflect that? I’m sure the info was there at some point, but since it no longer is, shouldn’t that be reflected in a main-page update? Much ado about nothing?

    • MorgZ
    • 14 years ago

    Anyone else reckon that the clock speed difference between X1900 XTX and X1900 XT of 25mhz and memory 50mhz isnt really going to give more than a 1-3% performance difference? Is it even worth releasing both chips at all?!

    In the BFG OC range or these other out of the box overclocks we see on the nVidia card’s they often offer an extra 25mhz and more than 50mhz extra memory speed

      • Freon
      • 14 years ago

      I agree, at these speeds, 4% clock and 3% memory is going to give zero subjective difference in speed. Not worth even a $50 premium on a $500 card.

    • Philldoe
    • 14 years ago

    awww nVidia got OWNED

    seems things are switching around nVidia just launched vaporeware and ATI is puch out a 48 pipe card.

    I can’t wait for my X1900XTX 🙂

    EDIT Yeah I knw its not a “true” 48 pipe card but can’t a fanboy dream? lol

      • Madman
      • 14 years ago

      Not a 48 pipe card, 16 pipe card with 48 shader units, probably vertex/fragment universal…

      • UberGerbil
      • 14 years ago

      ATI hasn’t “puch”d anything out. They haven’t even announced it, let alone shipped it. Get back to me when either side has actually shipped a next-gen product.

    • 5150
    • 14 years ago

    I hope I never have to buy another video card again. These things are too damn confusing anymore.

    • BoBzeBuilder
    • 14 years ago

    ;);)

    • albundy
    • 14 years ago

    I dont see any improvement until I see the .0000001% increase in fps. where are the benchies?

    • Shining Arcanine
    • 14 years ago

    Geoff Gasior, the tradditional pipeline consists of a pixel shader processor, a TMU (texture maping unit) and a ROP (does rasterization and antialiasing) and data executes in that order. In ATI’s R520, ATI made the pixel shader processors independent of the respective pipelines with a crossbar like Nvidia make the ROPs independent in the NV40. From what I have been reading the figure of 48 pixel pipelines that you have heard is not of pixel pipelines, but of pixel shader processors. So from what I have been reading, ATI’s R580 will have:

    48 Pixel Shader Processors
    16 Pipelines (each with a TMU and ROP)

    ATI’s R520 has:

    16 Pixel Shader Processors
    16 Pipelines (each with a TMU and ROP)

    Nvidia’s G71 will have:

    32 Pipelines (each with a Pixel Shader Processor and TMU)
    16 ROPs

    ATI’s R580 does not have 48 pixel pipelines. Any mention of 48 pixel pipelines is either misinformation or done intentionally by ATI’s marketing department, as ATI’s R580 has 48 pixel shader processors, which of course, process the assembly language that Nvidia’s driver converts game developers’ pixel shader programs into. And of course, no one will believe me, but this is how things are.

      • Dissonance
      • 14 years ago

      Shining Arcanine, I’m merely directly quoting Sapphire’s web site, which refers to “48 pixel pipelines.” Obviously, there are reasons why I can’t do anything more than convey what Sapphire made public.

      But by all means assume I don’t know what I’m talking about 😉

        • Sargent Duck
        • 14 years ago

        sooooooo, if I’m reading this right, you already have the card, tested it, and just waiting for the NDA to lift? Of course I realize you can’t say anything, but wink twice if the answer is yes.

        • CampinCarl
        • 14 years ago

        I think they nuked their site; All I get is ” ol 19 cf Specifications” :-/

      • DaveBaumann
      • 14 years ago

      The definitions of what is a pixel pipeline and what isn’t are a little fuzzy, whats the difference between a pipeline processing pixels with a texture unit and one that is processing pixels without a direct texture unit?

      One thing you should consider is the organisation of the shaders and how many pixels /[

        • Shining Arcanine
        • 14 years ago

        A pipeline is a chain of individual units in a chip. Hence why ATI’s Pixel Shader Processor independent pipeline consisting of a TMU and ROP is a pipeline, and Nvidia’s ROP independent pipeline consisting of a Pixel Shader Processor and a TMU is a pipeline. If they were to place a crossbar between the two remaining units in the pixel pipeline, the pixel pipeline would cease to exist, as there would be no longer a definate path for the data to travel.

    • BoBzeBuilder
    • 14 years ago

    hopefully this will lower prices from the current ati cards

    • spuppy
    • 14 years ago

    Nice, completely opposite of a paper launch!

    • PerfectCr
    • 14 years ago

    I can’t wait until Gainward releases the GAINWARD Powerpack /X1900XTX2MAXX XP Golden Sample VIVOOOM DELUXXE H0LY SHIXX RETAIL Edition.
    -[<(Crossfire dongle sold separately)<]-

    • Ricardo Dawkins
    • 14 years ago

    when is the official lineup launch….? I know, I will not get one but still…nice competition between these 2 teams…

      • Ricardo Dawkins
      • 14 years ago

      the X1900 XT is in stock :rolleyes:

      • bthylafh
      • 14 years ago

      Damage, could you please fix that annoying long-URL bug? Even at 1280×1024 resolution, I have to scroll horizontally to see the entire page.

    • SpotTheCat
    • 14 years ago

    all of those X and 1 and T makes everything look the same

    X1TXXT1TXXTX1

    • Shobai
    • 14 years ago

    but, why does a card need a “pecs” sheet anyway?

    larfs

      • PerfectCr
      • 14 years ago

      Did you see all those pipes!?! That’s why, those aren’t mere specs, those are PECS baby!

      edit, damn he fixed it lol.

    • deathBOB
    • 14 years ago

    AHhh!! The Xs!!! My eyes!

    Am I the only one who doesn’t care? Its not like I will ever own one of these. Ati making a good midrange part would give me reason to be excited, but lately they have been dissapointing in that regard.

      • Austin
      • 14 years ago

      y[<:o(<]y Yup, ATi's last really great mid-range card was the 9500Pro which IMHO only came about out of ATI's fear of the awesome sounding but delayed GeForce-FX. Okay the 9600 series was decent and the X800XL has its merits but ATi are really losing ground in the all important low and mid-range sectors (and their high-end cards aren't exactly mid-blowing either). They had a superb chance to gain significant market share during and just after the 9500/9700 era but simply missed the boat. Again all IMHO.

        • d0g_p00p
        • 14 years ago

        Ahh, the 9500Pro. That and the 9700Pro were the 2 best video cards ever released (IMHO) since the Geforce 3 and 6600GT.

          • deathBOB
          • 14 years ago

          I think my Geforce 4 4200 was pretty kickass.

          • Beomagi
          • 14 years ago

          have to agree, though i wont say 9700 since the 9500 oc’s were at or around the 9700 level. Just think – a card that old still playing games today at the lower middle end of performance. just awesome.

          before that there was the ti4200 – just awesome card, like the 9500, no pipes or hardware changes,, just lower easily ocable clocks. 9600 imo wasn’t as enthusiast friendly – 4 pipes, and a high clock made gains lower than the 9500.

          what REALLY made these cards shine, was the fact that compared to the top end, They didnt skimp on memory interface width. like the 9700, the 9500 was 128bit.

          For the current budget cards out there, the awesome ones today would be the $105 x800, $200 6800gs and GTO’s, and i have to admit, the 7800gt is quite tempting. you can get deals from time to time with games, and adding the value of the packaged games, the card’s boxed value can be driven down to $230 or less.

          #36 – good info 🙂
          Now to see if ATI’s idea on ratio’s pays off. If games are bottlenecking on shaders, and the vertex units are twiddling thumbs, then it’s a good move.

          race race race you two!! i got another year or year and a half for my 6800gt before i feel like upgrading, so the faster those companies compete, the better off i’d be then …
          competition rocks 🙂

            • Austin
            • 14 years ago

            Well the 9700 & 9700Pro actually used 256bitDDR, the 9800 & 9800Pro were mostly just higher clocked 9700. So the 9500Pro’s main limiting factor was it’s memory bandwidth being halved vs the 9700. It was amazing how relatively little this mattered as the 9500Pro o/c’ed could nip at the heals of the 9700 despite the large deficit in memory bandwidth. All 9500/9700 tended to hit around 360mhz core making them very close to the raw core power of the 9800Pro. The 8 piped 9500Pro had one majorly powerful core for the time and represented excellent value, IMHO it was GeForce 6 before nVidia really had a definitive answer.

            9500 = 4 pipes, 128bitDDR, 275mhz core, 540mhz memory.
            9500P = 8 pipes, 128bitDDR, 275mhz core, 540mhz memory.
            9700 = 8 pipes, 256bitDDR, 275mhz core, 540mhz memory.
            9700P = 8 pipes, 256bitDDR, 325mhz core, 620mhz memory.
            9800P = 8 pipes, 256bitDDR, 380mhz core, 680mhz memory.

      • 5150
      • 14 years ago

      The goggles, they do nothing!

    • Krogoth
    • 14 years ago

    In other words, this “X1900XT” is just a 7800GTX SLI or X1800XT Crossfire in one card from a performance standpoint?

      • Shintai
      • 14 years ago

      If your game is shader intensive yes, if not..no. If really weak or no shaders. It would be like a x1800XT

    • Sargent Duck
    • 14 years ago

    It’s the same thing as the Xbox 360 graphics chip isn’t it? They’re not traditional pixel shaders, but more of a hybrid I believe. They can switch between pixel shading and vertex.

    and #1, Don’t forget Nvidia with their “G’s”. GTX, GT, GS, GTX 512.

    and #2, Shintai mentioned the Geforce 7300 was just paper launched, but don’t forget about the 7800 GTX 512. That’s non-existant.

      • stix
      • 14 years ago

      No xbox 360 has Edram imbedded

        • Chryx
        • 14 years ago

        No, it doesn’t, the edram is a seperate die next to the GPU

        • Chryx
        • 14 years ago

        No, it doesn’t, the edram is a seperate die next to the GPU

      • Jive
      • 14 years ago

      I dont think the G’s with nVidia are nearly as bad. A lot of new tech products have a X in their name somewhere, at least nVidia kinda left the X’s, and did something that is similarly catching, and less annoying.

      Last generation, ATi went “X” crazy, with every single model name there was at least one or two X’s in there, on both their hard and paper launchs.

    • PerfectCr
    • 14 years ago

    650Mhz, here comes another Dustbuster.

      • trintron
      • 14 years ago

      or maybe 3 slot cooler 😉

    • madlemming
    • 14 years ago

    Is that really the only advantage the xtx has… 25Mhz core and 50 Mhz memory? I can see manufactures selling ‘overclocked in the box’ xt’s at least that fast.

    That just makes the price premium people will pay for the extra x even more absurd.

      • Shintai
      • 14 years ago

      Remember old Ultra cards? Same thing.

      Its never been economically right to buy the extreme/ultra cards.

    • UberGerbil
    • 14 years ago

    To be fair to ATI, this isn’t a “launch” paper or otherwise. It’s not even an annoucement. The specs may be right, they may be wrong, but this isn’t coming from ATI. We can roll them in petrochemical distilates and poultry byproducts when and if they actually announce the product without any actual, you know, product. Until then, they should be pilloried for all the cards they’re still not shipping, not one they haven’t even announced.

    • Wajo
    • 14 years ago

    its not “48 pixel pipelines”.. the pipeline has been separated.. x1900 is going to have 48 pixel shader processors, 16 texture mapping units and 16 ROPs, so it’s not 16 “full” pipelines… ati believes a 3 to 1 proportion to work well for pixels…

    Edit.. meant this as a reply to #5

      • 1c3d0g
      • 14 years ago

      Yeah, all this is a bit misleading when they just put “48 pipes!!!”, without stating the all-important details. I guess it’s all about Shock and Awe nowadays… :-/

    • willyolio
    • 14 years ago

    if it were possible to build computers out of paper, i would have such a kick ass system…

    • Madman
    • 14 years ago

    1900XXX anyone or maybe 1900pr0n edition?

      • CampinCarl
      • 14 years ago

      I wonder if it’ll come with a free pron tape?

    • Jive
    • 14 years ago

    XTX? XT? X1900?

    Im getting sick seeing those damn X’s everywhere. What is ATI’s marketing department thinking? “Oh the consumer is attracted to anything with the letter X, so lets just put twenty of them on the model name.”

      • 1c3d0g
      • 14 years ago

      I think that’s a minor problem to worry about (for now)…they’ve got MUCH bigger problems to deal with, like how to Hard Launch a product and not release (yet another) vapourware product. And ATI better make the X1900XTX a killer GPU, ’cause I have an eerie feeling that the G71 will be ~2x faster than a GeForce 7800 GTX 512… 😮

        • Shintai
        • 14 years ago

        nVidia 7300 was just paper launched 😉

        • shank15217
        • 14 years ago

        The NV50 generation might be 2x faster, i doubt G71 will be. Remember the G70 series is an incremental improvement over the NV40 and is still considered the same generation. Nvidia’s next generation part will be NV50 series.

          • Ardrid
          • 14 years ago

          I have a feeling G71 is going to be damn close, at least if we’re talking about the standard GTX. According to Hexus, it’s posting up a 13K in 3DMark05, whereas the standard GTX gets about 7800-8000. The GTX 512 can post up a 9500 or so. I’d say we’re probably looking at about a 35-50% increase over the GTX 512 in certain apps.

      • Shintai
      • 14 years ago

      nVidia with 7800GT/7800GTX/7800GTX512 is better? The last two are directly consumer misleading with their huge difference.

      • SpiffSpoo
      • 14 years ago

      Ya going from 16 pipelines, which had trouble working at first, to 48 seems abit improbable (sp?) and at those speeds as well. Considering that, it must be even more power hungery, yikes!

        • Shintai
        • 14 years ago

        Its 16 pipes, 16 vertex shaders, 48 pixel shaders.

        Power would be near what x1800XT uses. But performance will be massively higher in shader intensive games. (All new games)

      • A_Pickle
      • 14 years ago

      Yeah. Right. Tell that to people who buy that “performance RAM.”

      Crucial /[

        • d0g_p00p
        • 14 years ago

        I thought the X in XDR was for eXtended not eXtreme.

      • absinthexl
      • 14 years ago

      As bad as nVidia’s naming scheme is, ATI’s is about twice as horrible. That alone just makes me ignore half their lineup. Die die die die die die die die DIE!

      • indeego
      • 14 years ago

      /[

Pin It on Pinterest

Share This