S3 unveils Chrome S20 series GPUs

S3 Graphics has just pulled back the curtains on its newest series of graphics chips, the Chrome S20 family. S3 turned to a new foundry partner, Fujitsu, to produce these chips, as we reported yesterday. Fujitsu manufactures the GPUs using a low-K, 90nm fab process, and S3 claims this advanced fabrication process, along with S3’s own Power Wise power management techniques brought over from its mobile products, has helped the Chrome S20 series achieve killer performance per watt.


The S20 series is made of up two variants, Chrome S25 and Chrome S27. From the looks of it, the S25 and S27 are probably the same basic silicon spun out into separate products for different markets.


Although these are new GPUs, the Chrome heritage of the S20 series is obvious. (Those of you who are curious about the architecture can read my review of one of the first DeltaChromes for more info.) The chips come with native PCI Express interfaces, and like the rest of the Chrome line, the graphics core supports DirectX 9’s Shader Model 2.0. The S20’s pixel shaders support a “full” 96 bits of floating-point precision per pixel, and they sport more registers than past Chrome GPUs, for faster, more efficient execution of code. Most notably for performance, the S20 series GPUs combine four vertex shaders, eight pixel shaders, and four texture units per GPU. That should give them a mix of attributes fairly similar to NVIDIA’s GeForce 6600 series, although real-world per-clock performance comparisons between different GPU architectures are tough to handicap based on specs alone.

The sexier of the two parts is the S27, intended as a competitor to the GeForce 6600 and the Radeon X1300 Pro. This chip will run its eight pixel shader pipes at a staggering 700MHz, and the fastest cards based on the S27 will use a 128-bit interface to talk to 128MB of GDDR3 memory, also clocked at 700MHz. S3 plans to sell these 700MHz/700MHz cards for about $100, which could make the S27 a reasonably appealing option in the so-called mainstream graphics market. NVIDIA’s recently announced GeForce 6600 with 256MB of DDR2 memory, for comparison, has reference clocks of 350MHz core and 400MHz memory—much less than the S27. The $99 version of the Radeon X1300 has 450/500MHz clocks and comes with 128MB of RAM. S3 believes the S27’s performance should be more than competitive in this segment of the market.


The S27 has learned a new trick, too: a multiple graphics card technology that S3 calls… MultiChrome, of course. Like the current low-end SLI options from NVIDIA—and like ATI’s plans for Radeon X1300 CrossFire—MultiChrome will need no external connectors to operate, instead passing data over PCI Express lanes. To their great credit, S3 and VIA have decided to make MultiChrome a platform-independent affair; any capable PCI Express implementation should be able to work with MultiChrome, with no bogus driver-based lock-outs like NVIDIA and ATI impose with SLI and CrossFire.


I somehow failed to ask how S3 handles load sharing between two cards and whether they’re planning to use a feature like application profiles to help determine which load-balancing method is best for a given application. They did say, though, that performance scales up to roughly 75% higher than a single S27 card, which isn’t bad given the connector-less implementation. I expect to get my hands on a dual S27 configuration before too long, so we’ll get those questions answered soon.


The S25 is the lower end part, and with peak clock speeds of 600/400MHz for GPU core and memory, it will do battle with the Radeon X1300 HyperMemory and GeForce 6200 TurboCache cards of the world. Indeed, S3 has endowed the S25 with its own AcceleRAM tech, which is the same basic thing as TurboCache and HyperMemory: the S25 can read from and write to main system memory over the PCI Express interface, just as it can to local memory. S25 AcceleRAM cards will come with anywhere from 32 to 128MB of local memory, and they’ll allocate more space in system memory to expand their effective storage space.


Both S20 series GPUs come with the latest version of S3’s Chromotion video engine, massaged and revamped from past Chrome products. Chromotion 3.0 will support WMV HD acceleration, but stops short of support for H.264 decode acceleration—a compute-intensive task that will likely challenge the ATI and NVIDIA chips in this segment. The S20’s display engine can drive an HDTV in modes up to 1080p or a couple of LCD displays via a pair of DVI channels. Like most DX9 GPUs, S3 expects the S20 to be capable of handling Windows Vista’s AeroGlass interface, and the S20 can even run in 10-bit-per-color-channel video modes for high-color photo editing and the like.


As one might expect given S3’s focus on performance per watt, mobile versions of the S20 series are in the works. Also, S3 has big plans for future GPU generations. Rather than aim for Shader Model 3.0, where the S20’s primary competitors are now, S3 is planning a jump to Shader Model 4.0 in its next-gen GPUs.

As ever, the Chrome S20’s success in North America will hinge on actual availability of the product. (We can talk about driver quality and game compatibility, as well, but not if the cards aren’t selling.) S3 says cards based on the Chrome S20 chips will be available today in China, and that it’s working on getting major, respected U.S. e-tailers to make cards available on these shores. We’ll be watching to see what develops.

Comments closed
    • Bensam123
    • 15 years ago

    Anyone else see a under-dog in the graphics market apearing?

    Not just like someone who also makes graphics but from the past products S3 has released to what they’re releasing now it looks like they’re slowly but surely catching up to ATi and Nvidia.

    I would never have even considered buying one of S3s high end products over a bargin bin Radeon or Geforce, but now they’re at the point where they deliver spec wise comparable cards to current ATi/Nvidia low end cards. If the drivers deliver (which they never have in the past) and they perform well (again which they haven’t in the past) they might end up grabbing a share of the bargin bin.

    I still wouldn’t consider buying one and probably won’t for awhile but next year or the year after they might be competeing in the mid-grade graphics card market if things keep persisting.

    Wonder if one of the three will catch on and figure out everyones system isn’t brimming with PCI-E x16 slots and build lower end ‘modules’ that you can plug into a spare PCI-E x1 slot that will assist the main GPU.

    Everyone is all about half and half but no one considers assigning certain tasks to GPU helpers…

    So much bandwidth… so little utilized…

    • Pete
    • 15 years ago

    Scott, did they really decouple the texture units from the pixel pipes, or did you mean to say ROPs? Or do the cards have four actual pixel pipes (one quad) with two ALUs per pipe?

    • Delta9
    • 15 years ago

    Good. The last generation of s3 was crap at first, but drivers improved. If they finally designed a crossbar memory controller, and good texture compression algarythm it should be compelling product for an entry level system. That is if they can deliver @ a killer price point. The drivers for this arcitecture are vaslty improved since the chrome vpu was first released. Still not perfect, but very useable.

    • LoneWolf15
    • 15 years ago

    When is the last time that S3 truly got a GPU out the door on a part you could easily buy, that was on a card (not an embedded solution)?

    And when that last happened, how long did it take S3 to introduce reasonable drivers? And by the time those came out, wasn’t the price of the cards above the cards the GPU was actually meant to compete with, yet the performance was less?

    I will consider an S3-based card when the following happens:

    a) S3 actually partners with a vendor who actually releases cards
    b) The cards are actually released to the public on time, not six months after the planned release date
    c) The cards have mature enough drivers that they can be reviewed successfully by multiple tech. sites so we know end-users won’t be taking the shaft. These drivers must be available at product launch, not 6-10 months later; they don’t have to be perfect, but they can’t be the horrible bug-fest that previous S3 (remember Savage?) products have suffered.
    d) The product must perform, at time of release, within 5-10% of the products S3 claims to be competing with.
    e) The price needs to be competetive (read: lower) than the competition, because S3 has done such a shoddy job over the past several years, that nobody would choose an S3 card over an equivalent Geforce or Radeon, even if the performance was 5-10% better.

    Would I like to see S3 manage to pull this off? Darn tootin’ I would; it’s about time there was a serious third player in the graphices market. I’m not holding my breath though; if I did every time S3 had a press release on the Chrome family of GPU’s, I’d be six feet under by now.

      • Vrock
      • 15 years ago

      That would be the Savage2000. It was the little card that almost did. It went toe to toe with the original Geforce SDR card, and it was far cheaper. The only problem was that the silicon was buggy and IIRC, the T&L unit didn’t work in D3D. This was back when hardware T&L was the big industry buzzword. The buggy silicon combined with S3s horrendous driver support sent this card to its death. And that’s a shame, because if it had worked right, it could’ve been a serious contender.

        • LoneWolf15
        • 15 years ago

        The same could have been said of the Savage3D or Savage4. The Savage family started S3’s true downward spiral (though you could argue the same of the S3 ViRGE).

        I knew what chipset it was, but there has been more than one Savage (Savage3D, Savage2k, Savage4) so listing a specific model when all of them had similar issues would have been a moot point.

      • Shinare
      • 15 years ago

      heh I would also like to see ATi pull that off.

        • LoneWolf15
        • 15 years ago

        ATI has pulled most of it off. Multiple vendors are selling their GPU’s. Their drivers are far more stable than they were back in `98-2000, and everyday use is quite good (I should know, I sent back a Rage Fury in January of `99 due to drivers, and own a Radeon X800XL now that is great).

        ATI’s one major flaw is in the availability of product, something that’s really only been an issue starting with the Radeon X-series cards. I agree that it’s a major issue, as it hurts ATI’s credibility, but unlike S3, ATI cards eventually ship.

    • genesisx
    • 15 years ago

    I really like the platform independent SLI thing. I wish ati and nvidia would adopt such a policy, but it’s unlikely given their desire to conquer the mobo chipset market.

    Is there any significant technological limitation stopping them from doing so? Couldn’t they get comparable performance to their specialized solutions by just throwing more pci-e lanes at the problem?

    • AmishRakeFight
    • 15 years ago

    I don’t trust S3…used to be a pain in the ass to find drivers for their shite. didn’t they change their name to Sonic Blue or some crap once to dodge people looking for support on their cards?

      • axeman
      • 15 years ago

      Oh what a sorried saga this was. I believe Diamond Multimedia bought them out or something, changed their name to Sonic Blue. Sonic Blue was basically pushing mp3 players and stuff. Then S3 (the graphics division) got sold to VIA, and somehow you can buy Diamond Multimedia graphics cards again ? Makes my head spin.

      edit: my bad. S3 bought Diamond Multimedia to acquire the Rio, changed their name to sonic blue, then sold off S3 graphics to Via. Now sonic blue is no more? But rio is still around. And so is diamond. but diamond no longer makes the Rio. O_o

        • Third Eye
        • 15 years ago

        Around 1998 time period, a couple of graphics chipset manufacturer acquired card manufacturers.
        3DFx aquired Simply The Best(STB)
        S3 acquired Diamond Multimedia
        They pissed of all their existing customer base which ran to NVIDIA (after that NVIDIA never looked back). They were trying to emulate the Canucks ATI(@ that time ATI was card/chipset manufacturer) and Matrox.
        Look now. The most successful companies are NVIDIA and ATI (which has now diversified as a chipset provider for other card manufacturers).
        3DFx is swallowed by NV and S3 is now a subsidiary of VIA.

        VIA meanwhile was looking for a graphics partner. Meanwhile S3 was trying to concenterate on Consumer electronic devices. Intel was already pissed off in VIA delivering the Appollo PRO 133 chipset sounding death knell to the 820 RDRAM experiment. So as usual they were about to take action against VIA. Knowing Intel’s tactics Wen Chi Chen, VIA CEO (and a former Intel Employee) hit upon a double jackpot.
        a) He would have a stake in S3 Graphics.
        b) S3 had a 10yr Patent x-licensing with Intel bcos of acquisition of a company called Exponential Semiconductor.
        c) So as long as any product came out of a S3 venture, Intel cannot do anything.
        So he was able to manufacture chipsets for the Pentium platform without caring a damn about Intel.

        • just brew it!
        • 15 years ago

        They also acquired (and killed off) Micronics, who used to be one of the better mobo makers “back in the day”.

        Starfalcon currently has two of my old Micronics mobos (a White Lightning Pentium Pro board and a C200 Socket 7 board). Both still work just fine.

    • dragmor
    • 15 years ago

    Sites are reporting 10-12w, so it has the same wattage as a 9600 (plain) and the performance of a 6600. Doesnt sound like anything special.

    • Flying Fox
    • 15 years ago

    It’s hard not to root for this little guy, especially they do keep trying. Let’s hope this time it works.

    • Sargent Duck
    • 15 years ago

    Heh, I thought I was reading a preview of it on the front page. I think that’s definetly one of the longer front page posts in a while.

    But good read never-the-less.

    • Hattig
    • 15 years ago

    Correction: “the graphics core supports DirectX 9’s Shader Model 3.0” should be Shader Model 2.0 I think.

    Supported by: “S3 has big plans for future GPU generations. Rather than aim for Shader Model 3.0, where the S20’s primary competitors are now, S3 is planning a jump to Shader Model 4.0”

      • Damage
      • 15 years ago

      Ack! Fixed. It’s 2.0, of course.

    • FireGryphon
    • 15 years ago

    S3 is slowly rising. We may be on the virge of a savage takeover 😛

    FP!

      • Chrispy_
      • 15 years ago

      That would make a Trio of competitors with the same Vision
      (to become the Twisting Auroura of the graphics world)

      😀

      All the others cannot be used because they were just numbers IIRC

        • FireGryphon
        • 15 years ago

        touche! 😀

Pin It on Pinterest

Share This