The 3D bits
Naturally, because 3D is the part of graphics that's so hard, I hit the S3 folks with a barrage of questions to see if I could understand exactly how well put-together this chip is, what its advantages are, and what its liabilities might be. I'll try to relate all the good bits, then I'll talk about the compromises S3 had to make, including any potential "gotchas" that might make the chip fall short of the mark.
DeltaChrome is a true DirectX 9-class design, with the ability to handle floating-point pixel data throughout its 3D pipelines, including texture formats and color/frame buffers. The chip's pixel and vertex shaders are intended to meet Microsoft's 2.0 shader specs, and they include a few extra features, as well. The 8-pipe DeltaChrome chip has four vertex shader units on it, although that doesn't tell you much, because vertex shader units vary widely in terms of performance, internal parallelism, and the like. Still, this chip has four of S3's vertex shader units.
For those of you unfamiliar with DirectX 9, all those things make a recipe for some very tasty eye candy. With 96 bits per pixel of floating-point precision, DeltaChrome should be able to pull off some killer effects like high-dynamic-range lighting.
Unlike Matrox's pokey Parhelia, the DeltaChrome includes robust on-chip facilities for occlusion detection and overdraw reduction. Most modern graphics chips include these facilities, because without them, GPUs would waste too much time processing pixels that will never be seenobjects behind other objects, occluded from view. Overdraw reduction matters more now that we have programmable shaders, because a single pixel may occupy many GPU clock cycles. S3's overdraw reduction scheme combines multiple techniques, including a low-res hierarchical Z buffer, Z analysis early in the rendering pipeline, and some unique methods that I'd love to know more about, but the sauce is too secret for S3 to discuss them. DeltaChrome also includes provisions to reduce Z-buffer reads and writes, including a zero-cycle Z clear.
All of this, so far, sounds pretty good to me. I questioned the S3 folks for hours, took pages of notes, and did my best to find chinks in the DeltaChrome's shiny, cubic-environment-mapped armor, but I didn't find much. The NV3x chips are rumored to be low on register space in their DX9 pixel shaders, so I asked about that. S3 was confident DeltaChrome had been properly engineered in this area. I also asked about S3's compiler development efforts, because compiling from high-level shading languages like Microsoft's HLSL will be crucial to next-gen DX9 apps and games. The answer: S3's compiler development efforts are ongoing, with 10 to 15 developers on the team.
Decent answers, all things considered. The patient people from S3 were exceptionally forthcoming as I quizzed them, too, which is a very good sign.
Design compromises and potential "gotchas"
S3's engineers did have to make a few compromises in designing DeltaChrome, and there are a few potential weaknesses that may keep the chip from becoming a titan of 3D performance. I'll look at each and try to assess its possible impact.
First, S3 didn't include support for any form of higher-order surfaces, like ATI's Truform. This is one feature I can't imagine most gamers will miss, because there's no single accepted standard for HOS support at present, anyhow. These issues will probably be worked out starting with the next generation of GPUs, with the pixel and vertex shader 3.0 standards in DX9. Omitting HOS support from DeltaChrome was probably a savvy choice for S3.
Next, the DeltaChrome pixel shaders do not have dynamic flow control, so pixel shader programs can't branch and loop at will like NVIDIA's NV3x pixel shaders. Dynamic flow control is not required in the DX9 pixel shader 2.0 spec, and as I understand it, the DeltaChrome shares this limitation with ATI's R3x0 chipsnot bad company to keep, all things considered. For S3, the upshot is that DeltaChrome will support DX9 and OpenGL 1.3, but it will not support OpenGL 2.0. Since S3 is not aiming for the workstation market with DeltaChrome, that's probably a fair compromise to make. Still, I have to think that the company could deliver an OpenGL 2.0 driver if it really wanted to by clever use of multipass rendering.
A more severe limitation, at least to some, will be DeltaChrome's relatively weak support for edge antialiasing. The chip will do anisotropic filtering up to 16X strengths, but edge AA is limited to 2X supersampling at resolutions up to 1024x768. That's it. AA wasn't a big priority for S3, so the design team didn't blow its transistor budget on this feature. I'll be interested to see how gamers respond to this one. For all the attention AA has gotten over time, I'm not sure how much it will be missed.
Performance-wise, one big concern is the memory controller on the DeltaChrome, which is not a crossbar design. The more sophisticated memory controllers from ATI and NVIDIA incorporate multiple ports, a switch fabric, and fancy load-balancing algorithms to take best advantage of a precious resource: memory bandwidth. S3 says the DeltaChrome has a sophisticated cache design, but all memory transfers will happen in 128-bit chunks. We'll have to see whether this compromise produces a fill rate bottleneck.
My last big concern is a simple one, which is that the DeltaChrome could turn out to perform unacceptably for some other reason. The only really competent performers in DirectX 9 right now are ATI's R3x0 chips, and those chips are the standard platforms for a great many game developers. The R3x0 series has a formidable amount of computation power onboard, and the DeltaChrome could fall short of the mark by delivering far fewer instructions per clock than the ATI chips. NVIDIA is experiencing this problem right now for reasons we only partially understand, and I don't believe the DeltaChrome has those same pitfalls. But perhaps it has others. Time will tell.
Barring that sort of unspecified catastrophe, S3's new graphics chip looks mighty decent on paper. DeltaChrome doesn't have to tear through benchmarks like an NVIDIA driver cheat in order to succeed, but it does need to provide a reasonably smooth gaming experience without too many compatibility headaches. If it can do that, I expect many of us would be willing to live with the compromises the S3 design team has made, with the possible exception of the rather weak edge antialiasing.
|Gigabyte, Asus, and MSI prep updates against Meltdown and Spectre||37|
|be quiet! displays its Dark Rock 4 and Dark Rock Pro 4 coolers||19|
|EVGA teases its 2200-W power supply and Z10 keyboard at CES||24|
|Intel acknowledges Haswell and Broadwell reboots after patches||43|
|AMD will issue optional Ryzen and Epyc microcode updates for Spectre||22|
|Intel promises speedy exploit patches in its Security-First Pledge||15|
|ECS displays diminutive Liva-series systems at CES||5|
|Intel studies the performance impact of Meltdown fixes||52|
|Sony puts a projector into a table and a speaker into a TV at CES||6|
|I brought balance to the Force meme by making everything +/- 58, sadly it's been ruined now. :(||+9|