SiS’s Xabre 400 graphics processor

Manufacturer SiS
Model Xabre 400
Price (street) US$99
Availability Now

VERTEX AND PIXEL SHADERS are where it’s at for 3D graphics. Everyone deserves a DirectX 8.1-compatible graphics card, and trickle-down is a wonderful thing. Last year’s high-end DirectX 8.1 features have now made their way down to value graphics cards that everyone can afford. SiS’s Xabre is the latest value GPU to feature a full DirectX 8.1 feature set, and it’s the first to support the new AGP 8X standard. Other than Xabre, only ATI’s high-end Radeon 9700 graphics card supports AGP 8X, and it’s only just now becoming available to consumers.

There’s more to SiS’s Xabre than support for DirectX 8.1 and AGP 8X, and only diving into the registry can reveal the cunning conspiracy. Years ago, one of SiS’s on board graphics chipsets was caught rendering only every other frame in ZD’s then-popular 3D Winbench performance test. It was cheating, and a mark was made on SiS’s permanent record. Now, with Xabre, SiS is again using questionable tactics to deliver better frame rates in 3D applications.

Is SiS cheating again? What does the performance picture look like when we force a level playing field? Find out the answers to these questions and more as we take an in-depth look at SiS’s Xabre 400 graphics chip.

The chip
SiS hasn’t introduced a new GPU since its 315 chip, so the Xabre is a pretty big deal. Xabre is currently available in three flavors, the Xabre 400, 200, and 80, but we’ll be concentrating on the Xabre 400 for the purposes of this review. As you might expect, the Xabre 200 and 80 run at lower clock speeds. The Xabre 80 also has only a 64-bit DDR bus and is only AGP 4X-compliant.

Unlike ATI and NVIDIA, who outsource their chip production to specialists like TSMC and UMC, SiS fabs the Xabre chips itself.


The Xabre chip is hiding under there

Here are some of the Xabre’s key features:

  • A 4×2 rendering pipeline — SiS has given Xabre four pixel pipelines, each capable of laying down two textures per pass. As far as I’m aware, there’s no functionality that allows Xabre to “loop back” and lay down multiple textures without a second rendering pass. For complex shader calculations, Xabre will have to write to the frame buffer, read back the results, and possibly lose some color precision in the process.

    Today’s games don’t lay down that many textures per pass, but future titles using more complex shader programs will. Then again, this is a budget graphics part, and the only announced graphics card with any kind of really hard-core internal precision is ATI’s high-end Radeon 9700 Pro.

  • Shaders — Xabre features version 1.3 pixel shaders, which have a maximum instruction length of 12 instructions (8 arithmetic, 4 texture address). Pixel shaders are at the heart of DirectX 8’s advances, because they enable more complex pixel processing (and thus tastier eye candy) in 3D applications.

    What’s really interesting is that Xabre has no vertex shaders. DirectX doesn’t have any facility for emulating pixel shaders (they could be emulated via multi-pass rendering, but DX8 doesn’t break down pixel shader programs into multiple passes, probably for performance reasons). However, DX8 does have a software vertex shader implementation, so cards without vertex shaders can lean on the CPU for vertex shader handling. SiS kept Xabre’s transistor count low by leaving out vertex shaders. Clever, no? This is one of the benefits of using a clean-slate design for a budget chip instead of recycling an older graphics core, like NVIDIA did with the GeForce4 MX.

    There is a hitch, however. The lack of a vertex shader makes the Xabre look a lot more like the GeForce4 MX than a DirectX 8.1-compliant graphics card, and it could run into some problems with some DX8 titles. Applications will have to be intelligent enough to recognize that the Xabre has pixel shaders but not vertex shaders. If applications don’t get that right, they may either turn off shader-specific features or fail to run at all.

    SiS would have been wise to include a software vertex shader in its driver software, so Xabre could expose both pixel and vertex shader capabilities to applications. A driver-level vertex shader would also present opportunities for tweaking, extensive SIMD support, and the like. Instead, SiS chose to rely on Microsoft’s DX8 vertex shader.

  • Bandwidth conservation — Xabre uses “Frictionless Memory Control” to conserve memory bandwidth, but details on just what makes the system frictionless and what they’re doing to conserve bandwidth are scarce. The most we were able to squeeze out of SiS’s engineers is that FMC is a 2-channel architecture. The Xabre also attempts to conserve bandwidth by using Z-compression, fast Z-clear, and their own proprietary hardware occlusion culling algorithm.

    Like every other graphics chip in its class, Xabre has a 128-bit memory bus, which means there should be a decent amount of memory bandwidth with DDR SDRAM and reasonable memory clock speeds.

  • AGP 8X — SiS is talking a lot of smack about AGP 8X with the Xabre, and to its credit, Xabre was the first AGP 8X graphics card available. However, with a shortage of AGP 8X motherboards on the market, the fact that Xabre was first is a bit of a moot point.

    As you may have already guessed, AGP 8X delivers twice the bandwidth of AGP 4X, which puts it right around 2.1GB/s. Greater AGP bandwidth will theoretically let Xabre pull more data over the AGP bus, but what kind of impact this will have on performance is unclear. To be faster than AGP 4X, an application needs to be saturating the AGP bus with over 1GB/s of data, and I’m not sure applications are quite there yet.

  • Multi-monitor and video capabilities — Xabre features an integrated 375MHz RAMDAC and MPEG decoder, which means cards should support a good range of resolutions and refresh rates, plus DVD playback. There’s also a TV encoder on the chip, but Xabre needs the companion SiS 301 chip to output signals to a TV. The 301 chip also powers a DVI output or second VGA monitor if you want to use multiple monitors.

    SiS has programmed in some per-pixel motion detection de-interlacing that kicks in when you use the integrated MPEG decoder. This feature should improve the quality of incoming video streams.


SiS’s 301 chip powers secondary displays
 

The specs
Now that we have an idea what Xabre is all about, let’s take a look at how it stacks up against its competition, theoretically speaking. Fill rates and memory bandwidth don’t guarantee real-world performance, but they will give us a good launching point for our benchmarks.

  Core clock (MHz) Pixel pipelines  Peak fill rate (Mpixels/s) Texture units per pixel pipeline Peak fill rate (Mtexels/s) Memory clock (MHz) Memory bus width (bits) Peak memory bandwidth (GB/s)
GeForce4 MX 440 270 2 540 2 1080 400 128 6.4
GeForce4 Ti 4200 128MB 250 4 1000 2 2000 444 128 7.1
Radeon 7500 290 2 580 3 1740 460 128 7.4
GeForce4 Ti 4200 64MB 250 4 1000 2 2000 500 128 8.0
Radeon 8500LE 250 4 1000 2 2000 500 128 8.0
SiS Xabre 400 250 4 1000 2 2000 500 128 8.0
Radeon 9000 Pro 275 4 1100 1 1100 550 128 8.8
GeForce4 MX 460 300 2 600 2 1200 550 128 8.8
Radeon 8500 128MB 275 4 1100 2 2200 550 128 8.8

With four pixel pipelines laying down two textures per pipe, the Xabre 400 lines up perfectly with ATI’s Radeon 8500LE and GeForce4 Ti 4200s in terms of single and multi-texturing fill rate. With 8GB/s of memory bandwidth, things remain competitive across the board. These are just theoretical peaks, so don’t get too excited just yet.

The card
Now let’s take a look at the reference card.


That’s one massive purple heat sink

I’m not a huge fan of purple, but I can live with just about any color on my GPU heat sink as long as the fan isn’t overly annoying. Unfortunately, the fan on our reference card’s heat sink was beyond annoying. Whenever you fire up a 3D application, the temperature-sensitive fan kicks into high gear and starts whining louder than a Morpheus user complaining about high CD prices. Really, it’s that loud.

So the fan is loud, but doesn’t the large size at least cool the memory?


No rear RAM heat sinks

Nope. As you can see, there are no heat sinks on the card’s rear memory chips. Normally this would mean that there’s an imbalance between the memory cooling on the front and rear memory chips, but that’s actually not the case here.


Notice the gap between the heat sink and RAM chip

It doesn’t really matter that there are no heat sinks on the rear RAM chips because the heat sink doesn’t actually make physical contact with memory chips on the front of the card. The gap between the heat sink and memory chips isn’t a result of a sloppy heat sink installation; the top of the GPU and memory chips simply don’t line up in a manner that lets the flat base of the heat sink make contact with both.


Standard back plate ports

Our Xabre reference card features all the usual ports. Given the low price targets of Xabre-based cards, I wouldn’t expect most manufacturers to include the DVI-to-VGA adapter necessary to run two VGA monitors.


DDR SDRAM chips rated to 300MHz

The Xabre 400’s memory spec only calls for a 250MHz memory bus, but the 64MB worth of chips on our reference card were rated at 300MHz. That should be one heck of an easy overclock, though I doubt third party cards will be so generous with their memory chips.

 

Driver and platform quirks
SiS’s drivers for the Xabre are about as lean as drivers come these days. There are no tuning options for any 3D features. Multimonitor control is especially weak, and there’s not even access to vsync. In a world where NVIDIA and ATI keep raising the bar for driver software, SiS’s effort has much to be desired.

The driver CD for our review sample came with a copy of SiS’s 3D Wizard software. Although this utility isn’t available for download from SiS, I would expect it to ship with cards based on the Xabre chipset. 3D Wizard gives you control over antialiasing, 3D stereo glasses, and overclocking, but its support of controversial wireframe and transparency modes is sure to ruffle a few feathers.


Eagle-Eye: Wireframe


Eagle-Eye: Transparency

In the past, wireframe and transparency modes have been justified by manufacturers as enabling users to learn more about 3D rendering. However, SiS includes access to these features under the 3D Wizard’s “Eagle-Eye” tab, which doesn’t sound all that educational to me.

Normally we like to make sure graphics cards we’re testing are displaying textures with the highest possible quality setting because high scores in benchmarks mean nothing if rendering quality is poor overall. Since SiS’s drivers don’t include any texture sliders, we had to hack our way through the registry to access a setting X-bit Labs found in their Xabre review called TexTurbo. On our review system, the SiS.3D.TexTurbo key was found in the following registry location:

\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video\{9396991D-4B21-9DCB-EFA46829E612}\0000.

In a nutshell, TexTurbo controls the quality of textures, which in turn affects frame rates and image quality. SiS’s latest WHQL drivers default to a TexTurbo setting of 3, and we’ve tested that. We’ve also run through a complete line of tests with TexTurbo set to 0, and we’ll be comparing the two along the way in both performance and image quality to see how they differ.

Having to use the registry to change a driver setting is worrisome, but we’ll have to go through some benchmarks and image quality tests before passing judgment. TexTurbo could be a big deal or a minor inconvenience.

Stability was a big problem with the Xabre on Abit’s AT7 motherboard, which uses VIA’s KT333 chipset. The card was stable on Abit’s KT266A-based KR7A-RAID, but I’m hearing there are some issues with Xabre-based cards on VIA and nForce platforms in general. If problems plague both VIA and NVIDIA chipsets, chances are pretty good that the blame lies with the SiS rather than with the chipsets themselves.

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. All tests were run three times, and the results were averaged.

Motherboard

Abit BD7II-RAID

Processor

Intel Pentium 4 2.26GHz

Front-side bus

4x133MHz

Chipset

Intel 845E

North bridge

Intel 82845E (MCH)

South bridge

Intel 82801DB (ICH4)

Memory size

512MB (2 DIMMs)

Memory type

CAS 2.5 PC2700 DDR SDRAM

Graphics SiS Xabre

GeForce4 Ti 4200 128MB
GeForce4 Ti 4200
GeForce4 MX460
GeForce4 MX 440

Radeon 7500
Radeon 8500LE
3D Prophet 8500 128MB
Radeon 9000 Pro

Graphics driver SiS 3.03

 NVIDIA 30.82

ATI 7.74

Storage

IBM 60GXP 40GB 7200RPM ATA/100 hard drive

Operating System

Windows XP Professional

Today we’re comparing the Xabre 400 with the most recent crop of budget video cards from ATI and NVIDIA. Some of these cards are price-competitive with products based on the Xabre 400, while others will show just what kind of performance gains to expect if you’re willing to spend a little more.

The Abit BD7II-RAID motherboard we used for testing doesn’t support AGP 8X, which may put the Xabre at a bit of a disadvantage. However, considering how few applications saturate an AGP 4X bus, I think we’ll be alright.

We used the following versions of our test applications:

The test systems’ Windows desktop was set at 1024×768 in 32-bit color at a 75Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests. Most of the 3D gaming tests used the high detail image quality settings in 32-bit color.

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

3DMark2001 SE

In 3DMark2001 SE, the Xabre 400 doesn’t perform well at all; there’s also quite a performance hit when TexTurbo is set at 0. Though the Xabre 400 claims to be a fully DirectX 8.1-compliant product, it’s the slowest DirectX 8.1 part in 3DMark2001 SE. In fact, with TexTurbo set to 0, the Xabre 400 is slower than the GeForce MX cards and the Radeon 7500, neither of which implement the full DirectX 8.1 feature set.

It’s interesting to note that MadOnion had to issue a patch for 3DMark2001 SE to make it work correctly with the Xabre. Totally new GPUs like Matrox’s Parhelia haven’t required patching, so SiS must be doing something radically different from other graphics companies. My guess would be the “software” vertex shader, which is certainly a nonstandard way of fulfilling that particular DirectX 8.1 requirement.

3DMark2001 SE – Game benchmarks

In 3DMark2001 SE’s first game scene, the Xabre is only slower than the Radeon 8500 and GeForce4 Ti 4200 cards. With TexTurbo at 0, things get a little more dire, and performance drops considerably.

The Xabre 400 rides Dragothic’s dragon all the way to last place among DirectX 8.1-compatible cards in the second game test, and it’s the slowest of all the cards we’ve gathered with TexTurbo 0.

In the Lobby game scene, the GeForce4 MX 460 just squeaks by the Xabre 400, which continues to scrape the bottom of the barrel when TexTurbo is set to 0.

3DMark2001 SE’s Nature scene requires vertex and pixel shaders, so the GeForce4 MXs and Radeon 7500 are out of the picture. With them gone, the Xabre 400 brings up the rear with a performance that’s well off its competition. The Xabre 400 may be fully DirectX 8.1-compatible, but it doesn’t perform that well when DirectX 8.1’s advanced features are used.

 

3DMark2001 SE – Fill rate
Earlier we looked at the theoretical fill rates of each of the cards in our stable today, and now it’s time to see how much of those theoretical peaks each card is able to realize in 3DMark2001 SE’s fill rate tests.

The Xabre 400’s single texturing fill rate is poor, and grossly inefficient regardless of which TexTurbo value is used.

With multiple textures, things get better for the Xabre with TexTurbo set to 3. Setting TexTurbo to 0, however, cripples multi-texturing performance and we realize less than 25% of the Xabre 400’s theoretical multi-texturing fill rate.

3DMark2001 SE – Transform and lighting

SiS claims the Xabre has a third-generation T&L unit, and its performance isn’t too bad.

In a more complex scene with eight lights, the Xabre 400 moves up right behind both GeForce4 Ti 4200s and Radeon 8500s. Note here that TexTurbo has no impact on T&L performance; it’s just a texture thing.

3DMark2001 SE – Bump mapping

The Xabre 400 performs admirably in the DOT3 bump mapping test, that is until we set TexTurbo to 0. Notice the huge variation between the Xabre 400’s two scores—one almost doubles the other.

Only the GeForce4 MXs can’t complete the environmental bump mapping test. The Xabre’s performance here is, well, pretty horrible. Even a Radeon 7500 is more than twice as fast.

 

3DMark2001 SE – Advanced features

In 3DMark2001 SE’s point sprite test, the Xabre 400 only manages to edge out the Radeon 7500, which barely produces a score.

The results of the vertex shader test are particularly interesting because both the Xabre and GeForce4 MX cards are emulating a vertex shader in software and doing all the work on the CPU. Xabre runs a little bit slower than a GeForce4 MX and quite a bit slower than the cards that have true hardware vertex units.

3DMark2001 SE – Pixel shaders

The Xabre 400 has pixel shaders, but their performance is over three times slower than the pixel shader-enabled competition.

Codecreatures
I would have liked to include Codecreatures scores because it’s a nifty benchmark that really stresses vertex and pixel shaders. Unfortunately, though the Xabre 400’s shaders are detected and the card generates a score, the rendered output doesn’t look anything like what it should. Entire sections of the benchmark scene are blank; if Codecreatures were a game, it would be completely unplayable.

3DMark2001 SE had to be patched to work with the Xabre, and maybe that’s what Codecreatures needs to render things properly. Software shouldn’t require an update just to run on a new graphics card, though. I’m not sure whether to blame the software developers for hard coding elements to certain graphics chips or SiS for doing something wrong with the Xabre.

 

Villagemark

Villagemark shows a huge disparity between the performances of the two TexTurbo modes we’re testing. With TexTurbo set at 3, the Xabre 400 is the fastest card of the lot. Setting TexTurbo to 0, however, drops the Xabre 400 to the bottom of the pile.

Let’s take a more condensed look at how the Xabre 400’s more direct competition fares over several different resolutions.

Here we can clearly see the extreme disparity between the performance of TexTurbo modes. The Xabre 400 is either solidly above or below its direct competition depending on which TexTurbo setting you use.

 

Quake III Arena

First off, you’ll notice that the Xabre 400 with TexTurbo 3 scores a zero for 640×480 in Quake III Arena. No amount of coaxing could get the timedemo to complete without crashing the system at that resolution.

In resolutions above 640×480, the Xabre 400 with TexTurbo 3 consistently finds itself behind the GeForce4 Ti 4200 cards. With TexTurbo set to 0, it drops to the bottom and effectively ties the Radeon 7500 for the lowest score at each resolution.

At TexTurbo 0, the Xabre 400 follows the Radeon 7500’s performance quite closely through all the resolutions. The GeForce4 MX 460 provides the best competition for the Xabre 400 with a TexTurbo setting of 3, but the Xabre 400 has a slight edge.

 

Jedi Outcast

Jedi Knight II showcases the disparity in TexTurbo scores in a Star Wars setting, but this time the Xabre 400 with TexTurbo set to 3 is able to beat out even the GeForce4 Ti 4200 cards. As you might have expected, setting TexTurbo to 0 throttles the Xabre 400’s performance and drops it near the bottom. Available video memory really matters here. Notice how the performance of the GeForce4 Ti 4200 64MB and 128MB cards diverge at high resolutions. A Xabre 400 with 128MB of DDR SDRAM should fare much better at high resolutions in Jedi Knight II.

Though the Xabre 400’s performance with TexTurbo set to 0 starts really dropping off after 800×600, setting TexTurbo to 3 lets us maintain more consistent frame rates until the highest resolution. We are using Jedi Knight II’s “Very High” texture quality setting here, which accounts for low scores at high resolutions. Once we’re done with the performance benchmarks, we’ll get into just what kind of image quality price we’re paying for these higher frame rates with TexTurbo set to 3.

 

Serious Sam SE

Even a TexTurbo setting of 3 can’t help the Xabre 400 in Serious Sam SE, and it brings up the rear at almost every resolution. As we’ve seen throughout testing so far, a TexTurbo setting of 0 is much slower, especially as the resolution increases.

It looks like the Xabre 400 is limited by driver throughput at Serious Sam SE’s 640×480 resolution, because both TexTurbo setting lines have similar slopes to the competition after 800×600.

With Serious Sam SE we get one more graph than usual: a plot of frame rates over time during a benchmark demo. This plot will show us where during the benchmark demo the cards excel, choke, and just how erratic the frame rates are.

The same pattern we’ve seen so far rears its head again, and the Xabre 400 with TexTurbo set to 0 peaks and dips lower than everything else. With TexTurbo set to 3, the Xabre 400 manages to keep itself relatively competitive, but really only against low end GeForce4 MX 440 and Radeon 7500 cards at this resolution.

Serious Sam SE had some unusually jerky console performance in high resolutions on the Xabre 400. You have to drop into the console after each demo run to record average frame rates, and as I ran through each resolution several times, I noticed the console rendering getting slower and slower with each resolution increase. This problem was only on the Xabre, and I didn’t observe any slowdown with any of the other cards.

Console rendering isn’t a huge deal, but applications that use the same kind of transparency seen in Serious Sam SE’s console window could run into problems.

 

Comanche 4

Comanche 4 didn’t like the Xabre much, and it didn’t recognize any shader support. This could be Novalogic’s problem, something that will just require a small patch like 3DMark2001 SE, but it illustrates how application support for the Xabre’s unique approach to shaders may be spotty.

The Xabre 400 is reduced to the same graphics settings as the GeForce4 MX in Comanche 4, and its performance is disappointing. Not even setting TexTurbo to 3 makes things respectable, and frame rates really slide as the resolution scales up.

The Xabre 400’s performance with TexTurbo set to 0 drops off right away, while the other cards have more gradual performance declines as the resolution increases. Setting TexTurbo to 3 keeps the Xabre 400 from falling too far behind the value competition, but it’s still getting beat.

Something that the benchmark results don’t show, but that’s worth mentioning anyway, is a lot of choppiness in Comanche 4 when the benchmark fades in and out of different scenes. Only the Xabre 400 seemed to slow down during these transitions, when one scene fades into the next, and where I would assume transparency is being used extensively.

 

SPECviewperf

SPECviewperf’s results vary a little, but the Xabre 400 is at or close to the bottom of the pile in each test; this is clearly not a card you want to use for 3D workstation applications. TexTurbo settings have no impact on the Xabre 400’s performance in SPECviewperf, further confirming that texture quality is the only thing being manipulated with that switch.

 

Antialiasing performance

The antialiasing graphs are pretty big, but they show the same kind of performance we’ve seen from Xabre throughout our benchmarks. Antialiasing on the Xabre 400 is faster when TexTurbo is set to 3, just like everything else, but it’s still not enough to vault the card to the top of the pile. With TexTurbo set to 0, performance drops dramatically and the Xabre 400 really only competes with the Radeon 7500—a card that’s far too old to be running with a new DirectX 8.1-compliant AGP 8X graphics card.

All the price-competitive value graphics cards show relatively similar drops in performance going from no antialiasing to 2X and 4X AA at 1024×768. The whole point of antialiasing is how it makes the scene look, so let’s take a look at some screen shots.


Xabre: no AA, 2X AA, and 4X AA


GeForce4 MX: no AA, 2X AA, and 4X AA


Radeon 9000 Pro: no AA, 2X AA, and 4X AA

ATI’s SMOOTHVISION continues to produce the best looking antialiasing, followed closely by the GeForce4 MX and finally SiS’s Xabre. 4X AA doesn’t look too bad on the Xabre 400, but the 2X AA sample shows a lot more jagged edges on the lamp than either ATI or NVIDIA. There doesn’t appear to be much difference between the Xabre 400’s no AA and 2X AA samples at all, and the very subtle reduction in jagged edges certainly doesn’t justify 2X AA’s performance hit.

Warning!
Image quality is up next, and we’ve got a bunch of uncompressed image files to illustrate a few of the Xabre 400’s texture quality shortcomings. If you’re on broadband, you’ll be fine, but modem users will have to wait quite a while for everything to download. You can skip past this next section if you’d like. We would have offered up compressed JPEGs, but compressing files designed to illustrate image quality sort of defeats the purpose.

 

Texture quality
This is where we uncover the price for all that extra performance having TexTurbo set to 3 was able to give us in our performance benchmarks. Pay special attention to Max’s hand and necklace as we take a look at some screenshots from 3DMark2001 SE.


Xabre, TexTurbo set to 3


Xabre, TexTurbo set to 0


Radeon 9000 Pro


GeForce4 MX

There you have it: the image quality price that comes along with a TexTurbo setting of 3. Max’s hand and necklace clearly illustrate that the Xabre 400 is using lower quality textures. Some users may be hard pressed to pinpoint these lower quality textures in a game setting, but things feel noticeably ‘off’ during gameplay.

TexTurbo doesn’t only affect the Xabre 400’s texture quality, it also limits texture filtering as a little Quake III Arena r_colormipmaps fiddling shows.


TexTurbo 3: harsh, unfiltered mip map transitions


TexTurbo 0: mip map transitions the way they should be

The Xabre 400 can’t do trilinear filtering with TexTurbo set to 3, and that results in some ugly transitions between mip map levels. Again, you may not be able to notice these rough transitions during a heated deathmatch, but the image quality is noticeably poorer than on other cards, even when you’re frantically running around trying to evade rockets.

Quake III Arena’s sky doesn’t look too hot either, no matter which TexTurbo setting you use with the Xabre 400.


Xabre 400


Radeon 9000 Pro

The Xabre 400’s clouds are a lot rougher than what you’ll find on the other cards (and the Radeon 9000 Pro in this specific instance).

 

Overclocking
The Xabre 400’s performance isn’t impressive, but maybe a little overclocking will help matters. In testing we were able to get the Xabre 400’s core clock speed all the way up to 295MHz without any artifacting or instability. Curiously, the memory chips rated to 300MHz topped out at only 260MHz, and the card crashed consistently with memory clocked any higher.

An almost 20% core clock speed boost helps the Xabre put in a better performance in 3DMark2001 SE, but it’s still the slowest DirectX 8.1-compliant card that we tested.

In Quake III Arena, there’s a small performance gain, but it’s not much. After seeing what setting TexTurbo to 3 does to image quality, I think it’s pretty safe to say that those scores aren’t all that important to consider unless you really don’t care about image quality.

 

So is SiS cheating?
Testing shows that SiS sacrifices image quality to gain higher performance in 3D applications with its TexTurbo setting. A TexTurbo setting of 3 yields higher frame rates at the expense of texture quality. But is that wrong? The answer isn’t quite as obvious as the differences in TexTurbo image quality and performance.

Many graphics drivers, including those from ATI, have a texture quality slider that lets you specify a range of values between “Quality” and “Performance” settings. It’s no secret that downgrading texture quality can increase performance, and some gamers don’t mind if things look horrible. If TexTurbo is really no different than ATI’s texture quality slider when it comes to its impact on image quality and performance, where’s the beef?

Here: The TexTurbo switch is buried in the registry, and its default setting is 3.

The fact that you have to wade through the registry to change the texture quality is bad enough, but since SiS has used a TexTurbo setting of 3 as the default, you’re stuck with poor image quality until you change the registry setting. Now hacking the registry isn’t a big deal for most enthusiasts, but the Xabre 400 is a low-end card that will find its way into the value market. Many users aren’t as comfortable with modifying the registry, and they shouldn’t have to.

I can’t justify calling TexTurbo cheating, but how SiS have implemented this texture quality setting is sneaky and abhorrent, and they have no plans to let users directly access this setting. Letting the users adjust texture quality easily from the driver control panel is all SiS needs to do to correct the problem, since doing so will give users a choice between texture quality and performance. As far as I’m concerned, having to modify the registry doesn’t constitute choice.

Conclusion
I have to admire SiS’s novel approach to shaders. They save transistors by emulating a vertex shader in software, but they include hardware pixel shaders to achieve DirectX 8.1 compatibility. It’s an interesting way to do things, but it looks like a lot of software will need to be patched before the Xabre 400’s shaders are universally detected. Until then, you’re left with the performance picture we’ve painted today, and it ain’t pretty.

The Xabre 400 occasionally performs very well with TexTurbo set to 3, but if you’re at all concerned with image quality, you’d do well to just ignore those results. Performance with TexTurbo at 0 is ugly—as ugly as texture quality with TexTurbo set to 3—and the Xabre 400 is consistently the slowest DirectX 8.1-compatible card. Against ATI’s Radeon 9000 Pro, the Xabre 400’s closest competitor in terms of price and features, the Xabre is outclassed in every department.

Maybe the Xabre 400 would have performed a little bit better if we’d tested with an AGP 8X platform, but even if it did, any praise would be conditional. SiS is playing games with texture quality—don’t let them pull a fast one on you. TexTurbo needs to be in the driver control panel, or it needs to default to 0. Anything less is deceptive and unacceptable. 

Comments closed
    • Anonymous
    • 17 years ago

    I just got a 128MB Sis Xabre. I am very pleased with it and have some problems with some of the beefs people have on it.

    The biggest complaint I see is the image quality. I honestly don’t understand why people complain about image quality, when many were so quick to point out how fast the Geforce 2 GTS was compared to the Voodoo 5, despire the fact that the GTS didn’t have the Voodoo 5’s image quality. But, if SiS does this against nVidia in order to get more FPS, it’s wrong? I don’t seem to understand this double-standard.

    In terms of video quality / performance, you ought to know that the Xabre’s latest drivers include a slide-bar for this. Sure SiS didn’t have this to begin with, but remember that Xabre is still a maturing technology. Ever since 3Dfx went down, we flat out have had fewer and fewer people in the 3D market. With Matrox going downhill and nVidia swallowing the market in a very Intel-like mannor, we should be GLAD the Xabre is here, because the more people we have competing, the bloodier the price wars are going to get, and SiS has more than enough cash to play the price wars with nVidia – something that I candidly don’t see ATI as having.

    Give SiS a break. I don’t know about you guys but I’m going to be cheering on Xabre. It shows promise… just wait for it to mature.

    • Anonymous
    • 17 years ago

    I fart on your Xabre!

    • Pete
    • 17 years ago

    [q]Pete, but who is to judge what decent video quality is? I have my opinions on how decent something should be, yours may be different. Apparantly, SiS thought that decent was something below your level, but you cannot blame them for that. Asking for higher quality than SiS is originally providing is indeed asking for extra image quality. If SiS doesnt meet your image quality standards thats too bad, spend your money else where.[/q]Let’s put aside decent for the moment and focus on equal, my original intent.[q]You think that everyone should meet your standards? Hah, people think far too much of themselves…[/q]Thanks for the laugh. =)

    65, I’m assuming with 9000’s going for the same price as a Xabre, there’s no reason to go for the SiS card. I find it doubly ironic that the only Xabres listed on PriceWatch are 128MB–what’s the point of all that texture memory if you’re just going to lower the texture quality by default? An $85 64MB 9000 Pro is a far better deal than a $100 128MB Xabre 400. The OOBE will be a sight prettier, too.

    • Namarrgon
    • 17 years ago

    [quote]if the SIS spanked a 9700 if FPS because its default rendering – fixable only in the registry – was wireframe[/quote]
    Indeed. Or if it was limited by default to 640×480 (justified because its target market can only afford small monitors), and therefore produced a much higher 3DMark score.

    It’s pointless comparing performance unless the cards involved are doing the same thing. You might as well be comparing the results at different resolutions or depths. Performance is be a measure of the time taken to do a given amount of work – if one card is only moving half the texels of the other & thus has a lower workload, it cannot be said to be performing the same as the other card just because its frame rate is the same.

    If SiS claimed their card could achieve 80% of the [i]framerate[/i] of a Ti4600, it’d be sneaky terminology but hard to argue with 🙂

    • Anonymous
    • 17 years ago

    #73 again…

    [q]spanked a 9700 if FPS[/q]That should be [b]in[/b] FPS, of course.

    • Anonymous
    • 17 years ago

    I wonder what people would say if Kia were to introduce a new car that “offers the same performance as the BMW M5” yet has only 2 seats.

    And would this thread be so long if the SIS spanked a 9700 if FPS because its default rendering – fixable only in the registry – was wireframe?

    • Anonymous
    • 17 years ago

    *[

    • droopy1592
    • 17 years ago

    No! I am all knowing. I am right. It’s a bus. What’s todays date?

    • shaker
    • 17 years ago

    Aphasia is right- it’s the implementation rather than the basic design that determines “port” or “bus”. Since the AGP “bus” (forgive me) was implemented solely for graphics data, and is designed to be saturated with said data, it is in reality a “port”. Minor changes in architechture could make it a “bus”.

    It’s all semantics, anyway.

    As to SiS, Joe Sixpack is likely to enjoy the speed of his new Xabre (Zaber, like Xylophone?) until one of his geek buddies informs him of the inferior textures. Then, he’ll feel a bit betrayed (although he’s not sure why) and that’s SiS’s little sin.

    • Aphasia
    • 17 years ago

    I did some looking in that spec for the last discussion we had in the thread i linked in post #10.

    Well.. AGP is per definition a port, but it also uses some signaling schemes that corresponds to a bus. As it uses pci commands to initialize etc. But it seems that the bus is of secondary importance when considering the AGP as the databus is just a very small part of what makes up the AGP.

    But the AGP in whole is implemented as a port interface while the PCI is implemented as a bus interface.

    Which means that if you want more then one device on the AGP you need a bridge or Fan-out device. While the pci is open as long as you dont want more then the amount of devices that a specific implementation is done for, in which case you also need to use a PCI-bridge device.

    cheers

    • Anonymous
    • 17 years ago

    (high compression, that is)

    • Anonymous
    • 17 years ago

    The point is, the SiS Xabre 400 is reducing the quality of the textures by default. No other card does that. That means they’re doing it for a reason no other card maker found necessary. And that is because it was known by SiS to be a poor performer. But they (as you can see them advertise on places like Anandtech) pretend it is a high performer with their AGP 8x! crap, and DX9 crap.

    So they’re trying to get away with something. You’ll reply again, of course, that they’re targeting a lower quality market, or some such thing. The GF4 MX cards are priced similarly, but don’t have the same hacks to try to stay competitive. It would be like a car company telling you its car has 180 hp, but they get it by a compression ratio and setting the timing far advanced and requiring aviation gas. You try to run it on regular gas, and the timing (assuming computerized electronics as is done these days) goes so far back that it really is giving you 100 hp. That’s basically what SiS has done.

    • dmitriylm
    • 17 years ago

    exactly….

    • pwdrhnd23
    • 17 years ago

    I agree with dmitriylm. The out of the box experience is going to be tex turbo = 3.

    [q]These benchmarks assume every card is producing the same output.[/q]

    These cards are not all priced the same, see you get what you pay for. When the Riva 128 had shoddy visuals it was a problem because the cards were competetively priced.

    I think SiS said the card would have about the same performance as a GF4 Ti4200 whick is a slight stretch. Quality was not really a big point. If ATI said the 9700 was gonna have twice the performance of a Ti4600 I would expect the quality to be better as well since the price is going to reflect high end card.

    Nobody is going to buy this card because of the visual quality even before diss reviewed it, people will buy it because it is cheap. Those people will get what they paid for a low end budget 3D-card that will be better than most integrated solutions.

    • dmitriylm
    • 17 years ago

    Pete, but who is to judge what decent video quality is? I have my opinions on how decent something should be, yours may be different. Apparantly, SiS thought that decent was something below your level, but you cannot blame them for that. Asking for higher quality than SiS is originally providing is indeed asking for extra image quality. If SiS doesnt meet your image quality standards thats too bad, spend your money else where. You think that everyone should meet your standards? Hah, people think far too much of themselves…

    • LiamC
    • 17 years ago

    Thanks Droopy. From the same page:

    [q][b]3.1.2 AGP3.0 Signal Definitions[/b]
    [i]AGP3.0 is a point-to-point interconnect[/i] that contains three types of signals. The two primary sets of
    signals are the source synchronous signals for data transfer and common clock signals for arbitration
    and control. The third type of signals (referred to as

    • droopy1592
    • 17 years ago

    LiamC, Damage

    Pg 45 says:

    The AGP 3.0 data bus provieds a peak theoretical bandwidth of 2.1 GB/s (32 bits per transfer at 533 MT/s)… AGP 3.0 specifies a parallel-terminated bus with a fixed nominal voltage swing of 800mV peak-to-peak…

    If you look at the diagram on page 101, it defines the agp standard as a bus, or secondary pci bus. And the glossary defines MT/s and uses the words AGP 3.0 data bus.

    §[<ftp://download.intel.com/technology/agp/downloads/SpecUpdate06-21.pdf<]§

    • LiamC
    • 17 years ago

    As for the PCI bus and only having one card in it and calling it a PCI port – I’m afraid you’ve missed the “bus” 🙂

    The “PCI bus” was [b]designed[/b] from the outset to connect multiple devices. If you choose not to connect the devices, that’s your perogative (no relation to Bobby brown).

    The AGP OTOH was not designed to accomodate more than one device, though Intel has not precluded this. THe AGP has only recently been used to have more than one device, though I believe it is in an either or fashion – only one can be active. This too may change.

    Why so pedantic? If you want to have a (more than a casual) conversation about these topics, if people use two different words meaning the same thing, or use the same word and have different meanings for them, then you can’t have a discussion – because you end up arguing about different things. Stupidity.

    • Pete
    • 17 years ago

    [q]The assumption is that the “preferred” or “best” default setting for a video card is highest picture quality. When did that vote get passed?[/q]When nV passed 3dfx on the strength of 32-bit rendering and Kingpin shots?

    [q]Getting EXTRA is getting any more than what you should originally get. Youre not happy with the Xabre’s texture quality so you want extra quality by using a registry hack. Some people arent happy with nvidia’s default clock speeds so they get extra speed by using a registy hack to add an overclocking slider. How is that not EXTRA?[/q]Decent image quality is not extra. These benchmarks assume every card is producing the same output. When one cuts quality corners for speed (as with ATi’s quack fiasco), you’re deceiving the consumer. As the benches show, Xabre can’t compete at the same IQ as other cards. If people are annoyed by ATi’s MIP-map boundaries, you can bet they’ll be more annoyed with blurry textures.

    Let’s not get into including hacks as driver features.

    • LiamC
    • 17 years ago

    Damage,

    thanks for the link. When Intel mention the AGP there is no mention of bus. When they talk of the electrical/clock timings on the parallel connection between core logic and the port, the sometimes (but not always) use the term bus.

    As for more than one device connected:
    [q]As with AGP2.0, the AGP3.0 interface is a logical point-to-point
    20
    network. The AC timings and
    electrical loading on the AGP3.0 interface are optimized for one active host component on the
    motherboard and one active AGP3.0 agent. More than two physical connections to the interconnect
    are not recommended since there is little timing margin for the added load, stubs and other signaling
    discontinuities. If the interconnect is comprised of more than two loads and/or branching in the
    topology, [i]it is the system designer

    • Anonymous
    • 17 years ago

    *[

    • Anonymous
    • 17 years ago

    Dissonance –

    What were the texture quality settings with which you tested the ATI and nVidia cards? Are the texture quality settings in the control panels always enforced in all the games? What are their defaults?

    This will help us assure an “apples to apples” comparison.

    Thanks!

    • Anonymous
    • 17 years ago

    ahhhh. I guess.

    • R2P2
    • 17 years ago

    dissonance (#43) — I hope you meant TexTurbo [b]3[/b] is asstastic, since TexTurbo 0 is apparently the best the Xabre can do.

    AG#48 — I think you missed the point. AGP Port = Accelerated Graphics Port Port, NIC Card = Network Interface Card Card. AGP Card wouldn’t be redundant like those.

    • dmitriylm
    • 17 years ago

    Ryu, well to spare anyone else even more confusion, ill just say yes…

    • Ryu Connor
    • 17 years ago

    [quote]Ryu, mobo’s that have integrated graphics solutions but still have AGP ports just disable the onboard graphics while the AGP port is in use. Both of those chips are still connected to the AGP “line”. If the onboard solution was actually using the PCI bus, you could be using the onboard solution and an additional AGP card at the same time. I was also a little confused by your post, so maybe thats what you meant as well. Oh well….[/quote]

    You can’t very well graft a device to a port.

    Does that make more sense?

    • dmitriylm
    • 17 years ago

    Ryu, mobo’s that have integrated graphics solutions but still have AGP ports just disable the onboard graphics while the AGP port is in use. Both of those chips are still connected to the AGP “line”. If the onboard solution was actually using the PCI bus, you could be using the onboard solution and an additional AGP card at the same time. I was also a little confused by your post, so maybe thats what you meant as well. Oh well….

    • Ryu Connor
    • 17 years ago

    [quote]Maybe I’m subcaffinated today but I didn’t quite get the meaning of your message.[/quote]

    Drink more coffee.

    • Unanimous Hamster
    • 17 years ago

    Ryu –

    Maybe I’m subcaffinated today but I didn’t quite get the meaning of your message.

    • Unanimous Hamster
    • 17 years ago

    I agree with the review 100%.

    It’s clear from the screenshots that the registry hack is required to obtain image quality comparable to the “regular” quality of other cards. SIS deliberately set their image quality to a substandard level to get better benchmark scores. Having normal image quality (comparable to other video cards) is not a special feature like overclocking – it’s EXPECTED. SIS should have included a slider and allowed customers to decide on the tradeoff of speed vs. image quality.

    SIS = Substandard Image Scam

    • Anonymous
    • 17 years ago

    I thought AGP card would be like saying NIC Card

    • dmitriylm
    • 17 years ago

    Getting EXTRA is getting any more than what you should originally get. Youre not happy with the Xabre’s texture quality so you want extra quality by using a registry hack. Some people arent happy with nvidia’s default clock speeds so they get extra speed by using a registy hack to add an overclocking slider. How is that not EXTRA?

    • dmitriylm
    • 17 years ago

    My standards are high, which is exactly why im using a Geforce4 Ti4200 card. Not the most expensive card, but turns in frame rates and visual quality that are far above decent. How can you blame a cheap video card for giving cheap video quality.

    • corrosive23
    • 17 years ago

    AGP PORT is like saying NIC Card.

    • droopy1592
    • 17 years ago

    Hamster, I just figured he meant PCI cards when he said no devices.

    • Anonymous
    • 17 years ago

    *[

    • Damage
    • 17 years ago

    Look at the quality of the Xabre’s output with its default texture setting versus the other cards. The screenshots are in the review. The Xabre is–and I’ve seen it myself when I tested the card before sending it to Diss–using much lower quality textures than the app developers, API, etc. intended in order to boost its performance. That’s the problem.

    • dmitriylm
    • 17 years ago

    Wait, so I can blame nvidia for not putting an overclocking slider into it’s control panel without using some kind of registry hack? I mean the purpose of overclocking is to get something extra out of the graphics card, so youre saying that SiS should have a texture quality slider to get extra visual quality from the card. I dont see how thats fair.

    • Anonymous
    • 17 years ago

    *[

    • dmitriylm
    • 17 years ago

    Uh, there is nothing wrong with SiS setting the default texture quality setting at it’s lowest value. If thats what SiS considers as the setting that meets their quality point then thats what they have to do. Would you also blame them for having 2X FSAA that sucks ass? With that kind of thinking you can punish them more by saying that you have to use 4xFSAA on the Xabre to meet the 2xFSAA quality of ATI and NVIDIA cards. Its a bunch of bull, the default settings are just that, nvidia’s image quality slider isnt all the way up at default, can you blame them for that? Civics suck until you spend a couple thousand dollars on shitty spoilers, intakes, and NOS. Would you blame Honda for selling a cheap car that doesnt meet the same quality standards of Ferraris, Porche’s, and all other high end cars? SiS doesnt owe you anything, if you bought one, it’s your fault for being cheap (“Damn, my Celica cant beat my friends Salene…”).

    • Ryu Connor
    • 17 years ago

    [quote]Many “integrated” components are still on the PCI bus. They may not be separate add-in cards, but they still interface internally to the PCI bus.[/quote]

    You mean sort of like how integrated graphics attach to the AGP bus, but still give you an option of a port attached to that bus so you can upgrade?

    [i]cough, cough[/i]

    Sorry, just clearing some phlegm.

    • Anonymous
    • 17 years ago

    ALL of you nervous Schoolgirls—-relax.
    SIS did nothing wrong.

    Here’s the Word……

    A Mfr who sets up his grafix card to make it look super—-no matter how sneakily he does so—is engaging in legitimate warfare because the name of this grafix card war is “Sell”.
    It’s a jungle out there Girls; you’ll find that out soon enough when you graduate next year from Miss Pringle’s School.

    Let me Summarize for you girls the basic strategy employed by both sides, in this grafix card “Total War”….

    Grafix card Mfr does his sneaky dang best to make his card look super and Website reviewer do his sneaky dang best to smoke him out.
    Both are to be commended.
    Why?
    These activities keep both of them busy. happy, prosperous and outta mischief.

    • Unanimous Hamster
    • 17 years ago

    droopy –

    Many “integrated” components are still on the PCI bus. They may not be separate add-in cards, but they still interface internally to the PCI bus.

    • pwdrhnd23
    • 17 years ago

    What is the target price for this card?

    Are we as enthusiasts the only ones that would even notice the visual quality? I would think the target audience for such a budget card will be those individuals that got roped into an integrated solution form Intel and would like to cheaply use the AGP slot on their mobo. That is provided they even want to open that <insert top 3 pc seller> box and risk voiding their warranty.

    • droopy1592
    • 17 years ago

    Nah, you would just have integrated crap. And pray you still have an AGP Pus (bus+port=pus)

    • Unanimous Hamster
    • 17 years ago

    [quote]
    Besides, if I only had one PCI device, it’d still be a PCI BUS, not a PCI port.
    [/quote]

    Q: If I had NO PCI devices on my computer, what would I have?
    A: A non-functional computer.

    SIS = Sucky Image Scam

    • Anonymous
    • 17 years ago

    so what about the other tex turbo settings? 1 and 2? 4? 127?

    • Anonymous
    • 17 years ago

    Like a ARM Mortgage.
    Adjustable Rate Mortgage Mortgage?

    It’s a bus fool.

    • Anonymous
    • 17 years ago

    [i]It’s AGP PORT![/i]

    right… just like “LCD Display”

    LCD Display – (Liquid Crystal Display Display??)
    AGP Port – (Accellerated Graphics Port Port ??)

    :-p

    • droopy1592
    • 17 years ago

    Hey, I was right all along. Thanks Damage. I knew all those years working on antiquated military technology meant something.

    • Anonymous
    • 17 years ago

    Yeah, I got this AGP bus on my motherboard and I don’t know what to put in it. Anyone got any suggestions?

    • TheCollective
    • 17 years ago

    I believe it is pronounced sabre or “say-ber.”

    • Kilroy1231
    • 17 years ago

    All I can say is…
    Ouch

    Good job there exposing the Xabre (how is that pronounced by the way?) SIS has some explaining to do.

    Wireframe and transparency may be cool for somethings, but when it turns into another Asus transparency deal it becomes uncool

    • R2P2
    • 17 years ago

    …And Damage said “Smack!” and the Gerbils were quiet, and there was much rejoicing.

    • Damage
    • 17 years ago

    #23: You’re close to getting nuked. Play nice or go home.

    As for the bus/port thing, please see here:

    §[<ftp://download.intel.com/technology/agp/downloads/SpecUpdate06-21.pdf<]§ You'll find discussion of the AGP bus in the official specification documents. Nobody said a port couldn't have an associated bus.

    • Anonymous
    • 17 years ago

    Hey assholes. It’s AGP PORT. Yeah, we’ve already been through this, and will be through this a thousand more times until the reviewer wakes up and sees the difference between a PORT and a BUS.

    And for that nitwit that said an AGP Port connects to the system via an AGP BUS…STFU, please.

    If a reviewer strives to attain an air of legitimacy, AT LEAST get the terms down correctly or don’t even bother. UNDERSTAND the reasons why things are labelled PORT and BUS.

    ACCELLERATED GRAPHICS PORT, or AGP for short. If it was a BUS, it would be ACCELLERATED GRAPHICS BUS or AGB. AGB would be a bit of a misnomer in and of itself, since not all devices sharing a BUS on the AGB bit would be GRAPHICS related.

    Next, you assholes will try to loopy-dream engineer a TeraWatt Accellerated Graphics Port Bus and devices like a GF5 and a winmodem and your EBrater share the same data path.

    Wheee! PORT! BUS!

    • Anonymous
    • 17 years ago

    *[

    • TheCollective
    • 17 years ago

    Thanks for playing. Good bye.

    • Anonymous
    • 17 years ago

    Get off the bus and put the petal to the meddle

    • Anonymous
    • 17 years ago

    *[

    • Freon
    • 17 years ago

    #13,

    I tihnk all he’s asking for is a level playing field. The default TexTurbo setting on the Xabre is not comparible to competing video cards’ default settings.

    Comparing default settings on a GF4MX to a Xabre would be deceptive because the GF4MX would have superior image quality. Should the GF4 MX be allowed to run lower res textures? r_picmip 2 for the MX while the Xabre gets r_picmip 1 ? Ponder on that point.

    • Steel
    • 17 years ago

    [q]AGP is a PORT, not a BUS.[/q]Didn’t we already go through this?

    • shaker
    • 17 years ago

    A “bus” is a communication bridge (either serial or parallel) where multiple devices can communicate with a controller/processor via interrupts or addresses over the same data path. (At least that’s the way that I learned it, way back when). I imagine that the AGP port could be configured as a bus if the Intel spec has an addressing scheme.

    • LiamC
    • 17 years ago

    Fejji, I believe you are mistaken. Intel defined and developed the AGP specification.

    §[<http://www.intel.com/home/glossary/body.htm<]§ Check out AGP: [q]an accelerated graphics port (AGP) is a dedicated high-speed port for moving large blocks of data between a PC graphics controller and the system memory. [/q] Seeing as they defined and developed it, they'd know. No mention of a "bus" anywhere. A bus is neither serial nor parallel by nature. It depends upon how the "bits" are shuffled back and forth.

    • fejji
    • 17 years ago

    Liam C: A point-to-point connection is serial, a points-to-points connection is parallel AKA a BUS.

    • Anonymous
    • 17 years ago

    [q]TexTurbo needs to be in the driver control panel, or it needs to default to 0. Anything less is deceptive and unacceptable. [/q]

    [i][where 0 equals highest texture quality/lowest frame rate][/i]

    Dissonance, you are either making an assumption here, one I’m not sure applies, or you have sunk to placing the value of your test scores above the wishes of the average consumer, or even enthusiasts.

    The assumption is that the “preferred” or “best” default setting for a video card is highest picture quality. When did that vote get passed? Even if not a majority, there are a lot of users out there that want it to go as fast as it will go, right out of the box, and what really matters is how good it looks pedal to the metal.

    I don’t know what the max registry number is for texture control, but let’s say its 5. Producing a card with a default setting in perfect compromise between looks and speed seems among the fairest decisions the manufacturer could have made; the fact that modifying this setting is not made easy makes it no less judicious.

    Basing the tone of your entire review on what you decided for the rest of us the default should be, though, seems unfair, at best.

    • Forge
    • 17 years ago

    AG #2 – Not sure what you’re going on about there. ATI cheated in the beginning, got caught, and immediately fixed it. Since no detectable mipmap/texture hackery has been detected since, I’d be inclined to let ATI’s excuse/explanation of a misset optimization stand, and just watch them more closely in the future. SiS has released more than one driver version with their texture muddying enabled, meaning that they are aware of the hackery and condone it. Whole different enchilada, in my book.

    FWIW, ATI scores very, very nicely on Jedi Knight 2 (since the texture upload bug was fixed in the second or third driver, way back when), and no one has ever detected any texture/mipmap messing in that game. How do you explain that?

    Whole different situation. I’d be more concerned with the wireframe/transparent modes SiS is allowing, shades of Asus SeeThrough, anyone?

    • Pete
    • 17 years ago

    Next up: Trident. >:)

    😉

    ROFL, #8. 🙂

    • Aphasia
    • 17 years ago

    Soon it will be possible as an AGP Bridge or Fan-Out device can be developed. Its not fully speced out but only mentioned in short in the AGP 3.0 spec. But one of those devices work just the way a pci bridge works. It has a controller for each subsequent port(kindof).

    But this has already been discussed here…
    §[<http://www.tech-report.com/news_reply.x/3977/<]§

    • BooTs
    • 17 years ago

    Besides, if I only had one PCI device, it’d still be a PCI BUS, not a PCI port.

    • BooTs
    • 17 years ago

    You’re all wrong. A BUS has 4 WHEELS and groes VROOM. Just because the ‘net and tech community has bastardised the name doesn’t make it so.

    • LiamC
    • 17 years ago

    Fejji, AG#4 is right. A bus has more than one device connected. The Accelerated Graphics Port is a point-to-point link. Ever see more than 1 graphics card on the “AGP bus”?

    Just because the web has bastardised the name by calling it a “bus” don’t make it so.

    • fejji
    • 17 years ago

    [q]AGP is a PORT, not a BUS. [/q]

    The AGP card plugs into the AGP port and connects via the AGP bus to the northbridge.

    Optimized drivers for 3dMark? My concern is how often will the drivers be updated – are we talking Nvidia, ATI, or Matrox-style updates?

    • atidriverssuck
    • 17 years ago

    girls always cheat

    • Anonymous
    • 17 years ago

    AGP is a PORT, not a BUS.

    • Anonymous
    • 17 years ago

    I expect this review was done to make sure readers know that SiS’s Xabre 400 is no value, even though the “value segment” is its target.

    Makes me mistrust their chipsets, too.

    • Anonymous
    • 17 years ago

    Well, obviously ATI’s success has created a model for the industry. We’re sheep who don’t have the balls to punish a company for such behavior. Seeing how successful ATI has been, can you blame them for this behavior?

    • Anonymous
    • 17 years ago

    Hmm thats a nasty spanking.Better luck next time SIS. If there is a next time considering the prices of budget video cards.

Pin It on Pinterest

Share This