When tecChannel recently reviewed the Parhelia, they compared it against cards from a bevy of manufacturers in terms of analog VGA signal quality. Most importantly, tecChannel used instruments to capture the waveforms produced by each card. I'll post links below to their result pages. You'll have to click through to get their full-sized waveform pics.
On the whole, it appears Radeon and GeForce4 cards are reasonably similar in their signal outputs. They are not identical, but close. As with anything, there are exceptions where one particular card is significantly worse or better than the others. There is also the matter of price. On average, a Ti 4600 performs better than the lower-priced cards. That's one of those little hobgoblins we all hate.
The Matrox cards have superior output, at least in part. The G550 puts out a solid signal (especially considering its low price), but it's rivaled by some of the better ATI and NVIDIA implementations. The Parhelia, on the other hand, is very impressive. Its waveform looks almost pristine.
To help illustrate the differences between the various cards, I recommend using the Parhelia signal image as the reference. The closer a card is to matching the shape, height, and width of the Parhelia signal, the bettter the image quality of the card. Also, notice that the RGB signals of the cards are defined as colored lines. Ideally, these lines should stay tight together; keep an eye out for colors that sag or spike. Here are the results for the various cards:
These results don't really deviate from those in Matrox's own white paper on the subject, which is nice. The tests do leave a few areas untouched that I am curious about. I'd like to see testing of the secondary output on the cards. I'd also like to see results obtained from two of the same card. I suspect signal quality can vary from run to run with some manufacturers. There is also the question of subjective impressions. We can see which waveform's shape is theortically superior, but that doesn't negate the potential value of some blind test opinions.
Video signal quality is on the rise, but it would still be fair to say ATI and NVIDIA need to invest more effort into improving things through better reference designs and recommended parts. Video cards are getting faster and monitors cheaper with every passing day, and higher resolutions are becoming commonplace and usable at a ever-quickening rate. More than a few people have called current antialiasing methods snake oil and said the real goal should be resolutions like 2048x1536. Some of these cards might be able to produce an image at that resolution, but most of them wouldn't provide a very pleasant image in 2D or 3D.
|Steam's 2017 Summer Sale is downright hot||11|
|Asus XG-C100C NIC breaks the gigabit barrier||16|
|Stuff a terabyte of RAM in Gigabyte's MZ31-AR0 Epyc motherboard||25|
|National HVAC Tech/Onion Ring Day Shortbread||18|
|Imagination Technologies hangs a "for sale" sign in its window||24|
|Vulkan is about to erupt in CryEngine 5.4||2|
|Mionix's new RGB LED keyboard lights the Wei forward||5|
|ThinkPad lineup will get a retro model for its 25th anniversary||22|
|Netgear readies the Nighthawk X6S for take-off||23|
|Almost everything in a rackmount server is "passively" cooled. Rack Servers have their own front-to-back airflow systems and every card is required to...||+20|