HD HQV video image quality
We've seen how these cards compare in terms of CPU utilization and power consumption during HD video playback, but what about image quality? That's where the HD HQV test comes in. This HD DVD disc presents a series of test scenes and asks the observer to score the device's performance in dealing with specific types of potential artifacts or image quality degradation. The scoring system is somewhat subjective, but generally, the differences are fairly easy to spot. If a device fails a test, it usually does so in obvious fashion. I conducted these tests at 1920x1080 resolution. Here's how the cards scored.
|HD noise reduction||0||25||25||0||0||0|
|Video resolution loss||20||20||20||20||20||20|
|Film resolution loss||25||25||25||0||0||0|
|Film resolution loss - Stadium||10||10||10||0||0||0|
The Radeon HDs may have good reason for consuming a few more CPU cycles and a little more power than the GeForces in H.264 playback: they're doing quite a bit more work in post-processing. Both of the RV630-based cards post perfect scores of 100, and their competition from Nvidia flunks out of the noise reduction and film resolution loss tests.
We could chalk up the GeForce cards' poor scores here to immature drivers. Obviously, the current drivers aren't doing the post-processing needed for noise reduction and the like. However, I received some pre-release ForceWare 162.19 drivers from Nvidia on the eve of this review's release, which they claimed could produce a perfect score of 100 in HQV, and I dutifully tried them out.
Initially, I gave these new drivers a shot at 2560x1600, our display's native resolution. With noise reduction and inverse telecine enabled, I found that our GeForce 8600 GT 620M stumbled badly in HD HQV, dropping way too many frames to maintain the illusion of fluid motion. After some futzing around, I discovered that the card performed better if I didn't ask it to scale the video to 2560x1600. At 1920x1080, the 8600 GT was much better, but it still noticeably dropped frames during some HQV tests. Ignoring that problem, the 8600 GT managed to score 95 points in HD HQV. I deducted five points because its noise reduction seemed to reduce detail somewhat.
The faster GeForce 8600 GTS scored 95 points on HD HQV without dropping frames, even at 2560x1600. That's good news, but it raises a disturbing question. I believe Nvidia is doing its post-processing in the GPU's shader core, and it may just be that the 8600 GT is not powerful enough to handle proper HD video noise reduction. If so, Nvidia might not be able to fix this problem entirely with a driver update.
Also, even on the 8600 GTS, Nvidia's noise reduction filter isn't anywhere near ready for prime-time. This routine may produce a solid score in HQV, but it introduces visible color banding during HD movie playback. AMD's algorithms quite clearly perform better.
Update 7/14/07: We originally said we tested HD HQV primarily at 2560x1600 resolution, but that's inaccurate. We were unable to do so because of a bug in either ATI's drivers or PowerDVD that prevented the Radeon HD cards from scaling video beyond 1920x1080. Due to this limitation, we tested all cards at 1920x1080. We've updated this page to reflect that fact. We have also inquired with ATI about the cause of the video upscaling problem and are awaiting an answer.
|New HP Chromebook combines Tegra K1, 1080p touch screen||6|
|Friday night topic: what are you giving for Christmas?||116|
|Notes from TR's next-gen storage testing||25|
|Today's Steam deals include AC Unity, Borderlands: The Pre-Sequel||32|
|Deal of the week: A Radeon R9 290X for $280, a 960GB SSD for $339, and more||2|
|RRAM breakthrough could lead to 1Tb chips built on 28-nm tech||24|
|The TR Podcast 167.5 bonus edition: You guys ask us stuff!||4|
|AC Unity season pass holders can now redeem their free game||13|