When Nvidia unveiled its next-gen Tegra, code-named Kal-El, a couple of days ago, it showed the upcoming chip outperforming a Core 2 Duo T7200 in an multi-threaded benchmark running on Android. While the result appeared impressive, I cautioned that the test had been picked and run by Nvidia. Well, the folks at Ilsistemista.net have done a little testing of their own, and they've concluded that the Nvidia numbers are seriously skewed.
The main issue, as the Nvidia video shows in small print, is that the benchmark run on Kal-El was compiled using a recent version of GCC (4.4.1) with aggressive optimizations, while the Core 2 was given a benchmark compiled with looser optimizations using GCC 3.4.4. Ilsistemista attempted to reproduce the results, grabbing a Core 2 Duo T7200-powered Dell laptop and running the same benchmark twice: first compiled with GCC 3.4.6 using the "normal" O2 flag, then with GCC 4.4.4 and the O3 flag.
The results speak for themselves. Performance apparently surged by 41% with the newer GCC release set with more aggressive optimizations. Sticking that result in Nvidia's chart as Ilsistemista did, we can see the Core 2 Duo T7200 actually outpaces Kal-El by a decent margin.
Now, these new numbers don't necessarily detract from Kal-El's appeal, especially since the Core 2 probably consumes many times more power. Nevertheless, while we've come to expect hardware makers to hand-pick tests that show their products in a positive light, the evidence suggests Nvidia went a step further here by not showing the best possible performance of the Intel config. Disappointing.
|A technology overview of the Aimpad R5 analog keyboard||6|
|Microsoft helps hardware companies make VR more affordable||10|
|Intel P3100 M.2 SSD has datacenters in mind||7|
|Microsoft Surface Ergonomic Keyboard merges comfort and style||26|
|Surface Studio puts the iMac on notice||71|
|Microsoft Surface Book i7 packs a bigger punch and more batteries||45|
|G.Skill KM570 MX keyboard goes back to the basics||5|
|Intel's Purley server platform won't use 3D XPoint memory||5|
|In the lab: EVGA's GeForce GTX 1050 Ti Superclocked graphics card||42|