Since then, as in Dick Gephardt's campaign headquarters, all the activity has been elsewhere. Intel has upgraded its lineup of Pentium 4 processors and chipsets with an 800MHz bus, dual-channel DDR400 memory, ubiquitous Hyper-Threading, AGP 8X, and Serial ATAto name just some of the improvements. The Pentium 4 platform practically pulses with bandwidth everywhere, and performance is up as a result.
We found the Pentium 4 3.0GHz chip to be a little bit faster overall than AMD's latest, the Athlon XP 3200+, in our last round of tests. Still, with a new 400MHz front-side bus and its own dual-DDR400 chipset in the nForce2 Ultra 400, the Athlon XP 3200+ is no slouch. The Athlon turned in the highest scores in many tests, and put up a heck of a fight for the overall crown.
Now we come to the new 3.2GHz version of the Pentium 4. Suppose with me, if you will, what might happen when the manufacturer of the world's fastest desktop processor turns up the clock speed from 3000MHz to 3200MHz. Lower interest rates? The scent of almond? (What the devil does one say about 200 more megahertz?)
Uhm, sorry about that. As I was saying, we're expecting the Pentium 4 3.2GHz to take its rightful place at the top of the x86 pecking order. The P4 3.2GHz may be more of the same, but like faithful patrons of the local Luby's, we're generally in favor of getting more of a good thing. As always, we've loaded up our test bench with a gaggle of the new P4's competitors and forebears, and the results follow, so read on.
|Nanoxia Project S case slides into home-theater setups||1|
|Nvidia previews Xavier SoC with Volta GPU for self-driving cars||6|
|be quiet! Silent Loop AIO liquid coolers hum along quietly||1|
|Microsoft catapults datacenter performance with FPGAs||28|
|Asus J3455M-E mobo sails out with Apollo Lake SoC aboard||17|
|AOC's Agon family of gaming monitors heads stateside||11|
|Google Data Saver improves mobile browsing on narrow pipes||7|
|Toshiba expands its budget SSD lineup with its OCZ TL100||13|
|Rumor: Nvidia and Apple may reunite for future Mac GPUs||28|