Dell & NVIDIA unveil Quad SLI-certified PC

Dell and NVIDIA’s press release is here, and PC Perspective covers their presentation from CES. Imagine a card with 1.3 billion transistors with 96 pixel pipes able to push 41 gigapixels per second with an incredible 2GB of memory all on one PCB.

Initially, these cards will only be available in Dell systems scheduled to ship this spring. With any luck, this card will be available as a stand-alone component, but there is no official word on that yet. You will, however, be needing a motherboard based on NVIDIA’s nForce®4 SLI X16 chipset (all 16 lanes of them) to run this card if and when it is available separately.

Comments closed
    • --k
    • 15 years ago

    I wish they would go after sub-pixel rendering of graphics with HDR. That would be revolutionary. Imagine 96bit 300dpi monitors showing life-like graphics. VR wouldn’t be just in our imaginations anymore.

    • wierdo
    • 15 years ago

    what…the….funk…. is this a joke!? What next, a 500 horse powered dune buggy?

    The video card picture itself makes me think it’s a joke, please tell me it is, that thing belongs in lowes, in the tree chopping tools dept 😀

      • ripfire
      • 15 years ago

      “a 500hp dune buggy” Hey, that would be cool!

    • maxxcool
    • 15 years ago

    Bwahahahahaha i was right! ha! 9 months earlier than predicted we have 4x sli and 4x corssfire… 😛

    • Ruiner
    • 15 years ago

    but it makes your penOr bigger.

    • Bensam123
    • 15 years ago

    I guess it sorta makes sense. Anyone who will actually think about purchasing one of these won’t actually be able to build a computer on there own. It’s the same thing with Intel Extreme Edition chips and for the most part FX chips as well. Hence why they’re getting sold to dell.

    Too much money and not enough know how.

    This is a step away from purchasing cheap x1 booster cards 😛

    • PerfectCr
    • 15 years ago

    I cringe when I see a gaming machine sold with an Intel processor. Sizzler won’t be any better.

    • zqw
    • 15 years ago

    The “superlinear” numbers are kinda fishy. Who measures MAX fps in FEAR!?
    §[<http://images.visualwebcaster.com/31754/59353/Slide26.jpg<]§

    • crose
    • 15 years ago

    From Anandtech: “a hand painted chassis and Dell will only produce a limited number of them.”

    Why do people think that computers are like cars? A super-powerful PC goes out of style after one year, two at the most, while some cars are time-less.

      • 1c3d0g
      • 15 years ago

      Just like some people like customize their gun(s)…

      Why won’t you let everyone decide what to do with their damn money instead of dictating what should be done with it? :-/

    • fishyuk
    • 15 years ago

    Yeah, but you wouldn’t need central heating so there would be some money saved!

      • 1c3d0g
      • 15 years ago

      It’s getting old… :-/

      • 5150
      • 15 years ago

      Oh gosh that’s funny! That’s really funny! Do you write your own material? Do you? Because that is so fresh. Central heating. You know, I’ve, I’ve never heard anyone make that joke before. Hmm. You’re the first. I’ve never heard anyone reference, reference that outside the of this thread before. Because that’s what says right? Isn’t it? Central heating. God what a clever, smart boy you must be, to come up with a joke like that all by yourself. That’s so fresh too. Any, any Titanic jokes you want to throw at me too as long as we’re hitting these phenomena at the height of their popularity. God you’re so funny.
      ——–
      That was too much work for what it’s worth.

        • PerfectCr
        • 15 years ago

        breath, breath 😀

          • 5150
          • 15 years ago

          I can’t take credit.

          §[<http://tinyurl.com/72gsu<]§

            • ripfire
            • 15 years ago

            As I was going to say… 🙂 Oh man that’s funny.

      • fatpipes
      • 15 years ago

      Jeez, these two guys are so defensive. This cooling solution should rightfully be mocked. Does anybody else see that “internal” GPU fan completely blocked by PCB? Yeah… tthat’s either a photoshop or an ODM design that has not been thermally validated.

      It’s one thing to pull some “OOOOOO YOUR FUNNY LOLOL” crap when the guy is just being stupid. But come on dudes, open your nose and smell the searing PCB.

      I’ve built rigs with two of these cards side by side. Do you know what happens when you have 1mm clearence on a 40mm fan? Absolutely nothing. That GPU roasts under a big metal frying pan. And a single 80mm fan isn’t pushing much CFMs by the time it hits the front of that perpendicular-oriented, air-deflecting curved fan enclosure that’s sandwiched between two PCBs. Unless they’ve got a mylar bra to strap around all 4 cards air-tight to the edges of that 80mm fan, you’ll be seeing artifacts in a matter of months, followed by an extremely difficult RMA process (Oh, we ran out of those cards, give us a couple of months to put in a special order with the ODM who made them for DELL).

      Those problems are just for 1 of those cards. Now you’ve got two of them stacked on TOP of eachother (bear in mind heat rises) and guess who gets to intake that buildup of hot-assed ambient air? The OC’d P4EE!

      Wow, all I’m saying is don’t point the ass end of this thing at your LCD, or a wall with paint or wallpaper on it.

        • 5150
        • 15 years ago

        I didn’t really care, I just had an itching to use that quote for some reason this morning. Intel sucks, or blows, or pumps, or whatever kind of cooling solution you need nowadays, agreed.

    • TREE
    • 15 years ago

    i think in all hoensty the power usage is lower, the heat output is lower, the pci express usage is different…. come on this has got to be using a smaller die size, cough maybe the 90nm process thats supposed to occur in febuary… hence the spring release.

      • Lord.Blue
      • 15 years ago

      That would make sense. But the OCed EE proc will be eating power juice like no tomorrow.

    • HiggsBoson
    • 15 years ago

    §[<http://www.slizone.com/object/slizone_quadsli.html<]§ Nifty flash animation there. Also, I'm not sure I agree with PCPer, they do a bunch of hand waving about a bit of logic on the video cards to split a 16xPCIe connection into two, for 16xPCIe to each daughtercard, and then the same logic remerges them together. Anyone (else) find that unnecessarily a lot of work, and kind of in violation of reasonableness? I mean ok, maybe the data coming over the 16xPCIe slot to the cards is copied and split in two, but how do you pump double the same data back over the same bus from two different cards without throwing half of it out? Wouldn't it make more sense (given the history of nSLI) that they split the 16x lanes into two 8x lanes? Data coming in over the bus then would have to be fiddled with in the driver to sort out which 8x lanes were coming from which card, but it wouldn't be like trying to pump 32xPCIe worth of data over a 16xPCIe slot with some finagling by a magic "special logic" on the board. Also I find it really remarkable that nVidia managed to design or acquire a thermal solution for a GeForce 7-series chip with 512MB of DRAM that fits in such a narrow space. Ok, so the photos do make it look absurdly long, but with two cards side by side, only the last card on the bottom has any breathing room. I hope this thing isn't Dustbuster revisted, although in this case with 4x 7-series GPU's one might actually call that level of noise a worthwhile price to pay.

      • Palek
      • 15 years ago

      *[

        • HiggsBoson
        • 15 years ago

        Point taken. Asymetrical I/O would explain why it’s not required to push 16xPCIe lanes upstream from each card after they’re “split”, but the point still stands that you can’t even have 16xPCIe real lanes of data for each sister/daughter card downstream unless you have 32xPCIe lanes to begin with. You’d either have to split the slot into 2 8x slots, or just copy the data. Although upon further reflection, I guess in SLI, a simply copy could make sense.

      • TREE
      • 15 years ago

      i think thats a show of the new 90nm process

        • lindy01
        • 15 years ago

        or the newer .65nm process:)

    • acejj26
    • 15 years ago

    i’m surprised no one has mentioned that this beast would consume about 700 watts under load…..it would be a nice little space heater in the winter

      • BoBzeBuilder
      • 15 years ago

      700??? i better build a powerplant in my backyard

        • Shintai
        • 15 years ago

        Ye the GFX cards uses insane amounts. The EE is the least power hungry part in that system. I never understood why people nagged about P4 huge usage, when their highend card(s) used more.

          • Anomymous Gerbil
          • 15 years ago

          It’s because they were comparing CPUs, not systems.

          A hotter CPU is a less efficient CPU, and/or a poorer design. So for comparing (say) Intel and AMD CPUs, their point is that AMD’s designs are (were?) better than Intel’s in that regard.

      • Anomymous Gerbil
      • 15 years ago

      700W? Do you have any support for that figure, or did you pluck it out of your underpants?

      For comparison, the similar quad-SLI setup reviewed by Tom’s only uses 377W for the whole system. Where do you get you figure of 700W power consumption for i[http://www.tomshardware.com/2005/12/14/sneak_preview_of_the_nvidia_quad_gpu_setup/page7.html<]§

        • acejj26
        • 15 years ago

        that used 4 7800 gt’s…this uses 4 7800 gtx 512’s….their processor was an athlon fx running at 2.8 GHz…uses much less power than an overclocked p4EE…dell’s computer will also feature 2 raptor hard drives which use quite a bit of power

        now, according to xbit labs, the p4ee in question uses 83 more watts (156 vs 73) than an FX-57…and that’s under non-overclocked conditions…let’s say that no voltage increase is necessary to pull off this overclock. then using the linear relationship that relates power draw and frequency, this chip would use 192 watts. if there is a voltage increase necessary, then this would use over 200 watts. x-bit labs needed a voltage increase to make this speed stable.

        now, according to another x-bit labs article, one 7800 gtx 512 uses 94.7 watts under load compared to 56.7 watts for the 7800 gt. multiply the difference by 4 and you get 38×4 = 132.

        now, you said that tom’s hardware posted 377 watts under load for their system…..fine. however, the motherboards differ in that the nf4 for the athlon doesn’t need a northbridge, whereas the intel version does. there’s 20 more watts. let’s add in the extra power that this system uses.

        processor (192 – 73) = 119 watts
        difference between 4 gtx’s and 4 gt’s = 132 watts
        northbridge = 20 watts

        i’d say that 2 raptors running at 10000 rpm will add an addtional 30 watts to the system.

        so, let’s add it all up. 377 + 119 + 132 + 20 + 30 = 678 watts.

        there you go, chief….right out of my ass

          • Anomymous Gerbil
          • 15 years ago

          Not bad, but no cigar. So it might be more than 377W, but if we ignore your worst-case assumptions re the increase in GPU power consumption, or some highish assumptions for borthbridge and HDD power, we might be to 500W or so, which is high, but it’s hardly the end of the world. (For example, it seems apparent from Tom’s review of the quad-SLI Asus setup that strapping 4 GPUs onto two cards does NOT consume 4 times as much power as a single GPU/card setup,etc).

          • Shintai
          • 15 years ago

          20W for a Northbridge? Anything higher than 8-13W basicly mean ACTIVE cooling. And I don´t see Intel with active cooling on their Northbridges.

          20W is about the same a Mobile CPU uses.

          The 975x chipset is set to 13.5W. 2/3 of your 20W.
          And thats the peak power and for the most premium chipset.

          For reference Yonah chipset uses 3.0W with 667Mhz FSB, dualchannel, GFX, southbridge and wifi. Dothans chipset used about 3.5W.

          A 4.26Ghz EE uses 2W more than a 3.46Ghz in recent tests. Its first when you go beyond the design that power consumption scales fast upwards. Like a 90nm AMD64 at 2.9Ghz would fast use 125-150W. But under 100 at 2.8.

          The Cedar Mill 4.5Ghz OC review also showed a “minor” increase in powerusage.

          Also check say some 3000+, 3200+ and 3500+ chips And you see basicly same power consumption. WHERE IS TAH SCALING?

          But again, you already know CPUs powerconsumption scale in a logirithmic scale when passing their design specs and not in a linear scale.

          And how can you take xbitlabs power consumption numbers serious?
          “As always, we used a special S&M utility to measure the maximum power consumption”

          Yes yes..a pcs of software

            • d0g_p00p
            • 15 years ago

            Where is this 4.5Ghz OC Cedar Mill review at? People have been posting all this info from reviews but never seem to link the reviews. I would be interested in reading about the CM @ 4.5Ghz, thanks.

            • Shintai
            • 15 years ago

            §[<http://www.hardforum.com/showthread.php?s=4c4174268c618462485eda08e87dd9d9&t=989605&page=1&pp=20<]§ §[<http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2578&p=3<]§ And so on..google is your friend. Oh ye, and the Anandtech one is on an engineering sample 2-3months from launch and total systempower is used. So chipset, gfx etc adds alittle due to higher FSB etc.

            • d0g_p00p
            • 15 years ago

            thanks for the links. I am too lazy to google the results. That is why I asked for them

            • Krogoth
            • 15 years ago

            Remember that 4.5Ghz Cedar Mill is a cherry-picked engineering sample. A real production line grade part will probably overclock to around 4.0-4.5Ghz with a little more voltage then that cherry-picked part needed.

            • Shintai
            • 15 years ago

            I dont think so…Intel doesnt send out OC friendly CPUs out on purpose. And what I´m told from retail…4.5Ghz is easier and without voltage increase.

            • Krogoth
            • 15 years ago

            Engineering samples are very overclocking friendly, as they come from cherry-picked yields and are completely unlocked. Factory grade samples are not so lucky. The same thing is true from the AMD side too. I remember when ethusiast were excited that review samples of Venices and San Diegos were hitting 2.7+Ghz with mininal or no voltage. However, that was a rarity among production grade stuff. The production grade stuff usually required more voltage and could not handle as high as a clockspeed.

            Intel only multiplier lock their CPUs and elimiate tweaking options from their reference chipset BIOS. It does not stop DFI, ABIT and Gigabyte from their own developing verisons to allow those options. 90% of K8s are half-lock to allow Cool&Quiet to work. The FX series are the only K8s that are factory unlocked.

          • Shining Arcanine
          • 15 years ago

          I am curious, did you take into account the AC to DC conversion in your calculations? I realize that the GeForce 7800 GTX uses less power than the GeForce 6800 Ultra (roughly 60 Watts) at full load, and the Pentium 4 Extreme Edition probably uses 130 Watts at full load. The Raptor uses about 10 Watts at seek. This is all in DC power of course. You could easily double these figures by mistakenly measuring it in AC power with a horribly inefficient PSU. So anyway…

          4 * 60 + 130 + 2 * 10 = 390

          Throw in the motherboard, sound card and dual channel memory at 10 watts a piece and you get 430 watts of DC power. Then after an 80% AC to DC conversion, you are looking at 537.5 watts (430 * 100 / 80) of AC power at full load, quite a bit less than your 678 watt figure (which was for AC or DC?), and of course, PSUs are rated by the maximum amount of power they provide, rather than the maximum amount of power they consume, so all you need is a 430 Watt PSU with the proper ratings in amps for each rail and everything will run fine. And lets not forget that this is only under full load, at idle DC power consumption should drop by over 200 watts, significantly lowering the AC power consumption figures.

        • bovehein
        • 15 years ago

        §[<http://www.legitreviews.com/article/285/2/<]§ Here it says that the system needs 857W.....

    • Baddle Tech
    • 15 years ago

    But holy shit look at the CPU on that thing! Intel Pentium EE Dual Core 955 with HT OCed to 4.26 GHZ!? Thats a theoretical quad-core cpu too!

    • Ryu Connor
    • 15 years ago

    More than a little surprised they didn’t use Yonah. Guess there are no full size BTX motherboards to do that. You’d think Dell could swing that though.

    Dell’s laptop lines have been better gaming machines than the desktop lines for a while now.

    • verpissdich
    • 15 years ago

    all to power that nice shiny new 30″ dell LCD you just picked up eh…. since running at that high res you’re going to need this…

    • DrDillyBar
    • 15 years ago

    edit: uh .. oops … nm

    • Chrispy_
    • 15 years ago

    Jesus, how long can Dell get away with ramming Intel down the throat of unknowing gamers?

    Don’t get me wrong, we couldn’t live without Intel’s at work but this is where I just wave my hands in dispair at Dell, again (again).

      • DrDillyBar
      • 15 years ago

      IMHO: AMD can’t meet availabilty/stock demands. … not long

        • Baddle Tech
        • 15 years ago

        I don’t think Dell would sell too many Gaming machines? So if they only put AMD in them then AMD wouldn’t have too much trouble keeping up?

          • lindy01
          • 15 years ago

          I bet Dell sells more gaming machines than any other vendor. Dell sell more PC’s than anyone else.

            • PLASTIC SURGEON
            • 15 years ago

            However, that in itself does not make Dell systems any better.

            • Chrispy_
            • 15 years ago

            Best Dell gaming machine I’ve ever seen is an XPS laptop because at least that has a Pentium M in it (and a 7800GTX).

    • albundy
    • 15 years ago

    What board will they be using? I more than doubt its an intel board. I have seen quad sli from gigabyte, so this news is not really news to me. It would be nice to make a longer board with more pci slots as qsli becomes more poplular. This would really be a nice idea with el cheapo cards from gf660 to 6800 series. You dont have to buy them all at once which allows you to purchase them as their prices lower in the future, as to maintain some performance standard for future games.

      • sativa
      • 15 years ago

      its already confirmed that its intel. check the link i posted previously in this topic.

        • Lord.Blue
        • 15 years ago

        Then you did not read the article. It is an nvidia x16 sli. You have to have 2 16 lane pcie slots for this to work. No intel board has that yet.

          • sativa
          • 15 years ago

          huh? like i said, LOOK AT THE LINK I ALREADY POSTED. Its GOING TO BE USING AN INTEL PROCESSOR.

          Also, its going to be 2x dual gtx cards, not 4x gtx cards.

          §[<http://www.dell.com/html/us/products/ces/<]§ From the link: -Quad Nvidia SLI Technology - Dual 1GB Nvidia Geforce 7800 SLI -Graphics Cards -Nvidia NForce SLI x16 -Intel Pentium Extreme Editition Dual-Core 955 w/ HT Overclocked to 4.26 GHz. -etc.... like i said ... the OP should have read the link I posted... just like you should have too.

            • hmmm
            • 15 years ago

            Apples and oranges. OP asked about the motherboard. You responded by talking about the CPU. The CPU will be Intel. The motherboard chipset is NVIDIA. You’re both right about those questions, but they’re right about the question asked by the OP.

    • DrDillyBar
    • 15 years ago

    *Insert Physics Processor here, to offset Pentium*

    • leor
    • 15 years ago

    ha! they’ll work in my k8we!

    and i was right about to get crossfire . . . :-))

    • black6jack9
    • 15 years ago

    The performance is not going to be that great because of CPU bottleneck. All 4 cards are going to be watiting on CPU to process the data. So what’s the point? It is going to make lighter on your wallet!!

    • BobbinThreadbare
    • 15 years ago

    With this card, those dual core patches might become usefull.

    I still don’t see how gaming at 1000 fps is an improvement over 100 though.

      • BoBzeBuilder
      • 15 years ago

      True.
      Besides by the time games actually require that much graphics power there will be better cards with superior technology and features.
      quad SLI is a waste of money.

        • lindy01
        • 15 years ago

        Try driving a game at high FPS on that 30inch LCD with that crazy resolution. Say FEAR with everything maxed.

      • Anomymous Gerbil
      • 15 years ago

      Whilst the card is probably ludicrous, I think you miss the point – the real benefit of bigger/faster graphics cards is more about boosting i[

        • stix
        • 15 years ago

        Untill 6 monthe later you can have 2 cards do the work of the four u spent stupid amounts of money on. This is just a test to see how Dumb customers are.

        Alot seem to pass this test. 4 cards now = wow I am the coolest and have the fastest GPU setup for $2800.

        6 months later. New cards out and 2 in SLI/crossfire beats the quad card setup for 1/3 the cost.

        4 card setup guy kicks his own ass for buying into quad sli.

        Only winner here is the seller end of story.

    • ThelvynD
    • 15 years ago

    Damn and I thought the Voodoo 5 6000s were a big card!

    • arb_npx
    • 15 years ago

    A gaming machine not based on an A64?

    i[

      • draksia
      • 15 years ago

      It wouldn’t surprise me if this is where Dell makes its AMD debut.

        • A_Pickle
        • 15 years ago

        It’s a Pentium D 955, factory overclocked to 4.26 GHz.

    • CasbahBoy
    • 15 years ago

    q[<...Quad SLI features four of NVIDIA's flagship GeForce® 7800 GTX GPUs with an NVIDIA nForce®4 SLI X16 motherboard.<]q It sounds to me as if this is not one card, but four of them all in the same system, using an SLI X16 motherboard with (probably) four PCI-E X16 slots, each with 8 connected lanes. As much as I salivate waiting for numbers from a /[

      • DreadCthulhu
      • 15 years ago

      Well, a poor CPU might hurt the minium frame rate, if you have a part that is heavily CPU bound (like massive physics work in HL2 – think Garry’s Mod and what you can do there).

      • crabjokeman
      • 15 years ago

      y[<"It sounds to me as if this is not one card, but four of them all in the same system, using an SLI X16 motherboard with (probably) four PCI-E X16 slots, each with 8 connected lanes."<]y Obviously, you didn't read the article.

        • Lord.Blue
        • 15 years ago

        Or look at the pictures…..damn that’s a long card!

          • sativa
          • 15 years ago

          the pictures do not represent the final product.. again, read the freaking link i posted earlier.

    • robg1701
    • 15 years ago

    Comes with an overclocked processor…. 😮

      • A_Pickle
      • 15 years ago

      Yeah…. uh…. to 4.26 GHz…

      ….wow…

    • sativa
    • 15 years ago

    §[<http://www.dell.com/html/us/products/ces/<]§ this shows a little bit more about it, in addition to their 30" monitor & some kind of laptop prototype.

      • jobodaho
      • 15 years ago

      I think that laptop looks pretty bad ass. My favorite part would be the detatchable keboard/mouse, very cool idea.

    • Krogoth
    • 15 years ago

    I guess Nvidia pick-up on Gigabyte’s idea, but 4-way SLI is massive overkill. Hell, a single 7800GTX is overkill for most games!

      • ripfire
      • 15 years ago

      That’s what I used to say, but only to get “BLASPHEMY” thrown at my face.

        • eitje
        • 15 years ago

        BLASPHEMY!

    • Deathright
    • 15 years ago

    It will be held back by the intel processor. Dell really should put fx-47 in those things, who is ever paying for one of these things are retarded.

      • A_Pickle
      • 15 years ago

      Um… yeah. Just ’cause it’s an Intel it’ll somehow get lower scores in gaming.

      …le sigh.

        • Vrock
        • 15 years ago

        If it’s a P4 Prescott-ish core, then it most assuredly will. Fanboy.

          • Chrispy_
          • 15 years ago

          I mean, if my goddam sempron* sees EE’s in it’s rear view mirror, then what chance does any intel have agains a “proper” AMD?

          *2.8GHz, 256K L2, SSE3, AMD-64, Venice(?) core, £50 heatsink, £45 chip.

            • Shintai
            • 15 years ago

            Like a *4.5Ghz Cedar Mill that cost nothing aswell. Point being?

            Or a 2.5-3Ghz Yonah

          • A_Pickle
          • 15 years ago

          There’s no denying AMDs supersede Intels in gaming… but… seriously, it isn’t going to suck at playing games. Not with four GPU’s behind it. I play on a 3.0 GHz Northwood just fine.

            • wierdo
            • 15 years ago

            you could play a game at higher resolutions with a video card like this, but the processor will still need to do the number crunching for the AI and physics either way – barring physicis cards catching on and taking up some of the slack – so if a game is processor-intensive then one or four video cards would not help much if the processor can’t keep up.

            That aside, considering that the machine will have a severely overclocked processor inside it – well, for an actual product that is – I don’t think the processors will do bad, it’s just that I would love to see an AFX in that setup, that would round up all the edges imho.

            • ScythedBlade
            • 15 years ago

            guys … you’re being biased here. A 4.26 Ghz was surprisingly fast, able to surpass 4800+ at normal clock. Since a 4800+ is essentially two 4000+’s, intel did pretty well on the AMD supposed scale of xx00+. Plus, it handles 5 threads nicer than 4800+ at normal clocks… check benchmarks~

            Nevertheless, it’s an energy drainer. But we all know that … so stop pointing out the obvious and point out some of the surprising facts that not everyone knows…

            [wrong reply but ehh]

    • robg1701
    • 15 years ago

    oh my……

      • Sargent Duck
      • 15 years ago

      God. That is insane. And the worst part? It’ll only be released on a Dell initially, which means gaming with an Intel processor. Gag

        • robg1701
        • 15 years ago

        i dont disagree….but at least SLI tends to mean high res gaming, where the cpu becomes less of a factor

        • A_Pickle
        • 15 years ago

        Nyehehehe. I already game on a Pentium 4 at 3.0 GHz (…northy). Works great.

Pin It on Pinterest

Share This