Chipset development is on hold at Nvidia

Nvidia made quite a splash with its GeForce 9400M chipset last October. Several laptop makers—including Apple—found the 9400M’s speedy integrated graphics, single-chip design, and compatibility with Core 2 processors too tempting to resist, choosing it over Intel’s own mobile chipsets.

The 9400M and its ilk may soon be but a distant memory, though. Both PC Magazine and PC Perspective report that Nvidia has essentially put chipset development on hold, and we won’t be seeing GeForce or nForce chipsets made to work with Nehalem-based Intel processors anytime soon, if ever. Nvidia spokesman Robert Sherbin explained why to PC Magazine:

“We have said that we will continue to innovate integrated solutions for Intel’s FSB architecture,” Sherbin said in an email. “We firmly believe that this market has a long healthy life ahead. But because of Intel’s improper claims to customers and the market that we aren’t licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs. So, until we resolve this matter in court next year, we’ll postpone further chipset investments.”

The chipset licensing brawl between Nvidia and Intel erupted in February, although back then, Nvidia said it was “aggressively developing” chipsets for Intel’s DMI interface. DMI is what Nehalem-derived processors on LGA1156 and mobile sockets use to talk to their chipsets, as opposed to the QuickPath interconnect of LGA1366 Core i7 CPUs.

This situation doesn’t bode well for Ion, the GeForce 9400M’s netbook (and nettop) iteration. Future Atom processors will have on-die graphics, so PC Perspective says Nvidia simply plans to offer a discrete, Ion-branded GPU without chipset elements. System makers will be able to combine that product with Intel’s Pine Trail platform, which includes Atom SoC and I/O hub chips.

On the AMD front, Nvidia told PC Perspective it’s still seeing “a high volume of MCP 61-based sales” at the low end, although demand for enthusiast, AMD-bound nForce chipsets has withered. AMD doesn’t plan to switch platforms until 2011, so Nvidia chipsets may survive in that niche for a little while longer.

Comments closed
    • thermistor
    • 10 years ago

    #58…Well, Opteron was King Kong in 2005-2006…Opteron is still competitive depending on the niche, but not as excellent and triumphant as they were.

    Finally someone above mentioned that AMD really wanted Nvidia as part of their business, not ATI. It was the business equivalent of settling for hamburger when you really wanted the steak. I’m surpised that it has worked out as well as it has for AMD, ATI, and Nvidia. But now the whole thing is cracking up, as it inevitably would.

    • Silus
    • 10 years ago

    My stance is the same as it was in regards to Intel vs AMD anti-competitive “measures”. If Intel or AMD are trying to push out NVIDIA out of this business, they should be punished as Intel was not long ago.

      • Anonymous Coward
      • 10 years ago

      Progress towards a single-chip solution for mainstream computing needs is completely natural and not anti-competitive. While it is dirty (possibly worthy of a big fat fine) to deny nVidia the right to build an external “integrated” GPU, it won’t be long before the market for that disappears entirely.

        • Silus
        • 10 years ago

        That’s quite an irrelevant argument, when we haven’t reached that “single chip” time yet…
        If either Intel or AMD are doing it now, they need to be punished for it.

          • flip-mode
          • 10 years ago

          Anyone can develop a chipset for AMD as far as I know. This one is Nvidia’s choice. Either they’re trying to hurt AMD/ATI by not continuing to provide an SLI solution for the platform, or they don’t see a large enough market for AMD motherboards for them to bother with, or people are actually choosing AMD’s solutions over theirs to such an extent that Nvidia does not feel it can compete.

          Furthermore, AMD/ATI does not lock out crossfire from any platfrom, but Nvidia chooses not to enable it, as far as I know. Actually, I’m shooting from the hip on this one, but I think this is true.

          If Nvidia has balls, they’d have blocked Intel from SLI on the X58 and P55. The problem is, with the 4870 X2 and the GTX 295, SLI isn’t really necessary anymore. SLI is a weak bargaining chip – at least not anywhere near as strong as it used to be.

    • jensend
    • 10 years ago

    The demise of the 3rd-party chipset market is *very* bad news for consumers, and if I were an antitrust regulator I’d be giving a hard look at Intel and AMD (especially Intel) in regards to anticompetitive measures and trying to bring together some legal remedies esp. licenses very fast before it’s too late for it to do any good.

      • compuguy1088
      • 10 years ago

      I agree, with NVIDIA out of the picture, its just the individual cpu makers who are making the chipsets.

      • Anonymous Coward
      • 10 years ago

      While I’m a fan of choice, I think your exaggerating a bit to call it “very bad”. There has been an endless march of integration (network, sound, graphics) which has seen the number of companies involved in a complete computer decrease for quite some time.

      The only monopolistic problem is that x86 is a patent fortress sitting on top of a software mountain. It would be ideal if there was no legal barrier to stop nVidia from entering the x86+GPU market (presumably at the low end).

    • kc77
    • 10 years ago

    I think Nvidia did this to themselves. First, the AMD platform on their side has needed a refresh for sometime. AMD chipsets are “OK” but their SB leaves much to be desired here. AMD now has the preferred chipset because finding a decent NV AM3 socket MB leaves much to be desired.

    They thought that Intel could make them more money. Well, when was the last time Intel played nice on their chipset business? Never. So now they are blowing in the wind with chipsets that are aging in all respects. Maybe instead of trying to play hardball they would do well to work with other CPU manufacturers that still need enthusiast MB’s.

      • eitje
      • 10 years ago

      like VIA! 😀

    • sledgehammer
    • 10 years ago

    intel platform looks plain and boring.
    their graphics are a disgrace, pitiful,shameful, and a professional risk for the people that use them
    intel processors are powerhungry and expensive and crippled to artificially segment the market, and without upgrade path.

    amd graphics are top notch
    amd processors are really good and cheap.
    amd platform is interesting.
    amd innovates.

    wow intel is in trouble

      • wibeasley
      • 10 years ago

      * permanently collapses snakeoil’s thread *

      • eitje
      • 10 years ago

      well well, a new & refreshing voice!

        • MadManOriginal
        • 10 years ago

        Haha. Somehow ‘wow intel is in trouble’ just doesn’t have the ring of the original phrase though 🙁

          • DrDillyBar
          • 10 years ago

          true

      • maxxcool
      • 10 years ago

      What’s wrong with sledge hammer?

        • fpsduck
        • 10 years ago

        Peter Gabriel’s Sledgehammer is the Best Music Video of ALL TIME.

          • DrDillyBar
          • 10 years ago

          good tune. 🙂

      • DrDillyBar
      • 10 years ago

      50%. Even I try harder.

    • Farting Bob
    • 10 years ago

    Dont worry, now that every i5/i7 board is SLi enabled your paying licence fees even if you have zero intention of using any nvidia hardware. Everybody wins, except the consumer, and who cares about them?!?!

    • shank15217
    • 10 years ago

    Intel is up to its usual tactics, anybody thinking Intel is a pillar of innovation is a delusional idiot. They will slowly lock out all competitors from the market, then tier their offerings in such a way that flexibility will fly out the window. The IT world will be a very boring place indeed.

    • Phimac10
    • 10 years ago

    I don’t like this at all. I am distastefully disappointed, in all manner of speaking. Can we all get to one agreement, which in fact, produce a win win type of deal.

    • MadManOriginal
    • 10 years ago

    For desktop I could care less, in fact I say good riddance because NV Intel chipsets have never been very good. The only reason to use them was SLI. For mobile it’s worse, even if the chipset might have had minor problems (manufacturing defects aside) it was a compelling choice. I guess we’ll see discrete laptop graphics quite a bit more now, hopefully they work out the power options to be able to completely shut down discrete cards.

    • Forge
    • 10 years ago

    This leaves Apple in a very bad spot, too.

    Apple is pretty much the ONLY OEM who adopted 9400M in any meaningful way. I’ve been Macbook/Hackbook shopping, and I can’t find any models, past or present, with the 9400M that aren’t Apple machines. I know they must exist, but I can’t find them.

    Apple also burned a lot of bridges with Intel when they went to NV graphics as standard, much mud was slung about Intel IGPs having crap performance.

    This means that Apple is stuck combining ATI/Intel for GPU/CPU (Apple didn’t bash ATI graphics) or crawling back to Intel for the whole package, with much grovelling.

      • MadManOriginal
      • 10 years ago

      Well at least they won’t have a problem marketing to the Macheads. I remember all their PowerPC ‘supercomputer’ bs that bashed other CPUs hard, granted it was during the P4 era but there were A64s too, then when they switched to Intel CPUs the spin changed.

      • Skrying
      • 10 years ago

      I think even Intel is aware how crappy their graphics are. Also, I don’t the bridges are burnt that badly considering the CPU relationship between Apple and Intel. Intel is not going to burn one of the largest manufacturers because of this. Companies have big egos but they’ll get over it for more money.

      Also, Dell’s Studio 14z and XPS Studio 13 both have the 9400M as default graphics options from the top of my head.

      • Convert
      • 10 years ago

      Just off the top of my head a current one from Dell is the Studio XPS 13.

      *edit* As I practically repeat Skyring verbatim, sorry!

      • data8504
      • 10 years ago

      Where on earth did you come up with this?

      Skrying is right. We’re all painfully aware of how underpowered our integrated solutions are, comparatively speaking. And to be quite frank, we don’t hold grudges against anyone: just ’cause someone leaves us for a better solution doesn’t mean we dislike them. Please cease the FUD.

      • OneArmedScissor
      • 10 years ago

      I bet Apple could care less. They’ll just replace their entire product line with Westmere derivatives and go along their merry way.

      • Anonymous Coward
      • 10 years ago

      I think that if Apple cared, they would have pressured Intel into allowing nVidia to build the GPU, maybe exclusively for Apple.

    • anotherengineer
    • 10 years ago

    No SSID………maybe his date kidnapped him, and holding him randsom for leet PC uber gear LOL

    • StashTheVampede
    • 10 years ago

    Not … too long ago, a number of TR users ended up asking why AMD bought ATI out. This should be yet another pointer as to why: AMD’s only hope of surviving the long haul was to be a complete OEM player (like Intel) for chipsets.

    I’m positive AMD and Nvidia talked, but the buyout couldn’t possibly happen because of Nvidia’s price. ATI’s acquisition has positioned AMD well (yes, they are still bleeding, but not as fast) for a longer haul in the consumer space. Nvidia is effectively left out of desktop/server/mobile chipsets and they don’t have any other choice but to go embedded (not a bad space to be in, ihmo).

      • TheEmrys
      • 10 years ago

      AMD made its own chipsets long before it acquired ATI. And ATI’s chipsets really weren’t much to write home about.

        • eitje
        • 10 years ago

        Yes, the AMD 760 was badass…?

        §[<http://www.techreport.com/articles.x/3461<]§

        • Saribro
        • 10 years ago

        AMD only produced chipsets for (relatively) short times to fill the gap untill other players (back then mostly VIA, somewhat SiS) got their parts out. It was never meant as more than a stopgap.
        They noticed that didn’t always work out that great, and after VIA’s retreat and SiS’s slow fade, they figured they needed to provide a default platform of their own, just like Intel.

      • indeego
      • 10 years ago

      Buying ATI was good for AMD if only because ATI was a much better run company. Were I on the board I would have made ATI’s managers lead AMDg{<.<}g

      • Suspenders
      • 10 years ago

      I’m still not convinced of that. Pretty much everything so far that AMD’s managed to get from the ATI acquisition it could have got through aggressive partnerships with either Nvidia or ATI. And at a much, much cheaper price to boot.

      Not to mention the bad-blood between Nvidia and AMD, now that AMD owns Nvidias biggest competitor. If you remember, Nvidia and AMD were pretty buddy-buddy back before the acquisition.

        • StashTheVampede
        • 10 years ago

        AMD and Nvidia didn’t have much of a choice but work together, right?

        AMD needed chipsets, since they weren’t big on making their own (Intel was, by comparison). VIA started them off well (but also made Intel chipsets), but weren’t that great (one reason why OEM adoption was poor). Nvidia was a serious kick in the pants with good chips, lots of functionality, but they too went right for the Intel market asap and somewhat left AMD’s with Socket 939. Nvidia was fine with leaving AMD — they were all over the profit hungry Intel chipsets!

        Partnering up is only as good as the contract and their management. ATI could easily have sold to someone else (maybe difficult for Nvidia to buy) and let their partnership expire. AMD needed to be a top to bottom player in the market and it would have taken much longer to obtain all the GPU knowledge that ATI already had.

          • Suspenders
          • 10 years ago

          I just don’t buy the assumption that they needed to be a “top to bottom player” in the market, and certainly not at the price that they paid to do it (monetary, assets and relationships in the industry). AMD almost bankrupted itself to become a “top to bottom player”, and they didn’t need to because they could have gotten everything they were looking for (save for iron fisted commitment to them) without having to resort to putting the company on the brink of bankruptcy.

          Nvidia’s opteron chipsets showed that the sort of partnership model they had going worked very well for both AMD and Nvidia. Also, had ATI “sold to someone else”, that would have only served to put Nvidia more firmly in AMD’s camp. Instead, AMD alienated one of their best partners, nearly bankrupted the company, destroyed ATI’s value, and generally crippled themselves all to get a hold of a chipset business and graphics know-how that they didn’t really need, and certainly not at the company breaking price they paid for it.

            • kc77
            • 10 years ago

            You are forgetting that AMD went to nvidia first with the merger offer. ATI was the second choice. Nvdia wanted to control the CPU side of AMD. AMD balked and went over to ATI. Whose fault is that?

            • Suspenders
            • 10 years ago

            Well, my understanding of it was that AMD simply couldn’t afford a merger with Nvidia.

            I think that would have been a bad idea also, in any case.

            • ludi
            • 10 years ago

            Actually, the Opteron chipset example doesn’t support your claim. Platform support for that kind of setup required that buyers validate two major platform products, and then receive support for same, from two companies, which is always less preferable to dealing with one company.

            The fact that it worked out anyway was merely a testament to how good the products were, relative to the competition.

            • Suspenders
            • 10 years ago

            Possibly.

            Anyway, my point was that it could be done, and it could be done economically (quite profitably in fact), and without the need for an expensive potentially company breaking buy out of ATI by AMD. The fact that potential buyers would have to deal with two companies rather than one is a snag, but that could have been mititgated through closer co-operation between both AMD and Nvidia. There was certainly a lot of room for co-operation on various projects between the two firms (eg: imagine AMD fabbing Nvidia GPUs).

            Would it be better if AMD had it’s own chipset division a la Intel? Probably. But not at the price they paid to acquire it, and certainly not if they could have got what they were looking for without the back breaking expense of an acquisition of ATI. AMD isn’t Intel, they just can’t afford to throw around billions of dollars like Intel and run the business as if they were Intel. They have to do things smarter, and through partnerships with other companies. Or they should have at any rate.

            Just imagine where AMD would have been today had it not acquired ATI. Do you honestly think they’re better off? I really don’t think so.

            • StashTheVampede
            • 10 years ago

            Do you realize, that at the time AMD bought ATI that Intel had was shipping more GPUs than ATI and Nvidia? That’s right, Intel’s integrated solutions (as bad as they were) were out shipping dedicated cards.

            Let’s combine a few things, shall we? To put together a basic laptop, you will need:
            – Processor
            – Chipset
            – Motherboard
            – Video

            Intel could deliver EVERY item on that overly simplified list. As time goes on, all of those will get cheaper to buy from Intel and they will continue to consolidate their parts. AMD knew this and we’ll start to see more consolidation next year.

            AMD could deliver a processor, but couldn’t deliver the rest. What’s an OEM going to do when presented with the “AMD solution”? They would have to get at least two vendors into the mix for their laptop/desktop/server or they could go with Intel and be done with it.

            ATI was purchased before someone else could do it. Intel is making a killing off selling all chips in your desktop and AMD knew that if they didn’t strike when they could there wouldn’t be a chipset/gpu player left they could buy/merge with. Had AMD waited (say for Nvidia to be low on funds), they wouldn’t have reaped the cash from consoles revenue, notebooks, servers, etc.

            • Suspenders
            • 10 years ago

            Intel can ship millions of crappy GPUs with their offerings because they have the power to cram it down vendors throats. Hell, they were shipping inferior CPUs along with inferior GPUs for years, handily beating the competition with barely a hiccup. They can do it because they’re a behemoth. AMD is no such behemoth, so trying to do things the Intel way is much less likely to work out.

            As to the “platform problem”, my guess is that partnerships with other vendors would have been a better solution than pouring $5 billion into an acquisition. Especially when you consider what AMD has today that it wouldn’t have had without the ATI buy out. The only meaningful thing I can think of is restriction of ATI integrated graphics chips to AMD cpu’s, along with an in-house chipset platform. Without the ATI acquisition, they’d still have their own fabs, a few billion dollars extra spending power, Nvidia and most likely ATI chipsets for their CPUs, but no in house platform. THey paid a damn high price for that platform…

            • StashTheVampede
            • 10 years ago

            They paid a high price to be in the market with Intel, yup. If they didn’t get a complete chipset and GPU solution, they could have been squeezed off by either of those two different companies, easily.

            It’s A LOT of money to get all this tech, but they didn’t have much of a choice — Nvidia was too expensive and there was really no other player.

            From the looks of things, it could be better, but it’s far from bleak. AMD is getting plenty of embedded sales with every Wii and 360 sold (low margins, I know) and their desktop/server solutions are doing very well for OEMs.

            I know you have named going with a partnership or merger to get what they needed out of this, but what if Intel (or someone else) bought them instead? They had little time to act in this space and needed to make a play.

            • Suspenders
            • 10 years ago

            Sure, there was certainly a danger that Intel might go for gobbling up ATI. I don’t think they would have, though, as Intel have very much been more keen on pimping their own technology (x86 and Larabee), and there was too much overlap between what ATI would have brought them (chipsets, integrated graphics expecially) and Intels own products. THe price was obviously another factor.

            Still, even if Intel did buy ATI, that would only have served to force Nvidia and AMD into each others arms. It certainly not clear that it would have been the end of the world for either company. In many ways, AMD would have been better off without the debt load and with Nvidia in its orbit, along with the technologies Nvidias been trying to develop (CUDA, PhysX, Tegra etc.) I certainly am not one who subscribes to the “they HAD to do it to stay competitive” theory.

            • zima
            • 10 years ago

            SiS? ;p Their chipsets were better than ATI at the time of AMD acquisition…and SiS even had GFX better than Intel. Plus – AMD would also get a cut from X360 sales 😉

            • zima
            • 10 years ago

            But AMD is not trying to do things Intel way. AMD integrated graphics are actually quite good, can stand on their own merits.

            • MadManOriginal
            • 10 years ago

            The biggest problem was timing. They bought ATi well in to a bull market so they paid more than what was really the fair price for the company. The not long after NV launched the 8 series and ATi had the HD2000 series, not a good reversal from x1k vs 7 series, and it took a while to get good chipsets and there are still quirks with them (AHCI.)

            • SubSeven
            • 10 years ago

            Absolutey spot on. When they acquired ATI, it seemed the perfect future strategic move. What they had no clue about is that the market will soon tank like it hasn’t in a very long time. Coupled with Nvidia kicking their behind in the GPU segment and Intel introducng Conroe in CPU segment, AMD was ripe for a massive collapse thanks to that wonderful debt load. I believe it was Indeego, that said that ATI was a well run company. Well sir, i must respectfully disagree. ATI was well known for it’s relativly poor management on Wall Street. It is this poor managment that allowed Nvidia to kick the living daylights out of them since the introduction of the 6000 series of cards.

            • Suspenders
            • 10 years ago

            Yeah, a good part of it was timing also. But then again, ATI would have always been worth more on its own than hooked up with AMD, mostly for the fact that they served more markets (Intel chipsets and mobile platforms), so any buy-out would have had to include a loss of value. AMD also wasn’t too keen on some of ATI’s other businesses (the Imageon product line, Xilleon). That all has to be priced in as well.

            Makes the buy seem even less appealing to me. I think we’ll only know whether or not the ATI buy was the right move when we see how the “Fusion” idea pans out, and whether or not integration of CPUs and GPUs deserves the importance that was placed onto it. If it turns out that this was a bad idea, then I really think the ATI purchase was a waste of money for AMD. If it’s the way of the future, than it was a good buy. 2011/2012.

    • YeuEmMaiMai
    • 10 years ago

    with all their chipset problems, this might be a good thing…..I used an Nvidia chipset for 2 AMD builds and both boards (ECS and ABIT) croaked…..replaced with ATi chipset and both PC are still running strong 2 years later.

    edit: also this is what happens when you repeatedly try to screw companies over…..lol (MS with xbox and countless maufacturers with the mobile gfx issues)

      • compuguy1088
      • 10 years ago

      My current rig has a NVIDIA 680i motherboard. Just replaced my 8800GT for a 5850, and it hasn’t croaked. It has its odd glitches, but it hasn’t croaked..

    • flip-mode
    • 10 years ago

    CHARLIE WAS RIGHT YOU FANBOY BIZOTCHES!

    Heh, just teasing.

      • jdaven
      • 10 years ago

      Lol! Way to rile up the fanbois, you troll. Lol! J/K

        • flip-mode
        • 10 years ago

        I has teh troll 4 hollowpeen

    • ssidbroadcast
    • 10 years ago

    Yeah this pretty problematic for nVidia. I have a hard time believing management would just sh–can R&D just because of a legal hurdle. Think of the valuable time lost if chipset engineers had to sit idle and twiddle their thumbs all because of billable hours.

    Not having a chipset roadmap altogether would be a pretty sizeable setback.

    • DrDillyBar
    • 10 years ago

    How are they going to charge people a premium to use SLI now? oh noes!

      • yuhong
      • 10 years ago

      Motherboard licensing

    • adisor19
    • 10 years ago

    This blows. It can only mean that future iterations of the MacBook Air will come with BS Intel integrated graphics >:(

    Adi

      • derFunkenstein
      • 10 years ago

      not until they go Core i7 or i5, which is going to take some time yet – they require too much power and generate too much heat for the Air.

        • jdaven
        • 10 years ago

        Not necessarily. The CPU inside the Macbook Air is rated at 17W. Intel is coming out with an 18W version of Arrandale this year in Q4.

        §[<http://en.wikipedia.org/wiki/List_of_future_Intel_microprocessors#.22Arrandale.22_.28ultra-low_voltage.3B_32_nm.29<]§ It is clocked slower but the Nehalem architecture is better than Core 2 and hyperthreading and turbo clock is there. At least the integrated GPU in arrandale is 50% faster than the old GMA chipset performance.

          • tfp
          • 10 years ago

          That and the northbridge is pretty much gone so that will save some power.

          • maxxcool
          • 10 years ago

          18 watts? In a freezer maybe in suspended mode. I am still a bit suspicious of I7 as a mobile part. I don’t think its ready yet for that space even as a dual core 4 thread part.

            • MadManOriginal
            • 10 years ago

            Arrandale is 32nm, that should help decrease your suspicions.

            • jdaven
            • 10 years ago

            Intel was able to take a 95W Prescott and make it dual core for just 130W at the same speed when they went to 65 nm from 90 nm. When you shrink the process, you shrink the power. What’s so weird about that? We all know that to be true. The 18 W 32 nm Arrandale chip is a dual core (not quad core) running at 1.2 GHz (turbo to 2.26 GHz). I think this is perfectly reasonable.

      • designerfx
      • 10 years ago

      what? lots of macs have ATI graphics cards in them.

      • anotherengineer
      • 10 years ago

      LOL To bad they couldnt use ATI intergrated gfx/chipsets, but wait if Mac offered AMD cpu’s they could.

      So does this mean Mac = fail??

        • Kurotetsu
        • 10 years ago

        They could use AMD’s discrete graphics chips (like the HD4330) like the Acer Timeline notebooks do.

      • Skrying
      • 10 years ago

      They could easily switch to a… switchable graphics system in the 13″. It’ll kick start Intel graphics when on battery and Nvidia dedicated (or AMD) when connected to a wall power source, also hopefully a user method to do so. That’ll still maintain excellent battery life (potentially even better) while providing the graphical oomph we all want.

      “Easy” assuming OS X wouldn’t throw a fit though being they make both hardware and software it shouldn’t be out of the realm of possibility.

        • jdaven
        • 10 years ago

        The main problem is the number of chips for there small systems. Arrandale chips are not totally SoC. There still needs to be an I/O chip for ports. So that would make 1 nvidia GPU, 1 intel CPU and 1 intel north/south bridge. The current Macbook Air desire only has two chips. Maybe they would all fit. Time will tell.

          • Skrying
          • 10 years ago

          Only has two chips? Are you sure? There’s an Intel CPU, I would assume an Intel chipset, and an Nvidia graphics chip. I wasn’t under the impression that Apple went with a Nvidia chipset fully, simply a very low power Nvidia graphics chip that communicated to main memory.

            • jdaven
            • 10 years ago

            They went to Nvidia chipset only. No Intel chipset at all just the CPU. That is what the 9400M is: a northbridge, southbridge and GPU all in one. This is the reason that Intel is so on fire about Nvidia. They are losing their chipset business to them. But instead of releasing a better product, they are using their monopoly to force Nvidia out of the business. OEM’s love the 9400M. One chipset linked to a great mobile GPU makes for compact mobile motherboard designs. So what does, Intel do…stop licensing to Nvidia.

            Don’t ever be fooled into thinking Intel does not abuse its monopoly position. This is not AMD fanboi talk or other such nonsense. Just the pure power of having a dominating position over a market. Who loses the most?…us!

            • Skrying
            • 10 years ago

            Ahh. I figured otherwise.

            I’m not exactly sure what this has to do with anti-competitive behavior from Intel though.

            • jdaven
            • 10 years ago

            Because Intel was mad that they lost chipset business to Nvidia. They were especially mad that they lost the business from their newest partner, Apple. Apple was using Intel Northbridges with GMA graphics. But Apple did not like the video performance and switched to Nvidia. So instead of making a better chipset to compete, Intel revoked Nvidia’s license to make chipsets for Core i7 CPUs. Therefore, if Apple wants to use Core i7’s they are forced to drop Nvidia and use Intel chipsets which are inferior when it comes to graphics. When a large company denies a smaller to make products just because it lost business to the small company and doesn’t make a better product to compete, then we ALL lose. This is anti-competitive behavior in its finest. I don’t know how to explain it any clearer.

            • Kurotetsu
            • 10 years ago

            l[

            • jdaven
            • 10 years ago

            All FSB cpu’s except Atom are being discontinued for DMI and QPI. Are you saying that Apple, Dell, HP, etc. should migrate all of their products to Atom? Of course not!

            Once Intel finishes migrating to Core i3,5,7 and only uses 1156, and 1366, Nvidia is out of luck. This is going to happen at the end of this year, beginning of next. The reason Nvidia still sees use out of FSB is because of Ion/Atom platform only. This is still a relatively small market compared to all of Intel’s other CPUs.

            If you want a desktop, laptop or workstation with integrated graphics, your ONLY choice is going to be Intel for Intel CPUs. Why is this so hard for some of you guys to understand? This is a major, major development.

            • jdaven
            • 10 years ago

            Here is a link to the ifixit teardown of the macbook pro 13″. This is similar to the Macbook air.

            §[<http://s1.guide-images.ifixit.com/igi/XaPYhlwqukefDI4j.large<]§ Notice the small size of the motherboard and only two chips: Nvidia 9400M and Intel CPU. There are no other chips on the other side. Just imagine trying to fit a third chip on that thing.

            • adisor19
            • 10 years ago

            Wrong. There’s the Intel CPU, the Nvidia 9400Mchip and the soldered RAM. That’s it. And that’s pretty much what can fit in the small space while keeping bat life in check.

            Adi

    • thermistor
    • 10 years ago

    Several years ago, NV was the preferred chipset provider to the then-competitive Opteron, as well as NV being present on most retail systems. Yeah, there are NV chipset options for Intel and have been for 775 for years (I even own one mobo); however, the bulk of their sales had to have been on AMD stuff.

    This lowers consumer choice, but the channel and enthusiast market are not big enough to support their Intel offerings IMO.

      • shank15217
      • 10 years ago

      Lol, so you are saying the Opteron isn’t competitive any more? Check again..

    • sweatshopking
    • 10 years ago

    seems like probably the best idea. dont spend anymore money designing chipsets for systems you arent allowed to.

      • HighTech4US
      • 10 years ago

      I thought the court case wasn’t until next year!!! Your statement seems to draw the conclusion that the court has already decided in intel’s favor.

      If nVidia prevails (and they believe strongly they will) expect development to commence and a hefty check from intel (nvidia has counter sued intel) for their interference with nVidia’s business.

      And as a side note if intel prevails the chipset cross license agreement them becomes null and void and the larrabee will then infringe on nVidia’s graphics patents.

      I really don’t see how intel expects to come out ahead on this. If they win nVidia won’t make chipsets but intel won’t be able to make the larrabee.

Pin It on Pinterest

Share This