Nvidia posts record quarterly revenue

Maybe it’s the sheer popularity of the firm’s Tegra 3 chips. Maybe it’s all the new 6-series GeForces. Most likely, it’s both. In any case, Nvidia enjoyed some very strong results this past fiscal quarter. Revenue hit $1.2 billion, a new record that represents double-digit percentage growth from both the previous quarter and the same quarter a year ago. Gross margin also reached a record high, at 52.9%.

Things are looking so good that Nvidia has issued a dividend to shareholders, to the tune of 7.5 cents per share.

Here’s a quick run-down of the firm’s earnings compared to previous quarters. Keep in mind the quarter we’re referring to was the third of Nvidia’s 2013 fiscal year, and it ended on October 28.

  Q3 FY’12 Q2 FY’13 Q3 FY’13
Revenue $1,066.2 million $1,044.3 million $1,204.1 million
Net income $178.3 million $119.0 million $209.1 million
Gross margin 52.2% 51.8% 52.9%

Here’s some commentary from Nvidia CEO Jen-Hsun Huang. It sounds like all of the company’s key businesses performed well:

"Investments in our new growth strategies paid off this quarter in record revenues and margins," said Jen-Hsun Huang, president and chief executive officer of NVIDIA. "Kepler GPUs are winning across the special-purpose PC markets we serve, from gaming to design to supercomputing. And Tegra is powering some of the most innovative tablets, phones and cars in the market."

Things are shaping up nicely this quarter, too. Nvidia predicts revenue will reach $1.025-1.175 billion, a potentially sizable jump from the $953.2 million the firm posted a year before. Gross margin is expected to fall between 52.9 and 53.1%, roughly flat sequentially but up slightly from 51.4% the fourth quarter of Nvidia’s previous fiscal year.

Comments closed
    • Asbestos
    • 7 years ago

    They don’t need all that money. Time for a windfall profits tax. A lot of people helped Nvidia. Let’s spread the wealth around.

      • bb420
      • 7 years ago

      That sounds like a bad idea. If we decided to tax butt hurt, irrelevant political comments, however, I firmly believe that we’d have the deficit whittled down to nothing in under a year.

        • MadManOriginal
        • 7 years ago

        People making those posts would just be sure to host them on off-shore servers so the tax man can’t get to them 🙁

      • l33t-g4m3r
      • 7 years ago

      I agree. They didn’t build that, so let’s spread the wealth around from the workers to the non-workers and watch society react. Hilarity ensues. Good times, good times. Obamafone! Keep Obama in president you know, he’ll do more. ]:-P

        • NeelyCam
        • 7 years ago

        R&P for destruction, thx

      • liquidsquid
      • 7 years ago

      I could swear that is what happens when you invest in a company. You invest, you are rewarded. Simple. The problem is getting the money to invest.

    • sschaem
    • 7 years ago

    Ok, so nvidia now got more cash then AMD is worth on the open market….

    And its baffling to see that nvidia makes the most money on technology ATI had pioneered and AMD killed after the acquisition.

    Way, way back. ATI was working on the SoC business (2002) and was working with Stanford research,BrookGPU, on enabling GP GPU compute (early compiler for shader HW ~2005).
    It was only a year after ATI released their GP GPU SDK that nvidia released CUDA.

    ATI was a true technology pioneer.

    While AMD was busy mutilating ATI, nvidia saw what ATI was doing… and went full on.
    nvidia gave us CUDA & Tegra, AMD gave us bulldozer.

    Well done nVidia.

      • MathMan
      • 7 years ago

      All ATI ever introduced until it came up with its own OpenCL was an ‘SDK’ with an assembler. Only a month or so later, Nvidia released CUDA, with full compiler, dev tools, libraries etc.

      Now think for a minute: which one takes the longest to create in terms of development time?

      It took AMD years to release there compiler, but by then the train had ready left the station.

    • Krogoth
    • 7 years ago

    I attribute a considerable portion of the record profits to the growing GPGPU market in HPC systems.

    Nvidia’s brand name is still very strong and they are enough “fanboys” out there who are willing to get Nvidia solutions despite the fact that the competition at some price points have “better” deals.

    It will be interesting to see how AMD’s graphical division fare in this fiscal quarter.

      • Silus
      • 7 years ago

      Look at the numbers…Tegra business is quite big already. In just two years they went from almost no revenue to more than 200 million.
      That’s where the “record” comes from, because this is a newer market than all the others NVIDIA is in. All the other segments are basically the same as the previous quarters. Some increase here and a decrease there, but Tegra grew quite a bit which made the record quarter what it was.

      • brute
      • 7 years ago

      one doesnt have to be a fanboy to prefer a particular brand.

      are there shampoo fanboys who only buy heads and shoulders over the walmart brand? how about cereal fanboys who refuse to buy frosted mini spooners? brand preference != fanboyism

        • alwayssts
        • 7 years ago

        Your example makes an interesting, if subliminal, inference. The ratio of crushed wheat bites, stray strands, and deformities in mini spooners is often higher, hence why it is cheaper. To be a fanboy, I would argue it is when you will buy/support anything relating to that brand regardless of it objective, let-alone relative quality.

        Those people do exist, and are hard to parse from the vast majority whom are uninformed, and yes nvidia does bank on both. It’s called winning the marketing war and associating the brand with a better product regardless of what the competition offers. It’s Intel Inside. It’s THE WAY IT’S MEANT TO BE PLAYED. It’s the Kraft Mac and Cheese because I’ve seen the commercial 500 times.

        The comparison you are making relegates AMD to a generic value brand (ie because of drivers etc)? I disagree with that classification, but perhaps that is indeed the problem. There are certainly people whom take that stance as well.

        Perception is reality…and one cannot argue it is not on AMD’s side…If it was, we would not be having this conversation. It doesn’t really matter if it’s founded or not, the results speak for themselves.

          • brute
          • 7 years ago

          i know it isn’t a good comparison really. my point was mostly that people have brand preferences. it doesnt make anyone a fanboy or whatever.

          i know guys who know little to nothing about cars, but associate certain brands with quality, and others with a lack of it. it may well be the case that hyundai/kia make alright automobiles, but i know i wont own one. that doesnt make me a fanboy.

            • Washer
            • 7 years ago

            Lol?

      • MathMan
      • 7 years ago

      A different way to look at it: in terms of pure GPU performance, there is not a lot of difference between AMD and Nvidia. You’d do well buying either one.

      But AMD has little more flashy stuff to offer:
      No PhysX, no 3D Vision, no meaningful GPGPU tech (other than the silicon, which is undeniably great), a reputation for crappy drivers, no high quality Linux drivers etc.

      These are all things that are not really important in terms of revenue, but they do wonders to build a brand. AMD doesn’t have that, other than Eyefinity.

      You can’t ignore brand building and hope that people will figure things out by themselves: they don’t.

        • Krogoth
        • 7 years ago

        PhysX is already dead.

        3D Vision = Eyefinity.

        AMD’s Northern Island GPUs are potent GPGPU in their own right.

        *nix advantage is a moot point here. The vast majority of the *nix userbase use integrated graphical solutions (Intel).

        “crappy” drivers is just a myth that has gone on far too long. It was only valid back when the Rage line was ATI’s top platform…..

        AMD and Nvidia are about equal to each other in the driver department. They each have own set of aching issues which the fanboys blow out of proportion.

          • Deanjo
          • 7 years ago

          [quote<]PhysX is already dead. [/quote<] Doesn't appear that way with the games that come out. [quote<]*nix advantage is a moot point here. The vast majority of the *nix userbase use integrated graphical solutions (Intel).[/quote<] Not even close. [url<]http://www.phoronix.com/scan.php?page=article&item=lgs_2011_results&num=4[/url<]

            • Krogoth
            • 7 years ago

            Name me a title that isn’t practically sponsored by Nvidia and has PhysX support? Good luck. The last time that I had took a look at the list. The overwhelming majority of the titles were UE3-based games. The final nail in PhysX’s coffin will happen when developers no longer use UE3 engine.

            Most of the already “small” *nix userbase and systems are running servers and workstations. They don’t need fancy discrete cards to get the job done. *nix gamers are a tiny minority out of the *nix userbase. The link’s poll speaks for itself, only a few thousand users at most and the website it is a the hub for *nix “gamers”. Take that and go against the millions of non-*nix gamers that exist within the PC gaming community. I’m not even mentioning the even larger console crowd.

            Like I said, the “supposed” *nix advantage is a moot point in terms of marketshare and profitability for Nvidia shareholders.

            • Deanjo
            • 7 years ago

            [quote<]Most of the already "small" *nix userbase and systems are running servers and workstations. They don't need fancy discrete cards to get the job done. *nix gamers are a tiny minority out of the *nix userbase. The link's poll speaks for itself, only a few thousand users at most and the website it is a the hub for *nix "gamers". Take that and go against the millions of non-*nix gamers that exist within the PC gaming community. I'm not even mentioning the even larger console crowd.[/quote<] That's only one survey, smolt results also back up those results as does the openSUSE surveys that they do about every year. Servers usually run headless, and if they do have a graphics solution on it, it is usually an old ATI Rage graphics. Workstations are usually using Quadro cards. Desktops are usually running Geforce cards. Why? Because despite all the opensource efforts, the nvidia cards do give the best experience in linux. It is the standard to which all other video solutions are judged against. BTW, Phoronix caters more to users and developers then "gamers". Yes they do cover linux game news as well but they are a mere pittance as to what is discussed there. It is however probably the leading site for linux graphics however in all aspects.

          • kc77
          • 7 years ago

          [quote<]AMD and Nvidia are about equal to each other in the driver department. They each have own set of aching issues which the fanboys blow out of proportion. [/quote<] Only in Windows is this the case. For Windows AMD is pretty much OK it's drivers are no worse than Nvidia's. I've had my fair share of Nvidia driver bugs. [quote<]*nix advantage is a moot point here. The vast majority of the *nix userbase use integrated graphical solutions (Intel). [/quote<] For Linux It's so not the case. A vast majority of people have Nvidia cards when it comes to Linux. Why? Because for whatever reason AMD refuses to take Linux seriously in any kind of way. Releasing some souce code and saying "Here you do it" isn't exactly support. AMD's biggest problem is that it cedes markets it has no business ceding. Linux is the one area where you don't have to pay developers or make back-room deals to get your code accepted in compilers or the kernel. AMD just got around to backing a API so that people could do video acceleration only a scant two to three years ago. How long has Nvidia had it? Try like eight. Now we have Steam coming to Linux. Where's Nvidia? Right there. Where's AMD? Who the hell knows. The sad part is all of Nvidia's work with Linux will pay off BIG TIME when people start releasing benchmarks and then AMD is going to pretend like someone else is making it perform badly. Not having proper Linux support is so stupid I just can't fathom it. It makes no sense as it's everywhere Nvidia is and AMD isn't.

            • Krogoth
            • 7 years ago

            Nvidia’s drivers aren’t “perfect” in the *nix world nether are AMD and Intel drivers. They are half-closed which goes against an certain element within *nix which have the belief that eveything should be completely “open”. Nvidia drivers just work better than their AMD/Intel counterparts for gaming and other graphical intensive tasks. For general stuff, they are both equal. Remember most of the *nix userbase are running servers/workstation tied with integrated solutions. Intel is the king here in terms of marketshare. You just need drivers that can render no-frills GUI solutions and CLI and you are golden. AMD and Intel drivers can handle this task as well as their Nvidia counterparts.

            *nix for Steam is just a proof of concept. I doubt will help *nix community gain any significant marketshare. The problems with *nix not getting any mainstream marketshare run far deeper than having the lack of mainstream gaming support. The only people who are cheering about it are existing *nix gamers and some anti-“M$” zealots.

            • MathMan
            • 7 years ago

            If you say that Nvidia drivers ‘only’ work better for games and graphics programs then I think we are pretty much 100% in agreement on the Linux front! 🙂

            Well, that, and the fact that AMD is nowhere in terms of GPGPU tool/library support in general, so also for Linux…

          • MathMan
          • 7 years ago

          Yes, PhysX is only supported on TWIMTBP games. That’s the whole point. How does that matter? Sounds to me like an extra incentive to buy Nvidia cards if you like that particular game, no? (Borderlands 2?) AMD does not have compelling features that can’t be run on Nvidia.

          Eyefinity is multiple monitors, which Nvidia now has too. 3D Vision is stereo, for which there is decent Nvidia monitor eco system. AMD only has lip service, but, hey, it’s open!

          There are tons of Unix graphics work stations out there. Don’t even think of going AMD there. Same think for CUDA. AMD has always thought that HW was sufficient but neglected SW.

          The Windows drivers are probably around equal now? The reputation is not. Sucks for AMD.

          It can not be surprising that a company of 7000 (?) people dedicated almost exclusively to graphics can out do a company with 11000 where graphics are decidedly not the only focus.

      • jihadjoe
      • 7 years ago

      18,688 GK110s sold in one go! If they sold those to Cray at $3199 each (retail for K20 I think) that’s close to $60M from one customer alone.

      The bulk of their revenue increase probably did come from Tegra, but you can’t deny the margins and profitability of the HPC segement.

    • GatoRat
    • 7 years ago

    I like the lower power requirements of AMDs graphics cards, but prefer NVidia in all other respects.

      • Silus
      • 7 years ago

      NVIDIA’s power requirements are better than AMD’s in this generation.

        • GatoRat
        • 7 years ago

        I stand corrected. The last card I looked at (an MSI 660) was the exception, at least in the review I read. Checking the other cards found that NVidia has indeed beat out AMD on power in general.

        • Farting Bob
        • 7 years ago

        True, the 2 are similar at idle (close enough it doesnt make a real world difference, NV maybe a few watts less), while AMD saves a bit when the monitor is off with their zerocore stuff, Nvidia uses quite a bit less at load against most comparable cards.

        Ive always gone for ATI/AMD, but if i was to buy one now it would be 50/50. Probably whichever happens to be on sale at the time.

    • jdaven
    • 7 years ago

    Good job, Nvidia.

    Now AMD needs to adopt your business model as fast as possible and get away from x86 and Intel.

      • chuckula
      • 7 years ago

      Oh Totally! I just heard from Rory and here’s a scoop on the latest AMD project code-name:

      [b<]M[/b<]aking [b<]E[/b<]xtreme [b<]T[/b<]echnology [b<]O[/b<]bnoxiously [b<]O[/b<]utrageous! Project [b<]METOO[/b<] will combine all of the latest buzzwords into a synergistic cloud-enabled megatasking platformance system. It will finally unify all of the raw number crunching power of ARM chips in smartphones with the [b<]EXTREME[/b<] power efficiency of AMD's own Bulldozer architecture. Project [b<]METOO[/b<] will be completely implemented inside of a virtualized emulated GPU for the ultimate in completely nonsensical technological convergence. All of this [b<]METOO[/b<] technology will be so eerily similar to the products produced by Samsung, Qualcomm, TI, Apple, and Nvidia that they will immediately join a new PowerPoint based consortium to boost AMD's profits to almost break-even levels! PowerPoints describing project [b<]METOO[/b<] indicate that it will run just as well in a smartphone as in the latest supercomputers. Project [b<]METOO[/b<] silicon will be implemented on the latest and greatest 8 nm SharkFINFet silicon which is already shipping from Global Foundries according to some slides we dug up out of their dumpster. No matter what market segment or application some so-called "competitor" has a product for, AMD has a resounding response: [b<]METOO[/b<]!

        • vaultboy101
        • 7 years ago

        That made me laugh hard chuckula!

        Truly if AMD had the vision and leadership of Jen Hsun Huang they would be in a powerful position today.

        Lol SharkFINFet reminds me of that 80’s guy in Futurama who asks Fry and the gang who is a shark!

    • Forge
    • 7 years ago

    Bugs me, calling the current GeForces “6 series”. The GeForce 6 series I remember fought the Radeon 9000 series, and was Nvidia’s first PCIe (and later native PCIe) parts.

    • Omniman
    • 7 years ago

    Glad I bought in recently! It’s to be expected since they’re a very solid company with no debt and actually have lots of hard cash.

    • alwayssts
    • 7 years ago

    I swear I say this not from a fanboy perspective, but as a consumer trying to figure it all out.

    How do I say this in an noninflammatory way…Ok, here goes.

    This goes to show (in accordance with other things obviously…like huge margins on tesla) the ‘gross margin’ on GK104 parts we all knew/know was happening. They may not be 50% in the consumer space, but I’d be willing to bet it’s darn close. Good on ’em for making it work, but…insert Jimmy McMillan quote approximation here.

    No doubt their margin/price will stay the same, probably adjusted with the curve of how TSMC production allows/improves (ie prices will remain flat or drop very slightly) until there is a comparable stack from AMD.

    Better soak up those margins now, as prices will finally adjust come that refresh in Q1, and common sense says they will be big. Even if you disregard the 8800 series rumors (200/270 for OG 7950/7970ish performance) we all know AMD will flood the market, as they typically do, and their record is long-standing on aiming for unit sales over margins.

    Then there is HSA, and even Intel with MIC. While obviously young and not anywhere near the establishment of CUDA, a comparable stack to g104/gk110 derivatives, if not something more flexible may actually gain a foothold next gen. I’m very curious if they will pay dividends this early or not, but I still think nvidia is at their contemporary peak because of the current makeup of options. I highly doubt this current situation is sustainable when other options in the market (be it T3/T4 versus exynos et al) are right on the cusp and they are all more-or-less all-in against nvidia.

    Not saying nvidia won’t continue to do well, but it will be a much more diversified market moving forward with stiff competition establishing itself on all fronts.

      • chuckula
      • 7 years ago

      [quote<]Even if you disregard the 8800 series rumors [/quote<] You know, AMD got the 7970 out several months before Nvidia got the GTX 680 out and charged higher prices initially that had to come down later. Guess what? Given that situation Nvidia still made a boatload more money. So the 8XXX Radeons launch first (they likely will launch first)... how will things be any different than the last round? Another point: The 8XXX Radeons will not be anywhere near as big a jump in performance since they are still using 28nm, and the same will be true for Nvidia, so first-to-market is not as big a deal for these upcoming chips. [quote<]Then there is HSA[/quote<] Which is nothing more than a marketing buzzword until we see something concrete. OpenCL vs. CUDA in the HPC world? The answer is CUDA and the Top-500 list bears this out. [quote<]even Intel with MIC[/quote<] MIC is a legitimate competitor for a subset of HPC tasks but it is not a 1:1 replacement for a compute-GPU and Nvidia has a big infrastructure advantage with CUDA. Not to mention that MIC is not a competitor at all in the graphics arena.

      • tviceman
      • 7 years ago

      Drops on the 600 series Geforces prices will be offset by Quadro recovering and Tesla ramping up more.

      • jjj
      • 7 years ago

      Well you are way off on many counts here:
      -those are overall margins,Pro has huge margins,then comes Tegra, In what we buy for our desktops they do have solid margins and then there is what they sell to OEMs where margins are quite low.
      – AMD messed up big time this year with high prices and they have the lesser product this year,losing a lot of share. In Q3 AMD GPU segment that includes console was 342 mil , while Nvidia’s q3 (is 1 months off) was 739.6 mil and that includes 66 mil from Intel but not consoles or Pro (pro was 220 mil and there is some GPU revenue in the consumer segment too from consoles). Point being that AMD doesn’t have the scale to beat them in price,as it is AMDs GPU business makes little or no profit.
      – for the pro segment,they managed to push it for now and AMD has to first show some results and some market share gains but for now they might not be alive soon enough
      – exynos is a really bad example as a competitor for Tegra. all other phone makers avoid it,they don’t want to help Samsung be even stronger, they don’t want Samsung to know what devices they are making 1year ahead of launch.In this space Qualcomm is huge and doing great, Mediatek is pretty big but mostly targeting China,TI is getting out, ST-E is in trouble, Broadcom wants to grow but they are not yet able to get into high end and Marvell gave up,for now, on the high end and they mostly focus on China. Overall Nvidia is doing pretty well compared to the competition. Ofc this is Nvidia and they will have bad cycles and good cycles,they’ll be surprisingly good at times and yet mess up often enough..

    • Silus
    • 7 years ago

    NVIDIA has good management and people that know a good investment when they see one and these results are just a glaring proof of that.

    They entered a highly competitive market a few years ago and many were already claiming their doom, because their first Tegra wasn’t on many products (as if any company in the world enters a highly competitive market and makes a killing on all the other competitors…). But as a company that knows how to execute and execute well, NVIDIA did what it needed to and now has a very successful chip in very popular products and will bring even more to the mobile market in the form of Tegra 4.
    The Tesla business is also very well, although it should have more competition from now on, especially from Intel, and the professional market is doing great also, but here they always dominated.
    The GeForce products are the only ones where NVIDIA doesn’t have a clear upper-hand. They are tied with the competition, but the 600 series are formidable graphics cards, which coupled with a stronger brand makes NVIDIA quite comfortable in this segment.

      • travbrad
      • 7 years ago

      [quote<]The GeForce products are the only ones where NVIDIA doesn't have a clear upper-hand. They are tied with the competition, but the 600 series are formidable graphics cards, which coupled with a stronger brand makes NVIDIA quite comfortable in this segment.[/quote<] Yep, and it also helps that their die sizes are a bit smaller than AMD's this generation, which presumably makes them cheaper to produce. There is one area where AMD will soon have the upper hand though, and that is next-gen consoles. Nvidia will have no presence in that market at all anymore. I'm not what the profits/margins are for console chips though. For AMD's sake I hope they are decent.

        • shaq_mobile
        • 7 years ago

        I don’t know how true this is, but I heard that the hardware for consoles sells at regular price from day one, so most of the consoles are sold at considerable lost. However, as time passes, they still contain the same old hardware. So AMD will see profit for most of it, however Microsoft and Sony will foot the bill for the first year or two. Dunno if that’s how it works, but that’s what I heard for the 360 and PS3.

    • ronch
    • 7 years ago

    Perhaps AMD should have agreed to let Jensen be CEO when they approached Nvidia for an acquisition.

    • flip-mode
    • 7 years ago

    AMD’s response:

    [url<]http://www.quickmeme.com/meme/3rpn6m/[/url<]

    • Chrispy_
    • 7 years ago

    I know AMD is making big improvements but Optimus is still superior.

    Outside of a Trinity APU, I would chose Nvidia for a laptop.

      • beck2448
      • 7 years ago

      Totally agree. There is a reason professionals use Nvidia 85% of the time.

    • chuckula
    • 7 years ago

    Hey, I’ll have you know that I totally drink the SemiAccurate Hate-Koolaid and I know for a fact that all of Nvidia’s products are just relabeled GT8800s. These numbers are just lies spewed by Nvidia to hurt Charlie’s feelings!

      • Deanjo
      • 7 years ago

      I don’t understand, how can this be? Charlie has been saying for years that AMD and intel were snuff out nvidia in no time….. ;D

      • BestJinjo
      • 7 years ago

      You gotta take Fudzilla, Semi-accurate, OBR and similar websites with a grain of salt. GTX600 is a very successful series from a performance/watt point of view and is a money maker for NV since it’s a sub 300mm2 die for $300-500 GPUs.

      At the same time, looking at NV’s revenue/earnings, they don’t make that much in their desktop division. This round, most of the growth is coming in the notebook Kepler design wins, professional graphics (Quadro and Tesla) and mobile chips for smartphones/tablets (Tegra 3). When most people look at earnings of NV, they automatically correlate it to desktop GPU success. If you look at the details:

      The “professional solutions” division saw revenue rise 12.4%. And consumer products, which includes the “Tegra” microprocessor line for tablets and smartphones, saw revenue jump 36%, and 28% year over year, to $243.9 million.

      The professional solutions makes a lot of $$ even compared to the desktop GPUs. They have done very well securing lots of wins for Cray and Oak Ridge as well as places like Amazon I bet will use K10 Tesla products.

      This generation, AMD delivered better price/performance across almost all price segments since early summer 2012, while offering constant game bundles throughout the generation and higher overclocking and yet NV continues to do well. To me this says a lot more about NV’s customers loyalty than about how much better NV products are than the competition. Take into account that it took NV more than 6 months to launch sub-$300 desktop GPUs. If AMD was 6 months late with its GPU launch or had worse price/performance to NV for majority of this generation, they would have lost millions. NV doesn’t even need to beat AMD to sell more GPUs; most people would buy them for drivers, PhysX and brand name over AMD in the first place. Next gen, I think NV will regain the performance crown as they have more die size to work with. That’s going to put a hurt on AMD’s graphics division even more.

Pin It on Pinterest

Share This