Epic Games founder discusses the PC as a gaming platform

The folks at TG Daily have had a little chat with Epic Games big daddy Tim Sweeney about the current state of the PC as a gaming platform. As the founder of Epic Games and the mind behind the Unreal Engine, Sweeney is well-placed to discuss the situation, and he has some interesting insight.

According to Sweeney, the hardware industry’s fascination with outrageously expensive solutions like three-way SLI is a “terrible mistake,” and industry marketing people are misguided in believing that PC gamers are a small population with lots of disposable income. He adds that “it is very important not to leave the masses behind,” especially since PCs are more popular today than they’ve ever been in the past.

With that said, Sweeney thinks the true problem affecting the PC industry today is scaling:

You cannot go and design a game for a high end PC and downscale it to mainstream PCs. The performance difference between high-end and low-end PC is something like 100x. . . . If we go back 10 years ago, the difference between the high end and the lowest end may have been a factor of 10. We could have scaled games between those two.

60% of PCs around today don’t have a game-worthy graphics processor, Sweeney explains, which puts PC gaming in a “weird position.” He believes Intel integrated graphics have never worked and will never work for games, but that the blurring line between graphics processors and general-purpose processors could be the answer:

If you look into the past, CPU makers are learning more and more how to take advantage of GPU-like architectures. Internally, they accept larger data and they have wider vector units: CPUs went from a single-threaded product to multiple cores. And who knows, we might find the way to get the software rendering back into fashion.

Then, every PC, even the lowest performing ones will have excellent CPUs. If we could get software rendering going again, that might be just the solution we all need.

AMD plans to integrate a graphics processor core into its next-generation “Fusion” processor in 2009, and Intel’s next-gen Nehalem architecture due to show up later this year will also feature a built-in GPU. Of course, Intel is expected to break into the discrete graphics market with its “Larrabee” product in 2009, too, which could see it produce some more potent integrated graphics solutions. Meanwhile, the newly formed PC Gaming Alliance wants to push for the democratization of game-worthy hardware, and its members include both AMD and Intel.

Comments closed
    • A_Pickle
    • 12 years ago

    Frankly, what he says is a little bit of BS. The PC would BE the gaming platform if devs started going for intelligent and effective piracy counter-measures, and ease of use in their games.

      • l33t-g4m3r
      • 12 years ago

      consumers need DRM, like Americans need the patriot act.
      …….
      shameless plug: Takeover: the return of the imperial presidency and the subversion of American democracy is a very excellent book.
      won the Pulitzer prize.
      must read.

      • Krogoth
      • 12 years ago

      Again, piracy is not the primary culprit.

      Sure, it has always and will be a problem. The developers and publishers have vastly overblown its impact. They do it because, it is a great scapegoat for the real issues.

      In a nutshell, PC is becoming marginalized as a gaming platform , while gaming consoles are evolving into gaming PCs with standardized hardware. Demographics and times have vastly change from what they were back in 1980s to early 2000.

    • YvonneJean
    • 12 years ago

    So many people in these comments are saying that as a PC gamer, consoles would make more sense. Hello? Any of you remember that genre called MMOs, like World of Warcraft? How about RTS? When those two types show up on consoles, with a keyboard/mouse, I will abandon my computer as a gaming platform. Until that day, remember that there is more out there than FPS and RPG games, and that some of us have no choice but to game on the computer.

      • Vrock
      • 12 years ago

      There’s no reason MMO games can’t be played on consoles, and they don’t need a mouse. RTS, that’s another story, but it’s still doable with some kind of custom controller.

        • swaaye
        • 12 years ago

        ever look at a MMO’s GUI? Check out all that tiny text and all those itsy bitsy icons. These games have more information in their UI than Windows does. TVs are going to have a tough time with that. Especially since the console market is absolutely not 100% HDTV empowered. The first sign of consolitis is 16pt fonts in the UI.

          • l33t-g4m3r
          • 12 years ago

          Phantasy Star Online. more MMO’s should be like that.

          • green
          • 12 years ago

          that would be more a function of the game designer taking advantage of the number of keys/buttons available
          it’s definitely not as an effective platform for involved mmo gaming, but it’s still passable.
          eg. you might have items bound 0-9. on console you instead use a button to ‘scroll’ through those items to select. the other possibility being sub-menu’ing

    • quarantined
    • 12 years ago

    I’ve been playing games on the PC this whole decade and the story is always the same: Things are smooth for a short while untill a new game or group of games show up and choke any piece of hardware you through at them. This happens on the PC and not so much on the consoles because on the PC, the platform isn’t standardized. There is no clear cut way to define the next generation from the last one, apart from API versions. There’s always a lot of grey area in there.

    But since the Radeon 9700 Pro, I thought GPUs would eventually advance to the point that they would last far longer than what seems like a 12-24 month upgrade path. Now I’m convinced I was dead wrong about that. I can’t imagine the hardware ever staying out in front of the software long enough for what is considered “high-end” to become mainstream.

    So who’s the culprit in this? Is the software really that demanding, or is it that a lot of software these days is developed with the re-assurance that hardware is always moving forward, so less money can be put towards developing heavily optimized software? I think the latter plays a bigger part in it more often than not.

    I can look at 3 recent games and take note of how they perform at 1920×1080 with high settings on an 8800GTS 640MB card. COD4 looks amazing and runs well at these settings. UT3 doesn’t run nearly as well or look nearly as amazing. Crysis can barely move. I mean, I find it peculiar that these three games are roughly in the same technological class, yet there is such a wide range in performance. I just have a feeling that sloppy programming is at play because it’s “too expensive” and there’s never enough time to expect anything better.

      • MattMojo
      • 12 years ago

      I would mostly agree with what you are saying, but the problem as a whole for the PC is that, starting with the 1st gen GeForce, hardware manufactures create “new” features to set their product apart — yet it won’t even perform that feature very well — I mention this and the 1st gen geforce because that is when I saw it happen first (AA) —- it forced AA on to get “better looking graphics” and it completely killed any effective gameplay because it was not powerful enough to run with AA at ANY resolution — that is when driver hacks/tweaks really came to the front. The PC has always been the “next gen” platform and it will always be that way, but what happens all to often is the “next gen” hardware touts I can do this x,y&z and then programmers like John C. call them out with his next engine and guess what…. it plays like shit — that is until the next “next gen” hardware comes out. I place the blame solely on hardware makers (nVida when are we going to have complete Vista drivers????!!!!)… stop pushing your crap and make a better product. most would say “what about the 8800 series?” — and I respond “those were meant for DX10 and guess what? It plays DX10 like SHIT!”

      Mojo

    • green
    • 12 years ago

    so basically he’s complaining that UE3 isn’t able to run on the 10/100’s of millions of el-cheapo computers that are in or headed for places like libraries, or dumbed down office computers/terminals due to the graphics hardware not being high powered enough…
    anyone else see a flaw in that?…

    i’m not saying he’s completely wrong though
    when you think about the top end titles now take 2years+ to develop
    during that time 2 generations of video cards can pass by
    when development time stretches out to 5 years you’re essentially getting left behind

    i’m thinking gpu performance curve is gonna get more shallow soon
    the power vs performance curve is now a lot on the ‘uncomfortable’ side
    gpu performance growth per generation used to increase a lot
    its slowing down a fair bit and using up a lot more power
    kind of like the way cpu’s were going for a while

    • Krogoth
    • 12 years ago

    I agree 100% with Sweeney on the current state of PC as a gaming platform.

    It has outgrown its market to the point that it can no longer sustain itself. Gaming consoles have caught up in many areas. The last piece is just a standardized keyboard/mice combo.

    It is no surprise that big developers and publishers are favoring gaming consoles.

    I do heavily disagree that developers should use software rendering for 3D graphics. CPUs are so much weaker then modern performance GPUs at graphics. It is like trying to make a 486DX66 to perform as well as an overclocked QX9770 at 4.0Ghz. The low-end, integrated GPU solution still place modern CPUs to shame. The games that do not require fancy graphics already use software rendering. So the whole argument is a moot point.

    My thought on bitteriness of UT3 lackluster sales. It didn’t do well, because the freaking game was out of the oven ten minutes before completion. The game mechanics are down and solid. The other 20% of the what matters to the end-user (UI, MP implementation and tweakability) is completely unpolished.

    It does not help that UT3 had some competition at its launch. *[

    • ish718
    • 12 years ago

    Console gaming will always be way bigger than PC gaming.
    PC gaming is just too expensive and you have to upgrade often.
    PC games most of time are unoptimized and run crappy with average graphics.Heres my conclusion, if you only use PC for gaming then it is a waste unless you got the cash to burn, unfortunately most don’t.
    Since I personally use my PC for gaming, net surfing,work,music,movies, so it somewhat justifies paying lots of cash for PC parts.

      • Krogoth
      • 12 years ago

      If you do nothing but game, then a performance gaming PC is not quite as much value it used to have back almost a decade ago. Gaming consoles have finally caught and got most of the advantages that PCs had enjoyed for years.

      IHMO, a gaming PC is only worth it if you want greater versatility (*[

        • ish718
        • 12 years ago

        Hmm, I never played PC games back then, I was all consoles, I recently got into PC gaming like 2 years ago but I know almost everything about PC gaming though.
        PC gaming is a bitch though cuz the games are more likely to have problems like driver issues or just bad optimization period. At least with consoles, developers have a specific platform to design games for and thus better optimization with console games.

      • Meadows
      • 12 years ago

      Run crappy with average graphics? PC games have always had the best graphics available since more than a decade now.

        • ish718
        • 12 years ago

        Never said anything about PCs not having the best graphics, of course they have the best graphics sherlock, they have the latest GPU technology.
        I was talking about unoptimized games running crappy even with average graphics.

    • FireGryphon
    • 12 years ago

    Software rendering… I still think 256k color games (Jazz Jackrabbit!) were more fun to look at than the junkers out these days.

    • no51
    • 12 years ago

    They’re just pissed off UT3 tanked. First it was pirates, now it’s hardware.

      • floodo1
      • 12 years ago

      #20, suprised you’re the first to see right through Sweeney and realize that he’s just bitter.

      EVERY reason tha they’re game failed is due to them 🙁

      How can you believe this guy when he’s talking about software rendering? Is he smoking crack? Why would we want to use a general purpose processor (CPU) for something as specific as rendering?

        • Kharnellius
        • 12 years ago

        Actually UT3 is outpacing the previous UTs. Not sure why you think UT3 is a failure. ???

          • Krogoth
          • 12 years ago

          That is very wrong.

          UT99 and UT2004 have done far better than UT3 in terms of sales. The current online population on UT3 versus its predecessors speak volumes.

          FYI, my private UT3 server almost never sees any activity, even on weekends!

    • clone
    • 12 years ago

    I have to agree the original Unreal looked great for it’s time and overall had quite a long lived experience behind it.

    I still include Unreal tournament on every PC I sell and most of the buyers will try it and continue to play it…… Integrated graphics up until just recently have been despicable worthy of little more than contempt.

    I have managed to play Warcraft III which was never really a demanding game in 2002 just barely on the 690g chipset.

    the new ATI 780 shows the most promise but to be honest I believe ATI should have gone higher to the 3650 XT and raised the price of the motherboard at least as an option…. I know for certain I would have been able to sell it if it was within $25 of the 780.

    I also believe on the dealer level their would eventually have been a push for these motherboards as an inexpensive gaming option that has been missing for the past 7 years in integrated solutions.

    the 780’s hybrid gfx option and it’s improvements directly recieved from better cpu’s has actually peaked my interest in getting a quad core AMD which just prior to the 780’s release wasn’t even a mild consideration while I waited for the next revision in hopes of more speed to compete more directly with Intel’s quad.

    for the first time in a year I’m seriously interested in AMD only because of the 780 motherboard.

      • Chaos-Storm
      • 12 years ago

      I think they need to release a second version of the motherboard, with x3650xt graphics, and somewhat higher cost. This way, people have a choice.

    • l33t-g4m3r
    • 12 years ago

    a quick google search shows that there are numerous software rendering engines out there.

    §[<http://www.radgametools.com/pixomain.htm<]§ §[<http://www.transgaming.com/products/swiftshader/<]§ Even though they exist, I haven't really seen them implemented in games. Maybe nvidia's mafia-like control of the market has something to do with it. ("the way it was meant to be played") The industry also seem to be shoving SLI and super high end CPU's down our throats recently too. (crysis/ut3) I'm glad they are slowly coming back to reality, and realized that they are only hurting themselves by marketing games/hardware to such a small demography. Before now they've only been blaming their poor sales on piracy: UT3/crysis/(insert game/crappy tech demo here) sales are suffering from piracy !!!1111one1elventy!! more like nobody can play it, its only a graphics tech demo, game play sucks, drm is too draconian, people are sick of beta quality games with little or no future support, nobody cares anymore, pc gaming is dead/moved to consoles, etc. Another thing that I think has recently started to hit games is the 2GB memory limitation, and instead of creating massively bloated games, developers have to start being more efficient. Anyway, it really seems like this gaming alliance could actually revive pc gaming, and bring it back to reality where it belongs.

      • AliceCooper
      • 12 years ago

      I agree that Crysis is a bit too far ahead. I’ve got 2x NV8800 GTS in SLI and frame rate sucks @1680×1050. This is why sales of Crysis are crap.

      TBH COD4 is head and shoulders above Crysis in playability and frame rates and is a great game to play.

      With regards to consoles it takes nearly 5 years for the developers to make games that actually look good enough to play. TBH I’ve seen some crappy games on the 360 and PS3 not to mention the lack of keyboard controls.

      I buy consoles for my kids but for me PC gaming is the only way to go.

      Finally look at the price of console games compared to PC games. Even the Wii games are £40 and in Wii sports you get cartoon images with no arms and legs!

    • swaaye
    • 12 years ago

    Well I can say that I don’t know anyone who is interested in putting down the $$ to get a triple-SLI setup for gaming.

    But lets not forget that the IHVs have a very sweet mid-range assortment right now. These cards aren’t any more expensive than a Voodoo2 was back in the day. I think some folks get caught up on not having the fastest. There’s nothing wrong with there being a extreme luxury high-end product. They exist in every other industry.

    • ChronoReverse
    • 12 years ago

    Well, before they had lower than low end GPUs as integrated. The 780G is at least equal to a discrete low end card.

      • Peldor
      • 12 years ago

      The 780G is close but not quite as fast as the current low-end offering, the 3450.

      Sweeney’s point though is that Intel completely rules the integrated market and he’s right. He’s also right that Intel integrated is terrible at games.

      But he’s dead wrong that Intel’s integrated stuff is broken for games and always will be.

    • pluscard
    • 12 years ago

    Actually, didn’t AMD just put a full gpu in the 780 chipset?

      • titan
      • 12 years ago

      Yes, it’s a whole working GPU. Nobody in their right mind would put half a GPU on a board. It wouldn’t work. Lol!

      In all seriousness, most on board graphics are just powerful enough to push the desktop processing. That’s the case with the 780G as well. It does, however, support Hybrid CrossFire, which allows the use of a discrete graphics card and the on board graphics in a CrossFire configuration. The drawback here is that both the discrete and on board have to be in the same class. So, no Hybrid CrossFire with a 3870×2 and the 780G.

      • Flying Fox
      • 12 years ago

      Yeah, but wake me up when they put anything other than a low end GPU in there.

        • cegras
        • 12 years ago

        I thought ‘hybrid-fire’ was supposed to remedy this issue?

          • Flying Fox
          • 12 years ago

          Only a little. If you put that low end IGP with another low end card, you get slightly less than the midrange card. If you want high end graphics, you get a high end GPU and the IGP doesn’t help much (but it can help 2d mode on the power consumption front), might as well just get a chipset with no IGP and stick with a discrete high end card?

    • Vrock
    • 12 years ago

    It’s not so strange to see Sweeney talking about software rendering….IIRC, the original Unreal had a full-featured software renderer that performed quite well and looked very nice.

    Software renderers allowed people who didn’t have the latest gee-whiz graphics card to at least play the game. That’s not such a bad idea, is it?

      • Meadows
      • 12 years ago

      It’s why integrated graphics have gotten stronger over time, but even then, they’re rarely enough to “just play”. One needs an entry level videocard or a hefty CPU advancement for this.

      • bdwilcox
      • 12 years ago

      Do you remember the 3dfx Voodoo Anubis demo where you could flip back and forth between software and hardware rendering? It was literally like night and day. I would be rather skeptical that any current or near-future CPU could handle bi-linear/anisotropic filtering/anti-aliasing at high-resolutions in real-time, especially with high-poly scenes.

      That’s the second time today that Meadows got the jump on me when I didn’t hit refresh.

        • Vrock
        • 12 years ago

        Again, I point to Unreal, a game which still looks pretty nifty even today. The software renderer, while not as snazzy as running in Glide, did a fantastic job and made it possible for those without $300 Voodoo2 cards but with some CPU power to enjoy the game.

        That’s really what Sweeney is talking about here: the ability to provide a decent gaming experience on PCs without 3D graphics. Since games have gone 3D only, they don’t run at all on some systems, or run very poorly. Why exclude a huge segment of the market this way?

          • trinibwoy
          • 12 years ago

          The problem is that 3D engines and our expectations have far exceeded the increase in CPU performance since the days of Unreal. Heck, even the latest and greatest GPU hardware hasn’t lived up to our expectations.

          It’s going to be a long road to CPU rendering if it ever comes. We’re getting an 8-core Nehalem next year. We have a 320 “core” GPU today. Normalize that for clock speed and you still have a vast head start in parallelism on the GPU. In the end a CPU that would make a useful GPU will look a lot more like a GPU than a CPU. I really want to see where Intel is going with Larrabee.

          • WaltC
          • 12 years ago

          I agree with you about Unreal being a very nice-looking game in its day–and in terms of the game itself I think it is head-and-shoulders above any sequel made to Unreal. I found Unreal refreshingly original and delightful to play, but unfortunately I don’t think Epic agreed with me as every succeeding game bearing the Unreal title has seemed to be little more than a warmed-over love fest prostrating itself in worship to the spirit of id’s Quake 3…;) Obviously, the guys at Epic thought imitating id’s path as opposed to forging their own to be the better way. IMO, the only Unreal game for me is still Unreal, and Epic has yet to match the experience or the depth of that game.

          But I well remember my experimentation with Unreal’s software renderer, and I surely can say a bit more than “it wasn’t as snazzy” as the Glide version…;) The software renderer absolutely sucked visually compared to the 3d Glide version, and more importantly than that it ran far, far slower, such that there was really no comparison graphically between the two versions of the game, and of course no comparison as to how the two versions actually allowed you to play and enjoy the game.

          Speaking of that–cpu power is exponentially greater now than it was then. So why did Epic stop doing software renderers? It wasn’t from a lack of cpu power, was it? No–I think the decision to drop software rendering came when Epic realized that 3d rasterizers were advancing exponentially, too, and that the quality and playability of 3d just blew the doors off of the fastest cpu software renderer Epic could contrive. I don’t see that changing–ever.

            • Vrock
            • 12 years ago

            Hmm, my experiences with Unreal’s software renderer were different. Tell you what: I’ve a retro box with Unreal installed running a Pentium 3 650E and a Voodoo3. I’ll run flybys on both and compare scores and subjective quality.

            I don’t mean to infer that software rendering is just as good as 3D hardware rendering, only that it could be a viable way of getting games to run on lower end machines with decent quality.

            • Peldor
            • 12 years ago

            Also try UT2004 (which also comes with a software renderer) on the system of your choice. It’s awful.

            • Vrock
            • 12 years ago

            Does the demo come with the software renderer? If so I’ll try it.

          • MattMojo
          • 12 years ago

          I believe the original Unreal engine relied heavily on MMX for it’s software renderer. I remember reading a snipit in MaximumPC about how it was able to handle a 4×4 square of pixels in the same pass vs. a 1×1 with a non-mmx capable processor. But it did nothing for lighting — which was one of the primary focuses of 3D accelerators back then and it (MMX) did nothing for larger texture maps (better image quality).

          Mojo

            • swaaye
            • 12 years ago

            It actually does a sort of texture filtering. It looks better than what a Matrox Mystique could do. LOL. But, I believe it doesn’t do alpha transparency well at all.

            • Vrock
            • 12 years ago

            Okay, it turns out the software renderer on Unreal doesn’t perform quite as well as I remembered. Still, I was able to get an average of 34fps at 640×480 in software mode, running on a Pentium 3 650E. Very playable, and it looked pretty good.

            For giggles I tried Unreal Tournament 99 on my main machine (Athlon 4600+ X2 on WinXP) and I was able to run it at 1280×960, 32 bit color, all details high, and average 35-40fps using the software renderer. And again, it looked pretty darn good.

            • Krogoth
            • 12 years ago

            I already did something simliar back eariler to see how far CPUs have gone.

            UT99 runs software rendering at 1280×1024 60-100FPS on my Q6600@2.93Ghz. The graphics look ok, but even UT99’s crappy Direct3D render had superior graphical quality than software mode.

            • Vrock
            • 12 years ago

            I was pretty impressed with the software rendering graphics. Sure, they’re a bit pixelated, but not ugly by any stretch of the imagination.

        • Meadows
        • 12 years ago

        Yep, that’s me. 😉

    • bdwilcox
    • 12 years ago

    He wants to go back to software rendering…I can’t believe I just heard that. Who knows, maybe voxels will make a comeback. You listening, Novalogic? LOL

      • mortifiedPenguin
      • 12 years ago

      I think the idea that it is software rendering with a “twist.” Like AMD’s Fusion project. I gotta admit though, the very idea that it’s software rendering is rather… disconcerting to say the least.

        • tfp
        • 12 years ago

        If the CPU could handle the software rendering with the same performance and quality and I wouldn’t have to shell out 200+ bucks for a good graphics card I would be all for it.

    • Meadows
    • 12 years ago

    This is the hybrid rendering we need, “software” and hardware, not raster and ray-trace.

    They also need a way to figure out what to do with idle GPU shader processors, if there are any, much like they should figure out what to do with CPU cycles, if there are any free.

Pin It on Pinterest

Share This