What’s next for PC gaming?

If you’re reading this, you’re probably a PC gamer. You’ve probably invested a decent amount of money in a fast graphics card, a decent-sized monitor, and more cheap RAM than you probably needed. I’m willing to bet you’ve also played some of the latest shooters on that gaming rig of yours.

If my description fits you, then you must have realized that your PC can carry much bigger loads than the lightweight Modern Warfare engine and its ilk. The sad truth is that today’s games are developed with six-year-old consoles in mind, and they look the part, too. High-end gaming PCs are roughly an order of magnitude more powerful than the Xbox 360 and PlayStation 3. Playing Modern Warfare 3 on the PC is a bit like taking a Ferrari to go grocery shopping; as flashy as it might look, the resources at hand are being woefully underused.

None of that should be news to you. The question is, what happens next?

Epic Games Technical Director Tim Sweeney said in September that Unreal Engine 4 won’t be ready ’til “probably around 2014.” Speaking to Develop the following month, Epic President Mike Capps noted, “I want Unreal Engine 4 to be ready far earlier than UE3 was; not a year after the consoles are released. I think a year from a console’s launch is perfectly fine for releasing a game, but not for releasing new tech. We need to be there day one or very early.”

Unless there’s some miscommunication inside Epic, those two statements tell us the successors to the Xbox 360 and PlayStation 3 won’t be out until late 2013 or early 2014. That’s a long time to wait with PCs getting more powerful and game developers still forced to target the same old platforms. However, I don’t think that means we have to suffer continuing stagnation in PC graphics for the next two years. There’s plenty that can be done to improve visual fidelity without tessellating everything and soaking images in photorealistic shader effects.

Mainly, I’m talking about four little visual eccentricities we’ve been living with for far too long—eccentricities that fast PC hardware could eradicate while we wait for the next generation of games.

  • Jaggies. That’s a big one, and some games have already made good progress toward eliminating jagged edges, but we ain’t there yet. Ideally, multisampling and post-process techniques like FXAA need to be combined to offer accurately smoothed polygon edges and to eliminate jaggies inside and around transparent textures. Some games already do that, but it needs to become the norm rather than the exception.
  • Shimmering. That one is trickier but just as bothersome. In all too many titles, distant textures shimmer as you move around the game world. Sometimes, it’s because of inadequate antialiasing. Sometimes, even cranking antialiasing and anisotropic all the way up doesn’t help. It needs to be fixed. Can you imagine watching a Pixar movie where some parts of the scene shimmer as the camera pans around? Unthinkable.
  • Vsync. Unfortunately, vsync is the red-headed stepchild of the PC gaming world. Hardcore gamers leave it off, because it can induce low frame rates and input lag, and us reviewers and benchmarks keep it disabled in our tests, because if we didn’t, we’d be obscuring the differences between different graphics solutions. I’m not saying those aren’t good reasons to leave vsync off, but the fact remains: screen tearing is an ugly thing that needs to go away. Again, imagine a Pixar movie; now think how it would look with screen tearing in action scenes.
  • Frame rates below a constant 60 Hz. Playing id Software’s Rage earlier this year was something of a revelation. Here was a game that prioritized fluidity of motion above all else, adjusting graphical detail dynamically in order to offer consistent smoothness. And it was good. There’s nothing more frustrating than walking into a wide outdoor scene and having your frame rate plummet all of a sudden. I saw that all too much in Skyrim, which was buttery smooth almost everywhere and frustratingly choppy when I was trying to take in the view from a mountaintop. That’s no good. Movies don’t have smooth parts and stuttery parts. Neither does real life, and neither should video games. Some of you might interject that 30 Hz is a good enough target, since that’s the norm for TV, but I disagree. If you’re playing a fast-paced action game, you want quick-twitch responsiveness in the blink of an eye, or quicker.
  • Microstuttering. This kinda ties into the 60 Hz thing, but it’s an entirely different problem. Even if a game does manage to pull off a 60 FPS average frame rate, that’s no guarantee the illusion of motion will be preserved. Multi-GPU solutions especially have problems with microstuttering, but single-GPU solutions aren’t immune, either. You’ll want to peruse Scott’s Inside the second article for all the specifics and watch the video here to see the real-world implications. In a nutshell, microstuttering is difficult to detect but easy to see; it damages the illusion of motion by having objects skip across the screen instead of gliding smoothly. It needs to go disappear one way or another. I hate to bring up the Pixar movie analogy again, but I don’t recall any skipping or jittering in The Incredibles.

I think those are the big ones. Rage already got us part of the way there with a hard 60 Hz target and beautifully effective vsync. Now, other games need to follow suit and iron out the other kinks mentioned above. I certainly hope AMD and Nvidia will push developers in that direction, too. After all, extra graphics horsepower can be put to good use making games look smoother, cleaner, and more seamless—graphics horsepower that would otherwise go unused… or, more crucially, un-purchased. Yes, I know about PhysX, stereoscopic 3D, and PC-only DirectX 11 eye candy, but the GPUs that come out next year and the year after that will no doubt have the brawn to handle those things with cycles to spare.

Of course, if my wishes are fulfilled, then we’ll be in an interesting position when the next-gen consoles do come out. If Epic’s Samaritan demo is any indication, future titles will take another step toward photorealism. I expect hardware requirements will suddenly spike up, but does that mean we’ll be forced to trade silky smooth, shimmer-free graphics just for a taste of all the eye candy future games can throw at us? I certainly hope not. I hope next-gen titles will manage to offer smooth, distraction-free imagery with an added dose of realism. Otherwise, what would be the point? Photorealism with screen tearing, shimmering textures, and microstuttering wouldn’t be photorealism at all.

Comments closed
    • kamikaziechameleon
    • 8 years ago

    *How about a better more unified platform.*

    I think publisher based distribution networks are bound to fail and fracture the consumer market to much. EA might make a better margin on each sale with origin but if that service is bad for the Community who cares, short term gains for long term detriments to the user base will drive people away from the service and the platform. Money made today but lost tomorrow.

    We can at best have two major rallying entities, since impulse fell to gamestop and GFWL is complete crap we are left with Steam. Yes hardcore audiences will continue to exist in factured communities and forums but those of us who game at most 20 hrs a week appreciate having all our friends and games and community features in one place.

    • kamikaziechameleon
    • 8 years ago

    I agree, the fact that pc has so much hardware but these are still not given for console ports is annoying. The fact that you can get native 1080p on pc is pretty nice but the rest of these should be a given as well. Microstuttering might be the most difficult and annoying of these while the rest are relatively simple solutions that have existed for a long time.

    • focusedthirdeye
    • 8 years ago

    “bit like taking a Ferrari to go grocery shopping”
    awesome analogy! I have 16 GB 1600 MHz ram….I bet nothing would have even used 6 GB. Is there anything I can do that will use 16 GB? Will games ever be able to use this much ram? – Doug Rochford

    • Beomagi
    • 8 years ago

    I think we need to stop looking at 60fps as a holy grail. While reaction times are slow, you can time key presses much much better than that pokey frame rate.

    We need to start pushing 120fps and higher as a standard – including on the monitor. Vsync on these with enough power should net no tears and 120fps. Much improved input lag. Visually an indescribably smooth video experience. Your eyes may not fully react to contrast changes at 120fps, but your brain does appreciate it.

    • ish718
    • 8 years ago

    What would happen to PC gaming if there were no console titles being ported to PC?

    TR’s “Best Game of 2011” poll consist of almost all console ports…

      • yogibbear
      • 8 years ago

      Then The Witcher 2, Deus Ex: HR, Battlefield 3 would have been more dominant and we would have had to add options like Minecraft, Terraria, The Stanley Parable. etc.

    • clone
    • 8 years ago

    what’s next for PC gaming.

    a continued decline of course for various reasons….. initial cost, lack of titles, the forced upgrade path, alpha to beta title launches, DRM all pretty much in that order.

    4 years ago I gave my Nephews my computer, Opteron 170 dual core with 2gb’s of ram and a 6600GT video card, they bought DarkSpore this year and needed an upgrade.

    I didn’t want to give my nephews a new system but DarkSpore needed it… that’s piss off number 1.
    piss off number 2 is that DarkSpore was sold in Beta form and didn’t work at all and gave a message that the game wasn’t ready to launch and that I should come back later…. instead of saying it needed a patch…..2 tries later I look on the web and see “ahh beta bullshit”… “stupid game” that’s piss off number 3 and then for piss off number 4, the patch was huge, 43 seperate patches that took more than an hour do complete…. and then came the DRM and registration which obviously is piss off number 5 and the whole time I’m wishing they had just bought it for Xbox 360 or PS3 so that I wouldn’t have to be involved at all. 5 hours essentially a day of my vacation lost getting 1 stupid video game to work on the PC.

    so here I sit using my old Opteron 170 with 6600 gt and 2gb’s of ram while my Nephews have my system for a stupid $50 game they will likely stop playing in a week.

    yes I do understand why PC gaming is on the decline and am finding it harder and harder to stay interested in it.

      • Dashak
      • 8 years ago

      I’ve never had a problem remotely close to that. But then, I purchase games well after their launch dates.

      • BobbinThreadbare
      • 8 years ago

      You got pissed off because a midrange card from 2004 couldn’t run a recent-ish game?

      Are you also mad a Playstation 2 can’t run it?

      • homerdog
      • 8 years ago

      I’m sorry, but a 6600GT? PC gaming will be just fine without you, since you obviously haven’t been playing any PC games on that thing since around 2005 anyway.

      • lokii_
      • 8 years ago

      [quote<]so here I sit using my old Opteron 170 with 6600 gt and 2gb's of ram while my Nephews have my system for a stupid $50 game they will likely stop playing in a week.[/quote<] [img]http://cdn.epicski.com/a/a0/a0560b4d_NotSureIfSerious.jpg[/img]

    • Geonerd
    • 8 years ago

    Umpteen core CPUs seem to be the future.

    I’d like to see much improved multi-threading, and opponent AI that is able to take full advantage of this resource.

    Every realtime game I’ve seen that features any sort of computer controlled opponent could use some help. Make the Cacodemons and enemy soldiers smarter – much smarter. Remember Doom3? The fantastic environment was utterly wasted because the stupid monsters would jump out at the SAME place each and every time. (I know – this is an old example!) IMO, the F.P.S. experience should be much closer to being dropped into an Alien movie. The bad guys should stalk, ambush, attack, and retreat if wounded. Do the same for race and flight sims where each driver / pilot has a distinct set of skills, and ‘personality,’ and enough cpu cycles to exploit – in a context correct manner – changing conditions or a specific hardware advantage he may hold at the time.

    All this will take a lot of programming. Maybe someone will write a killer AI library and happily license it, in much the same way that ID made zillions renting the engine-of-the-week to 3rd party FPS developers.

    • Bensam123
    • 8 years ago

    Not entirely sure I agree with all your points. I don’t think developers will continue to develop exclusively for consoles with things like BF3 tweaking at that little bit of innovation that makes people want more. People and developers are getting tired of the same old crap and I’m sure they’ll start developing for computers, within reason. That will last as long as it takes to get the next consoles out though.

    What will change the ball game is when the next evolution of video games comes out and ROFLstomps all the competition into non-existence. We aren’t talking about social networking games, we aren’t talking about WoW 2.0, we aren’t talking about CoD 9, we aren’t talking about pro gaming; We’re talking about something that encompasses, incorporates, engages, and moves the entire genre forward.

    • JLW777
    • 8 years ago

    Back in the P3 800Mhz – Riva TNT 2 days. I miss the feeling of inadequency that my machine won’t run the title I just bought and will be forced to look at the game box until I have upgraded the necessary part. Upgrading back then was more fun as the essential upgrade will visibly improve the experience. I game at 1920 x 1080, until I decide to go multi-screen or 2500 x 1600 res. Upgrading now is relatively ‘Unexciting’ as back in the days as many games r ports that doesn’t require cutting edge to be playable. (I’m not bench mark freak)

    • mike941
    • 8 years ago

    As a 25 year old that grew up with a SNES, genesis, N64 and PS2 i have to say i’m really satisfied with the current graphics. I look at them and i think of how amazing they look. I don’t really care about having a blazing fast gaming rig anymore that costs a lot of money. I want graphics to continue to get better and everything but for now they look good enough to me. I imagine in 10 years a $100 GPU will be able to handle the photo realistic graphics so i don’t intend to waste my money right now.

    • tfp
    • 8 years ago

    If ” graphics”Jaggies, Shimmering, Vsync, Frame rates below a constant 60 Hz, and Microstuttering are all they can really improve on PC games (not stories, game content, or anything that actually makes a good game) there is no reason to stay with PCs for gaming.

    How many of the “same” FPS do people really need to play.

    • otherstuff
    • 8 years ago

    This can be confusing, but NTSC TV is actually perceived as 60 fps.

    NTSC TV is around 30 interlaced fps, but each frame contains two “fields”. A field is half a frame. Every second line in the frame belongs to the “odd” or “even” field. They are shown one after another. So it’s perceived as 60 Hz. PAL TV does the same with 25 interlaced fps, being perceived as 50 fps.

    Films are shot by 24 fps, non interlaced, and are a lot more jerky. This gives the film a more film-like look. If you looking to make your home video film like, just convert it to progressive frames (full frames, the opposite of interlaced) and perhaps adjust the contrast. The film-like effect is dramatic.

    In a film, try looking at a horizontal camera movement and watch the jerkiness, it’s very easy. (Remember to turn of your TV’s “100+ Hz” feature, since it ruins this – and the film effect as well. Looks awfull if you ask me, but maybe that will change in time). For TV, Film’s are slightly altered to 25-30 fps, but still the two fields in a frame actually contains the same picture.

      • BobbinThreadbare
      • 8 years ago

      NTSC doesn’t exist anymore, digital TV is 30 fps.

        • sschaem
        • 8 years ago

        No digital TV is 30fps.. Try again. NTSC was replaced with ATSC for other the air, but most of it is 1080i (interlaced) 60field or 720p 60frames.
        30FPS broadcast doesn’t exist.

          • BobbinThreadbare
          • 8 years ago

          I was thinking of cable transmission in addition to over the air, since we’re talking about watching TV, I assume it doesn’t really matter how it’s received. 1080p at 60fps is not an ATSC standard at all.

          [url<]http://en.wikipedia.org/wiki/ATSC_standards#Video[/url<] As far as I know, pretty much all "p" TV is 30 FPS. 60 FPS content is almost unheard of. So even if it's transmitted at 60 FPS, it's probably just duplicating frames.

            • sschaem
            • 8 years ago

            You are correct in saying that ATSC doesn’t define a 1080p 60hz format (but it does define a 720p 60hz)

            Yet, 1080i is defined, and like 4080i NTSC, include 60 images per second, and need to be refreshed at 60hz to show all the images.

            The difference between 1080p 60hz and 1080i 30hz, is that with 1080i the vertical resolution is half (decimation of odd/even lines)

            People get stuck on the nomenclature of “30hz frame rate” and think the signal is made of 30 image per second.
            30hz Interlace is not a frame rate reduction, but a resolution reduction, over 60hz progressive.
            Both signal include the same amount of video frames, each display at 1/60 of a second interval.

            Take a 1080p 60hz video, and blank out all the even row for even frame and the odd row for odd frame and what do you have?
            1080i: A video with 60 images a second that require a 60hz display to show them all correctly. Only diff, half the vertical resolution.

            • BobbinThreadbare
            • 8 years ago

            TV shows are filmed at 30FPS, they might afterwards interlace it into 60, they might not, but it’s filmed at 30. That’s what really matters. Lots of cable and satellite providers offer 1080p at 30FPS channels.

            • sschaem
            • 8 years ago

            Most TV shows like series are likely shot at 24fps for that film look exposure.
            But the majority is shot at 60fps. From the morning shows to live events.. all captured at 60fps.
            ALL sporting event are shot in 60hz mode in the US (true today as is was in the 40s)
            Today its either in 1080i 60 field a second or 720p 60 frame a second. (back then it was 480i)
            Camera CCD are sampled at 60 frame a second in interlace ’30hz’ or progressive 60hz.
            Always resulting in 60 discreet images displayed a second.

            The reason the industry uses 60fps and not 30fps is mainly because 30fps look really bad for sports.
            (hence its also bad for action gaming.. its not just a latency issue)

            Most network use 1080i 60hz formats for acquisitions.

            You seem to have a misconception of what a ’30hz frame rate’ interlaced signal is.

    • odizzido
    • 8 years ago

    I think the future of PC gaming is indie devs. If you look at a great time in PC gaming where we had Raptor: Call of the Shadows, X-COM:UFO Defense, and other great games like that you will notice the indie devs of today are much more similar than what we have now.

    Game companies just grew too much and became scared to release anything creative or different. Indie devs don’t have that problem, and they also don’t use DRM a lot of the time. I think it’s great.

    • travbrad
    • 8 years ago

    Rage was the worst game I played this year, so all its “smoothness” was wasted on a boring, short, generic shooter. The inconsistent graphics quality, blurry textures, and 20GB+ install size weren’t exactly great either. Borderlands ran at 60FPS and that was a great game.

    I’ve also played L4D2 for hundreds of hours at over 60FPS. Admittedly Rage generally has a higher graphics quality than L4D2 (although not always), but what’s the point if the game sucks?

    I agree 30FPS is a horrible experience (especially in a shooter/racing/etc), but I think the biggest problem with games is generic gameplay, not graphics/framerates. Most of them already run at 60FPS if you have a decent video card. The only real exceptions are BF3 and Skyrim (because it only uses 2 CPU cores).

    EDIT: I guess someone prefers graphics to gameplay and is too embarrassed to admit it huh?

    • streagle27
    • 8 years ago

    It seems the newly-released X-Plane 10 pushes graphics limits, explicitly.

    From my understanding, X-Plane 9 (and prior) were more CPU dependent and did not take full advantage of GPU capabilities, to the point where it was difficult if not even possible to determine how much GPU RAM was actually being used so as to optimize performance.

    It should be noted that the developers do recommend as many cores as possible and have indicated that each core can run an additional simulated plane with no impact on performance. They’ve indicated that Mac’s with 12 cores can run 20 planes easily, your own, and others in the virtual sky. So it seems this sim does take full advantage of as many cores as are available, not just the standard 3 or 4 that some do.

    From my reading of X-plane developer blogs, the latest version (10) now actually is programmed to take full advantage of the installed GPU and seems to be programmed to emphasize the GPU more than games/apps have in the past. I recall reading somewhere that they recommend a GPU with 2GB RAM to optimize performance.

    It should also be noted that the developers have indicated that transition to 64-bit code is next on the agenda after some code stabilization. (http://www.x-plane.com/blog/) Here is a quote:

    ” The long term solution is clear – we need to migrate X-Plane to 64 bits. This is not going to be particularly easy or fun.

    On Mac, we have to port all of our platform specific Window/UI setup code from Carbon/C++ to ObjectiveC/Cocoa because Apple doesn’t provide our APIs in a 64-bit variant. Thanks Steve.
    On Windows we have to change compilers, as the one we use now is old and doesn’t build x64 apps.
    Sandy and I will have to make a 64-bit variant of plugins.
    32-bit plugins won’t run in a 64-bit app. As far as I know none of the three operating systems provides a way to bridge DLLs like this, so plugin authors will need to create 64-bit compiled versions of their plugins.

    It’s not a trivial amount of work. But it is what we need to do. It’s crazy to have a video card with 3 GB of VRAM and an app with only 3 GB of address space – in that situation by definition an app could only fill VRAM with textures if it had no code (let alone data). For years the cards, CPUs, buses, everything has been getting faster, bigger, more, except for virtual address space limit, which we have finally smashed into face first.

    There have been a bunch of posts in which I have said “we know we need to do 64 bit” but now we’re saying “it really needs to be on the next train that leaves the station.” We will probably start the porting work once we get the 10.0.x series of patches stabilized.” (end quote)

    Microsoft’s “Flight”, which is the successor to MS Flight Simulator (X) is also being developed but indications are that it requires fewer resources than X-plane for similar frame rates. That would be interesting to see. X-plane seems to have the market cornered on flight realism (an advanced version is available and certified for flight training) while MS seemed to have the market on eye candy, to a point.

    It would be interesting if online GPU and CPU reviewers could somehow use the latest version of X-plane to test performance in a medium and extreme resolution modes.

    If people are wondering about a game to push CPU and GPU limits, for now, getting the free demo of X-plane 10 looks like the way to go.

      • streagle27
      • 8 years ago

      And no, I have no connection with X-plane other than as a customer. I like doing barrel rolls in jumbo jets just above the runway, and seeing if I can land the shuttle without burning up on re-entry. I do own a few versions of MS Flight Simulator including the latest, X, as well as X-Plane.

      X-Plane and demo are available for Mac’s, as well as Linux and PC’s.

      I also have an iPhone a Droid cell phone and USED to run Linux.

      The point being, I like what works, and am not a fanboi of a particular tech even though I don’t own a Mac… But I used to…

        • dpaus
        • 8 years ago

        [quote<]I don't own a Mac... But I used to...[/quote<] And then you took an arrow to the knee?

      • lycium
      • 8 years ago

      lol x-plane, are you kidding me? i had to google that. have you played any of the games that came out in the last, oh, 4/5 years?

      some of those renderings look like the default art assets and examples that shipped with Unity 3D.

        • streagle27
        • 8 years ago

        Actually, no. I have NOT played any games that have come out in the last 4 – 5 years, unless you include r-Factor. I don’t usually play ‘games’, unless you count solitaire or the occasional iPhone racing or flight game. I played either realistic racing, flight or air combat simulators. X-Plane and FSX would fall in the flight simulator category, while Falcon F-16 and it’s variants, along with Ubisoft products like IL-2 would fall in the air combat SIMULATION category.

        I might get COD and BF2, but WOW is out.

        To be honest, I’d really like to see an FPS that uses some mechanism that allows a gun to be used in the shooter that is being run on a huge 37in or greater inch HDTV, similar to what you would find at an arcade. Using a mouse to point a gun on the screen seems a bit non-realistic to me.

        Trying to land a jet or other plane on a carrier, race a car using realistic physics, or having to learn air combat maneuvering to be able to dogfight another fighter in sky are what I prefer, just as others prefer using a mouse to maneuver a pointer on a screen to shoot and blow things up.

        I used to like that too, and still do. But I’d prefer holding something close to a real weapon and the game requiring my aim and reflexes be good enough to hit a target and survive, just as in real life, i.e. a simulator.

      • clone
      • 8 years ago

      I just watched the trailer for X-Plane 10 and it looks terrible….. brutally dated, I’ve always thought that flight sims would be hard because of the FOV and man does X-Plane 10 prove that theory….. if it’s actually pushing graphics hardware that is?

      it looks 7 years behind the latest tech.

      as for PC gaming I see it as declining as more lucrative opportunities crowd it out of the developer space.

      they released that Spyro game that allows consumers to buy avatars to play in the game….. I see games selling for $20 or less but recouping all the coin and much more in the accessories that are bought later…. PC doesn’t do well in that field.

      PC’s last advantage is keyboard / mouse and I’d expect that to become almost standard on the next consoles as they push harder for permanence in the living room and as 2ndary devices.

        • streagle27
        • 8 years ago

        If you really thought that X-Plane version 10 looked ‘terrible’, then Microsoft would shudder to think what you thought of their graphics in Flight Simulator X.

        The world in X-plane is pretty generic except of airport locations etc, while FSX focused on the world being as it is, with airports and buildings where and how they really are in real life. X-Plane focuses on the flight realism, while the graphics have actually improved. I use v9 and the graphics are not bad, certainly better than FSX.

        As far as I know, X-Plane 10 is the only ‘game’ out there that actually uses as many cores as are available on the system as well as much GPU as is available on the system, though Crossfire and SLI are currently not supported. It doesn’t make much sense to support dual or more video cards when a program can’t even take advantage of all the memory on a single video card.

        Personally, I’d like a gun-type interface to become standard for FPS’s now that people are gaming on displays alot bigger than 21in diagonal.

        Isn’t this tech available right now?

    • bthylafh
    • 8 years ago

    64-bit executables, and at least DirectX 10 support for all new games; it’s annoying that all my new Steam purchases have to attempt installing the latest DX9 runtime.

    Someone below mentioned the extra RAM cost of 64-bit executables. It’s not double; ISTR it’s closer to 30%, and besides, RAM is /cheap/. The big stumbling block to 64-bit IMO is Microsoft itself, for having made Win7 and now Win8 come in 32-bit versions, and OEMs for having shipped the stupid thing.

    • lycium
    • 8 years ago

    I wish they’d do a modern Ultima game… Ultima IX was just awful.

    • ish718
    • 8 years ago

    [b<]Jaggies, Shimmering, Vsync, Frame rates below a constant 60 Hz, Microstuttering[/b<] Welcome to Unreal Engine?

    • Chrispy_
    • 8 years ago

    We need better content-creation tools.

    We cannot push PC gaming hard enough because getting pretty art into high tech game engines takes up a phenomenal amount of time and money.

    • TEAMSWITCHER
    • 8 years ago

    I want stereoscopic 3D standard on all PC games. And a real 3D standard for hardware in the industry. Every PC game I have ever played was a 2D projection of a 3D game. The Nintendo 3DS for all it’s faults is the best gaming platform available today. Every jump to a moving platform, plasma dodge, and race curve is far more enjoyable in 3D. There is a real sense of position that allows games to be more challenging, and simply better. The Pc needs to move in this direction to deliver this next great advance in gaming.

    • blitzy
    • 8 years ago

    i would like VR to go ahead another step, there are already semi decent headsets with 6 degrees of freedom head tracking. Just need some higher resolution lcds and higher refresh rate and it would be really awesome. combine that with controls similar to ps3 motion or wii, awesome immersion

    im sure the next gen unreal is going to look killer… but mouse and keyboard can only go so far towards realism. so I am looking past that further

      • --k
      • 8 years ago

      Razer Hydra would have been a good next step beyond mouse/kb but dev support is lacking.

      I’d like to see better AI and higher PPI. Something like QFHD resolution combined with natural language spoken & understood on the fly (not scripted) would be an amazing leap forward for games. Imagine walking around in an RPG and the npc actually changing topics on the fly without it being canned.

      There was a scene in Deus Ex HR that stuck out when I went through the police station taking out a few of the people inside, I left and came back and people were acting like nothing happened.

      [url<]http://www.rockpapershotgun.com/2011/11/21/game-logic-vs-choice-consequence/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+rockpapershotgun%2Fsteam+%28Rock%2C+Paper%2C+Shotgun%3A+Steam+RSS%29[/url<] A lot of new ideas coming out of academia, but sadly game companies usually go for the tried and true IPs.

    • joselillo_25
    • 8 years ago

    A best and funnier control system than a wasd+mouse

      • Firestarter
      • 8 years ago

      Well, you could switch to ESDF :p

        • Generic
        • 8 years ago

        Gasp!

        Been using it since Q2 to gain another column of buttons usable without moving my hand. The little bump on the home key F is nice to return to as well.

          • Meadows
          • 8 years ago

          Why you need the bump is beyond me.

            • Generic
            • 8 years ago

            Your bewilderment regarding the bump’s utility is beyond me.

            • derFunkenstein
            • 8 years ago

            so you don’t have to look down.

            • bthylafh
            • 8 years ago

            Precisely. I use the little nubs all the time to get my fingers on home row while I’m still reading what’s on screen.

            • Meadows
            • 8 years ago

            I don’t have to look down either way. By habitually aligning my thumb with left Alt and, from there, my middle finger on W, I can start either playing or typing texts without ever looking.

            I know precisely what the marks are for on the F and J keys, just couldn’t ever understand [u<]how[/u<] anyone needed them.

            • yogibbear
            • 8 years ago

            Agreed.

            • derFunkenstein
            • 8 years ago

            my thumb goes to the space bar on my keyboard, but I guess it’s just a matter of “to each his (or in your case, her) own”. 😉

            • Meadows
            • 8 years ago

            The correct term is “his”.

            My thumb does alternate between space and left Alt and presses spaces most of the time, but I still need that Alt key to [i<]position[/i<] my hand correctly. (And to switch to my umpteenth weapon in weapon-crazy shooter games, or umpteenth ability in role-playing games, or simply to switch tasks in Windows.) Space is a rather spacious key and I'd be full of errors if I reached straight for it like you seem to recommend.

            • Firestarter
            • 8 years ago

            This, resting my thumb on the alt-key feels needlessly cramped. I do use that key for a weapon from time to time, specifically the ‘nade-launcher in Q3 🙂

            • Meadows
            • 8 years ago

            I repeat once more, I said nothing about “resting”. I’m trying to get through to all you people, try giving me credit for that.

            • kamikaziechameleon
            • 8 years ago

            silly so you can type without moving your left hand from home. then game from the same position. Its great actually wish I’d thought of that a long time ago.

      • Krogoth
      • 8 years ago

      Because key and mouse button bindings are serious business!

        • Meadows
        • 8 years ago

        Stop commenting. You’re a waste.

          • Krogoth
          • 8 years ago

          Still mad bro?

    • ShadowTiger
    • 8 years ago

    I appreciate your wishful thinking, but I don’t see anything in the current incentive structure for video game developers to work on difficult problems like these.

    More of the same… here we come!

    • dragosmp
    • 8 years ago

    PC gaming is just fine. There have been many interesting games this year, just as before. There were many bad games, but also a lot of funny games that were not AAA. Things that happened before will happen again.

    What PC games would need is a new way to make money in order to reduce the purchase price which is frankly ridiculous @ over 20$. I has gotten so expensive to buy a PC game that we actually think of it as an “investment” and count the number of hours it lasts to see if the time/$ ratio is worth while; Angry Birds is free if some non-annoying banners run in a corner. I would enjoy a GRID game where cars, pits and stands are are full of banners like in any race day – it should be at least an option.

    • Vrock
    • 8 years ago

    What’s next for PC gaming? Increased marginalization and a slow decline, sadly. Bring on the thumbs downs.

      • yogibbear
      • 8 years ago

      You’re welcome.

      • Vrock
      • 8 years ago

      Oh, I forgot. Lots more casual games like Farmville and Bejeweled, and lots more PC fanboys whining about DRM and ‘consolitis’. Occasionally there will be some new eye candy feature offered that will make the fanboys ooh and aah, and they will scream ‘MOAR GRAFFIX!!!!” across the internets.

      • paulWTAMU
      • 8 years ago

      I don’t know, the last 2-3 years have been, in many ways, awesome for PC gaming (as well as console games). More quality titles than I’ve had time to play. About the only real complaint is the increased focus on multiplayer when it comes to FPS games and a degree of graphics stagnation. But two complaints don’t equal marginalization and decline to me.

      • derFunkenstein
      • 8 years ago

      The truth, she cuts deep.

        • Vrock
        • 8 years ago

        They aren’t willing to accept the truth, for accepting that PC gaming is in decline means the PC as they love it is in decline. Without gaming, there’s little need for anyone to upgrade a PC that was made in the last decade. On the extreme end, my dad gets by just fine on a 1.33ghz Athlon with 768mb of RAM and a GF 2 MX, on the more normal end, my dual core Athlon at 3.1ghz with 4gb of RAM has served me well for almost 5 years.

        Without gaming, the PC is just another appliance, and appliances aren’t exciting. How many refrigerator or washing machine enthusiast websites do you see out there, after all? Yeah, I miss the late 90’s back when PCs were exciting and it took some passion and knowledge to get the most out of your box, but those days are gone, and the novelty of the games they spawned has worn thin.

          • Krogoth
          • 8 years ago

          You can argue the same with gaming consoles. Wii is really the only “gaming” console left. 360/PS3 are just a dedicated gaming PC with a predictable hardware platform.

          IMO, what is happening is that gaming in general has become the new “Hollywood”. Rehashes of major franchises that follow the same formulas with a few flicks that try to break the mold, but are either fiscal lacklusters or outright flops.

          We are past the golden ages of PC and console gaming.

            • Geistbar
            • 8 years ago

            I’d agree with this with the caveat of it applying to AAA games. Indie developers are still making some very interesting and fun games. Most of them have much more limited appeal, but of course, that’s the whole point! If you appeal to everyone, your game ends up quite bland. I’ve also enjoyed some Eastern European games- The Witcher 1 & 2 and the STALKER series (though I suppose that one, at least, is done for), for instance. I guess those would fall into “A” games- well budgeted, but not “Hollywood” budgets. And I can’t forget Paradox’s stuff either.

            There’s lots of amazing games still being made, but you can’t look to the Biowares or ids or even the Blizzards of the industry to get them anymore. Those developers will still make popular and successful games, but they won’t be like the golden age either. The big publishers are going to be stuck playing it safe, and maybe they’ll even make bucket loads of money doing so too, but those of us will more specific tastes are still able to get games that appear to us, we just need to hunt a bit harder. I know I was quite pleased with finding Avadon to scratch my RPG itch, and Europa Universalis is much more in depth and entertaining than the Total War games have been since Rome.

            I can’t speak for console games though, all I have of the modern era is a Wii, and that’s mostly because I love Smash Bros, Zelda and Metroid. I believe indie Xbox Arcade games have been fairly successful though, and I expect it’s not too different for the PS3 equivalent. AAA games are a bigger deal on the console end though, so it’s entirely possible they will get more “blandness” than PCs, if only because of market forces.

            tl;dr The AAA golden age has ended, but fun games will still be made for PCs and consoles.

          • Geistbar
          • 8 years ago

          I think you just have a bitter dislike for people who enjoy PCs, both in general and for gaming. Which seems rather odd, considering the website we’re on, but who am I to judge? I don’t hugely disagree with your basic ideas, but you seem to be intentionally phrasing them to be inflammatory, which is rather needless I think.

          Yes- PC gaming probably going to see less representation from the AAA publishers, and yes, casual games like Farmville or $1 iOS games are going to become more and more popular, taking up a larger part of the market. And even yes, the need for growth in performance of PC hardware is going to decline- and has been for a while.

          None of that means the hobbies associated with PC enthusiasm will become pointless. I can’t remember if it’s Damage or Cyril (or perhaps even someone else- truth be told I don’t usually check the author tag for articles) but one of the main authors for TR will occasionally talk about an HTPC they have. PC enthusiasts will just move from tweaking their setups for maximum gaming performance to being the best for dedicated tasks. You can build an HTPC, or a NAS, or plenty of other various setups that I haven’t thought of.

          If PC’s are going to drift into the same appliance niche as a toaster or refrigerator, then they’ll do so, or they would have done so, regardless of PC games. Do you even think that the entirety of PC enthusiasm is gaming focused? I’m sure there are plenty of posters here that don’t care about games- you probably won’t see them in this topic (why would they view it?), but they exist. I might use my PC for games, but that’s not why I’m interested in this stuff- I love learning about all the various hardware (I even chose my degree to learn even [i<]more[/i<]!), and I enjoy picking out computer parts as a optimization problem- how do I get the best computer for a set amount of money? Not everyone likes looking at problems and trying to find the optimal outcome, but it won't be just me either. To take the overused car analogy- there are still tons of car enthusiasts out there, but not only is their hobby even more expensive, but it's already gone through all the cycles where a lot of the changes you can make aren't really necessary. They're still car enthusiasts all the same- they just found different things within their cars to toy around with. In essence, I think you have some right ideas, but use them to reach the wrong conclusion.

            • Vrock
            • 8 years ago

            Right, I just dislike people who like PCs. You pegged it.

            I was building PCs before this website we’re reading existed. Some people are quasi-religious about their hobby. This hobby has always been that way though, which is one of the many things that’s turned me off about it

          • paulWTAMU
          • 8 years ago

          Are you kidding? I look forward to the day when PCs are entirely an appliance. They’re expensive (at least expensive-ish) and I LIKE the fact that they can last longer now–my gaming rig is several years old and going strong with just a GPU upgrade.

    • Arclight
    • 8 years ago

    The article feels empty, it’s like it has been said a thousand times before and it’s not like you said it better. We all know (PC gamers) that the situation got very dire in the last few years. I for one reverted back to old games i used to enjoy instead of playing this console ports.

    But i wanted to ask, what about color correction? To me it seems pretty awesome and desirable enough to have in any game. For example i saw a mod for BF 3 that turns on color correction in game (i’m not talking about Sony Vegas effects).

    Here:
    [url<]http://www.youtube.com/watch?v=Z01j0RxdUkQ[/url<] Disclaimer: I'm not affiliated in any way to the person(s) that did this mod. Idk if the mod is legal or not, i will not be held responsible for any negative repercussion of using this mod. Here is another with same disclaimer: [url<]http://www.youtube.com/watch?v=pVSDlveS9P0&feature=plcp&context=C3775dabUDOEgsToPDskL9GUxzYeFMeGFmGOKj6MfO[/url<] BTW: Happy New Year!

      • Cyril
      • 8 years ago

      [quote<]But i wanted to ask, what about color correction? To me it seems pretty awesome and desirable enough to have in any game. For example i saw a mod for BF 3 that turns on color correction in game (i'm not talking about Sony Vegas effects).[/quote<] I believe color correction is a standard feature in a lot of modern game engines. Battlefield 3 already does it out of the box; from what I can tell, the mod you linked to just tweaks the parameters. See step 18 in the rendering pipeline described here: [url<]http://youtu.be/20ZYkYiCdrg?t=4m50s[/url<]

        • Arclight
        • 8 years ago

        [quote<]I believe color correction is a standard feature in a lot of modern game [/quote<] That may be but major triple A titles still lack it (cough CoD cough).

          • yogibbear
          • 8 years ago

          COD isn’t a AAA title on the PC platform. It may be so on consoles but it is definitely B grade at best on the PC.

          • sschaem
          • 8 years ago

          Seem silly to me… “Man LOTR look so gritty in some scene, I wish I could boost the color saturation and brighten the whole thing”

          But this is an option I could see driver developers adds to please who ever find this desirable…
          They already do it for video playback, they can use a similar interface for game playback.

      • Meadows
      • 8 years ago

      Why would a Saturation knob be illegal?

        • Arclight
        • 8 years ago

        When it comes to EA/Dice you start to expect the unexpected (i mean that in a bad way). It still baffles me that they are the ones coming up with awesome games (Bf 2, BF 2142, BF BC 2, BF 3) all plagued by bugs, flaws, poor design decisions (no in-game srv browser WTF?)and yet we soldiered on and played thir games for the good parts they offer.

    • Hurstmeister
    • 8 years ago

    I’m an old timer going back to the 286/10mhz days. I started to get obsessive with graphics and upgrades around the time the Voodoo1 and Voodoo2 in pairs could be had. I was spending on average $1500 – $2000 a year upgrading every 6 months with CPU, GPU, memory, main boards and what ever the fastest HDD that could be had. This went on up until the advent of Intel’s Core Duo chips came out and the Nvidia 8800 cards.

    Video games drove this obsession because I was addicted to the eye candy and couldnt bare to not play a game with everything on full tilt. I can still remember fondly over clocking my P166mmx to 266mhz, Taking the 233mmx to 300mhz. Then when the 300c Celeron came out I thought 450mhz was flying. I even had a Voodoo3 3500 with TV tuner. Nvidia Ti300, ATI 9800. Man I loved those cards. When the AMD Athlon 700 could do 1ghz with the pencil mod I was hooked on AMD chips right up until the Core DUO chips came out. What ever the flavor of the week was,.. I was hooked on it. My systems were strictly business. I didnt care about flashy cases too much but always made sure I had a PSU that could handle things inside.

    I am now finally on the verge of upgrading this system that originally started out 5 yrs ago asa Core Duo 6600, Abit X38 Max main board, Nvidia GTS 512 G92, 2x75gb WD Raptor in raid with 4 GB 1066 DDR2. Now sporting Core 2 Quad 9650 @3.8ghz, GTX 580 GPU, and 8 GB 1066 DDR2 CL5 ram. I believe this is the final stage of this system before I need to upgrade.

    5 yrs is a long time for me to still be on the same main board. But I honestly havent had a reason to really upgrade. The video games havent really pushed me to step up to an I7. I honestly wish developers would step it up a notch. I would gladly build another system to keep pace.

    • malicious
    • 8 years ago

    The game industry’s obsession with eye candy is a big reason why PC games have made depressingly little progress over the past 10+ years with ever increasing portions of new titles’ budgets going to making pretty pictures instead of better AI, interesting new gameplay, etc.

    Are people seriously playing through the same tired FPS game modes and having more fun because of higher graphics fidelity? Better graphics is nice but they’re well down on the list of problems that are causing PC gaming to stagnate.

    I think this article provides a more relevant list of issues, especially with regards to developing games for PC and consoles. Notice how the relative rendering power of PCs versus consoles is absent.

    [url<]http://arstechnica.com/gaming/news/2011/08/ars-guide-how-to-ruin-your-pc-port-in-five-easy-steps.ars[/url<]

    • indeego
    • 8 years ago

    Meh. Pay a “middle” for the machine to everything I want, ($1500 every 3 years seems about right,) Gaming is a plus, but the days of rushing out to buy A+++ titles are long gone. A+++ titles are either released buggy, or with 14-year olds I wouldn’t want souring my gaming experience anyway.

    I’m fine playing Indy titles for $3 and yesterdays A+ titles for $5-10. They are patched and ready to go. I’m paying mid-range for incredible systems and I will 3 years from now. I don’t care if the games catch up to me or not: they eventually will get on my machine.

      • derFunkenstein
      • 8 years ago

      That’s fine until you want online multiplayer and the player pool is long gone.

        • paulWTAMU
        • 8 years ago

        Not all of us do though, and it sounds like indeego is one that doesn’t (same here mostly).

          • derFunkenstein
          • 8 years ago

          For the most part I agree, I don’t do multiplayer a ton but that’s the draw for a lot of those new releases.

            • paulWTAMU
            • 8 years ago

            which sucks–I’d like more focus on single player portion of FPS’s 🙁 I like an FPS or two a year but I don’t dig multiplayer.

            • indeego
            • 8 years ago

            Hopefully Skyrim and Deus Ex starts a SP revolution. Doubtful though, given BF and CoD franchise opportunity…

            • paulWTAMU
            • 8 years ago

            Well, single player RPG style games aren’t uncommon. But good single player FPS’s? that’s a different story.

    • yogibbear
    • 8 years ago

    If the intent of this article was to postulate what would be the next big developments for PC gaming in the next 5 years as per my interpretation of the opening paragraphs and the title of this article then I think Cyril you’ve really framed this up quite limiting. I think the beauty is if you asked someone 5 years ago that was into gaming if they would have an SSD in their PC today and that it would make a HUGE difference in performance of various tasks and in some specific instances within gaming you probably would get very few who would even have known what an SSD was.

    The beauty of PC gaming is that someone somewhere will come up with another brilliant idea and it will happen. If you had of told me 5 years ago I’d have over 200 games on steam, an SSD, access to gpu supported eyefinity/3dvision (rather than say a TH2GO solution), 3D gaming, PhysX that actually does something (Batman: Arkham City I’m looking at you) rather than being a gimick, GOG.com, Skype on my iPhone4, able to stream 1080p videos from my PC to my phone and my TV (yeah I know this was possible prior to 5 yr limit but it is basically plug and play today whereas before it wasn’t), and a whole bunch of other stuff I’m forgetting and can’t be bothered to think about while I’m extremely hung over….

    Anyway, I think your bullet point list is extremely limited in vision and there will be way more progress made than you think in terms of additional features that we didn’t even know we needed, or would want, but they’ll come. Making a list this limiting is scary. I would have thought there would be some lofty goals around making use of bigger HDD capacities i.e. bigger textures or something, making use of all the power we have in multi-core CPUs i.e. design game engines with better threading capabilities, DX12/13-beyond, the death of UE3 engine and FOV & mouse accell abuse, standardized networking/better design of networking (even just rolling back to how it used to be done would be better than most of the slapped together ported networking that we get from so-called “blockbuster” games), etc. etc. etc.

    TLDR: If you can think of it, it can probably happen. Aim higher.

      • Vivaldi
      • 8 years ago

      This is one of the more insightful posts made on this article–not sure why it’s being down-voted.

      It’s vital innovation in all areas continues to be pushed, as we have no idea how future systems and the widening of performance envelopes and expansion of available resources will be utilized.

      Warning, tangent:

      The idea of limited vision yogibbear mentioned, made me think of the various arguments ISPs use in the United States, namely, “customers don’t need to fly any faster to check email and surf the web”, “5Mbps should be good enough for anyone”, “It’s too costly to deploy FTTH,” etc. Meanwhile, cable providers continue to milk their customers (who’s only other choice is a more-crappy DSL provider for the most part…), while speeds haven’t really changed for anyone and fiber-to-the-home roll-out is still virtually non-existent. Instead of growth, we have stagnation in the form of data caps and continue to fall further down the list of global speed.

      Who knows what kind of emergent technologies would surface if 1000Mbps were being deployed. Likewise, who knows the impact on, in this case, gaming will be with faster, bigger, stronger, more efficient software and hardware.

      I am *not* trying to hijack the thread into the merits/failures of capitalism, ISP quality, etc, just simply:

      “If you build it, they [it] will come.”

      • Geistbar
      • 8 years ago

      Cyril’s post (as well as one of my comments below) seems to be aimed at fixing structural or institutionalized problems with the gaming industry. Basically all of the issues he mentioned have been around for at least the better part of a decade, and show no signs of going away, because developers aren’t interested in solving those problems.

      Your post seems to be aiming more at the “what can we add that’s awesome” factor, and not the “why haven’t we fixed this yet factor?”. Maybe I’m reading you wrong though, but I do feel you’re trying to answer a different question.

        • yogibbear
        • 8 years ago

        I think his bullet list is crazy though. If you say the requirements of future games will not outpace PC tech resource availability then okay let’s say they work on these things…

        We’ve already got triple buffering, 16 x CSAA etc. I wouldn’t mind them looking into how proper triple buffering (not render ahead) affects microstuttering behaviour and IF designing a game with the intention of triple buffering being the norm i.e. allowing for use of the VRAM available on your GPU for triple buffering rather than higher resolution textures removes the issue of microstuttering AND the issue of vsync AND the issue of shimmering. (which nicely ties into the 64-bit address space issue also being discussed in the comments)

        Yes maybe I leant a little too hard on the “what happens next?” segment.

        To me the bullet list is extremely game engine dependent. i.e. putting all the blame on the GPU developers is a little unfair.

    • squeeb
    • 8 years ago

    Why mention that crap MW3 and not mention BF3 – unbelievable visuals..we still have at least a few devs who push the limits.

      • curls
      • 8 years ago

      +1

    • sschaem
    • 8 years ago

    “Some of you might interject that 30 Hz is a good enough target, since that’s the norm for TV, but I disagree”

    TV is 60hz in the US. Been this way for the past 75 years. Pong was 60hz, games on the Atari 2600 where 60hz.

    Also we have to think that real-world camera uses an ‘infinite’ frame rate.
    To build one frame or field, the CCD or film is exposed for the duration of the frame.
    Game today really render the photon exposed at a single moment in time, not over the exposed time: from the current frame to the last frame.
    High end video card could expose virtual refresh rates and emulate film exposure.(This interestingly can reduce CGI shimmer effects)

    Back to 30hz, this can cause some bad image blurring during smooth motion.
    I personally think 60hz is the sweet spot, but modern TV even go as far as 120hz refresh using synthesised motion estimated images.

    So beside latency, 60hz look crisper. Not always needed.. but nice for action games.

    Same is true for sport. Basketball and hockey in 24fps (or 30fps) looks so wrong, most everything moving become just a blur when the camera pan left/right to cover the plays.

    All in all.. 60hz with 1/60 second exposure would be the pinnacle.

    … The thing I see needed going forward on console, better texture compression & paging.

    I think this would enable a new level of game engines:

    BD drive + 32GB SSD (16GB used a dedicated BD cache) + 4GB ram + 1 GB vram (becoming 4 levels of texture cache)
    Better HW texture compression support (using virtual texture memory)
    FX-8150 class CPU (With some HW acceleration for data decompression to be used when loading real-time data)
    7950 class GPU (Include special HW decompression for vertex data)

      • havanu
      • 8 years ago

      I’m sorry, but what are you smoking? 30 FPS is what the man said. Hz refers to how your screen is powered.
      Oh and the broadcast standard in the US has been 30 FPS and 25 FPS for ages.
      For film it was 24 FPS both in Europe as well as the States.
      Only now are they starting to double it because of the arrival of HD content. (50p and 60p)

        • yogibbear
        • 8 years ago

        +1, how can someone be on a tech site and not understand the difference between power socket frequency and screen image refresh rate?

          • Thrashdog
          • 8 years ago

          …except he’s right, sort of. An NTSC analog signal contains 30 full frames per second, but those frames are interlaced so that there are 60 discrete updates to the image every second. With progressive-scan digital signals that distinction has pretty much gone away, but before the digital switch-over televisions did indeed update at 60Hz.

            • sschaem
            • 8 years ago

            Thanks for having some sense 🙂

            For the rest, let explain how a typical camera send NTSC video, to maybe shed some light on the process of 30hz frame encoding.

            The CCD is 720×480 and refresh at 60hz.. the exposure is set to 1/60 of a second.
            60 frames are sampled second… but to reduce signal bandwith the CCD is sampled on odd/even scanline alternatively.
            The signal stream will include 60 images a second, but only include odd or even lines.
            The result is that a new image will displayed every 1/60 of a second.
            In effect NTSC is a method of compressing 720×480 60FPS video.
            Static images look like a full res 720×480 image. but you also get 60 distinct images during motion.

            All in all, the only real difference with 60hz progressive is that the camera send the frame at full resolution.

            No matter what, 60 discreet images are capture, sent and displayed in 480i or 480p or 1080i…
            The difference is spacial, not temporal.

            edit: this is not limited to analog NTSC of old, but also broadcast encoded on DVD and blurray… as most US network still broadcast an interlaced digital signal. Its all 60 frames a second, but field encoded.

        • sschaem
        • 8 years ago

        How uneducated… TV for the past 75years display transmitted video with 60 unique image a second. might it be CRT or LCD.
        For ‘TV’ material encoded on DVD its 720x480i : 60 image per second each 720×240 in resolution.
        So an object in motion, will be represented by 60 UNIQUE frame showed on your LCR or CRT per second.

        30hz would look HORRIBLE. No sporting event are transmitted at 30hz for a good reason.

        The same reason gaming at 30hz look like a blur with smooth motion.

          • --k
          • 8 years ago

          Please educate yourself first. Interlaced images are shown at 29.97 frames a second. It takes 2 fields refreshes to update a whole frame.

          “NTSC color encoding is used with the system M television signal, which consists of 29.97 interlaced frames of video per second…
          …to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz”

          [url<]http://en.wikipedia.org/wiki/NTSC[/url<]

            • sschaem
            • 8 years ago

            You are talking about the frame *encoding*.
            NTSC refresh at 59.94, with 59.94 distinct images displayed a second.
            Like for color, NTSC use signal data rate reduction to reach the display of 60 video frames a second.
            The 60 distinct images are encoding as fields that compose a ‘frame’.

            The difference between 480i and 480p is only the vertical resolution.
            480i include 60 images a second at 720×240 (ditigal)
            480p include 60 images a second at 720×480 (ditigal)

            The difference is how the frames get encoded. 480p you get 60 frames, 480i you get 60 frame at half vertical res (denoted field).
            Field is the encoding of frame at half vertical resolution by decimation of odd or even fields. Its a spatial process no temporal.
            CRT phosphor are design to decay according to this specification.
            LCD (or any modern non CRT progressive display) need to emulate this, or uses digital de-interlacing to correctly display 60 images a seconds.
            But whatever is done to process a 480i signal, the result is 60 discreet and distinct images displayed at 60hz

            Sigh, you cant even comprehend what you are linking to…

            Cant stand people talking out of their arse !

      • otherstuff
      • 8 years ago

      You’re right about the 60 Hz.

      NTSC TV is perceived as 60 fps, since it’s interlaced. I just tried to explain this in another comment.

      🙂

    • bcronce
    • 8 years ago

    True DX11 with multi-threading and 64bit.

    • 1love4all
    • 8 years ago

    i strongly believes that the piracy of games is what that made developers suck them consoles hard for cash to hold on in the recession period…..we pc users too need an aggressive configuration standard and stick to that, for example a quad core and gpu like 560 or 6950 as a must for such titles to be utilizing that processing power…..and finally we are moving forward to 120 hz displays so slow, industry and leading brands need to push such good trends like dx 11 and tessellation as a must in all pc games…..i remember those good times when we could reduce resolution to play at full eye candy on CRT’s, real pain in the ass when choices get lesser by every innovation.

      • 1love4all
      • 8 years ago

      oops did i mention that consoles will no longer depend on hardware for future processing within their reach and instead rely on cloud computing for those super hi def tv’s yet to come…..its all money from your pocket baby, they will take the last penny out of you if you stick with consoles!:D

        • wierdo
        • 8 years ago

        Simply put, piracy is the excuse of dinosaur industries, used for the purpose of buying regulations to control the market with instead of innovating to stay relevant.

        This isn’t a new complaint by incumbent industries, hell, Universal pictures itself wouldn’t have existed if it didn’t survive being on the receiving end of this in the 1900s.

        (http://arstechnica.com/tech-policy/news/2010/09/thomas-edisons-plot-to-destroy-the-movies.ars)

        If you want good games with new concepts instead of sequels with new gun skins then just look in indie stuff, they don’t have DRM bs, and their prices are fair. They may not have the funding of the big houses, but they’ll deliver something genuinely interesting to try much more often.

          • Chrispy_
          • 8 years ago

          Yep.

          I have spent plenty on indie games since the fall Steam Sale.

          I can play those games on any PC I choose to, and download as many times as I want.

          DRM is irrelevant, find me a single game with DRM that wasn’t cracked within a week, if not within 24 hours.

          edit:
          For the record, I support gamecopyworld.com. I used to crack games that needed the DVD in the drive. Some of us don’t like the hassle of having to constantly fetch and put easily-scratchable disks in our noisy, vibrating, slow, and obsolete optical drives. I paid for your damn game, why should I have to suffer for supporting you, you stupid asses?….

    • Parallax
    • 8 years ago

    Wouldn’t 2013 and 2014 be around the time 4K TVs start getting some market penetration? Perhaps the new consoles are waiting for this (long overdue) feature?

    [quote<]Frame rates below a constant 60 Hz[/quote<] But be sure to support 120Hz and up! If my next-gen PC can play it at 120Hz, I expect to be able to and not locked at 60Hz.

      • TurtlePerson2
      • 8 years ago

      I can’t really imagine next-gen consoles doing much with the 4k resolution. PC hardware probably couldn’t quite handle 4K very well at this point and the next-gen consoles will only be as good as the best PCs right now.

      After the last generation where Sony and Microsoft lost a lot of money on their consoles initially, I would expect that the consoles will not push the envelope as much as they did before. There certainly won’t be another Cell-type chip in any of them. I wouldn’t be too surprised if we see something fairly close to off-the-shelf computer hardware in the consoles.

      • Xenolith
      • 8 years ago

      4k monitors won’t become mainstream until 4k content arrives. Need something to replace bluray for physical media, and an order of magnitude more bandwidth for streaming 4k video. PC games aren’t enough of a market driver.

    • Geistbar
    • 8 years ago

    Separate from my other comment: my biggest wish here is very simple: 64 bit executables. This isn’t [i<]that[/i<] hard- the hardware and software support is there. It's forgivable in many games, but strategy games, such as Civ V or the Total War games, have no excuse- they're PC exclusives and benefit tremendously from the extra memory space.

      • Krogoth
      • 8 years ago

      Its more than just the executable.

      You also need the libraries and the works in 64-bit format.

        • Geistbar
        • 8 years ago

        Poor phrasing on my behalf, I didn’t want to say “64 bit support” as that’d imply being able to run on 64 bit platforms, which we obviously can. I want the games to run in 64 bit and be able to access it’s memory space. They started talking about this first with the launch of the Athlon64- I know Epic showed on of the Unreal Tournaments running in 64 bit and getting a notable speedup.

        That it’s been so long and we’re not getting anywhere on this front is horrible. Even Microsoft Office has 64 bit variants now. And Photoshop, and MATLAB. Big programs have moved there, as have many small ones- I know 7-Zip does- yet games are completely dropping the ball here. Sword of the Stars 2 has this, but it’s had a troublesome launch- I can’t think of any other games. Hmm, did Crysis?

          • evilpaul
          • 8 years ago

          Aside from being able to use more memory I don’t think there’s really much benefit for games. You get twice the Integer registers, but CPUs have been doing behind the scenes register renaming for years. You also double the amount of memory space and bandwidth used by pointers as well (I think). I seem to recall John Carmack saying Doom 3 would have run slower as a 64bit binary years ago.

            • bcronce
            • 8 years ago

            There are quite a few science related heavy computing applications that get a few percent speed bonus by going 64bit because of extra registers. Larger points are only an issue if one has highly optimized structures where an increase of points size will cause bad alignment with cache lines. Pretty much every modern CPU supports async pre-fetching, so one could load the request for the next cache line before it’s needed, depending on the logic flow.

            There is no reason to still use 32bit except some people still have 32bit machines, libraries(like other have pointed), or you’re working on a 32bit platform or memory/bandwidth is tight and need every little bit of performance, and probably a few other things that I can’t think of.

            I think it’s mostly a mix of the first two.

            I only have one game that runs out of memory… World of Warcraft. I’ve played a few games that have gotten close. 64bit should start to become standard soon, assuming complicated games with large textures/etc.

            • Geistbar
            • 8 years ago

            [quote<]Aside from being able to use more memory I don't think there's really much benefit for games.[/quote<] I consider being able to use more memory a rather hefty benefit, that I wish they'd make available to themselves.

            • derFunkenstein
            • 8 years ago

            X86-64 also gives more registers which should help in theory too.

            But I agree the big thing is RAM availability. SupCom should never crash because it crossed 2gb and the fact it does is flat out stupid. Come on, devs!

            • Geistbar
            • 8 years ago

            Supreme Commander is a wonderful example of a game that could benefit from a larger memory access space. Strategy games are the easiest example I guess, but even things like many of Bethesda’s works- which rely on having thousands and thousands of minor objects throughout the world- would easily benefit as well.

            I’ve never encountered a game crashing due to a memory limit, but that of course isn’t the only way the address space rears its head. A lot of design limitations are going to be placed on a game or engine when you take modern stuff and try to stuff it into 2 GB or less. Not to mention the loading times- it’ll be nice just to have some of those reduced, forgetting everything else.

      • cygnus1
      • 8 years ago

      This seems like a no brainer to me. They do dual executables for many games to use DX9 or DX10/11, so why not provide x86 and x64 executables instead, and just drop DX9. Unfortunately they keep DX9 around because it’s the code path most similar to what the consoles use.

      I would love to see a 64bit game that can manage to load all it’s assets into RAM.

      • sschaem
      • 8 years ago

      Can you prove how they would benefit tremendously from 64bit?

        • cygnus1
        • 8 years ago

        12 or 16GB of ram is pretty cheap and easy to do these days, 24 and 32GB are not far around the corner. Wouldn’t it be nice if the games could use more than the 2GB per process limit imposed by 32bit Windows? I think it’d be really nice for a game to load it’s entire set of assets into RAM and say goodbye to level loading.

        • Geistbar
        • 8 years ago

        Many of them have to go through various technical wizardries to keep the games from crashing when reaching a memory limit. These slow the game down- ever played a long game of Civilization? The whole thing slows to a crawl at the very end, because there isn’t enough room in memory to store all the assets that are being processed.

        Even ignoring any potential speedup, as cygnus1 has noted, memory amounts keep going up- I have 8 GB of RAM, and my next computer will almost certainly improve on that- using all of that available memory to, at the bare minimum, decrease load times would be wonderful.

        • kamikaziechameleon
        • 8 years ago

        Do you know what RAM is??? Its basically your buffer, though SSD’s are great they are just so expensive I wouldn’t waste space on one with games. I’d go games on a massive media HDD and then productivity apps and stuff on my SSD with my OS. Until you can get a SSD for a similar price to a HDD I don’t think we will see RAM become any less important. The more ram you have potentially the more can be in a game, more objects, more AI’s more textures, art assets, etc.

      • willyolio
      • 8 years ago

      for PCs at least, they have no excuse on the hardware support. the minimum requirements for every game out there today is high enough that there exist no processors that can run the games yet don’t have 64-bit support.

      • eitje
      • 8 years ago

      My guess on “why not”: it costs more to test both executables.

      However, I *did* up your comment – it’s a cost that some manufacturers SHOULD incur! 🙂

        • Geistbar
        • 8 years ago

        I agree that that’s the main reason why they haven’t. Even if it only cost $10 they’d still avoid it, because it’d be $10 they wouldn’t want to spend. They’ll need to, eventually atleast.. or so I hope, anyway. Also laziness.

        I still find it inexcusable to be half a decade into the 64-bit era and still have many things that would benefit from the memory not being written to use it. But you seem to vaguely agree with that yourself.

      • DrDillyBar
      • 8 years ago

      Agreed. I remember Hellgate London was a buggy piece of garbage when you ran 32bit and Dx10, to the point it was only playable in Dx9 mode. But with the x64 executable it was using at least 3.5GB’s and ran fine.

      • dashbarron
      • 8 years ago

      ^ This. What…2004 AMD release a consumer processor with 64-bit? There was probably one before that even. Going on a good 10 years in development cycle with this technology in the consumer space.

      • Bensam123
      • 8 years ago

      It’s not as simple as making it a 64 bit executable. There are programs that make an executable LAA in the case it has a memory leak and needs to take advantage of more memory then it’s programmed to.

    • Geistbar
    • 8 years ago

    My guess for the future of PC games is that, between the current long-standing lull and the upcoming big boosts in integrated graphics, that the graphical prowess of games is going to stay limited. It could even cause a resurgence, actually- if every computer sold could play the latest games at low-medium settings with good quality, PC games would be a HUGE market again- right now the market is still there, but the practical market is limited by sales of graphics cards.

    I’m more concerned with texture quality and consistency though- many AAA games of today look ugly, not because they lack high resolution textures or fancy effects, but because the art is so generic. No exaggeration, but I find games like [url=http://www.diygamer.com/wp-content/uploads/2011/05/avadon.jpg<]Avadon[/url<] to look better than games like [url=http://ui01.gamespot.com/1621/me1-42118_2.jpg<]Mass Effect[/url<]. The texture quality is all 100% consistent, there's no army of jaggies assaulting you, and the visuals have their own style to them. I can't find a screenshot of it*, but in Mass Effect 1, Tali had most of her suit as a high resolution texture mesh like every other character, except for this ring around the base of her neck that was much lower resolution. It destroyed the visual immersion. So, I'm hoping more games move down that route- high quality, consistent and unique textures. And it can all be accomplished on lower end hardware- it's exactly what Blizzard does, for instance. * And looking was quite terrifying! Why is there so much erotic fan art? Gah.

    • Krogoth
    • 8 years ago

    I want a return to simulation genres.

    Technology has caught in a number of ways that you can pull off convincing real sims. It is one of the few genres that acutally take advantage of graphical gimmicks like “Eyefinty and 3D Vision”.

    • ish718
    • 8 years ago

    We’ll have to wait until the next consoles come out to get our upgraded console ports.

    • Kaleid
    • 8 years ago

    Better AI please

      • Farting Bob
      • 8 years ago

      Programming realistic AI is probably the hardest thing you could do in a game. Its not like nobody is trying, its just that in even big budget games, its impossible to make an AI that will play the game, or blend in so well in game that you would think it may be a human controlled character. Its not going to happen. Ai is far better than it was (far less predictable and much less set paths and animations) but you arent going to see dramatic improvements anytime soon.

    • evilpaul
    • 8 years ago

    I’d really like to see improved AI from NPCs. Imagine it improved to the point “Escort Mission” couldn’t spark rage killings.

    • TurtlePerson2
    • 8 years ago

    What about draw distance? No game has impressed me like Crysis because no game has managed to get a draw distance that long to look so good.

    • Umbragen
    • 8 years ago

    I can’t say for sure until Nvidia makes a showing, but from what I’ve seen of ‘current pricing trends in the PC space’, ‘What’s Next’ looks a hell of a lot like a console.

    • CasbahBoy
    • 8 years ago

    I kinda feel we should head this one off at the pass: yes, we can agree that RAGE had all sorts of technical issues and there is plenty of blame that should be rightfully laid at the feet of various groups regarding that. I think the point made above though is that [i<]in configurations where it worked properly[/i<], it was an impossibly silky smooth experience and its a shame games are designed to play like that from the ground up.

Pin It on Pinterest

Share This