Bethesda shows off Fallout 4 graphics tech and eye candy

Fallout 4 will arrive on November 10, but Bethesda is giving us a sneak peek at the eye candy afforded by the game's improved Creation Engine and its physically-based deferred renderer. If your PC is up to the challenge, a romp in the Wasteland looks like it has the potential to be gorgeous.

This iteration of the Creation Engine is based on the one used in The Elder Scrolls: Skyrim. Bethesda says it wanted to use more realistic materials and more dynamic lighting to make Fallout 4 look as realistic as possible, though. As a result, Fallout 4's engine gets a materials system, and each material reacts differently to lighting and the environment. For example, when a storm rolls in, the developer says surfaces will get wet. The game's cloth simulation purportedly lets flexible materials like clothing, hair, and vegetation interact realistically with the wind and the environment, too.   

Bethesda says it worked with Nvidia to improve Fallout 4's volumetric lighting. This technique uses the GPU's hardware tessellation capabilities to produce effects like "god rays" and other atmosphere-enhancing lighting effects. From what we can see, the lighting looks way better than it does in previous Fallout games, to be sure. Bethesda says the new lighting system (and the other effects it's demonstrating) will run well on both GeForces and Radeons, despite Nvidia's collaboration.

The company didn't want to give away all of its secrets in advance, but the blog post's laundry list of features hints at many changes under the hood. Here's a full accounting of the features and improvements Bethesda publicly disclosed today:

  • Tiled Deferred Lighting
  • Temporal Anti-Aliasing
  • Screen Space Reflections
  • Bokeh Depth of Field
  • Screen Space Ambient Occlusion
  • Height Fog
  • Motion Blur
  • Filmic Tonemapping
  • Custom Skin and Hair Shading
  • Dynamic Dismemberment using Hardware Tessellation
  • Volumetric Lighting
  • Gamma Correct Physically Based Shading

We're definitely ready to see these effects and techniques in action.

Ben Funk

Sega nerd and guitar lover

Comments closed
    • CaptTomato
    • 5 years ago

    FO4 WILL BE GOTY, AND IMO, HAS THE POTENTIAL TO BE THE BEST GAME EVER….
    Most of the negativity in this thread is based on extraordinary ignorance.

    • dmjifn
    • 5 years ago

    Thematically and stylistically, those screens look straight out of Gears of War.

    • ET3D
    • 5 years ago

    The mention of NVIDIA in game tech always makes me cringe. It’s a guarantee of something that would work terribly on AMD parts.

    • dikowexeyu
    • 5 years ago

    I can live with fallout 3 graphics. As soon as I get immersed on a game, I forget about graphics quality.

    What I really want is better physics, better mechanics, more content, more things to manipulate in the game. To being able to manipulate the AI, to be capable of developing complex tactics and strategies.

    I want to leave a footprint in that world. If I leave an object somewhere, it should stay there every time I return, instead of anything being reseted. Dead bodies should not disappear magically. They should slowly rot across the days, and being eaten by wildlife. I should be able to find the bones and take one as weapon. I should be able to bait and hunt predators and enemies with the carcasses.

    I want to drag and pile objects to obstruct passageways. I want to cut energy lines to disable defenses of a fortress, and I want to make traps with the falling electrified cables.

    I want to disrupt trade and transport lines to impoverish and make enemies weaker.

    I want all that to be non scripted.

      • jessterman21
      • 5 years ago

      You could always move to Iraq and live out your fantasy… [url<]https://en.wikipedia.org/wiki/Abu_Azrael[/url<]

    • LoneWolf15
    • 5 years ago

    I don’t know that I’ll immediately buy Fallout 4 (as much as I loved Fallout 3).

    However, I do plan on buying some Nuka-Cola from Target on release day.

    • auxy
    • 5 years ago

    Still not using physical-based rendering… ;つД`)

    [sub<]Nevermind that (as another user pointed out) the leaked shots so far don't look anywhere near this good...[/sub<]

      • Laykun
      • 5 years ago

      Nothing uses full on physically based rendering. Each engine has their own subset that checks most of the tick boxes while staying performant. The only fully compliant implementation of what you’d called physically based rendering would be disney’s offline rendering pipeline.

        • auxy
        • 5 years ago

        “compliant” to what? You’re really grasping at straws, desperate to start an argument here, it seems. ( *´艸`) Desperate to defend Bethesda and Todd-boy?

        I don’t see anywhere that I said anything about “full PBR”.

          • Laykun
          • 5 years ago

          I suppose what I mean is the parameters used to achieve the final effect. The most complete PBR BRDF I know of is the one put forth by disney. In something like Unreal Engine 4’s BRDF you have parameters like basecolor, Roughness and metallic, a very small subset of the parameters you get in the disney BRDF which include subsurface, metallic, specular, speculartint, roughness, anisotropic, sheen, sheentint, clearcoat, clearcoatgloss. I’m not trying to start an argument, just saying that no real time engine really does a complete job of physically based rendering.

          Sources :
          [url<]https://disney-animation.s3.amazonaws.com/library/s2012_pbs_disney_brdf_notes_v2.pdf[/url<] [url<]https://de45xmedrsdbp.cloudfront.net/Resources/files/2013SiggraphPresentationsNotes-26915738.pdf[/url<]

            • auxy
            • 5 years ago

            Yeah, sure, of course, obviously.

            I’m pointing out that I don’t get the impression they’re doing ANY kind of PBR here.

            • Laykun
            • 5 years ago

            Fair enough. When I look at it, it does look like it uses a version of PBR to me, I think what you’re more referring to is the art style, whether devs decide to use properties of actual real life materials or just wing it and let the artist set up the materials (bad idea).

      • Krogoth
      • 5 years ago

      Still too expensive from a coding and computing standpoint to justify the cost for gaming.

        • auxy
        • 5 years ago

        Warframe uses it extensively. Unreal engine 4 does too.

    • Krogoth
    • 5 years ago

    Who cares about the freaking graphics. I have been jaded by the constant epenis jerking for more graphics over the past decade. It has become so tiresome. The only consistent result so far is uncanny valley (Animation failure, lighting failure, shadow failure etc.)

    CRPGs are not showcases for eye candy. This does not bold well for FO4. I think it is going to be repeat of FO3. An unoriginal title that fails to live up to its predecessors. It is going to be Skyrim with Guns. >_<

      • Vhalidictes
      • 5 years ago

      You’re right… but in all fairness, “Skyrim with Guns” is pretty much what I’m looking for, here.

      A good story would be nice, but not required. The Black Isle game will be the next Fallout after this one, anyway.

        • Krogoth
        • 5 years ago

        That’s the problem with Bethesda.

        They don’t know how to make a compelling CRPG experience for over a decade. They just know how to make a barebone, open-world environments laced with copy+paste eye candy.

      • trek205
      • 5 years ago

      do you play all your games on low settings at 640×480? after all “Who cares about the freaking graphics” right?

        • Krogoth
        • 5 years ago

        It is the silly obsession with eye candy that results nothing more than forgettable interactive tech demos.

        If I wanted eye candy. I simply go outside and go to some rural area.

          • trek205
          • 5 years ago

          then go play pong. I want graphics to move forward and I want games to look better. instead we get crappy animations, low res textures, and horrible looking looking faces that look like they are from 2005. but as usual we will get a few graphics features that tank the framerate so they can sale the high end cards.

            • Krogoth
            • 5 years ago

            Nah, I rather play something that is actual fun and interesting rather than dealing with an interactive tech demo. The tech demo’s budget was spent mostly spent on the eye candy but the gameplay is barely touch on and is easily forgettable, short (less than 10 hours) and is riddled with more bugs than a gas station restroom.

            It is a sad state of affairs that games from over a decade ago are more interesting and complicated that the rehashed stuff floating around these days.

            There’s a reason why gamer keep going back to the classics while most of the current stuff is quickly forgotten within a year after release.

          • Sabresiberian
          • 5 years ago

          Yep, plenty of space in those rural areas to build your own fort, and bad guys to get your first-person shooter fun off of.

          Oh, wait . . .

            • Krogoth
            • 5 years ago

            I was referring to eye candy not gameplay mechanics.

    • tipoo
    • 5 years ago

    The published minimum requirements are weird. 550TI, or 7850, iirc? There’s almost a 2-3x performance spread there. 700Gflop card vs near 2Tflop, and the performance nearly lines up.

    Their released pics seem a bit…Optimistic, compared to the leaked pictures. They’re in areas where lighting is covering up some of the poor textures shown. I’m not making a fuss over it, if it means low requirements, heck Skyrim looked like a DX9 title and I put a lot of time into that. But the leaks definitely showed a lot worse, and this response to it is borderline dishonest.

      • Vhalidictes
      • 5 years ago

      The textures need to be crap; It’s a console port. There’s just no way around that.

      Fortunately mods are supported, and if this is anything like the Skyrim release Bethsoft will release better texture packs on Steam.

        • tipoo
        • 5 years ago

        I don’t buy that excuse. Do the Witcher 3 or The Phantom Pain suffer as much? TPP even supports a generation back! Supporting consoles with 512MB RAM, while this has 8GB as a lower limit. Meanwhile FO4 is 8th gen only. Those two games on 8th gen consoles themselves have better texturing than the PC version of FO4. Blaming consoles is just hiding a development decision.

        • brucethemoose
        • 5 years ago

        Both consoles have 8GB of memory. They can handle higher-res textures.

          • auxy
          • 5 years ago

          Consoles don’t have hUMA, so assets have to be duplicated for CPU and GPU addressing. ( *´艸`)

            • tipoo
            • 5 years ago

            That’s not right. They have unified address spaces and no need to swap things back between GPU and CPU memory, since it’s all addressable with the same pointers, and physically the same. And in fact, the architectures are further modified over PCs to allow easy sharing with additional bypass busses and such so the CPU/GPU can communicate better than over PCI-E or even than APUs. Read this.

            [url<]http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?print=1[/url<]

            • auxy
            • 5 years ago

            Did you read the article you linked? Nowhere does it dispute what I said.

            The very fact that they took additional measures (such as adding a separate bus) to improve CPU-GPU communication all but confirms what I said.

            You shouldn’t eat Mark’s lies.

            • tipoo
            • 5 years ago

            You’re citing Mark’s words while accusing him of “lies”, lol. These would certainly have no more duplication than desktop PCs with physically different GPU and system memory. They have one memory controller and one physical and abstract memory space for all of it. You can have the GPU work on something and the CPU access it right away with no swapping memory pools…Since there are none.

            Did *you* read the article? The CPU to GPU bus is so that you don’t *have* to go through the memory bus at all, it’s not because one or the other doesn’t have access to all the space. Its still higher latency to put something in RAM, then have the other access it, than to just swap things back and forth through a direct bus. That doesn’t mean no unified memory space.

            That was the design principal for the whole thing, he mentions developers having unified memory as their primary desire.

            hUMA by another name is pretty much where consoles are. Only a small subset of lower end PCs supports that, if you limit yourself to APU gaming. Your memory has some wires crossed if you think PCs don’t have to duplicate anything while 8G consoles do.

            • auxy
            • 5 years ago

            Where did I say anything about PCs not duplicating assets? Your entire post reads like a crying console fanboy. [b<]I was only pointing out that the memory pool in the consoles is smaller than it appears.[/b<] And you're over here going "b-but PCs...!" Please. Obviously a desktop PC has the same problem. Now run off back to your troll hole with your tail between your legs like a good little fan boy. (ΦωΦ)

            • Laykun
            • 5 years ago

            Seems someone has spun you 180 degrees. hUMA is EXACTLY what the consoles have. PCs can have it but in general don’t use it as it’s a small subset of hardware that’s barely capable of playing games in the first place.

            • auxy
            • 5 years ago

            It was highly anticipated and expected that the consoles would have hUMA before release due to having Kaveri-like APUs (x86-64 CPUs + GCN graphics) but in the end it turns out they don’t.

            I’m willing to admit I might be mistaken but everything I have ever read confirms this. There’s a lot of talking around it and hemming-and-hawing (like the article tipoo linked) but ultimately they just don’t.

            • Laykun
            • 5 years ago

            Ok, well I don’t have any facts to back up my side either so I’ll have to research this more. I get the feeling you’re probably right though.

            • Laykun
            • 5 years ago

            Here’s some snippets from the PS4 developer documentation :

            “Both programs that run on the CPU and shaders that run on the GPU will run in the same virtual address space”

            There are functions for mapping address space to both the GPU and CPU, so if this isn’t hUMA it’s very close and they certainly aren’t separate pools of memory.

            • auxy
            • 5 years ago

            Hmm! Can you link that for me? (Or send a PM!) I’m very interested in checking out a few other things.

            • Laykun
            • 5 years ago

            I’m afraid you need to be a PS4 developer to access the content, I don’t think it’d be proper for me to give you full disclosure of the document.

            • auxy
            • 5 years ago

            I can’t really take you at your word, then.

            • Laykun
            • 5 years ago

            To be honest I really don’t care, take it or leave it. I feel like I probably shouldn’t have even quoted that much of the documentation in public.

            • Ninjitsu
            • 5 years ago

            Screenshot, maybe? (I believe you but I’m just trying to help the discussion!)

        • fhohj
        • 5 years ago

        textures are a strong point of the consoles. the large amounts of fast video memory mean they can store and access textures that are quite detailed.

        the consoles are stream processor limited before being texture limited. they would have trouble drawing and shading surfaces to give the tmu to put textures onto under normal circumstances before they would run into limitations with the tmus being able to service the geometry and scene.

      • Seicho
      • 5 years ago

      I’m callin’ bullshit on the leaks, honestly. There is absolutely no scrap of proof that they are legitimately the PC version on ultra settings except for the guy’s word. PC images that look almost exactly like the console images, in a Bethesda game.

      Game has been confirmed to have Ambient Occlusion, the screenshots do not. Not even a little. TAA anti-aliasing, not in the screenies (that one could just be poorly implemented AA, though even Skyrim had really good AA). Tessellation also completely missing from action. Skyrim also had a high-quality texture pack after it released, I’m willing to bet Fallout will to, which is impossible to acquire right now unless you are Bethesda. He may have gotten the game early, but Steam doesn’t just unlock that stuff right away.

      TL;DR: Don’t jump the shark and say Bethesda is full of crap until you see it. People in general are full of crap, there’s no way to say this person isn’t and isn’t just trying to stir the pot.

    • Sargent Duck
    • 5 years ago

    I feel like my Radeon 7870 (min requirement) is going to be hurting. I should probably buy a new video card so I can take advantage of all that this game has to offer for eye candy, but oh, the $$’s.

      • DoomGuy64
      • 5 years ago

      I’d wait for a tweakguide. This is a console port, and your card is more powerful than a console. Just disable whichever useless feature is sapping performance. A lot of that stuff is junk anyways, and does nothing but blur the screen.

      [i<]DoF: disable. Does nothing but remove background detail. Faster to disable AF if you like that. Motion Blur: disable. People get enough blur from their LCD. Games don't need to add more. SSAO: use fastest option, barely noticeable, saves performance. TAA: isn't this that blurry AA method? Tessellation: use the control panel override if necessary.[/i<] The other things you can leave on. There's a reason why this stuff wasn't added to the console versions. It adds nothing, and saps performance. The only thing I'd be actually concerned about is that the game recommends a 4gb card, insinuating there might be HD textures that don't fit on a 2gb card. Probably not a problem if you're using a 1080p monitor.

        • auxy
        • 5 years ago

        >there’s a reason why this stuff wasn’t added to the console versions. it adds nothing, and saps performance.
        >it adds nothing, and saps performance.
        >it adds nothing

        So sez “DoomGuy64”. ( *´艸`)

        Excuse me if I don’t take your opinion as gospel on graphical matters, Mr. Taggart.

          • DoomGuy64
          • 5 years ago

          What? Did you read what settings I recommended to disable?

          DoF and blur are the first options I turn off on any game that I play, simply because those options detract from the experience. I’m not watching a movie, I’m playing a game. The only time those features are useful are in cutscenes, and not the game itself. Other features like SSAO are nice, but a lot of it is overdone and unnecessary.

          These aren’t effects that the developer wanted in the base game anyway, which is why consoles don’t have it. Nvidia partnered with Bethesda to add this stuff, and it’s there to slow down mid-range gpus, and encourage upgrading.

          Also, I don’t know why I’m replying to a poster who’s rebuttal is, “[i<]So sez "DoomGuy64". ( *´艸`)[/i<]". Excuse me if I don't take a personal attack from Mr/s. Emoticon seriously. /sarc The only thing of substance I can [i<]infer[/i<] from your post, is that you like to enable screen blurring effects in your games. If so, good for you, but I don't. Especially when it saps performance. Crysis 3 had several blurry effects and AA methods from Nvidia. They effectively halved the game resolution, so I switched all of them to the alternatives. Faster performance, no blur.

            • auxy
            • 5 years ago

            It’s cute how you think my post was a personal attack! Pffhaha. ( *´艸`)

            I turn Motion Blur off always. Haven’t you seen me evangelizing for Lightboost/ULMB in these very comment threads? Not much point in that if you’re going to artificially add it back in anyway. And TAA is more blurry garbage, you’re right there of course.

            I like DoF when it’s implemented well. A good Bokeh DoF can really help reduce aliasing (shimmer) at a distance. Surely a lot of games don’t implement it well, but it’s just something you have to deal with on a case-by-case basis.

            SSAO is just a net gain on image quality … as long as it’s not glitching out. But modern SSAO implementations are pretty good; HBAO+ and VSSAO2 both look great most of the time. Complaining about SSAO performance hit is just silly.

            And even more silly is saying that these techniques “add nothing”. You might not like them, but you call yourself DoomGuy64. You probably still PLAY classic Doom. I don’t think your opinion on graphical fidelity is very relevant to anyone besides other OLD MEN like yourself.

            So that’s all, really. Just teasing you about being old. Kyahaha. OLD! ( `ー´)ノシ

        • f0d
        • 5 years ago

        i do pretty much the exact same with every game even though i have enough grunt to do those effects – they are flat out annoying, especially motion blur

        • odizzido
        • 5 years ago

        I really hate blur. It’s my most hated graphics option.

      • Demetri
      • 5 years ago

      At least hold out until Black Friday before upgrading. Newegg has an R9 290 for $200 AR, but it’s a reference cooler. I bet there will be even better deals in a few weeks.

      • tipoo
      • 5 years ago

      But a 550TI is also the other minimum card…It’s weird, there’s a 2-3x performance spread there in your cards favor.

        • Vhalidictes
        • 5 years ago

        I agree with DoomGuy – If a 550Ti is good enough, there should be some settings that can be edited to let 7000-series AMD cards perform well at 1080.

    • mctylr
    • 5 years ago

    I’m surprised to hear that the game/graphics engine is derived from the version used in Skyrim (2011), as opposed to Elder Scrolls Online (2014, PC), which were both developed by studios / developers owned by ZeniMax Media and are both customized versions of the Gamebryo game engine.

      • Vhalidictes
      • 5 years ago

      My understanding is that ESO is using the Hero engine.

      Also, the core rendering problem with the in-house engine is lots of NPCs rendered on the same screen, so there’s no way “standard” Gamebryo could handle ESO.

        • Preyfar
        • 5 years ago

        ESO actually doesn’t use the Hero engine. While ZOS licensed it, it was only used in the prototyping phase. The rest of the engine was custom made made afterwards.

        [url<]http://www.gameinformer.com/b/features/archive/2012/05/25/why-the-elder-scrolls-online-isn-39-t-using-heroengine.aspx[/url<]

          • Vhalidictes
          • 5 years ago

          Thanks for the link, Preyfar! I think that Id tech is slowly being folded into Gamebyro, I’m not sure why Bethsoft isn’t using the Id engine for everything. My guess is that it would take the dead too long to get comfortable with it.

      • Horshu
      • 5 years ago

      Forgive my ignorance since I haven’t been following 3d engines for several years, but why wouldn’t they use idTech for this stuff?

      • auxy
      • 5 years ago

      I don’t know why you’re surprised.

      Bethesda has been using essentially the same engine since Morrowind. It got a visual update with Oblivion, and that game engine was used almost unmodified (save for the bolted-on VATS system) for Fallout 3 and New Vegas. That same engine was given another visual update and used for Skyrim (the quick-kill animations of Skyrim are done using the VATS code.) And now that same engine is being used again for Fallout 4.

      Bethesda does as little work behind the scenes as possible for each game, which is admirable in the sense of conservation of effort, but frustrating as a gamer. This means Fallout 4 will have the same jacked-up physics, the same weak gunplay, the same awful movement, and the same disconnection between mechanics and immersion, just like the rest of Bethesda’s games.

      Bethesda’s best game was SkyNET. ( ;∀;)

        • Preyfar
        • 5 years ago

        Skyrim used the Creation Engine while Oblivion used Gamebryo. Bethesda made quite a big deal about it at the time due to the major issues with Gamebryo in previous games. I don’t doubt the VATS code was somehow modified/used to work with the Skyrim engine somehow.

          • auxy
          • 5 years ago

          “Creation” engine is Gamebryo. Speaking as an expert on the topic.

    • brucethemoose
    • 5 years ago

    Thank god, they got GPU shadow rendering working again.

    For some inexplicable reason, unlike Oblivion of Fallout 3, Skyrim rendered shadows on the CPU… That game could eat a high-end Xeon E5 for breakfast, yet shadows would still look like a blocky Minecraft tree. Blurring + ambient occlusion + brute force CPU power helps, but it’s not nearly enough.

      • swaaye
      • 5 years ago

      Skyrim had shadowing that FO3 and Oblivion didn’t though. It was blocky and ugly, and shifted weirdly periodically, but at least it was there. 😉

    • DarkUltra
    • 5 years ago

    Skyrim engine huh. Does that mean a 60fps limit? Many 100hz+ monitors have been released lately so I hope Fallout 4 is following with the times.

      • aspect
      • 5 years ago

      Hey, it’s a 64 fps limit, thank you very much. /s

      There’s probably not much they can do about it until they finally switch to a modern engine. The physics is tied to fps (which is pretty common in games), but in this case if it allows anything higher than 64 fps everything goes out of whack.

        • UberGerbil
        • 5 years ago

        “64 fps should be enough for anybody”

          • cynan
          • 5 years ago

          With smooth, even frame times, it really should be.

            • Meadows
            • 5 years ago

            No.

            • DrCR
            • 5 years ago

            Redacted – would like to avoid a potentially horrendously incorrect iirc

            • auxy
            • 5 years ago

            You don’t … you don’t really BELIEVE that … do you? (・へ・)

        • orik
        • 5 years ago

        I feel like neither of you are very familiar with Bethesda’s / the history of their game engine.

        Bethesda will never ‘switch’ in a traditional sense to a ‘modern’ engine, and they don’t need to. In fact they haven’t switched engines since Morrowind.

        Morrowind, you ask? You mean like Gamebryo? Skyrim’s Creation Engine is Gamebryo. They do development themselves, update their engine and add features. Why would they abandon a codebase they’ve been working on for years and are intimately familiar with?

        It’s too soon to tell if there’s a 60fps limit. Bethesda has officially stated but there’s no FPS limit on PC but we don’t know what that means as far as world physics are concerned. We don’t know how the Creation Engine has evolved from 2011 to now (except for the parts outlined in the above blog post).

          • aspect
          • 5 years ago

          I think many people already know their reason they will continue with gamebryo, but we also want their games to look as good as other modern titles.

            • orik
            • 5 years ago

            they could make a really good looking game if they switched to crytek, but they wouldn’t be able to do any of the things that makes a bethesda game a bethesda game.

            I don’t think bethesda is willfulling mishandling their development time. I think things are the best status quo. We wouldn’t get very good modding support anywhere else.

      • superjawes
      • 5 years ago

      Looks like there will be a ~64 FPS limit….but hey, at least it’s not [s<]cinematic[/s<] 30 FPS.

      • Krogoth
      • 5 years ago

      It is because of the physics engine.

      The Gamebyro engine suffers from weird physics and ragdoll glitches if you force it to run beyond 60FPS. Besides, most games are hard-coded to render at a maximum frame rate for such reasons anyway.

        • Vhalidictes
        • 5 years ago

        Even limited to 64 FPS, the game should be plenty smooth enough, assuming that Freesync/Gsync is supported.

          • Krogoth
          • 5 years ago

          Yep, it is not a bloody twitch shooter.

          Framerate depends on the source material. 100FPS+ animation is pointless for a RPG and most other gaming genres.

          • rahulahl
          • 5 years ago

          70 odd frames on Witcher 3 seems stuttery to me. I doubt this will be much different. At least if its first or third person.
          Once you are used to 144 FPS, 64 will always look and feel inadequate.

            • swaaye
            • 5 years ago

            lol man. Life is hard. 🙂

            • Krogoth
            • 5 years ago

            People need to learn animation 101 to really understand framerate.

    • lilbuddhaman
    • 5 years ago

    See perfectly fine looking car in distance.
    Walk up to it to see that its a low poly mess with a nice texture over it.
    Walk around it looking for a way to enter/use/interact with it.
    Walk away remembering the core engine is old as hell and so are the underlying mechanics.

      • sweatshopking
      • 5 years ago

      Yeah. There is much Bethesda does right, but much they need to update.

    • bthylafh
    • 5 years ago

    Oh, that’s why it needs a much more powerful Radeon to run: they collaborated with Nvidia and it uses tessellation.

    Bah.

      • DoomGuy64
      • 5 years ago

      It’s basically a list of all the features you need to turn off to get an acceptable experience, aside from 980 Ti users.

        • LovermanOwens
        • 5 years ago

        As a 980 Ti user, I can’t wait to enjoy all the extra eye candy. Mmmmmm.

          • DancinJack
          • 5 years ago

          hahah me too!

          I do think it sucks they’re confined to Nvidia hardware (unless you have a far more powerful Radeon), but that’s also one of the reasons I have a Geforce.

Pin It on Pinterest

Share This