I got my hands on DOOM 3 today, and I've discovered a few helpful tweaks for the game.
Careful with the dual displays Now, I'm not sure if this problem is something everyone already knows about or what, but in the middle of tweaking my system for DOOM 3, I did something I hadn't for quite a while: put an NVIDIA graphics card in it. (Yes, folks, the Radeon 9800 Pro 256MB card in there was running a little slower than I wanted, so I thought I'd go for the latest. I put in GeForce 6800 Ultra OC.) I was shocked to see performance drop massively with the move to the NVIDIA card. The thing would run OK in some places, but in others, it was literally a slideshow and wholly unplayable. After lots of checking settings, updating chipset drivers, and futzing about, I figured out that the problem was this: I have two monitors on my system. Turning off the output for second monitor (by unchecking "Extend my Windows desktop onto this monitor) immediately fixed the problem.
Several readers have also suggested a more elegant fix where you don't have to turn off your second monitor. You can go to the "Advanced" config menu for the NVIDIA driver, go to the "Performance" section, and check the "Show advanced settings" box. Scroll down to "Hardware acceleration" and pick "Single display mode." That should do the trick. If you get an all-white screen (or any single-color screen) when you start up DOOM 3, though, you have more work to do. Kill DOOM 3 in the Task Manager. Then switch the monitor inputs. The monitor showing DOOM 3 must be both the "primary monitor" in Windows and "monitor 1" on the graphics card. I went through this exact process myself, and it worked for me. (Thanks to Shane and AOEU for the tips.)
Some readers have suggested that ATI cards show similar performance problems with dual monitors enabled, which could explain my initial reaction to performance on the Radeon 9800 Pro. If you have an ATI card, you might try turning off the second monitor, as well.
Turn on vertical refresh sync When you get into the game, the first thing you're going to want to do is enable vertical refresh sync, or vsync. Without vsync enabled (and it's disabled by default) you'll see lots of tearing (a sort of screen fragmentation) in DOOM 3. In my experience, setting vsync to "always on" in the ATI and NVIDIA drivers doesn't help. The game's default setting overrides the drivers, no matter what.
To turn on vsync, use the Advanced Options menu. Alternately, you can hit ctrl-alt-~ to bring down a game console. At the console, type "r_swapinterval 1". Then hit ~ to close the console. You should be all set, and the tearing should be gone. You can thank me later.
Tweak the gamma If the game is too dark for you and cranking up the brightness slider doesn't suffice, you might try playing with the gamma setting from the console. Again, pull down a console with ctrl-alt-~. Then type "r_gamma 1.2" and see what you think. The game's default is "r_gamma 1", and I've found something between 1.2 and 1.4 works for me. Any more than that, though, and you really risk destroying the game's realistic lighting. Be careful with this setting, because you could lose the look the game's designers intended.
A note about antialiasing You may have read at the HardOCP and elsewhere that antialiasing isn't necessary in this gameseems to be a near-universal opinion amongst those who have seen it. I don't totally disagree. In DOOM 3, object edges often have jaggies, but interior edges usually do not. Of course, the antialiasing used in most graphics cards today is multisampling, which typically only affects object boundaries. That's why you'll see annoying jaggies on interior edges in games like Far Cry, even with AA enabled. DOOM 3 doesn't have that problem, and I believe that it may be because id's pixel shader programs are programmed to do some antialiasing, at least in the game's "high quality" mode. So leave AA turned off, if you must. DOOM seems to kill interior edge jaggies anyhow.
Push your system The game's built-in routine to detect the best settings for your system appears to be very conservative. For instance, it chose "medium quality" for a GeForce 6800 on a test rig, while the GF6800 will run the game just fine at "high quality" at 1152x864 resolution. I suggest moving up to the "high quality" settings if your graphics card has 256MB of memory, regardless of what it picks. Also, the auto-detect routine picks some very low resolutions640x480 for medium quality, and 800x600 for high quality. You may be able to get away with much higher screen resolutions than what the game has picked. I'd suggest cranking it up to 1024x768 or higher to see how it runs.
That's all for now. If you have other suggestions, post 'em below.
|1. Hdfisise - $600||2. Ryszard - $503||3. Andrew Lauritzen - $502|
|4. the - $306||5. SomeOtherGeek - $300||6. Ryu Connor - $250|
|7. doubtful500 - $200||8. Anonymous Gerbil - $150||9. webkido13 - $135|
|10. cygnus1 - $126|
|The TR Podcast 176: Project Cars, cable to the Maxx & the Tao of Chi||12|
|Dirty Bomb mixes FPS elements into a potent brew||14|
|Friday night topic: how dinosaurs probably looked||58|
|Thermaltake's Suppressor F51 mid-tower looks a tad familiar||9|
|Umbra action RPG uses Megascans tech to glorious effect||29|
|Deal of the week: 27'' AHVA monitor for $300, The Witcher 3 for $39||22|
|F1 2015 offers a new formula for racing fans||11|
|The Witcher 3 developer explains controversial graphics downgrade||86|