Personal computing discussed
Moderators: renee, JustAnEngineer
derFunkenstein wrote:If you're interested in even one of those games, I think the Radeon may be the way to go. The more out of that bundle you really do want to play, the better it is for you. The 7950 is a fast card - way better than the 500 series card you initially picked, and also faster than a vanilla GTX 660. The 660Ti is the closest competitor performacne-wise, though performance between AMD and nVidia varies wildly depending on the games you play. At 1080p, both cards are more than enough.
Also, AMD's drivers have been known to be crappy and unstable and their GPU's are incompatible with some games.
yogibbear wrote:Doubt the mobo let's you use the CPU's onboard GPU (i.e. HD4000) while a dedicated card is plugged in. i.e. there is ZERO reason to install that VGA_64 driver.
Just go to nvidia.com and install their drivers for the 670. Default resolution when you build a PC is usually bottom of the barrel... so if it looks fuzzy, just right click and increase the resolution to 1080p in the desktop settings area of control panel / right click desktop appearance.
Why have you built a $1100 build without an SSD? Seriously go buy a 256 GB Samsung 830 in the black friday sales right now! (if you can't afford one, then just opt for an <$80 128GB 830.)
Sound over HDMI to a TV is a PAIN IN THE ASS with some older GPUs. I was pretty sure that both Nvidia and AMD had sorted these problems out. Are you plugging it into a LED LCD TV? Typically they have 2 different types of HDMI-in ports and 1 can accept PC + sound, and another will just accept VGA quality pic over HDMI with no sound. Check your TV's manual if the ports aren't labelled. You want the active HDMI or whatever I think it is called. Then I think you need to go into sound properties and set the audio device in playback settings to whatever the nvidia High Definition Audio device is called instead of the (usually) default mobo onboard sound to get it to transmit via the HDMI out of the GPU.
QuantumInteger wrote:yogibbear wrote:Doubt the mobo let's you use the CPU's onboard GPU (i.e. HD4000) while a dedicated card is plugged in. i.e. there is ZERO reason to install that VGA_64 driver.
Just go to nvidia.com and install their drivers for the 670. Default resolution when you build a PC is usually bottom of the barrel... so if it looks fuzzy, just right click and increase the resolution to 1080p in the desktop settings area of control panel / right click desktop appearance.
Why have you built a $1100 build without an SSD? Seriously go buy a 256 GB Samsung 830 in the black friday sales right now! (if you can't afford one, then just opt for an <$80 128GB 830.)
Sound over HDMI to a TV is a PAIN IN THE ASS with some older GPUs. I was pretty sure that both Nvidia and AMD had sorted these problems out. Are you plugging it into a LED LCD TV? Typically they have 2 different types of HDMI-in ports and 1 can accept PC + sound, and another will just accept VGA quality pic over HDMI with no sound. Check your TV's manual if the ports aren't labelled. You want the active HDMI or whatever I think it is called. Then I think you need to go into sound properties and set the audio device in playback settings to whatever the nvidia High Definition Audio device is called instead of the (usually) default mobo onboard sound to get it to transmit via the HDMI out of the GPU.
I do have an SSD. It's only a Crucial m4 64 gig so I'm only installing essential applications on there like video editing, photoshop, my drivers, etc. Everything else like Steam games go on the 1TB toshiba drive. Both are plugged into SATA 6gb/s on my motherboard. I don't do sound over HDMI because I have a speaker that plugs using conventional jacks. I'm not an audiophile and for the sound quality and clarity that it gives me is enough. Thus I disabled sound over DVI and routed it to the case's jack instead. If I had sent the sound over DVI to my LED TV, the sound would have came out of its speakers instead. Not ideal considering that it would be subpar and if I plugged my speaker into the TV, then the setup is the same. I did HDMI out using the same TV, speaker, and DVI cable setup before with a Macbook Pro Mid 2011 and the picture quality was phenomenal. Granted I bough a cheap mini DisplayPort to DVI adapter that didn't carry sound (I think), the picture quality was good and I plugged my speakers into the MacBook Pro for sound. The MacBook Pro had Intel HD 4000 it provided really good visual quality. The GTX's DVI output is a bit fussier and the glare/contrast of the screen annoys my eyes compared to my old setup. My old setup had more vibrant colors compared to what it is now. I'm new to computer building so I'm just wondering if I messed anything up along the way. Also, the rig makes a single beeping sound after it passes the BIOS. Not sure if that is any need for concern.
QuantumInteger wrote:Thanks for your reply. I finally figured out what was going on. I have an DVI to HDMI cable. On my GPU, there were two DVI ports, DVI-I and DVI-D. I plugged my connector into DVI-I assuming that since it supported analog and digital, I wouldn't have any issue. What followed was washed out colors, black texts that were bluish in hue, incorrect color schemes and uncalibrated gamma saturation. After switching to DVI-D, my picture improved drastically. I'm having a hard time going between two different drives. I wish there was a way to make the SSD and HDD function as one drive. Unfortunately, RAID is incompatible with SSD and HDD. Anybody got any general suggestions, tips, etc?