So, I got a really good deal on a 61" plasma TV that was used (Sony KDE61XBR950) off of craiglists for $450 in amazing shape with no burn in. I currently have two monitors hooked up to my Radeon 6850. First was a 23" DVI LCD, second is a 19" VGA LCD which I need to use a converter to in order to get it to DVI. Apparently only ONE of the two DVI ports on the back of a 6850 (and from what I understand the entire 6xxx series) supports VGA to DVI. You simply can't use the other one, it wont detect it. The one that works is the bottom one.
Now after I got the TV it has a DVI-D port on it I wished to use. I bought a HDMI cable to use the HDMI port on the back of the 6850 and a HDMI to DVI converter. To my dismay I couldn't have the HDMI port active while the top DVI port was active and from what I read they share bandwidth with eachother. That means YOU CANNOT have a HDMI and DVI device plugged at the same time with a VGA device.
...so I went and bought a Displayport to HDMI adapter, which then will be converted to DVI-D at the end (something I had not intended). This did not work either. The top row of ports share bandwidth with each other for some obscene reason and you NEED to purchase a active Displayport to HDMI adapter in order to get any of the above ports to work with two DVI inputs on the other two ports.
Now, what I question is after reading the displayport article on wikipedia, why can't I use a passive adapter? I'm not entirely sure how an active adapter magically enables DVI signals to be sent to a single link device. The TV only requires a single-link DVI connection, which can be provided by Displayport. (They don't even make active HDMI adapters so I don't know how to get two DVI devices to work physically plugged into those ports and an HDMI device. A trait that is shared by the 5xxx series as well.)
According to the article I need only a passive adapter to make it work, yet AMD requires you to use an active adapter... a prospect that seems complete hoghwash. They even have an entire page dedicated to it in which they tell you only two legacy devices work at one time. What exactly qualifies a device as being legacy? Apparently all the ports are legacy unless you buy a overpriced adapter to make it work. You can plug in a maximum of two legacy devices and then you need to start buying expensive adapters. Of course this fails to included that it's impossible to run a HDMI device and two DVI devices without adapters off of the said ports.
It's strange really. After all the testing tech websites have done on this, even TR had their own Eyefinity setup done (hell I even read the article), they failed to mention you need to start buying $30 adapters to add extra monitors once you reach a certain point. Interesting in itself, if AMD went to such lengths to do the setups could they have been trying to mask this?
It really is quite honestly a huge chore when I have a typical setup for most people. Most users I will guess are using a crappy VGA or DVI monitor as their secondary (used or procured) and a good DVI LCD as their primary, which is exactly what I'm doing. And exactly as I am doing, most users will be adding a TV through HDMI or HDMI to DVI as displayport isn't nearly as well adopter as either DVI or HDMI. Coincidentally this combination can't even be done.
I am rather pissed off myself as I am going on about three weeks of shipping items around only to find out I once again got the wrong item. I wish TR would've mentioned something about this.
Page I talked about for reference: http://support.amd.com/us/eyefinity/Pag ... ngles.aspx