Single page Print

DisplayLink's USB-to-DVI adapter


Using monitors via USB: Pipe dream or reality?
— 1:34 PM on February 11, 2008


Last year, Samsung released a 19" LCD monitor capable of being hooked up via a simple USB connection. To work its magic, the display uses an embedded video chip made by a company called DisplayLink that works like a sort of self-contained graphics processor. That's neat, but what if you already have a spare monitor you'd like to use via USB? Is there a cheaper alternative to coughing up the $300 or so for a USB display? As it turns out, yes there is. DisplayLink's video chip is also available in standalone USB-to-DVI adapters, which allow one to connect any monitor with a resolution up to 1600 x 1200 via USB.

The prospect connecting pretty much any display without an extra monitor port has obvious appeal, especially for laptops. USB only has a fraction of the bandwidth supported by the DVI interface, but DisplayLink nonetheless claims on its website that it's fast enough to allow "flawless DVD playback" and to enhance one's gaming experience in "most games." I was curious to put these claims to the test, so I got in touch with DisplayLink and the company graciously offered to send me a sample USB-to-DVI adapter.

DisplayLink's USB-to-DVI adapter.

You won't find this exact device anywhere commercially. Instead, you'll need to look for an adapter built by one of DisplayLink's partners, such as this Sewell model that's available for $149.95. The drivers are the same, though, as should the hardware be in both adapters.

Trying the adapter in Windows XP
Since the usefulness of DisplayLink's adapter ought to be greatest to laptop users, I first tried it on my now-aging IBM ThinkPad T41 (yes, it's so old it's not Lenovo-branded). The machine has Windows XP Professional installed, and it includes a modest 1.7GHz Intel Pentium M processor coupled with a Mobility Radeon 9000 graphics processor and 512MB of RAM. Getting the adapter to work on this system was a breeze. I ran the installer, rebooted, and was good to go.

I tried three different LCD monitors—a 20.1" ViewSonic wide-screen display, a 19" LG wide-screen display, and a standard 17" LG—and they all worked flawlessly as soon as I connected them, just like they would through a DVI or VGA connection. DisplayLink allows you to configure multi-monitor settings right from Windows' built-in display control panel, although the company's own tray application provides a quicker way to change some of the same parameters, like the extra display's resolution and behavior. Everyday desktop tasks behaved smoothly, so I tried to pop in my Raiders of the Lost Ark DVD to test full-screen movie playback. The DVD played smoothly on the 17" display at 1280 x 1024, but playback on the 20.1" pegged my laptop's CPU to 100% and caused dropped frames. Normally, CPU utilization while playing a DVD on the machine's 1440 x 1050 display is quite low, so there's evidently some overhead involved with the DisplayLink adapter.

Moving to Windows Vista
Now excited to try DisplayLink's claim about gaming on a worthy machine, I installed the adapter on my main desktop PC, which runs Windows Vista Home Premium and in whose entrails tick a Core 2 Duo E6400 processor, a GeForce 7900 GTO graphics card, and 2GB of DDR2-667 RAM. Unfortunately, this step is where the magic wore off. DisplayLink's drivers installed and behaved as diligently on Vista as they did on XP, but performance was noticeably worse right off the bat—even on the 17" display. Windows moved choppily, scrolling in Firefox on any site with flash ads caused very noticeable slowdowns, and all videos dropped frames. Turning off Vista's Aero graphical interface helped performance, but it caused Windows Media Player to fail to render anything on the DisplayLink device. VideoLAN and Media Player Classic weren't any help, either.

DisplayLink's adapter driving my ViewSonic VX2025wm in Vista. The other display is connected via DVI.

I was still determined to at least try gaming, so I kicked off a game of Team Fortress 2. As expected, the game was unplayable even with the resolution turned down and Aero disabled. Puzzled by this mediocre display performance, I dropped DisplayLink a line to ask if what I was experiencing was normal. After a few days, the company's PR representative got back to me with the following information:

- DisplayLink performance in gaming environments is much better in resolutions less than 1024 x 768. This will be improving in the spring timeframe with the next major update to the software.
- The primary focus of the technology until now is office applications. Your readers should know that this is not yet optimized for gaming apps, but that the company is working on that. The most appropriate use - for now - is chat screens or strategy screens.

So much for enhancing your gaming experience. The next morning, I was awakened by a phone call from one of DisplayLink's UK-based staffers, who quickly informed me that my GeForce 7900 GTO graphics card was to blame for the issues I had encountered. GeForce 7900- and 7300-series models suffer from a "CPU copy bug," he said, and other GPUs should perform "ten times faster." However, he went on to say that even moving windows around in Vista is supposed to be "generally a little slower" than in XP. After probing around a little more, I got the impression that Vista performance is still very much a work in progress. I wanted to give DisplayLink the benefit of the doubt, though, so I took out my 7900 GTO and popped in my old GeForce 6800 GS in order to give the adapter another go.

Despite DisplayLink's assurances, performance with the 6800 GS was very similar to what I had witnessed on the 7900 GTO. Moving windows around was admittedly a little smoother, and videos dropped fewer frames, but there was still noticeable choppiness everywhere. Even scrolling TR's front page was slow and sent my CPU usage through the roof. Someone using the adapter to do office work on a 17" or 19" display probably wouldn't notice those types of slowdowns too much, but anyone else most likely would.