My Samsung SyncMaster 215TW is a fabulous display that has proven challenging to configure with Ubuntu. At the very top of the list is the inability for the DVI connector to do anything at all, even when the computer starts in text mode there is nothing onscreen. After plugging in a Windows laptop, it seemed the only conclusion was that DVI is effectively unsupported on this display.

What’s wrong with this picture? Like just about everything with computers, my monitor stopped working after something changed. After using an extremely low-end ATI video board for many years, I decided to upgrade to a mid-range NVidia that includes excellent 3D acceleration support and dual DVI output. Imagine the possibilities!

At this point, imagine if DVI could actually work. While the ATI Radeon had a DVI connection that worked with the Samsung display, my upgrade to NVidia has broken this feature. So, rather than run a direct DVI feed, I am relegated to a VGA converter from my DVI output. While it has excellent 1680×1050 resolution, it seems like I’m getting shortchanged by the loss of a digital output.

Evidently, from what I have read, there are two schools of thought on why my DVI connection no longer works.

One possibility is that the pixel clock rate presented by the video adapter exceeds the maximum pixel clock rate supported by the 215TW. This rate is a function of both the total resolution and actual color depth of the screen. In practical terms, a wide screen like this one (8:5 aspect ratio) DVI Cable can support a maximum resolution of 2098×1311 which is well within the boundary. On top of all this, I know the screen could work with DVI since my ATI Radeon was able to generate output.

The other possibility, and this seems more likely, is that the HDCP (High-Bandwidth Digital Content Protection) features of the Samsung display are somehow incompatible with the NVidia implementation. What could be wrong?

Let me relate another DVI pitfall that happened about a year ago. On a recent vacation, I discovered that our hotel room had a TV with a DVI plug in the back. I happened to have a laptop computer with some movies on it, along with some family photos we were going to show to some friends. How convenient – I thought – the widescreen TV would be a perfect venue for that.

This is when I discovered that DVI is not a unified standard persay, there are actually a number of variants including DVI-I, DVI-D and DVI-A among others. So unlike VGA or the old RGB standard, a DVI cable is not necessarily a DVI cable for the device you are trying to connect it to. In this particular case, my cable was DVI-I and the TV would only accept DVI-D.

While it is an excellent format when it works, clearly DVI may not be for everyone and in my case hasn’t worked well at all. At this point, my Samsung is running through a DVI cable by virtue of a VGA converter to my monitor. If I ever choose to upgrade my Samsung, my next display must support some non DRM restricted output mechanism – VGA would be fine.