USB connectors appeared on computer monitors far back during the reign of CRT models. They played a secondary role, though, usually servicing the monitor’s integrated USB hub. The hub was often passive, without additional power, and had nothing to do with the monitor’s own electronics. You might as well buy a separate hub and place it next to the monitor. Monitors that could be set up from the PC were an example of a deeper integration of the USB interface, but they soon disappeared due to the arrival of DDC/CI (Display Data Channel / Command Interface) that allowed controlling the monitor via software without an additional USB connection.
However, there is no fundamental limitation to transferring video via the USB. Yes, the bandwidth of USB 2.0 (480Mbps) is only sufficient for transferring non-compressed video with a resolution of 640x480 and a frame rate of 60Hz, but why should the video stream not be compressed? There have long been programs for controlling a PC remotely that can transfer the image not only via Ethernet networks but even via ordinary modems. Yes, they can only transfer the elements of the standard graphical Windows interface (because not the image proper, but GDI commands that construct the image are being transferred, actually), so there is no talking about watching movies or playing games, but the fact that graphical information can be transmitted across low-speed networks cannot be denied.
So, the problem is not about the bandwidth of the USB interface, but about a chip that would decode a video stream compressed to such a level as to fit into the available bandwidth.
DisplayLink, formerly Newnham Research, took to developing such chips some time ago. Right now they offer two chips, DL-120 and DL-160:
These chips can receive a compressed video stream via such interfaces as USB, Ethernet or WiMedia (a wireless data-transfer standard with a throughput up to 480Mbps), uncompress it into video with a resolution up to 1400x1050 (DL-120) or 1600x1200 (DL-160) and in full 24-bit color, and output it in RGB (for an analog connection to a monitor) or LVDS (for a digital connection) formats.
The chips use lossless compression, which is expectable. Ordinary video encoding algorithms like MPEG1 or MPEG2 could be used, too, but they work well for movies only. Such algorithms would make a jumble of pixels out of OS and application interfaces with all their thin sharp lines.
The developer’s website doesn’t offer any info about the compression algorithms employed, so we can only make guesses as to the bandwidth it requires and other technicalities. We’ll better talk about a practical implementation of the concept. It is the first monitor with an integrated chip from DisplayLink: the SyncMaster 940UX model from Samsung.