Original Link: https://www.anandtech.com/show/1449
ATI and NVIDIA: Quick Look at HDTV Overscan Compensation
by Andrew Ku on August 25, 2004 12:00 PM EST- Posted in
- Smartphones
- Mobile
It has been a while since ATI released their HDTV dongles, which provided HDTV output support for most of their Radeon line. In fact, we probably first experimented with their HDTV dongle back in July or August of 2002. Back then, HDTV output support was plagued by the overscan issue.
And for those of you unfamiliar with "overscan", it is simply the part of the picture that is cropped. Depending on whom you ask, others also describe it as the space that bleeds or "scans" beyond the edges of the visible area of the screen. Typical televisions can have a loss of up to 20% of the image due to cropping. This portion of lost image is what is commonly known as overscan. Technically speaking, the information of the "lost picture" is not actually lost, but it is outside the range of the visible area of your TV screen. A similar situation on the computer end is when you view a picture in 100% scaling on a monitor with a lower set resolution than the picture, i.e. a 1600 x 1200 picture in a 1280 x 1024 desktop environment. The difference is that on a computer, you can move the picture around to see portions cut off by the visible area of the monitor.
We should clarify that overscan is a not necessarily a bad thing. It is implemented deliberately on TV sets because of the different video input formats: composite, s-video, etc., all of which the TV needs for which to provide support. If overscan was not implemented as a factor of these different formats, there would likely be underscanning of different degrees on different TV sets. This is due to the different protocols and inherently different signals that the TV needs to handle. (Underscanning would be the opposite of overscanning, where the image is smaller than the area on which it is being displayed.) It would be tantamount to zooming out when you look at a picture; though in the case of TV sets, the space that doesn't reach the edges of the display would be black. The deliberate use of overscanning allows the screen to be completely filled, as opposed to underscanning, which could have varying degrees of underscanned margins.
The reason why we notice overscan more so on a computer is that we already know what the signal is suppose to look like: where the start menu button should be, where the clock should be, where the corners of the screen should be in relation to our desktop shortcuts. A DVD to a DTV or even a regular TV usually encounters some measure of overscan, though we hardly notice it because we aren't use to its native signal. One way that DVD player manufacturers' or TV manufacturers' compensate for this is to provide a zoom out function, where you tell the system essentially to underscan. This is why we go crazy when we notice overscan from a X-Box rather than from DVD signal; you know where the game menu is supposed to look like.
In theory, if a HDTV was designed for only HDTV, there would be no overscan from component computer video output. The main issue with overscan is that DTV are programmed to do more than just DTV signals. They accept many legacy signals: camcorders, s-video, composite, etc All of this means that there must be a cross platform support for all formats, and the only way for that to occur is to either overscan or underscan. Underscanning would be more frustrating to the consumer, since the signal would be smaller than the displayed area with the black bars surrounding the image. Overscan ensures that video signal always fills the screen, though this gets to be increasingly frustrating when you get an X-Box or output video from your computer and the signal is overscanned.
And as Keith Rochford (Chief Engineer of eVGA) explained, when you switch to a DTV, you are now talking about a high resolution display, and backwards engineering a pixel technology to a line scan technology isn't a simple task. This backwards engineering or transfer is what leads to the large 10% to 15% margins of overscan that we have been accustomed to when we output from a computer to a DTV. For those who own something like a plasma display that can do direct computer video output via VGA or DVI, this obviously isn't an issue, since there is no backwards conversion needed. It is essentially like a really big computer monitor, since it keeps the video card's native output.
In the most practical sense, overscan is something you don't want to have or at least want to minimize. Using your HDTV set as a substitute for your monitor can be awesome, but the limitation of having part of the picture cropped gets to be a major deterrent, especially when you want to plan games, surf the web, or watch videos on that nice big screen.
There are more than just ATI and NVIDIA cards on the market, but most of us are still going to be stuck with one or the other. In which case, you are most likely going to get some degree of overscan. Keep in mind that we can't track down every or even most DTV sets and check the degree of overscan, and even if we could, overscan varies between TV sets because of the manufacturer's design, which isn't a bearing on the video card. For these practical reasons, we are going to focus primarily on how ATI and NVIDIA approach HDTV overscan compensation.
Video Cards - HDTV support
At the moment, ATI provides HDTV output support via HDTV blocks and dongle.- VGA to Component HDTV block for the Radeon 8500 and All-in-Wonder 9600 series
- DVI to Component HDTV block for the Radeon 9800, Radeon 9700, Radeon 9600, and Radeon 9500 series as well as the All-in-Wonder 8500/8500DV
- Those who own an AIW 9800 or AIW 9700 don't need a dongle, since a separate YPrPb component video adapter connects to the video out port.
NVIDIA plans to provide HDTV output via DVI out with the upcoming drivers (ForceWare 65.xx). But the issue still is some HDTV sets use component, which is why it would be nice for NVIDIA to produce a dongle or some option for those who need it.
ATI - Custom Timings
For those who need a history lesson, overscan compensation (for video cards) has always come by way of newer better drivers, which is why we are using the Catalyst 4.7 drivers (the latest driver from ATI at the moment). Before ATI implemented this method of overscan compensation, ATI came out with "optimized" HDTV settings, which brought their default output from a range of 10% to 15% margins to a range of 3% to 5% margins (these are average numbers provided by ATI). Optimized HDTV settings were provided a few months after ATI released the component to VGA/DVI blocks due to the relatively large amount of overscanning, which sought to provide an intermediary solution until ATI could develop something more concrete.Since we are only dealing with how overscan compensation is addressed, we are only going to go through and test with a single scenario for both NVIDIA and ATI cards: 1280 x 720 desktop resolution under 720p HDTV output. All other scenarios for HDTV output, whether it is on a NVIDIA or ATI, follow the same principles employed in the driver.
1280 x 720 desktop resolution@720p output
Click to enlarge.
The picture shown above is the standard output when you just turn on HDTV output via the display page within the ATI control panel. Notice how the picture is partially cropped, and thus, there is some overscanning. While our InFocus X1 gets a smaller margin of overscanning than many of the other DTV sets out there, we still get a noticeable degree of it.
As we mentioned, ATI's overscan compensation method is to simply create a custom timing based on the desktop resolution currently in use; in this case, it would be 1280 x 720. This is done in the advanced subpage of the YPbPr properties page within the ATI control panel.
When you hit "add" on the advanced page, the DTV display will display an image similar to the one seen below. The idea is that the grey space represents the image output from the computer and the blue space is the space that the projector/DTV will display. Using the arrows, you are supposed to change the size of the grey space so that it will be as large as possible, but at the same time, contained within the blue space. Red arrows are used to enlarge the image and blue arrows are to minimize.
Our custom timing based on the 1280 x 720 resolution ended up being 1248 x 704, which ATI programs into the available screen resolution track within Windows' display properties. Simply changing the slider puts the desktop into the "customized timing" resolution.
Looking at the DTV, we get an image that doesn't encounter the initial problem of overscan. While it is hard to see, the projector has a small area of black bars on each side of the displayed image because there was minimal blue space when we created the 1248 x 704 custom timing, as we showed in the third to the last picture. We tried to show the amount of underscanning by marking the margins that the projector actually displays.
NVIDIA - Underscan
We kind of went over NVIDIA's overscan compensation method, called "overscan shift", in our eVGA GeForce FX 5700 Personal Cinema review, but hopefully, this gives you a clearer sense of what is going on. We used the ForceWare 56.72 drivers, which is why we simply followed the instructions for the ForceWare 56.72 drivers to set up NVIDIA's overscan compensation method in the release notes under "configuring HDTV".
1280 x 720 desktop resolution@720p output
Click to enlarge.
As we mentioned in a previous review, overscan (as seen in the above) seems to be limited to only three sides of the signal: left, right, and bottom. The upper portion of the image seems to encounter no degree of overscanning.
There are technically two overscan compensation methods that NVIDIA implements. The first is underscanning, where NVIDIA outputs the picture at a lower resolution so that the TV displays everything. The problem with underscanning is that you can get a lot of wasted space.
It is a fairly simple process, as you just move the track slider on the "change resolution" page of NVIDIA's display control panel. In this case, we are outputting 1088 x 612 in a 720p environment, which runs natively at 1280 x 720. Basically, you are looking at a 1088 x 612 image on a 1280 x 720 screen.
As you notice, this accomplishes the goal of having zero underscanning, but in many cases, there is going to be a significant portion of screen that isn't being utilized by the computer's HDTV output, which brings us to overscan shift...
NVIDIA - Overscan Shift
Overscan shift is similar to the custom timings for ATI, but with a bit of a twist. While NVIDIA adds additional margins to compensate for the overscanning, it is implemented in the same way that current monitors react when they have to deal with a resolution that is not in their native support range.Enabling this 2nd method of overscan compensation is as simple as using the track slider on the "device adjustments" page for NVIDIA's display control panel. The more you slide the track to the right, the more black space NVIDIA will add to "shift" the overscan over to allow you to see the entire image. The benefit of this is that you don't modify or change the native output of the computer, but you have the drawback of not being able to see everything at once.
As you move the cursor to the edge of the displayed image on the HDTV, the image will "shift" so that you see the rest of the image. Ideally, the overscan shift should be set to the lowest possible value, but at least enough so that there is sufficient "shift" available to see the overscanned area.
In the picture above, we set the overshift shift value to its highest possible position, and moved the cursor so that you see the highest amount of shift that the driver supports. The red lines marked in the picture indicate the area displayed by our projector.
Final Words
Quite frankly, neither method, in our opinion, is necessarily better or worse than the other. They both do the overscan compensation job. Granted, NVIDIA's overscan shift doesn't allow you to see everything at once and is dependent on cursor position. But ATI's method basically manipulates the timings for a specific resolution that is optimized for a specific HDTV resolution. On the other hand, NVIDIA is much easier to set up and interact with on a settings level.However, we should note that neither solution provides a decent way to fix the overscan problem as it relates to gaming. Either way you go, you won't be able to play games locally or online because you either can't see the whole screen or the image output is too small, since games are actually run in predetermined resolutions (i.e. running Jedi Academy on 1024 x 768 for a 720p output gets overscan).
We should note that overscan in games is related to ATI HDTV output. With NVIDIA, there is no overscan, but you get really poor resolution handling, since 1024 x 728 in Jedi Academy under a 720p output produces an image that takes up less than half of the screen than when you are in desktop mode (enlarging the overall image size doesn't change the issue). This doesn't help to put NVIDIA at the head of pack. If you are outputting to HDTV for other purposes, such as PVR or web browsing, this isn't so much of an issue, since you are reliant on the overscan of the desktop resolution, rather than the program. Basically, when it comes to gaming, we are going to recommend using native computer resolution output rather than trying to toy around for hours with HDTV output settings that you can somewhat tolerate.
If you are shopping around for a video card that can do HDTV output, ATI provides more purchasing options and is obviously the cheapest way to go should you already own a supported card, but their HDTV output support doesn't provide the fix that everyone is going to be happy with. If you can wait a bit, NVIDIA is supposed to support HDTV output via DVI in ForceWare 65.xx, which won't cost a dime (minus internet associated costs) to download and try out if you already own a GeForce FX or 6800 card. Meanwhile, ATI has no comment on future driver support for HDTV output via DVI.
What NVIDIA does need is a component output dongle or some sort of a separate attachment to provide a more comprehensive HDTV output support for its various cards on the market. Until that happens, eVGA's GeForce FX 5700 Personal Cinema is going to have to do, which isn't saying much considering that it is NVIDIA's fastest multimedia card that does HDTV output via component cables.