I have a XFX Geforce 8800 GS graphics card & a 24inch viewsonic monitor. It's a vx2435wm( factory settings 1920x1200). When installing the nvidia driver a major issue occurs. Windows, programs, games, start menu, taskbar, clock, everything opens off screen. It acts as if the monitor was bigger than it is. I have the correct resolution. I have set the correct resolution through computer properties and nvidia control panel. Lowering the resolution does lower the resolution, but everything is still off screen.
It's terrible because I can not see the start menu, taskbar or desktop icons because they are off screen. I have tried using a 19inch monitor and I have no problems. Everything is normal. It seems to be a relationship error with the Nvidia driver and the 24inch monitor. But I only use the 24inch... When I uninstall the nvidia driver, things work fine on the 24inch monitor. but slow, and thus pointless. I even install the original Nvidia cd driver that came with the card and the same problem occurs, things open off screen. Also, I use to have a BFG8800gt overclocked agp, and when I first got the vx2435wm monitor I had this issue initially, but it somehow went away. I fear that because i have used different drivers it may be the monitor itself. Please give me a solution on how to fix this problem. I imagine it is some sort of special driver I need.
I contacted viewsonic and they were not much help on this issue :(
As for LCD, anything other than native resolution results in the interpolation of the pixels. Worst yet, the Sonys have 1366x768 native resolution with thier LCD TVs. At such a nonstandard resolution, that thing interpolates with everything.
#9 - The new ATI cards that support HDTV output use a custom connector that supports S-video and HDTV output. Standard S-video has a 4-pin layout, while the extra pins located on the connector of the videocard is meant for HDTV out. I love this idea, but it would be nice if the component video cables ATI supplied was thicker :-) I need to get away from my AIW card since I have no use for the extra bells 'n whistles.
#10 Thanks for the response, Andrew. If you want some screen shots of my 4:3 CRT TV (good canidate for overscan stuff) running on 640x432 (full screen) and 1152x768 (wide screen), I'll be more than happy to e-mail you photos.
#7 - This article was written with the use of 4.8 cat drivers.
#8 - This article was in reference to more of the out of box experience and the approaches taken by each company. Keep in mind that overscan margins vary from HDTV to HDTV.
We mentioned that there are ways to "unofficially" customize things in our Personal Cinema 5700 review. I hope that when I have a bit more free time, I can take a look at the games and share our experience with tweaking games for HDTV.
#9 - We were testing and explaining the approaches taken by the two companies. It doesn't matter which card you use, technically. Overscan is going to vary from HDTV to HDTV set more so than video card to video card. Drivers are the issue on the computer side.
And since the hardware supports HDTV, there are certain cases where you don't even need to use drivers to get HDTV output.
As for the weird output scenario, the difference is that most LCD monitors aren't going to be of the same size caliber as plasma or HDTV sets.
Why did they only test old cards? Why didn't they include the ati x800 series and the nvidia 6800 series?
With my ati x800 series it followed a svhs -> rgb output (I think its for hdtv).
I use the dvi out to my optoma H56 xga projector. It supports hdtv. I seem to have a perfect pictures when playing hdtv. Do you mean that I am not getting hdtv out of my x800 dvi plug to my projector?
That sound weird to me. It would be the same as connecting an lcd screen using a dvi cable....
I'm confused. I've had my AIW 9700P for over a year and I rarely (only a few) have issues running games on my HDTV set.
AFAIK, it is extremely easy to run DirectX games on HDTV because it can easily support modified resolutions (driver support). Any of the OpenGL games I have (like RTCW) does not fully support custom resolutions - at best, an attempt would cause an OpenGL rendering error.
Just for information (in case someone needs it) for ATI owners with HDTV output:
Unreal Tournament 2k3/2k4: It's been a long time since I've done this, but assuming you know what resolution you were running before you can edit the UT2k3/2k4 INI file (search for the resolution value, and it'll take you to the width/height section of the file). For 4:3 TVs, the lowest resoltion you can run is 640x432 or 480p support. I recommend 864x648 (I forgot this one, I don't have a need for it) or 1152x648 (16:9) if the TV supports 720p/1080i.
Max Payne 2: If you output to TV as your primary display, the game will display a list of supported resolutions - even custom ones defined by the ATI drivers! Again, you can run at 640x432, 864x648, and 1152x648.
Halo: It's been awhile since I've messed with this as well, but you can edit the INI file and put in custom resolutions listed. Custom resolutions aren't listed in the game despite output to HDTV.
Doom 3: There's advice out there regarding commands for custom resolutions. You can add these to your autoexec.cfg file because the custom resolutions won't be listed in the game.
I'm happy ATI has integrated HDTV output with their videocards because it'll drive up sales significantly. Everytime I play games or watch movies (requires DVD-Region Free program to bypass Macrovision - else, TV output is disabled), I always use my HDTV.
This entire set-up is just a lie from ATI to sell more high-end video cards.
Hear me out.
I have an original AIW Radeon (7500 core) with DVI-I (DVI + VGA) out. Unfortuantely, i quickly found out that when I plugged the DVI into my Toshiba HDTV (H83) that the ATI driver was shutting off the DVI port on bootup (it would display during the BIOS and load but kill before getting to the windows login . . it's the only explanation).
ATI told me that there is no way to force the DVI port open if the monitor doesn't provide a proper EDID which is total BS because i was going to set the resolution in Powerstrip anways.
After a bit of research, I bought the DVI Detective (google it) which manages to fool the ATI driver into staying open and I know have a perfect image running in both 480p and 1080i.
It's a total ripoff that ATI would cripple their driver like that.
Glad i didn't buy a newer card just for HDTV, saved myself more than $100.
What I can't figure out is that when using svideo (standard 480i) i get very little overscan, yet I can change the resolution via your standard rightclick->properties route. I can use anthing from 640x480 up 1024x728 picel resolutions, yet the HDTV maintains the 480i. Icons and text get bigger and smaller with it's repsective resolution, but overscan doesn't change.
Its the card sees the pixel resolution, but it does something to overlay it on the 480i from the svideo. Totally transparent.
It seems to me that it shouldn't be that hard to "program" the component out (I use ati) to a custom resolution that produces no over scan, and yet can fit any pixel resolution to it.
I probablly don't make sense, but anyone with TV out and HDTV try it, use the Svideo use ATI's TV out tab in the driver center the picture up (eliminate 480i overscan) then just change your regular pixel resolution. Games run fine in what ever resolution because the pixel res is translated to a 480i res via the svideo.
It should be a doddle for them to add those resolutions if they're already included in X-Box versions, after all some ports from the X-Box seem like little more than a quick recompilation of the source code as they didn't bother improving on controller options etc.
In any case its no problem at all for any game developer to add custom resolutions if there were any demand. I just wonder how many PC gamers really want to play games on an HDTV, or an ordinary TV for that matter (theres a lot more of them about), unless using a gamepad while sitting on the sofa as you can't really use a keyboard, mouse or joystick very well. In which case they may as well just use the X-Box or PS2 instead.
Keep in mind that for HDTV gaming, consoles are actually ahead of computers. A lot of games on the Xbox and a few on PS2 come in 480p/720p/1080i flavors, so they look great on the HDTV without any screwing around. Hopefully, computer game makers will start offering standard HDTV resolutions soon in all games...I have no idea how practical that is but if consoles can do it I assume they can too...
"It is implemented deliberately on TV sets because of the different video input formats: composite, s-video, etc., all of which the TV needs for which to provide support. If overscan was not implemented as a factor of these different formats, there would likely be underscanning of different degrees on different TV sets. This is due to the different protocols and inherently different signals that the TV needs to handle."
Where do you come up with this crap?
Overscanning is to eliminate the black bars around the picture. This was done long, long before there was s-video, component inputs, etc.
The type of input has nothing to do with overscan.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
13 Comments
Back to Article
skearns2 - Monday, February 2, 2009 - link
I have a XFX Geforce 8800 GS graphics card & a 24inch viewsonic monitor. It's a vx2435wm( factory settings 1920x1200). When installing the nvidia driver a major issue occurs. Windows, programs, games, start menu, taskbar, clock, everything opens off screen. It acts as if the monitor was bigger than it is. I have the correct resolution. I have set the correct resolution through computer properties and nvidia control panel. Lowering the resolution does lower the resolution, but everything is still off screen.
It's terrible because I can not see the start menu, taskbar or desktop icons because they are off screen. I have tried using a 19inch monitor and I have no problems. Everything is normal. It seems to be a relationship error with the Nvidia driver and the 24inch monitor. But I only use the 24inch... When I uninstall the nvidia driver, things work fine on the 24inch monitor. but slow, and thus pointless. I even install the original Nvidia cd driver that came with the card and the same problem occurs, things open off screen. Also, I use to have a BFG8800gt overclocked agp, and when I first got the vx2435wm monitor I had this issue initially, but it somehow went away. I fear that because i have used different drivers it may be the monitor itself. Please give me a solution on how to fix this problem. I imagine it is some sort of special driver I need.
I contacted viewsonic and they were not much help on this issue :(
anyone know the solution? thanks!
djtonium - Monday, August 30, 2004 - link
Eww.. correction, it's 1152x648 (720p timings).As for LCD, anything other than native resolution results in the interpolation of the pixels. Worst yet, the Sonys have 1366x768 native resolution with thier LCD TVs. At such a nonstandard resolution, that thing interpolates with everything.
djtonium - Monday, August 30, 2004 - link
#9 - The new ATI cards that support HDTV output use a custom connector that supports S-video and HDTV output. Standard S-video has a 4-pin layout, while the extra pins located on the connector of the videocard is meant for HDTV out. I love this idea, but it would be nice if the component video cables ATI supplied was thicker :-) I need to get away from my AIW card since I have no use for the extra bells 'n whistles.#10 Thanks for the response, Andrew. If you want some screen shots of my 4:3 CRT TV (good canidate for overscan stuff) running on 640x432 (full screen) and 1152x768 (wide screen), I'll be more than happy to e-mail you photos.
AndrewKu - Sunday, August 29, 2004 - link
#7 - This article was written with the use of 4.8 cat drivers.#8 - This article was in reference to more of the out of box experience and the approaches taken by each company. Keep in mind that overscan margins vary from HDTV to HDTV.
We mentioned that there are ways to "unofficially" customize things in our Personal Cinema 5700 review. I hope that when I have a bit more free time, I can take a look at the games and share our experience with tweaking games for HDTV.
#9 - We were testing and explaining the approaches taken by the two companies. It doesn't matter which card you use, technically. Overscan is going to vary from HDTV to HDTV set more so than video card to video card. Drivers are the issue on the computer side.
And since the hardware supports HDTV, there are certain cases where you don't even need to use drivers to get HDTV output.
As for the weird output scenario, the difference is that most LCD monitors aren't going to be of the same size caliber as plasma or HDTV sets.
magnusr - Sunday, August 29, 2004 - link
Why did they only test old cards? Why didn't they include the ati x800 series and the nvidia 6800 series?With my ati x800 series it followed a svhs -> rgb output (I think its for hdtv).
I use the dvi out to my optoma H56 xga projector. It supports hdtv. I seem to have a perfect pictures when playing hdtv. Do you mean that I am not getting hdtv out of my x800 dvi plug to my projector?
That sound weird to me. It would be the same as connecting an lcd screen using a dvi cable....
Can't computer monitors show hdtv?
djtonium - Thursday, August 26, 2004 - link
I'm confused. I've had my AIW 9700P for over a year and I rarely (only a few) have issues running games on my HDTV set.AFAIK, it is extremely easy to run DirectX games on HDTV because it can easily support modified resolutions (driver support). Any of the OpenGL games I have (like RTCW) does not fully support custom resolutions - at best, an attempt would cause an OpenGL rendering error.
Just for information (in case someone needs it) for ATI owners with HDTV output:
Unreal Tournament 2k3/2k4: It's been a long time since I've done this, but assuming you know what resolution you were running before you can edit the UT2k3/2k4 INI file (search for the resolution value, and it'll take you to the width/height section of the file). For 4:3 TVs, the lowest resoltion you can run is 640x432 or 480p support. I recommend 864x648 (I forgot this one, I don't have a need for it) or 1152x648 (16:9) if the TV supports 720p/1080i.
Max Payne 2: If you output to TV as your primary display, the game will display a list of supported resolutions - even custom ones defined by the ATI drivers! Again, you can run at 640x432, 864x648, and 1152x648.
Halo: It's been awhile since I've messed with this as well, but you can edit the INI file and put in custom resolutions listed. Custom resolutions aren't listed in the game despite output to HDTV.
Doom 3: There's advice out there regarding commands for custom resolutions. You can add these to your autoexec.cfg file because the custom resolutions won't be listed in the game.
I'm happy ATI has integrated HDTV output with their videocards because it'll drive up sales significantly. Everytime I play games or watch movies (requires DVD-Region Free program to bypass Macrovision - else, TV output is disabled), I always use my HDTV.
Daleon - Thursday, August 26, 2004 - link
I don't get it, was this article written before the 4.8's were released with HD support? If so, why not update it before posting it.Wrath0fb0b - Wednesday, August 25, 2004 - link
This entire set-up is just a lie from ATI to sell more high-end video cards.Hear me out.
I have an original AIW Radeon (7500 core) with DVI-I (DVI + VGA) out. Unfortuantely, i quickly found out that when I plugged the DVI into my Toshiba HDTV (H83) that the ATI driver was shutting off the DVI port on bootup (it would display during the BIOS and load but kill before getting to the windows login . . it's the only explanation).
ATI told me that there is no way to force the DVI port open if the monitor doesn't provide a proper EDID which is total BS because i was going to set the resolution in Powerstrip anways.
After a bit of research, I bought the DVI Detective (google it) which manages to fool the ATI driver into staying open and I know have a perfect image running in both 480p and 1080i.
It's a total ripoff that ATI would cripple their driver like that.
Glad i didn't buy a newer card just for HDTV, saved myself more than $100.
forcemac101 - Wednesday, August 25, 2004 - link
What I can't figure out is that when using svideo (standard 480i) i get very little overscan, yet I can change the resolution via your standard rightclick->properties route. I can use anthing from 640x480 up 1024x728 picel resolutions, yet the HDTV maintains the 480i. Icons and text get bigger and smaller with it's repsective resolution, but overscan doesn't change.Its the card sees the pixel resolution, but it does something to overlay it on the 480i from the svideo. Totally transparent.
It seems to me that it shouldn't be that hard to "program" the component out (I use ati) to a custom resolution that produces no over scan, and yet can fit any pixel resolution to it.
I probablly don't make sense, but anyone with TV out and HDTV try it, use the Svideo use ATI's TV out tab in the driver center the picture up (eliminate 480i overscan) then just change your regular pixel resolution. Games run fine in what ever resolution because the pixel res is translated to a 480i res via the svideo.
PrinceGaz - Wednesday, August 25, 2004 - link
It should be a doddle for them to add those resolutions if they're already included in X-Box versions, after all some ports from the X-Box seem like little more than a quick recompilation of the source code as they didn't bother improving on controller options etc.In any case its no problem at all for any game developer to add custom resolutions if there were any demand. I just wonder how many PC gamers really want to play games on an HDTV, or an ordinary TV for that matter (theres a lot more of them about), unless using a gamepad while sitting on the sofa as you can't really use a keyboard, mouse or joystick very well. In which case they may as well just use the X-Box or PS2 instead.
AndrewKu - Wednesday, August 25, 2004 - link
#2 - True. Hopefully, their will be more convergence in the spec.aw - Wednesday, August 25, 2004 - link
Keep in mind that for HDTV gaming, consoles are actually ahead of computers. A lot of games on the Xbox and a few on PS2 come in 480p/720p/1080i flavors, so they look great on the HDTV without any screwing around. Hopefully, computer game makers will start offering standard HDTV resolutions soon in all games...I have no idea how practical that is but if consoles can do it I assume they can too...Questar - Wednesday, August 25, 2004 - link
"It is implemented deliberately on TV sets because of the different video input formats: composite, s-video, etc., all of which the TV needs for which to provide support. If overscan was not implemented as a factor of these different formats, there would likely be underscanning of different degrees on different TV sets. This is due to the different protocols and inherently different signals that the TV needs to handle."Where do you come up with this crap?
Overscanning is to eliminate the black bars around the picture. This was done long, long before there was s-video, component inputs, etc.
The type of input has nothing to do with overscan.