Comments Locked

39 Comments

Back to Article

  • darklight0tr - Monday, April 9, 2007 - link

    On Page 4 it says that Screen Scaling doesn't work when using DVI, but that is incorrect. I have the 2707WFP, and the Screen Scaling menu isn't enabled in DVI UNTIL you switch the screen resolution to one that doesn't use the 16:10 aspect, such as 1600x1200.

    It confused me at first, but I guess it doesn't make sense to enable that menu if you are already running the monitor at 1920x1200.
  • JarredWalton - Tuesday, April 10, 2007 - link

    My screen scaling menu is definitely disabled with a DVI connection. I've tried several resolutions (1680x1050 and 1600x1200 for sure) and it never became adjustable. That said, the LCD is revision A00, and I know on the 2407WFP this was apparently something they addressed in a later revision of the firmware; perhaps they will do the same again. I felt this OSD was a step back relative to the latest 2407WFP firmware options, which is odd as it should be essentially the same menu. If they "fixed" something with Rev. A03 on the 24" model, why "unfix" it for A00 of the 27"? Oh well... I just report on what I have. :)
  • Amuro - Saturday, April 14, 2007 - link

    Did you use a Nvidia 8 series video card for the review? There are scaling issues with Forceware drivers later than 97.XX that forces adapter scaling at all times. If you try to change the scaling to something else in the Nvidia control panel, it will automatically go back to adapter scaling after you click Apply. The scaling menu on the LCD is disabled because the video card automatically streches everything to 19200 X 1200. I have a Rev A00 2707 WFP and I'm able to use monitor scaling before loading into Windows, and also ablo to do 1:1 scaling when using it with my PS3 at 10806P via DVI.
  • AnnonymousCoward - Friday, April 6, 2007 - link

    The article said "Furthermore, while nearly everyone will agree that running your LCD at its native resolution is the best solution, gaming on a 30" LCD at 2560x1600 requires some serious graphics horsepower." Well, there is a second native res that requires much less horsepower: 1280x800.

    I'm not sure how accurate your pixel pitch chart is, because if you go to tvcalculator.com, the results are different. Use that site to compare a 30", 16:10, 2560x1600 versus a 27", 16:10, 1920x1200 --> 10126/7030 is 1.44x pixel density on the 30". Using your pitch numbers (to convert distance per pixel to pixels per distance): 1/.250mm = 4pixels/mm, 1/.303mm = 3.3pixels/mm. 4/3.3 = 1.21x pixel density on the 30". I'd trust the physical size per resolution before I'd trust some published pixel pitch numbers.

    btw, if you're comparing CRTs on the site, don't forget to subtract an inch since LCDs go by the viewable inches.
  • JarredWalton - Friday, April 6, 2007 - link

    The table is composed of approximate values. The main point was that various sizes and resolutions will have a larger/smaller pixel pitch. So if you find a 19" 1280x1024 to be "good" and 17" to be "too small", you might like a pixel pitch similar to that of the 19". The HDTV values in particular could be way off, since I couldn't find any reasonable sources.
  • AnnonymousCoward - Friday, April 6, 2007 - link

    Yeah...and does pixel pitch mean the distance between the center point of each pixel? If so, and if the size of pixels vary per screen, then the pitch isn't the complete information since the gap between pixels probably also varies.

    The tvcalculator.com was useful for me before I bought a 30". I had a 21" CRT at my disposal, and it turns out that the pixel density is nearly identical at 1600x1200 to the 30" at 2560x1600.
  • TA152H - Thursday, April 5, 2007 - link

    Jarrod,

    Do you guys have any intentions of reviewing Eizo monitors sometime in the future? I'm not knocking the inferior stuff like Dell, Viewsonic, etc..., and I do understand that the masses buy this stuff in masses and the price makes them much more accessible. But, it wouldn't hurt to review the top quality stuff either, even if most people can't afford it. After all, you guys review $1000 processors, and high end graphics stuff. It's like a car magazine reviewing a Porsche. Most of it's readers can't afford it, but it still makes for interesting reading. We might end up with a Volkswagen (or Dell, or Viewsonic, etc...), but it does get readers. Since they sell so few monitors, I'm guessing they would love the exposure and you wouldn't have too hard of a time getting one. It can't hurt to ask. I have bought Nanao/Eizo until they stopped making CRTs, and they were much better than the other rubbish sold (and twice as expensive :P), and I think a review would be informative and open a few eyes (no pun intended).

    Also, are you guys done reviewing CRTs? I finally bought an LCD about four months ago because I kept reading how much better they were. Not only did I hate it, but I almost hit my cat with it (by accident) when I threw it out the window. I know they take some getting used to, but they have so many compromises compared to CRTs that I had to destroy it (NewEgg won't take them back). I'm not alone in this either, although I think a minority. Still, some reviews of the new Samsung monitors would be nice, and maybe the aforementioned Eizos are decent enough LCD monitors that they don't have to factor in flight attributes in the design, for when someone throws them out the window. It's a pity too, they weigh so little and take up so little desk space in comparison. I just can't stand the poor quality of the display. Why do they make them so bright anyway? Even turned down they can be really tough to look at.

    Also, if you do review CRTs, you might want to skip Viewsonic. They've managed to combine the worst of both worlds; their CRTs have dead pixels. I have no idea how this is possible, but they managed to do it and not only the one I bought. It's apparently a common problem. I have to give them credit for having the talent to create such a thing though, I didn't even know it was possible. Probably the shadow mask isn't transparent in that spot? I have no idea, really.
  • TA152H - Thursday, April 5, 2007 - link

    Hmmm, I just went to Samsung's web site and I don't even see their CRTs listed anymore. So maybe CRTs are finally dead, unless you want dead pixels. Oh brother, what a choice.
  • JarredWalton - Thursday, April 5, 2007 - link

    I think we are generally at the point now where CRTs truly are dead in terms of reviewing units. If I had a good late-model CRT available for comparison purposes, I would have included it in the test results already. Unfortunately, the best CRT that I had (note the use of past tense) was a 19" NEC (FE-991SB). I honestly don't even want to bother with CRTs anymore. LCDs certainly aren't perfect, but I will take the sharp image quality (no need to stretch/rotate/center your image like you have to do with CRTs) and the smaller size any day given a choice between CRT and LCD.

    I realize there are still CRT holdouts, and I do miss the higher refresh rates that CRTs offered, but hopefully we will see improvements in that area of LCDs in the not-too-distant future. If you already have a good CRT, you can probably continue to use that for a while longer. If on the other hand you are looking to go out and purchase a new display, I doubt many people would be interested in the current CRT offerings.
  • TA152H - Thursday, April 5, 2007 - link

    Actually, I did some research after I wrote that and Viewsonic, Phillips and CTX are still making them.

    Do you actually prefer the image on an LCD? I can't imagine anyone does, but I suppose some people do. It is probably your poor vision (no offense), because I could see the each pixel very clearly on my LCD and it's just not as lush a picture (incidentally, my vision sucks too, but only long distance. Short distance I see all too well). Also, the viewing angles aren't quite right. The top looks lighter than the bottom, for example, unless you move your head up. It's altogether an extremely flawed technology, and it hasn't really gotten much better in 20 years. In know the numbers look better, but the old ThinkPads I had at IBM didn't look that much worse than these monitors, and in 20 years they should have. I'm not saying if you put them side to side you couldn't tell the difference, you probably could, and they are a lot bigger now, but they aren't nearly as improved as I have seen spoken about. For that matter, neither are CRTs. My old Princeton Ultrasync had a fantastic picture, but it was only 12 inch.

    The most painful part of chucking the LCD was the footprint and weight. CRTs are awful in terms of creating heat too. In the summer, I want to chuck them out the window, but perversely their weight saves them from this form of destruction. (Even the illustrious dead pixel Viewsonic was discarded in a dump, although kicked and sledgehammered, rather than chucked out the window). The image quality is so hard to ignore though. For a TV, where size matters a lot and you generally watch from a distance, I think LCDs are great. But, being able to see each pixel and the terrible brightness, the poor viewing angles, the terrible ghosting, the weird bright green pixel that won't take no for an answer, and poor color saturation leave me cold. But, the handwriting is on the wall, although you'll probably have one company make CRTs for a long, long time, because even if 5% of the people can't stand LCDs, it's still a huge market. I think the number is higher than that too. Luckily, the Trinitron is dead. Another really badly flawed technology that people bought a lot of. How could people stand those lines or the asymmetric nature of the Trinitron? I never understood it. I'm equally baffled now by LCDs. Incidentally, the one I tossed was a ViewSonic VG2230wm. It was talked about well, but it was a huge disappointment compared to a CRT.
  • JarredWalton - Thursday, April 5, 2007 - link

    And here I preferred aperture grille over invar shadow mask in the CRT days. Again, it was the colors and overall brightness. Most CRTs look "fuzzy" to me, and I like the sharpness of LCDs. You look at black text on a white background and you can clearly see that every pixel is a direct mapping of the digital information. CRTs, you can get artifacts due to stretch and centering, not to mention rotation and pincushion... that stuff always irritated me. So yeah, I really do prefer LCDs.

    As for the "bright top and darker bottom", I don't notice that at all, even on 30" displays. It's probably there to a certain extent, but it's nowhere near as bad as any of the older TN+film panels (which are still common on notebooks). Turn down the brightness to a more moderate level, and everything is great for me. I truly do prefer the image quality on my 30" (or 24") LCD over any CRT I've used.

    The final issue is that many (most? all?) developers are now using LCDs as well. I remember thinking Doom 3 was way too dark to the point of being almost unplayable. Then I got my LCD (a 2405FPW at the time) and suddenly the whole game changed. It still had flaws, but darkness was no longer one of them. I think id must have been using LCDs, so they never realized what the game would look like on a typical CRT. (Of course some CRTs are brighter than others; mine was not one of those.)

    I can understand that some still prefer CRTs, but I'm definitely not one of them. Given the choice, I would always take a good LCD over a good CRT these days. I just hope stuff like OLED and some of the other technologies in development can make it out sooner rather than later. Give us a nice 120 Hz refresh rate with a bit faster response times, and I'd be very happy indeed.
  • TA152H - Thursday, April 5, 2007 - link

    Jarrod,

    Even with the brightness turned down all the way, I found the LCD gave me headaches. That's why it became one with the air, and then the ground. I couldn't get used to it. I think people's vision has a lot to do with it. I remember when IBM came out with the 8514/A and we got them, I raised Hell because I got headaches from it (it was interlaced). They thought I was whining, but I could see the flicker really vividly and it gave me the worst headaches (incidentally, I just bought one on eBay for $5 :P). The LCDs are not only too bright, even turned down all the way, but it's like looking through a screen door at a picture, because I can see very easily the boundaries between pixels. It's not as smooth or rich as a CRT.

    I don't play shoot 'em ups anymore, they bore me terribly. I prefer the strategy games, and even games like Civilization. I guess LCDs have become necessary for shoot em ups now, although it's kind of funny because they are so poor at it because of the slow response time. But, they have to develop to what people have, and LCDs are definitely more common. Why they are so bright is a mystery to me though. Maybe because they are so poor with respect to viewing angles and saturation, they try to make them extra bright to compensate.

    I can see the differences in brightness so easily in LCDs it is surprising you don't have a problem with it. I think some of it is what you are used to. If I had these for years, my eyes would adjust to it and I would really see it. Remember the first time you saw a flat screen CRT? It seemed screwed up because your brain had adjusted to the curved screens. A lot of stuff we adapt to without even knowing it. Again, I do think we all see differently, and it's not just a matter of opinion. It's got a lot to do with how we actually see it. I know with respect to refresh rates it is, because I'd ask people to look at stuff, and they didn't see the flicker (normally it's easier from the corner of your eyes, probably our adaptation to seeing movement in peripheral vision well so we could see things sneaking up on us?).

    The thing about CRTs I can't stand are the moire problems. CRTs seem to be extremely accurate in most dimensions at 800 x 600, but when you start getting higher there seems to be more distortion. And the size and weight. Ugggh. The mainstream CRTs are pretty foul too, but luckily I loaded up on the Eizos and they are much, much better than the Viewsonic, Sony, Dell, etc... rubbish. I was considering buying an Eizo LCD, but I don't know if all LCDs suck bad, or it's just the mainstream rubbish again. Eizo has a way of changing the rules, at least they did on CRTs. That's why I'd love to see a review on them, because at this point they are the only ones I'd consider. I guess I could put it on a server if I didn't like it, so I don't have to look at it for long. But, they are so expensive, I'm a little concerned I'd be throwing good money at bad, because the technology is so weak even they can't get it to work. Dell, et al, obviously aren't going to put out a very good product since the cost would preclude them selling enough. But Nanao, well, they might. Any chance of you reviewing one?
  • JarredWalton - Friday, April 6, 2007 - link

    I can ask them and see if they're interested in sending one for testing. Honestly, though, I'm not sure my review would be of any use to you. I'm already happy with current LCDs, and you're not. This is one of those cases where you might need to see about testing one in person to see if the results are acceptable or not.

    I've always had a problem with low refresh rates on CRTs as well - anything less than 85 Hz I could detect a flicker, and while 75 Hz was tolerable the 60 Hz displays gave me headaches. That's one of the things I like on LCDs. The backlights are always on, and rather than having a beam scanning across the monitor at 60 Hz, the light is evenly distributed and there's no chance for a pixel to start to darken and then light up again.

    I do know what you mean about adapting, though. The first time I used a 30" LCD I was like, "WTF!? It's too big!" Now I'm quite happy with the size and it no longer bothers me at all. When I go back to a 24" it seems small, relatively speaking. Heh.
  • TA152H - Friday, April 6, 2007 - link

    Jarrod,

    It might be helpful. I want to like LCDs, mainly because of the footprint and power/heat issues with CRTs. But, I couldn't stand the one I had. I might have gotten used to the flaws and then hated the CRT flaws, but the brightness was such I got really painful headaches and couldn't spend the time on it to adjust to it.

    So, if you get an Eizo and your review says it's much better than existing LCDs, it would at least be useful. On the other hand, if you say it's nice but pretty much the same, well, I'd probably pass and wait another couple of years before looking at one. Also, I think it would be very, very interesting for your readers, and probably even for you, to have an absolutely top quality display to review. It would be informative, because I think a lot of people don't even know they exist and think these mainstream brands offer the best displays. Who knows, maybe they do, and a review might point that out. If they are willing to send you one, and I'd have to think they are crazy if they don't (because they need the visibility since they are very poorly known), I think it would be a really interesting review for all those reasons, and not just to me.
  • strikeback03 - Friday, April 6, 2007 - link

    Have you tried looking at a good LCD in a store? Most of the commonly available ones are cheap and not very good - they show the contrast gradient top-to-bottom you referred to. IIRC the review of the HP said they accept them back for pretty much any reason within the return period, so you could toss it back in the box instead of out a window if you don't like it. Just be sure to get a decent one, not a cheapie.

    I too would be interested in seeing the results of a test on an Eizo or an NEC W-series. I'd like to see how the calibration results of what are pretty much considered the top professional diplays look like compared to the mainstream/gaming displays.
  • LoneWolf15 - Thursday, April 5, 2007 - link

    Having purchased the 2407WFP in mid-November at under $600 (not including the tax and the price for extending out the warranty to 5 years), the price on the 27" seems kind of steep. Maybe if Dell had upped the ante by adding HDMI, or a second set of component ins, it would put it over the top.

    I'm also a little disappointed that their card readers don't support xD Picture Card format, which is what my Fuji FinePix camera uses. Minor nitpick, but one that shouldn't be all that hard to fix.
  • anandtech02148 - Wednesday, April 4, 2007 - link

    the gateway spec at 125watts and this is 95wtt..that's an efficient build for a 27in lcd.
  • JarredWalton - Wednesday, April 4, 2007 - link

    The Gateway lists 125W maximum I think, so typical power use is probably less. The 95W for the 2707WFP is after calibration (i.e. with lowered brightness levels).
  • anandtech02148 - Wednesday, April 4, 2007 - link

    is it exciting to review all these new fresh monitor? or do you get sored eye after all the calibration work.
    Again thanks for the calibrated file you upload.

  • JarredWalton - Wednesday, April 4, 2007 - link

    Speaking of which, if anyone is interested, here are the http://images.anandtech.com/reviews/monitor/2007/d... 2707WFP Profiles.rar">2707WFP profiles used in this article. Standard "your LCD is not the same as the tested LCD" disclaimer applies.

    As for sore eyes, no, that's not a problem. The calibration isn't all that difficult to perform in most cases. It's the writing, graph generation, and photo editing that takes the most effort.
  • AnnonymousCoward - Friday, April 6, 2007 - link

    Slightly off topic, but what's the easiest way to get color profiles to apply in games, and not just Windows?
  • JarredWalton - Friday, April 6, 2007 - link

    If you set a color profile, it applies to everything but overlay. So games automatically use it, AFAIK. It's only video content that has problems.
  • AnnonymousCoward - Friday, April 6, 2007 - link

    You're probably right, since I tried changing the color profile to make everything hot pink, and the game also looked that way.

    Whenever Windows is booting up, the desktop first looks slightly lighter, and after a second it seems like the color profile kicks in. When I run the game Dark Messiah, right before the screen switches to the game, the desktop switches back to that lighter appearance, so it doesn't look like it's using the profile. I've also seen a few sites indicate that profiles don't apply for games: http://www.hex2bit.com/products/product_mcw.asp">http://www.hex2bit.com/products/product_mcw.asp says "...to prevent other programs from changing the color profile Windows uses. This is especially important to gamers as most games will change the color profile Windows uses." and http://www.hardforum.com/showthread.php?t=1064124&...">http://www.hardforum.com/showthread.php?t=1064124&... someone said "Also, that color profile won't effect videos, games, or your mouse cursor. I calibrated through my spyder2..."
  • sm8000 - Wednesday, April 4, 2007 - link

    "single-link with a very limiting 1280x800 resolution"

    Isn't single link's max res 1920x1200? I'm pretty sure it is. Is the article saying dual link panels by design won't display more than 1280x800 on single link?
  • JarredWalton - Wednesday, April 4, 2007 - link

    Right. There are no scaler ICs for 2560x1600 right now, but apparently they can manage a simple doubling of resolution. If you use a 30" LCD with a single-link DVI connection, they will only support up to 1280x800. In the case of the HP LP3065, any other resolution ends up being garbled (i.e. the BIOS, POST, and boot sequence is illegible). Within Windows, you can change the resolution and apparently the GPU will handle the scaling, but outside of Windows you're basically out of luck unless you're running 1280x800.
  • jc44 - Wednesday, April 4, 2007 - link

    I feel the need to take issue with the assumption in the article that a denser pixel pitches must lead to smaller text. OK - that certianly happens by default, but it is possible to increase the number of dpi that windows associates with amonitor and that should increase the size of the displayed text. I'll admit that support is somewhat patchy with web pages being amongst the greatest offenders - but in general it works.

    Personally I'm a dpi junkie and normally use a 204dpi monitor which can lead to somewhat interesting results on applications & web pages that are convinced that all monitors in the world run at 96dpi!

    These days you don't need to spend a lot on a graphics card to a a dual-link dvi connector - I'm not sure where the bottom of the range is but an nvidia 7600 costs less than £100 and can be found with one dual + one single link DVI connectors.

    JC
  • JarredWalton - Wednesday, April 4, 2007 - link

    Adjusting DPI is certainly possible, and I believe this is one of the areas that Vista is supposed to be a lot better than XP. (Anyone able to confirm that?) However, my personal experience with modifying the DPI has been less than stellar. I usually end up just increasing the font size in Firefox, using the magnification in Word, etc. There are plenty of other applications that have no respect for the Windows DPI setting.
  • nullpointerus - Wednesday, April 4, 2007 - link

    Vista is definitely better than XP in this regard, but there are still many areas that could use some polish. For example, Vista still appears to use tiny bitmapped icons, which do not scale very well on the high-dpi title bar and task bar. Moreover, many third-party applications and even many Microsoft applications still have icons and images that scale horribly without the standard 96-dpi setting.

    Nonetheless, font-handling and layout for non-Aero-native applications has improved dramatically since the early Vista RC1 release; instead of merely upscaling the fonts and controls into a blurry mess, the layout engine does proper spacing and the font engine draws crisp, high resolution fonts. Visual Studio 2005 shows *major* progress in this regard.

    For anyone interested in getting a higher density display and using the Vista DPI setting, I definitely trying it first. You could enable 120 dpi on your old monitor and stand back an extra foot or so to mimic the effect of a lower pixel pitch. Or get a friend to do this if you do not have Vista on your own computer.
  • strikeback03 - Wednesday, April 4, 2007 - link

    I always reduce the size of my windows icons anyway. they are huge in the stock setting.

    on a related note, anyone know how to change desktop icon size and spacing in Gnome/Ubuntu? do you need a whole new theme? icons for mounted drives are way large.
  • nullpointerus - Wednesday, April 4, 2007 - link

    typo: I definitely recommend trying it first.
  • JarredWalton - Wednesday, April 4, 2007 - link

    A perfect example of stuff that doesn't look right with a higher DPI setting is anything that uses a bitmap. All of the icons at 120dpi tend to look like crud in XP. There are just far too many areas of Windows and the applications that run on it that are built around pixel sizes, so changing DPI settings only sort of affects them.

    Anyway, the point isn't whether or not higher DPI is good or bad. You like it, others don't. That's the main idea behind that introduction: an explanation of why higher pixel pitch can be a good thing. I really do have poor vision (an irregular astigmatism that can't be corrected without a retina transplant, so I live with slight double vision). I find many of the high DPI screens to be undesirable, although I do like higher resolutions for image work.
  • kalrith - Wednesday, April 4, 2007 - link

    Since we're discussing pixel pitch and poor eyesight, I thought I'd mention that one of my coworkers has such poor vision that he's using a 21" LCD at 800x600 resolution and thinks it's "just right".

    Also, out of the 10 19" LCDs we have, only one person runs hers at the native res. Everyone else uses 1024x768.
  • LoneWolf15 - Thursday, April 5, 2007 - link

    This is one reason why I "downgraded" (the rest of the specs are similar, other than that I also shaved 2 pounds of weight) from a laptop with a 15" 1600x1200 UXGA display to a 14" 1024x768 XGA display. At 15", picture detail was incredible, but text for web browsing was giving me sore eyes and headaches. I wouldn't mind having 1280x1024 at 14" or 15", but since I'm not paying for it, beggars can't be choosers.

    It's also why I returned my Dell 2007WFP and exchanged it for a 2407WFP. Higher resolution, but larger pixel pitch as well.
  • kmmatney - Wednesday, April 4, 2007 - link

    I'm another person who likes big pixels. Work tried to give me a 17" LCD, but I would have none of that. I then tried a 21" Samsung at 1600 x 1200, but it was still too small. Now I have a 20" LCD running native at 1400 x 1050 and its really nice. I have a laptop with small pixels that I use when I travel, but I'm much more productive when I can see everything clearly.

    I would love to have this display, but it really needs to come down in price.
  • strikeback03 - Wednesday, April 4, 2007 - link

    my vision is awful uncorrected - way beyond not being able to see the big "E". But since I'm always wearing glasses or contacts anyway I like high-DPI displays. Love my thinkpad with the SXGA 15" display. The UXGA 15" would probably be hard to read though.

    My boss has a ~20" CRT that he runs at either 800x600 or 1024x768.
  • jc44 - Wednesday, April 4, 2007 - link

    OK - I admit it - I'm stunned. With the exception of your colleague with the poor eyesight I find it hard to conceive how anyone would prefer (presumably) a slightly fuzzy (due to scaling artifacts) 1024x768 to a sharp 1280x1024 on a 19" LCD. I could simply not put enough information on the screen to be able to do my job at that resolution without resorting to a lot of printouts.

    Well horses for courses I guess - thanks

    JC
  • xsilver - Friday, April 6, 2007 - link

    lol - the amount of people that have their lcd monitors set to non native resolutions is insanely funny.
    but even more insanely funny is how many people say they cant see anything wrong with the scaling artifacts and fuzziness.

    I haven't done much (any) testing on this in gaming though - is the distortion just as bad in gaming when running a non native res? getting a 20" lcd or above these days has pretty much required a high end graphics card to be purchased if any gaming wants to be done if you want to run native res.

    still prefer crt atm myself but I realize it will be inevitable that i'll have to make the switch and need to figure out some options.
  • mitchell123 - Thursday, December 3, 2009 - link

    hello Friends
    Thios is a nice article.......for everyone...........
    ==============
    Mitchell
  • Tommyguns - Wednesday, April 4, 2007 - link

    19" Viewsonic lcd here. you guessed it. 1024x768 and it suits me just fine. not that i have bad eyes at age 22 or anything, i just like being able to clearly see everything. I game hard as well and it works out just fine. i do have it in clone mode going to an aux 17inch crt thats about 20 feet away. higher res. is nice, but i prefer big letters, with out the squints sometimes.

    it would be nice to know what is around average in terms of gpu's, to be able to use these larger lcd's. average wasnt always a super highend 8xxx series card.

Log in

Don't have an account? Sign up now