Comments Locked

78 Comments

Back to Article

  • BoFox - Wednesday, March 31, 2010 - link

    If I wanted huge screen real estate, I'd definitely go for a 1080p projector that can do anywhere from 100" to 20'. Of course, a top-of-the-line one would cost upwards of $10000, but a really nice one would only be a bit over $1000. Give me this over "jail" bars of bezels anytime!

    I'm a bit puzzled at why ATI is doing a 2GB version to counter the GTX 480, and not a slightly faster version. Right now is AMD/ATI's real chance to seize the bull's horns with a death grip. By all means they should release a 950-1000MHz version of 5870, named 5890! Even if the power consumption is 25-50W more, it would still be considerably lower than the GTX 480, and actually pwning it in nearly all of game benchmarks. Even better would be to release a 512-bit version just like they did 4 generations ago with HD2900XT. With up to 100% greater memory bandwidth, there would be roughly 20% more performance at 1000MHz core clock across all benchmarks, if not more.

    I say this with mercy.. if AMD does not truly seize the moment with a death grip by the horns, AMD will regret it for a long time, if not forever.



  • bunnyfubbles - Wednesday, March 31, 2010 - link

    Why not go with 3 cheaper projectors and use them with eyefinity? One of the oft neglected advantages to Eyefinity is a properly supported game can actually provide a player with a FOV advantage - they can actually see more of the game world than other players without distorting the image.

    This was never a counter to the GTX480, the E6 edition card had been planned long before we knew anything concrete about Fermi. And considering the benches, its quite obvious that 2GB is not needed for today's games. If ATI was going to introduce a counter to Fermi it would simply be a higher clocked 5870, but even that's not necessary save for bragging rights.

    And a 512bit memory interface is the last thing I'd expect. It's actually bizarre you bring up the HD 2900XT as if it was something ATI should look back on for inspiration. If anything the HD 2900XT was ATI's own GTX480 debacle.
  • 1reader - Wednesday, March 31, 2010 - link

    I've hadn't thought about it that way, but the 2900XT situation was very similar to nVidia's 480GTX situation now. Like you said, definitely something ATI doesn't need to look back on for inspiration. That's why (I believe) ATI switched to GDDR5 as quickly as possible, to get as much throughput through that 256 bit memory interface.

    On the other hand though, I have a 2600XT with GDDR3 that makes a perfectly satisfying backup card. It definitely wouldn't have enough power to drive 6 displays though.

    Also, what's up with AnandTech? I don't check back for two days, and the site disappears, only to be replaced by this sexy tech website. ;)
  • brysoncg - Wednesday, March 31, 2010 - link

    Here's a thought: get a theater room with 6 hi-def projectors, and set them up in the eyefinity 6 setup. if you spent a little bit of time with it, you would be able to perfectly line up the edges of the projections from each projector, and you then have the eyefinity 6 setup, without the need for bezel correction (no bezel!), and therefore no crosshair problem. The only problem would be the cost....
  • erple2 - Friday, April 2, 2010 - link

    I think that the other problem would be the space. If a 1080p can comfortably drive a 100" screen, having a large enough wall to put 3x2 surfaces on it would become problematic, I'd think. I don't know too many people that have a 21' wide by 8' tall room where they could reasonably project onto...

    Plus the screen for that would be ... pricey.

    However, some cheaper 720p projectors would be an interesting proposition, particularly projecting on a smaller wall - maybe 1/2 the size? so about 11' wide by 4' tall?
  • Xpl1c1t - Wednesday, March 31, 2010 - link

    the ring bus is definitely worth looking back upon
  • Calin - Thursday, April 1, 2010 - link

    Also, unlike the computing units (which you can mostly disable at will in a finished product), any bad transistor in that ring bus would brick the entire chip
  • BoFox - Thursday, April 1, 2010 - link

    True.. 3 cheaper projectors with eyefinity would be an ideal solution.. and the screen could be a bit curved like at many cinema movie theaters today!

    On the same day Nvidia released GTX 480, AMD released this 2GB version to counter Nvidia's offering. Of course, AMD promised this 2GB version a long while ago, so it's about time. Perhaps it won't be long before AMD releases the faster 5890.

    About the 512-bit bus: It is certainly do-able on a 40nm process, compared to when it was done on 80nm process with a 1024-bit ringbus a while ago on that HD2900XT (I will agree with you here in that it was redundant for the 2900XT)..

    ____
    ""Does a 512-bit bus require a die size that's going to be in the neighbourhood (or bigger) of R600 going forward?"

    No, through multiple layers of pads, or through distributed pads or even through stacked dies, large memory bit widths are certainly possible. Certainly a certain size and a minimum number of consumers is required to enable this technology, but it's not required to have a large die."
    -(Sir Eric Demers, architecture lead on R600 which is the still the basis of 5870's today)
    http://www.beyond3d.com/content/interviews/39/5

    If a 4890 simply performs around 19% better overall than a 5770 in all games except when using DX11, what shall we point at as the cause of the difference? The GPU cores are nearly identical in terms of clock speed, shaders, ROP's, etc.. with perhaps a slightly better optimization in the R800 architecture and better drivers. The main "obvious" difference is a 62.5% increase in memory bandwidth over the 5770. A 5870 is basically 2x 5770's in one GPU with everything doubled. It has been shown that a 5870 certainly does benefit from greater memory bandwidth.. let's say about 0.2% increase in performance per 1% increase in bandwidth.

    By the way, Nvidia made quite an interesting statement on the memory bus a short while ago:

    "With 3-D interconnects, it can vertically connect two much smaller die. Graphics performance depends in part on the bandwidth for uploading from a buffer to a DRAM. "If we could put the DRAM on top of the GPU, that would be wonderful," Chen said. "Instead of by-32 or by-64 bandwidth, we could increase the bandwidth to more than a thousand and load the buffer in one shot."

    Based on any defect density model, yield is a strong function of die size for a complicated manufacturing process, Chen said. A larger die normally yields much worse than the combined yield of two die with each at one-half of the large die size. "Assuming a 3-D die stacking process can yield reasonably well, the net yield and the associated cost can be a significant advantage," he said. "This is particularly true in the case of hybrid integration of different chips such as DRAM and logic, which are manufactured by very different processes.""
    http://www.semiconductor.net/article/print/438968-...

    Nvidia's own John Chen mentioned increasing the bandwidth from "by-32 or by-64" per chip to "more than a thousand". This translates to 8x1024, which is an 8192-bit bus. Hopefully vertically stacked dies are the future. It would effectively reduce the need for increasingly larger buffer size, and act just like embedded RAM that can instantly load the buffer in one shot. ..a bit like SSD's today (small, but "instant"), and thought to be a pipe-dream a few years ago.
  • Ramon Zarat - Wednesday, March 31, 2010 - link

    The 2900XT used a dual 512bit ring bus topology. The fact ATI or Nvidia don't use this technology today is a hint that it was not efficient enough or too complex to be commercially viable. In that sense, it was not a classic 512bit wide bus, as used by Nvidia previous generation or the 256/384/448bit bus in use today.

    A 512bit bus would be impossible to implement on the 5000 series simply because the memory controller is physically limited in hardware to "talk" to a 256bit bus. You need twice the traces on the PCB to go from 256 to 512bit and those traces must be, oner way or the other, physically linked to the GPU. The only way to speed up the memory access on the 5000 series would be to use faster DDR5 chip.
  • SoCalBoomer - Wednesday, March 31, 2010 - link

    Unfortunately, a 1080p projector just won't get you the pixels that this thing will.

    I use a 2x2 setup on my desk at work and it has far more pixels (at far FAR less price) than a 1080p projector has (which is what? 1920x1080? something like that? - I'm working on 2560x2048)

    My question would be if you can set these up as individual monitors just extending the desktop of if you HAVE to use eyefinity? I'd love to be able to do this instead of running dual cards, with the limitations on the motherboard that brings. . .
  • Paulman - Thursday, April 1, 2010 - link

    I agree that they should release a higher clocked (binned) version of the HD 5870, if only to steal NVIDIA's thunder. They wouldn't need mass availability. Even just a few hundred, or ideally 10,000+ units would be enough to dethrone NVIDIA from being able to claim "the fastest single-GPU card". And I think such claims form the bulk of what NVIDIA has to work with right now.

    A 512-bit version would require a redesign of the chip, though, which would require a lot of manpower including design verification, etc. I don't think it would be worth it for ATI/AMD. Again, releasing a higher-clocked part - now that would be super easy and super effective.
  • Calin - Thursday, April 1, 2010 - link

    A redesigned 512-bit memory interface card wouldn't come much earlier (if at all) than the next generation. Also, it would use a lot of design/test/silicon resources and time (financially, manpower, ...) for what would be some couple thousands cards sold (when AMD can not produce enough graphic chips as it is).
    Keep up availability and low price instead of the absolute top. NVidia will be in the "enviable" position of having the top performance card which nobody can find, and nothing else in performance and mainstream segments.
  • Hargak - Monday, April 5, 2010 - link

    I would think they already have a working dual gpu that will beat it, yet stay close to the single cards output.
  • Griswold - Wednesday, March 31, 2010 - link

    I think 3 displays or better yet, 5 displays with the middle one in pivot mode, to counter any crosshair issues, is they route go. Should be possible, no?
  • bigboxes - Wednesday, March 31, 2010 - link

    No one says you have to add monitors in pairs. This card just has the connection for UP TO 6 monitors. So hook up five and then configure eyeinfinity.
  • Anand Lal Shimpi - Wednesday, March 31, 2010 - link

    3 is currently possible, however 5 isn't supported by the drivers yet. AMD is apparently working on it though.

    Take care,
    Anand
  • Quidam67 - Wednesday, March 31, 2010 - link

    Devils advocate, but it seems to me that you actually need 9 displays to deal with the crosshair issue.

    And wouldn't that also allow you to maintain aspect ratio?

    Imagine the fun Anand could have putting 9 monitors together :O)

  • Granseth - Wednesday, March 31, 2010 - link

    why not buy 3 cheap projectors and set them up with eyefinity.
    No Bezels, high resolution and a large screen.

    I would very much like to see somebody doing this and giving feedback about how this solution would work. Although the problem would probably be to find a cheap projector with display port.
  • AmbroseAthan - Wednesday, March 31, 2010 - link

    Though not eyefinity, this could give you an idea of what it is like:

    http://nthusim.com/setup/bhawthorne-triple-circula...
  • Makaveli - Wednesday, March 31, 2010 - link

    From this review it looks like the 2GB's are needed for a 6 screensetup.... I don't see how that is suppose to be a counter to the GTX 480.

    Graphic card prices aren't going down as quickly as people would have liked based on its review, just wait for the refresh card its coming.
  • cactusdog - Wednesday, March 31, 2010 - link

    Ok it might not be ideal for gaming right now but i could see ATI selling heaps of these cards for commercial purposes. Should be good for security people , finance sector, research and education, advertising dispalys etc the list goes on.

    OMG i just had a thought of connecting 6 HD TV's, you could make your own billboard lol.

  • eduardoandradeiturribarria - Monday, May 3, 2010 - link

    I have 4 hdtv's to watch football games. Of course Monday nights or the Super Bowl would be may main objective. Do you think it can be done through eyefinity?
  • nuudles - Wednesday, March 31, 2010 - link

    Hi Anand,

    I suppose it is not possible, but would it be possible to crossfire a normal 5870 and a 5870 E6?

    If AMD can enable that then I think they will sell quite a lot more: people who bought a single 5870 + use eyefinity might want an even more immersive experience, and if they could add a 5870 E6 + x-fire it with their normal 5870 they might be a lot more tempted to buy one, even if they lose 1 or 2 frames (due to 1GB+2GB vs 2x2GB, the driver would probably need to treat both cards as 1GB models) in comparison with 2 5870 E6's x-fired (still a lot more performance than a single 5870 E6)?

    Thanks!

    Kind regards,
    Morne
  • GiantPandaMan - Wednesday, March 31, 2010 - link

    Just wondering if you guys have tried putting 3 projectors in portrait mode and seeing how that worked. Figured 3 1280x720 projectors would make a pretty sweet wall of gaming...then use the other 3 display ports for your actual desktop monitors. Anything in the works for that? Would be a fun little project to put in Anand's theater room. :)
  • Anand Lal Shimpi - Wednesday, March 31, 2010 - link

    This is unbelievably tempting however I foresee two hurdles:

    1) Wall space. My theater has a 2.35:1 screen, I'd need something much wider (or end up with a really skinny display) for a 3x1 projector setup. I don't think I even have a room that has enough uninterrupted wall space for this to work well at a good size. Perhaps I'm thinking too big though. I could just stitch together three 80" screens or something like that.

    2) Inputs. Most 16x9 projectors don't use DisplayPort, although a quick Google search reveals a few options.

    I'll give it some more thought :)

    Take care,
    Anand
  • GiantPandaMan - Monday, April 5, 2010 - link

    How about 3 projectors, 3 screens stitched together, and just hang them from the ceiling so you can create a curved screen? That's the beauty of using 3 projectors anyway. Figure a 5970 could drive a 2160x1280 curved screen perfectly.
  • Patrick Wolf - Wednesday, March 31, 2010 - link

    This is just craziness. Dunno how someone couldn't just be happy with a single big 1080p TV. Ok, you can see the pixels, so what?; You can also the entire image. I'd like to see a video showing a nice (60"?) set up right next to this E6 display showing the same game or video and do a poll: "Which would you choose?"
  • DanNeely - Wednesday, March 31, 2010 - link

    To show the difference between 1080p and 5760x2160 in a video you'd need a greater than 1080p video and display to keep it from just being down sampled away.
  • Calin - Thursday, April 1, 2010 - link

    This isn't for video - it's for things like - let's say - playing a war plane simulator and seeing actual planes in the distance, not a black dot, or for seeing at decent quality text from several large sources (like seeing several of the very large Excel spreadsheets some of the financial people use). FPS gaming still has issues, I'd say using 3 old, 1600x1200 displays in portrait mode would be best for FPS (a 2.25 aspect ratio). Even with 5 very wide monitors in portrait, you'd end up with almost 3:1 view ratio (which might be good or bad)
  • Roland00 - Wednesday, March 31, 2010 - link

    Does the extra memory make a difference in crossfire benches? I am curious for each frame buffer has to keep track of what the other frame buffer is doing, thus having a larger frame buffer would make sense. Is there any chance we can see these results?
  • cfaalm - Wednesday, March 31, 2010 - link

    Why don't AMD go talk to display manufactureres to thin out or even totally forego any bezels on Eyefinity compatible displays? In other dual/multiple screen situations than Eyefinity it can still be desirable to have real thin or no bezels, so it won't be that far out.
  • Aclough - Wednesday, March 31, 2010 - link

    They're working on them now, but they aren't out yet. Be warned that they'll probably cost more than normal monitors though.
  • cfaalm - Wednesday, March 31, 2010 - link

    I expected they would cost a bit more. Though I don't have any figures on the premium I guess it would be worth it compared to what gains can be had it these special situations where you'd be spending a small fortune anyway.
  • mjrpes3 - Wednesday, March 31, 2010 - link

    It's greatly in their interest to develop this technology: lowering the barrier and increasing the incentive to buy 3x or 6x quantity of a product.
  • Calin - Thursday, April 1, 2010 - link

    Sell more cheap monitors instead of fewer expensive monitors? I don't think so.
  • Calin - Thursday, April 1, 2010 - link

    The bezels are there with a purpose (strength, if nothing else).
    There are monitors with thin bezels - what we might need now could be pre-built monitors in 6x configuration, reducing as much as possible the bezel size (they could do it better in the factory). Maybe some boutique industry could spring from this? Something like the tuning shops in the auto industry
  • behrouz - Wednesday, March 31, 2010 - link

    Hi Anand.
    this is very beautiful and the logo at top of page is better than previous logo.

    good luck.
  • Manuel1975 - Wednesday, March 31, 2010 - link

    Does anybody know if there will be any mac drivers? This would be a formidable beast in combination with MacPro and Mediaserver Software...
  • erple2 - Friday, April 2, 2010 - link

    I think that this tech would be wasted on a Media Server implementation. Unless you're talking about something different than what I'm thinking of. Streaming media to these devices would be essentially pointless, as few, if any, media is available at any resolution beyond 1080p.

    Putting it on a Mac makes even less sense, given that what makes this unique is the ability to run solid 3D games titles. And last I checked, there were few, if any, 3D games available on the Mac Platform.

    While Apple does offer multiple graphics cards in their MacPro systems, they're generally very low-end graphics products (currently NVidia GeForce GT 120 based), meant to drive CAD or other non-3D gaming applications. Those can easily handle any Media server load you could throw at it. I suppose you could make the argument that it could upscale the video to 2160p (doubling 1080p), but that seems to be pointless to me - just run a larger 1080p projector.
  • Manuel1975 - Tuesday, August 3, 2010 - link

    Hi Erple,

    The media I use to drive multiple dipslays normaly reqiure something along the range of 640x480 to a 4k type of resolution. Although these high resolutions are not mainstream, Youtube for instance does alow you to upload videos in 4k resolution. The future of HD+ video is very very near.
    And nowadays OSX ships with something called quartz composer. This something you can compare with prosessing. Its OpenGL bases image synthesis. Truely amazing stuf: 4k+ resolutions rendered at 60hz. Eazely.

    Tip for your next post: try to think outside your box before posting.
  • [email protected] - Wednesday, March 31, 2010 - link

    Hi, Anand.

    Did you have a chance to try (or ask the AMD guys) about 12 screens? Crossfiring 2 E6 cards makes you wonder about that chance.

    I once had a chance to put up a 4x3 screen, 2 years ago, with absolutely no bezel whatsoever, but that set up cost my company an insane amount of money. Each screen cost 7.000$, for starters.

    I see this E6 as an alternative to keep an eye on. I couldn't care less about the bezel problems, as my company usually sets up multiscreen displays either with projectors or with bezelless LCD's, but 6 displays might not be enough for our line of work.

    So, can it be crossfired to a 12 screen 3D accelerated output ? (I'm not concerned about performance, as our apps usually don't stress GPU's much)
  • [email protected] - Wednesday, March 31, 2010 - link



    When I wrote "a 4x3 screen", I meant "a 4x3 multi-screen display, as per 12 x 720p displays arranged in a 4x3 grid"


    PS: damn, we should be able to edit our own posts!
  • Ryan Smith - Wednesday, March 31, 2010 - link

    It can be done. They had it working under Linux using X-Plane back at their September launch. However it's not even close to being in a shipping state, and I don't have the foggiest idea when it would be.
  • [email protected] - Wednesday, March 31, 2010 - link

    Thanks for the quick reply, Ryan.

    Can you tell me how that works, in practice. Now that Windows 7 (and Vista) ditched horizontal spanning, I can't just set my company's apps to 3072x768 (3 screens), because that resolution is not even made available by the driver anymore.

    From my previous experiente, I can say that as long I have the taskbar spanned through 3 screens, it is safe to assume that I will be able to accelerate 3D apps at the same resolution as the desktop, at least.

    Eyefinity must be somewhat different. If the Catalyst 10.3 is still compliant with WDDM 1.1, then I'm guessing our 3D engine must be 'approved' or at least able to aknowledge the availability of Eyefinity.

    Back in the XP days, spanning throug 2 screens was transparent - the 3D app didn't even know it was outputting to 2 or more screens, but with eyefinity, compatibility must be achieved at a much lower level. Is that right?

    I'm sorry to bother you, but I don't have access to a hands-on approach.

    Thanks,

    Fernando
  • [email protected] - Wednesday, March 31, 2010 - link

    I'm sorry for the long post, guys.

    Shorter version:

    - Does the desktop look like XP in Span mode ?(I think I see a taskbar streched along 3 screens, in Anand's video)
    - when you run a 3D app (one that is NOT oficially compatible), will 'awkward' resolution (say 5760x2160) be availabe as a choice?

    If you answer YES to these 2, then I'm saved, and I'm in trouble aswell, as my company only supports Nvidia (and changing this standard will cost a lot, testing in Quality Assurance will be havoc. Time for a change, I guess. :)

    Fernando

  • Ryan Smith - Wednesday, March 31, 2010 - link

    Background info: Eyefinity is the trade name for what AMD calls Single Large Surface technology. SLS operates pretty much as how the name implies: the drivers provide a very large resolution option for applications to work with, and then AMD's hardware takes care of chopping up the image for multiple monitors. Providing the OS/software with a very large resolution is the fundamental aspect behind Eyefinity.

    So to answer your questions, yes, 3D games see the large resolution. As for the desktop I've never tried it (and Anand currently has all of our Eyefinity gear). To applications/games at least, this is completely transparent. Eyefinity support basically amounts to being able to handle the oddball aspect ratios and the higher resolutions.in the case where resolutions are hard-coded in.
  • vgdarkstar - Wednesday, March 31, 2010 - link

    I would like to point out to all the naysayers of multimon, you stop seeing the bezels after using it for a bit. Do you notice your nose? it's the same effect.
  • Anand Lal Shimpi - Wednesday, March 31, 2010 - link

    I agree with that completely which is why I said it wasn't an issue in a 3x1 setup. However in the case of a 3x2 configuration you always notice the bezels in the center of your screen because they often occlude important information (e.g. dialog boxes, crosshairs, etc...).

    Take care,
    Anand
  • Zstream - Wednesday, March 31, 2010 - link

    Why is it that people believe this setup is meant just for gaming? What about people who can have six LCD screens monitoring all sort of devices in a corporate world? What about those who use photoshop and can use other multi-monitor software?

    Just a thought, not all geeks are gamers you know ;)
  • Anand Lal Shimpi - Wednesday, March 31, 2010 - link

    Oh I agree, and that's why I mentioned dialog boxes as being a problem. Honestly I think the biggest application for a 6-display setup right now is for more than just gaming. It's just a shame that you have to buy a $479 gaming card to enable it :)

    Take care,
    Anand
  • Zstream - Wednesday, March 31, 2010 - link

    Hmm, wonder if they have a snap to grid function in Windows 7 for each LCD. That would be nice...
  • Taft12 - Wednesday, March 31, 2010 - link

    Those of us with 6 or more screens have been using multiple low-end video cards (including PCI where necessary) for many years now. It's still much cheaper than this.
  • Guspaz - Wednesday, March 31, 2010 - link

    Perhaps because such solutions have been available for non-gaming uses for VERY many years. You could grab Matrox graphics cards (still hanging around in the business world), or use two GXM products to get six displays on a standard ATI/nVidia graphics card.

    There are also other solutions from other companies, or just the possibility of sticking three dual-head graphics cards in a system, which can be pretty cheap if you don't need powerful 3D performance.

    In short, for productivity, this is old news; Eyefinity is really only of note for gaming, and it doesn't look very useful or practical for that either. Triple-head gaming has merit since it avoids many of the problems, and that can be done fairly easily (some graphics cards support three outputs, or you can use a GXM product).
  • lwatcdr - Wednesday, March 31, 2010 - link

    I could see this being used for Simulators and military systems as well.
  • Maroon - Wednesday, March 31, 2010 - link

    I hope AMD didn't spend a lot of time on this because this is an unbelievably niche market they are after with this card.

    What needs to be done (if possible) is take the monitors apart and just piece together the screens and somehow relocate the hardware all behind everything.

  • sparkuss - Wednesday, March 31, 2010 - link

    Anand wrote:

    The larger frame buffer did help raise minimum frame rates, but not enough to positively impact the average frame rates in our tests.

    I thought there had been review comments before about the 5870 being memory limited in some tests. Does this mean that the added 1GB doesn't solve any of that without further hardware/software changes from ATI?

    The 2GB cards, standard not eyefinity, are starting to list at manufactures sites. I was going to wait for them but maybe not worth it after all? I'm not interested in eyefinity, just CF for 1920 +above gaming.

    Do you have any of the base 2GB cards planned for review shortly?


    And P.S. the new comments system lose all adv editing?
  • poohbear - Wednesday, March 31, 2010 - link

    6 monitor support is good and all, but seriously how much demand is there for a card that can support 6 monitors? its a niche product, not sure what all the hoopla is about.:p
  • notty22 - Wednesday, March 31, 2010 - link

    From Review "It's worth mentioning that these power numbers were obtained in a benchmark that showed no real advantage to the extra 1GB of frame buffer. It is possible that under a more memory intensive workload (say for example, driving 6 displays) the 5870 E6 would draw much more power than a hypothetical 6-display 1GB 5870."

    I think a tester should run Furmark when in 3 way or 6 way eyefinity. I have a sinking feeling, this would break the card. The tests prove out that you need crossfire grunt of two 5870's to do any gaming. So the extra cost of this card , trying to do it all in one is a failure.
  • catalysts17TX - Wednesday, March 31, 2010 - link

    i have a 4870, great card but runs hot and sucks up a lot of wattage at idle power both the 5850 and 5870 due much better in heat and wattage, but at load the 5870 uses more wattage than my 4870 does. the 5870 card is NICE with the eyefinity 6 edition. MY question is will there be a 5850 Eyefinity 6 Edition? dont care about price just performance and wattage

  • Taft12 - Wednesday, March 31, 2010 - link

    Will Displayport ever start catching on as a monitor connector? The majority of displays still don't include a DP input and if it hasn't started by now, I wonder if it ever will...
  • frenchfrog - Wednesday, March 31, 2010 - link

    It would be so nice:

    -3 monitors for left-center-rigth views
    -1 monitor for rear view
    -2 monitors for guages/GPS/map/flight controls
  • vol7ron - Wednesday, March 31, 2010 - link

    I'm not sure why a "wall" was created. Your problem with FOV is the fact that you have too much of a 2D setup, rather than an easier-to-view 3D.

    Suggestion: 3 stands.

    Center the middle pair to your seat.
    Adjust the right and left pair so they're at a 15-25 degree slant, as if you were forming a hexadecagon (16 sided polygon @ 22.5 degrees)

    vol7ron
  • cubeli - Wednesday, March 31, 2010 - link

    I cannot print your reviews anymore.. Any help would be greatly appreciated!
  • WarlordSmoke - Wednesday, March 31, 2010 - link

    I still don't understand the point of this card by itself, as a single card(no CF).

    It's too expensive and too gaming oriented to be used in the workplace where, as someone else already mentioned, there have been cheaper and more effective solutions for multi-display setups for years.
    It's too weak to drive the 6 displays it's designed to for gaming. Crysis(I know it's not a great example of an optimized engine but give me a break here) which is a 3 year old game isn't playable at < 25fps and I can't imagine the next generation of games which are around the corner to be more forgiving.

    My point is, why build a card to drive 6 displays when you could have 2 cards that can drive 3 displays each and be more effective for gaming. I know this isn't currently possible, but that's my point, it should be, it's the next logical step.

    Instead of having 2 cards in crossfire, where only one card has display output and the other just tags along as extra horsepower, why not use the cards in parallel, split the scene in two and use two framebuffers(one card with upper 3 screens and the other card with the lower 3 screens) and practically make crossfire redundant(or just use it for synchronizing the rendering).

    This should be more efficient on so many levels. First, the obvious, half the screens => half the area to render => better performance. Second, if the scene is split in two each card could load different textures so less memory should be wasted than in crossfire mode where all cards need to load the same textures.
    I'm probably not taking too seriously the synchronization issues that could appear between them, but they should be less obvious when they are between distinct rows of displays, especially if they have bezels.

    Anyway this idea with 2 cards with 3 screens each would have been beneficial to both ATI(sales of more cards) and to the gamers: Buy a card and three screens now, and maybe later if you can afford it buy another card and another three screens. Not to mention the fact that ATI has several distinct models of cards that support 3 displays. So they could have made possible 6 display setups even for lower budgets.

    To keep a long story short(er), I believe ATI should have worked to make this possible in their driver and just scrap this niche 6 display card idea from the start.
  • Bigginz - Wednesday, March 31, 2010 - link

    I have an idea for the monitor manufacturers (Samsung). Just bolt a magnifying glass to the front of the monitor that is the same width and height (bezel included). I vaguely remember some products similar to this for the Nintendo Gameboy & DS.

    Dell came out with their Crystal LCD monitor at CES 2008. Just replace the tempered glass with a magnifying glass and your bezel problem is fixed.
    http://hothardware.com/News/Dell_Crystal_LCD_Monit...
  • Calin - Thursday, April 1, 2010 - link

    Magnifying glass for such a large surface would be thick and heavy (and probably prone to cracking), and "thin" variations have image artefacts (I've seen a magnifying "glass" usable as a bookmark, and the image was good, but it definitely had issues
  • imaheadcase - Wednesday, March 31, 2010 - link

    As much R&D the invested in this, It seems better to use it towards making own monitors that don't have bezels. The extra black link is a major downside to this card.

    ATI monitors + video setup would be ideal. After all, when you are going to drop $1500 + video card setup, what is a little more in price for a streamlined monitors.
  • yacoub - Wednesday, March 31, 2010 - link

    "the combined thickness of two bezels was annoying when actually using the system"

    Absolutely!
  • CarrellK - Thursday, April 1, 2010 - link

    There are a fair number of comments to the effect of "Why did ATI/AMD build the Six? They could have spent their money better elsewhere..." To those who made those posts, I respectfully suggest that your thoughts are too near-term, that you look a bit further into the future.

    The answers are:

    (1) To showcase the technology. We wanted to make the point that the world is changing. Three displays wasn't enough to make that point, four was obvious but still not enough. Six was non-obvious and definitely made the point that the world is changing.

    (2) To stimulate thinking about the future of gaming, all applications, how interfaces *will* change, how operating systems *will* change, and how computing itself is about to change and change dramatically. Think Holodeck folks. Seriously.

    (3) We wanted a learning vehicle for ourselves as well as everyone else.

    (4) And probably the biggest reason of all: BECAUSE WE THOUGHT IT WOULD BE FUN. Not just for ourselves, but for those souls who want to play around and experiment at the edges of the possible. You never know what you don't know, and finding that out is a lot of fun.

    Almost every day I tell myself and anyone who'll listen: If you didn't have fun at work today, maybe it is time to do something else. Go have some fun folks.
  • Anand Lal Shimpi - Thursday, April 1, 2010 - link

    Thanks for posting Carrell :) I agree with the having fun part, if that's a motivation then by all means go for it!

    Take care,
    Anand
  • XiZeL - Thursday, April 1, 2010 - link

    nice review, the real shame is the bezel, hope display vendors will start making some extremely thin bezel models for this kind of use.
    as for battlefield is saw you use a chase bench and waterfall bench... are these sequences done buy you or in game benchmarks you just have to run?

    thanks for the answer.
  • GullLars - Thursday, April 1, 2010 - link

    One thing i've been thinking about since the bezel problem, why don't anyone make a setup of 3x2 22" monitors in a single frame? I've seen DIY people take the frame off monitors for embedding them in walls, custom frames, or computer chassies. It should be doable to take out the panels, and mount them in a new frame with tape or glue or something on the backside. I would easily consider buying such a setup. You would end up with a monitor rougly around 50" (maybe 55"?) with 5040x2100 or 5760x2160.

    For a 3-panel setup, 3 22" screens in portrait mode in a single frame would also be nice. 3150x1680 or 3240x1920.
  • Zorro3740 - Thursday, April 1, 2010 - link

    How can anybody who is serious about image quality fall for this obvious sham. How can the black bars that separate the monitors be anything less than unacceptable? You have to be crazy to waste your money on this tech. 3D is way more appealing than this pseudo high res garbage. If you want real high resolution you simply get a quad XGA monitor like the HP LP3065 I'm using right now and call it a day. If you want something actually interesting then you get anything that might be 3D capable. It seems to be the next cool gadget feature in video.

    The cost of projectors and a screen and the features necessary like lens shift would be so damn expensive and not to mention the heat generated by 3 or 6 LCD projectors would be so ridiculous to not have the "black bar" effect. I really don't understand where AMD/ATI is going with this tech.

    Hell, I can't even get multiple displays to work properly with some of my 4850 crossfire setups and they come up with the idea to make a video card capable of up to six displays. How about fixing the Gray Screen of Death with multiple displays on the 4800 series? Eyefinity, yeah whatever.........

    Ludicrous Speed!!!! Go!!!!!!
  • phantazy - Sunday, April 4, 2010 - link

    I have a 4850x2 driving 4 22" screens in a 4200x1680 config (all 4 in portrait mode). Running my 4 (or even getting another 2 screens) from 1 gpu is much more interesting now... Have you tried running the new card in Crossfire just to see what the AA performance in games is?? And I mean Crossfire with 58xx cards and the Crossfire with 48xx cards just to see the support/scalability and so on?? If you're showing the performance of the new 480 in SLi, why not show the 5870 w/ 6 outputs in Crossfire with 1 5970 or even 2 5970... some people actually have the money and interest for this... not to mention you can buy the cards in 6-9 months and get them at half the price compared to today.. And btw, regarding the monitor stands, AMD looks to be choosing a "budget" alternative when showing them up, my Ergotron LX Dual Side-by-Side Arm stands got me up and running in about 30 mins from opening their boxes and clearing my desk, and I got my screens 99.9% prefectly aligned.
  • Hargak - Monday, April 5, 2010 - link

    For someone wanting to simply setup an extreme resolution display the ideal route (setting cost aside) is using 6 1080p projectors. they don't project a bezel. otherwise, go buy a 55" LED LCD, or wait until they have double res (denser pixel) displays for larger scale monitors. The 30" is a good balance of size, immersion, price, setup, resolution at much higher than standard High def, this is bledding edge, which means many will bleed money to get it right for the rest of us. This is simply not something you will see often. Hope the rambling came together as a thought.
  • Necrosaro420 - Saturday, April 10, 2010 - link

    I consider myself a pretty hardcore gamer. But I dont see why on earth someone would need 6 displays
  • Etern205 - Sunday, May 2, 2010 - link

    Anyone saw this yet?

    http://www.engadget.com/2010/04/30/powercolor-hd59...
  • eduardoandradeiturribarria - Monday, May 3, 2010 - link

    Can I split a TV signal through eyefinity? Say it is football season, I already have for 42" hdtv sets. Could I use eyefinity to project a split single tv feed on my tv sets?
    Regards

Log in

Don't have an account? Sign up now