Comments Locked

46 Comments

Back to Article

  • john810 - Thursday, March 4, 2010 - link

    [url=http://www.modconvertermac.org">http://www.modconvertermac.org]MOD Converter for Mac[/url] is excellent video converter for mac. This versatile mac mod converter can convert between all popular video, HD video and audio formats with super fast conversion speed and high output quality, such as AVI, MP4, MOV, MKV, WMV, DivX, XviD, MPEG-1/2, 3GP, 3G2, VOB Video, MP3, AAC, and AC3 Audio etc.
    [url=http://www.modvideoconvertermac.com">http://www.modvideoconvertermac.com]MOD Video Converter Mac[/url] is an efficient and versatile video conversion software designed specilized for Mac users.
    [url=http://www.converttomkv.com">http://www.converttomkv.com]Convert to MKV[/url] is a powerful MKV video converter and MKV file converter, that can help you convert videos to MKV files with fast speed and excellent output quality.
    [url=http://www.swfconvertermac.org">http://www.swfconvertermac.org]SWF Converter Mac[/url] is excellent SWF conveting software for Mac users. This versatile Mac SWF converter can convert between SWF and all the popular video.
  • CodeFuNinja - Thursday, February 11, 2010 - link

    I'm personally looking forward to seeing what this new hardware can actually do for gaming and performance.

    As much as i prefer nVidia GPUs over AMDs GPUs, i can't say bad about ATI/AMD on any of their previous GPUs, because i think of the 360 GPU and what it can actually do, and to actually outpace the RSX by such a wide margin that's impressive, especially as you're working on a limited but standardized platform as compared to the PC and its wide open platform and unlimited system configurations

    Although thats kind of funny as i've always prefered AMD CPUs and nVidia GPUs
  • YpoCaramel - Sunday, February 7, 2010 - link

    nvidia does have a GT215 series called 3xx, from low to mid/high end, so they also want to avoid confusion with that... for once.
  • prophet001 - Tuesday, February 2, 2010 - link

    this is pretty exciting :)

    i'm runnin an old 8800 GTX and ready for some new bleeding edge nvidia goodness
  • ambientmf - Friday, February 5, 2010 - link

    You'll be waiting a long time then.
  • DanNeely - Tuesday, February 2, 2010 - link

    If you remember the nVidia rumors from last spring/summer the GT300 chip was supposed to be 400-500SP scaleup of the existing architecture from the 8xxx/9xxx/2xx series cards done on a 40nm process (possibly with the DX10.1 update that the GT240 is sporting). I suspect that nVidia's 40nm woes resulted in this chip being canceled in favor of putting effort into bringing Fermi forward in time; although they still weren't able to beat ATI to the punch with a DX11 part.
  • CaioKK - Tuesday, February 2, 2010 - link

    Why on Earth did they name their cards like the previous generation AMD cards? There's going to be a lot of confused consumers...
  • Stas - Tuesday, February 2, 2010 - link

    HD4000 and GTX400? not much confusion there. I would be more concerned about uninformed customers being ripped off for an 8800GT 3 years running now (9800/GTS250/GT?300).
  • Per Hansson - Tuesday, February 2, 2010 - link

    It's actually 8800 GTS 512mb > 9800 GTX > Geforce GTS 250
  • Mr Perfect - Tuesday, February 2, 2010 - link

    Didn't they do 8800 GT -> 9800 GT -> GTS 240 too? Or was the 240 different then the other two?
  • DanNeely - Wednesday, February 3, 2010 - link

    The 8800/9800Gt had 112 SPs, the 240 only has 96; making it comparable to the best of several 9600 models they offered.
  • mindless1 - Wednesday, February 10, 2010 - link

    There is 240GT and 240GTS. GTS is not just an overclocked GT, and it has the larger 55(?)nm process size vs 40nm for GT.

    The former 240GT is what you refer to, the latter is what the former post refers to.

    http://www.nvidia.com/object/product_geforce_gts_2...">http://www.nvidia.com/object/product_geforce_gts_2...
  • marc1000 - Tuesday, February 2, 2010 - link

    anyway, they always try their best to confuse the customer... and they did it again! haha!
  • jeromekwok - Tuesday, February 2, 2010 - link

    Is Nvidia going to rename yet again the old good 8800GT as GT300 series?
  • AcydRaine - Wednesday, February 3, 2010 - link

    Haha, maybe a low-mid GT300 series. I would expect like the GTX260/275 chips to be the upper GT300 parts. Though the good ole G92 chip still has some pull. LoL
  • MrSpadge - Tuesday, February 2, 2010 - link

    8800GT will probably skip the 300 series and go straight to 400 series, as it's such a succesful card!
  • aegisofrime - Tuesday, February 2, 2010 - link

    Perhaps a 40nm version? AFAIK the GTS 250 is still 55nm. A 40nm version could be a low-mainstream card.
  • Bob Smith - Tuesday, February 2, 2010 - link

    What a pity!

    ATi/AMD delivered the DirectX 11 experience to its fans months ago. nVidia couldn’t make it. It’s real, and it’s a tremendous defeat! That’s what happened, that’s the truth.

    It’s called “Cost of Opportunity”. There’s no price to be the first to experience a ATi Radeon HD 5870 in its all glory, a single card crushing nVidia’s dual card GTX 295. And we’re talking about a heavy title such as Crytek Crysis @ 2560×1600.

    According to Tom’s Hardware, nVidia GTX 295 simply didn’t work at that resolution. Pity again! Please, see for yourself.

    http://www.tomshardware.com/reviews/radeon-hd-5870...">http://www.tomshardware.com/reviews/radeon-hd-5870...×1600_gtx_295_buffer_memory&xtcr=2

    One of the paragraphs from this review says:

    “Notice the missing result for Nvidia’s GeForce GTX 295 at 2560×1600 with 8xAA? That’s due to the card not having ample on-board memory to run that configuration (and the game not knowing any better than to keep it from trying). Grand Theft Auto gets around this by simply making resolutions unavailable if a graphics card doesn’t have a large enough frame buffer. Crysis crashes instead.”

    GTX 295 not being able to run Crysis @ 2560×1600? Pity!

    Fermi has got to be better and faster than Cypress. It’s an obligation for nVidia to build this in that way, since they had, at least, more time to conceive it.

    And, as always, don’t be fooled: you’re going to hurt your pocket to have Fermi installed onto your RIG. Be prepared to pay the price. It happened with Cypress. It’s going to be the same with Fermi. And since, nVidia cards are always much more expensive than ATi/AMD’s, one Fermi card can reach as much as 750 bucks. Wait and see.

    Take this weekend and go to your favorite retail store and grab your ATi Radeon HD 5870. It’s there, real. Just take it.

    Fermi, humpf…maybe 3Q2010 you’ll get one. It’s just an illusion…a dream (that hasn’t come true…hehehehe…)

    Cheers!

    ATi/AMD vs nVidia

    ATi/AMD offers 3-monitor output. You don’t need a Crossfire (CF).
    nVidia: the new Fermi delivers only 2-monitor output. If you want to experience a 3-monitor gaming setup, you must buy two nVidia cards, a lot more expensive than one single ATi Radeon HD 5970, for instance. Although it’s a bit expensive to afford a 3-monitor rig, that’s what gamers are starting to look at.

    ATi/AMD is delivering the ultimate gaming experience with its latest Dx11 card.
    nVidia is moving away from the gaming industry bringing horrible products that definitely didn’t make it.

    NVIDIA 3D Vision Surround – Expensive and A Huge Mess?
    http://en.expreview.com/2010/01/23/nvidia-3d-visio...">http://en.expreview.com/2010/01/23/nvid...nd-expen...

    We’re talking about gaming here, not working at the office with business solutions, CUDA, GPGPU, etc.

    @those_who_said_about_dx11_titles
    What the hell is CUDA, PhysX, anyway?
    If now we have just a couple of titles, it’s the beginning of a new gaming generation with Windows 7 and DirectX 11; it’s a trend, it’s the future.

    Does anybody REALLY has any titles which benefit from PhysX. How many are available? Did you know that PhysX is proprietary? nVidia does not offer it as an open standard. Guess now why so few or non-PhysX titles? huh?

    ATi/AMD is working on a new generation of its current line of products to be available in the second semester.
    nVidia: when exactly Fermi will be available? hehehehe…pity!

    ATi/AMD is completely committed to the gaming industry.
    nVidia is a big company, but it’s not working to the gaming industry anymore.

    ATi/AMD won last year. And it will win again this year.
    nVidia was a big FIASCO last year. It it will do it again in 2010! Pity!

    Cheers!
  • wizzlewiz - Wednesday, February 3, 2010 - link

    What is this I don't even
  • osmosum - Tuesday, February 2, 2010 - link

    What a pity you are just picking and choosing results to suit your argument.

    Theres actually a benchmark many sites use to convey what you are talking about. That is, the benchmark consists of dozens to 100 games. Each FPS score of every game is added together. Nvidia always wins. SLI always comes in last. Get a clue.

    Oz
  • osmosum - Tuesday, February 2, 2010 - link

    The Sum-FPS benchmark gives 0 points for a game if it cannot play at that resolution. ATI cards wind up playing the least amount of games when compared to Nvidia. Thats all I need to know who to buy. I dont have time to fiddle with settings to make *any* video card work, it just better, off the bat.

    Oz
  • vidianess - Tuesday, February 2, 2010 - link

    when fermi comes out all ati/amd DX11 cards will be obsolete.
  • Galid - Tuesday, February 23, 2010 - link

    When Fermi comes out, It will itself be obsolete mr.Fanboy.
  • cactusdog - Friday, February 5, 2010 - link

    Wrong,when fermi comes out ATI wont be obsolete. It will still be a hell fast DX11 card that will play any game at the highest resolutions for the next couple of years. Fermi being slighty faster doesnt make ATI obsolete. Obsolete is what ATI did to Nvidia when they released a DX11 card. Nvidia had nothing so their GTX 2XX series became obsolete.
  • mindless1 - Wednesday, February 10, 2010 - link

    You couldn't be more wrong. Show us titles using significant DX11 features, it is necessary for DX10 parts, from ATI included, to be obsolete. Only a handful of games have any DX11 features at all.

    Instead what matters is performance at the games people are playing.

    What is worse for nVidia is the lower performance per dollar.
  • AdamB5000 - Tuesday, February 2, 2010 - link

    Isn't everything in the computer world obsolete after about five minutes? I buy obsolete parts every time I upgrade my pc. My HD5750 just arrived today! Yay!
  • StevoLincolnite - Tuesday, February 2, 2010 - link

    Could you be any more of a fan boy? At least most people try to hide it...

    I like ATI as much as the next guy... (Who doesn't like an under-dog?)

    However ATI have out-done themselves this time around, I just hope nVidia doesn't do another Geforce FX on us again, otherwise there won't be the competition to drop the prices more. :(
  • GeorgeH - Tuesday, February 2, 2010 - link

    The only real pity here is that your ridiculous fanboi manifesto isn’t more succinct.
  • ratbert1 - Tuesday, February 2, 2010 - link

    move along, nothing to see here...
  • Bob Smith - Tuesday, February 2, 2010 - link

    move along, nothing to see here...
  • Bob Smith - Tuesday, February 2, 2010 - link

    Agree with that...
  • LeftSide - Tuesday, February 2, 2010 - link

    Hmm... Just worndering... Does AMD write the check out to Robert, Bobby, or Bob???
  • Nfarce - Tuesday, February 2, 2010 - link

    Nah, the douche bag just has no life, like an AMD loving, Intel hating fangirl (and I do say "girl" because little zit-faced punks like this couldn't get a girl in the real world - or a hot one anyway).

    What the limpdix like "Bob" don't understand is that competition is good. One year ATi wins; another year nVidia wins. When both take turns winning we consumers all win. I've had GeForce and ATi, and have both in two current gaming rigs (HD 4870 in one and GTX 275 in another). And for the record, my GTX 275 overclocked and tuned with programs like nHancer smokes the HD 4870 with the best it can do from ATi's CCC. But you'll never hear an ATi fangirl bring up overclocking facts and ATi driver and CCC snafus.

    In any event, I'll be selling one of those rigs this year and building another (probably the HD 4870 rig). If ATi still has the better card than nVidia as currently, then I'll buy ATi. If nVidia has better performance with the new card however - even for more bucks like I spent on the GTX 275 over the HD 4870, then I'll probably go nVidia. Finally, I game a lot with Microsoft FSX, and nVidia cards smoke the ATi cards in that program (one reason I have two rigs).

    Fangirls need not apply. Nothing to see here, right Bob?
  • Galidou - Wednesday, February 3, 2010 - link

    Ati owner here, not fanboy, system builder, tester of every card on the market. Ati overclocking fact, never used CCC it shucks but ATI Tray tool overclocks well. Comparing GTX275 overclocked with ATI4870, comon, look at the prices, 4870 fares well against GTX260(higher priced than 4870). Bob, Fermi will overkill ATI and that's for sure, they developped something that was technically a challenge. Sure It's gonna be overpriced, but it's gonna be for those who want performance over anything like price.

    Problem nowadays, most games are developped for consoles and PCs, so they run amazing on something like GTX260/4870 at high/max details 1920*1080, except Crysis. The best of the games are gonna go out on consoles and PC. So until XboxCube or Playstation4 gets out, titles won't need amazing video card, keep your cpu and ram to a good level and everything will be alright. Performance wise, these video cards are for benchers, extreme performance overclocker and or someone that owns a 30 inch monitor. That represents .01%(if so) of the market for video cards, welcome!
  • coldpower27 - Friday, February 5, 2010 - link

    Agreed with the portion that GTX 275 and the 4870 are in 2 different price categories. I would say the 4890 and the GTX 275 are at even parity.

    Fermi is designed to push technical limits and have a halo effect on Nvidia's lineup to help sell their lower offerings. It wasn't designed to be truly cost effective, like ATI's price/performance parts are. Nvidia has a different goal in mind, and that is totally alright.

    With the GPU horsepower we have now, we can continue pushing boundaries, you can run a multi-monitor setup, or have GTX 295 performance levels in a single GPU configuration at lower power consumption.

    If you wanted adequate performance, you can get something in the 9800GTX+ range that plays fine at 1680x1050.

    Take a look at Steam' survey results, even now 1280x1024, an older 17inch CRT, 17-19inch LCD Resolution, and 1680x1050, 20-22inch LCD's represent the majority.

    These video cards were never developed with "adequate" in mind, Fermi is truly about pushing boundaries and being bold.

  • Targon - Monday, February 8, 2010 - link

    There is the fact that Fermi still isn't available, and AMD has had a good amount of time to improve things on the 5870. As a result, we may very well see a 5890 released by the time Fermi even hits the shelves. In addition to that, work is obviously going on toward the Radeon 6000 series while NVIDIA has to keep working to get Fermi out the door.

    Fermi may end up taking the performance crown by the time it is released, but we may be looking at another Radeon 9700 vs. Geforce 5800 situation here, where nothing NVIDIA does in this generation will let them catch up due to all the "extra abilities" that their products are trying to offer.

    The Radeon 5870(and a number of other 5000 series parts) are already capable of 3 monitor output, but not just for a few titles that are designed to support it. Eyefinity really does let you just combine multiple monitors together so applications only see one display with the higher resolution available. Fermi, no matter the horsepower may not be able to offer that.

    That's really about it, Fermi just isn't out yet, and until it is, the big question won't be about how POWERFUL it is for most, it will be about DirectX 11 support, and going forward, OpenCL support and other standards. Yes, PhysX support will matter in a handful of applications/games, but how long will it be before it is replaced by an open standard by application and game developers?
  • Darkness Flame - Tuesday, February 2, 2010 - link

    nVidia has already announced, and possibly launched the GeForce 300M series, along with the G 310 OEM desktop card. They could be skipping the rest of the 300 series so that consumers will not be confused as to which cards carry the new architecture, and which do not. Just my guess, however.
  • Taft12 - Tuesday, February 2, 2010 - link

    There have been 8800 rebadge jokes already, but I really think they are reserving the 300 namespace for rebadges of current products since the die size and price of Fermi will be incredibly high.
  • Mr Perfect - Tuesday, February 2, 2010 - link

    "But we had expected that NVIDIA would fill the rest of the 300 series with GF100 and GF100-derrived parts, similar to how the 200 series is organized with a mix of DX10 and DX10.1 capable parts."

    I, for one, am glad they arn't pulling that this time around.
  • shabby - Tuesday, February 2, 2010 - link

    Just wait, they'll rename those low end 300 cards into 400 ones in no time.
  • JarredWalton - Tuesday, February 2, 2010 - link

    I was going to point this out... given that the various 300M parts are all currently DX10/10.1, it would really be good to see all the 300 series parts follow that feature set. Then 400 series can be reserved for true DX11 parts. Kind of makes you wonder when we'll see 400M, eh? If the next mobile architecture out of NVIDIA is only DX10, they're going to have a tough battle against AMD/ATI's Mobility 5000 parts!
  • mindless1 - Wednesday, February 10, 2010 - link

    Really? The last thing I'd care about on mobile graphics is whether it supports DX11. Performance per watt on the other hand...
  • bunty - Thursday, February 4, 2010 - link

    Hey may be they are manufacturing the 275 to 295 all those Dx10.1 cards in latest 40nm fabrication process and they don't want to mix em up with their brand new architecture.

    Like for on board graphics or something to these mobile platforms..
  • breathlesstao - Tuesday, February 2, 2010 - link

    It's posted on nVidia's official Facebook page too. So I'd say it's solid.
  • jamesadames12 - Wednesday, February 3, 2010 - link

    http://www.asdpoolsupply.com/pages.php?pageid=11">http://www.asdpoolsupply.com/pages.php?pageid=11
  • vol7ron - Tuesday, February 2, 2010 - link

    agreed

Log in

Don't have an account? Sign up now