No coverage on Intel's new high end Iris iGPU? (The ones with the 2+3e for example) It seems they were just released, here's a video of a laptop with one: https://www.youtube.com/watch?v=4-uIC-K0vFs&ab...
Sorry, I should clarify, I meant the ones with consumer chips (like the i series). Although that article does detail the 4e variant, it doesn't mention the 3e variant or pricing/details on the consumer (i.e. non-Xeon) 4e variants.
Shortcut to the "gaming" part - 16 minutes in, Bioshock Infinite gets briefly played at 1080p Medium, and seems to be pretty stable in the low 30fps ballpark.
Kepler? Really? Are they just using old inventory at this point or something because I don't see any reason NVidia should be launching non-Maxwell based parts in 2016?
If they made a new low end chip based on Maxwell they'd probably loose money on this, given these cards are probably extremly cheap for OEMs. And Maxwell would be of hardly any benefit to these using such cards: they don't really care about their GPU, otherwise they'd be using at least the GT720.
Are all of these cards PCIe 2.0 x16? Seems like another niche market for low end discrete graphics would be DIY compute/deep learning systems that don't have processor/motherboard graphics, but do have an empty x1 or x4 PCIe slot.
Sometimes cards like these are released as PCIe x1 cards -- of course this is just the OEM building it that way. I know Zotac has done this in the past.
I guess you could always cut out the plastic on the end of the PCIe 1x slot and let the card hang off the end of it. The card should negotiate whatever link is available.
On Nvidia low end cards, that memory clock means ABSOLUTELY NOTHING. It is ABSOLUTELY MISLEADING AND A LIE. Even many GT 710 models that where just announced, are coming with slower 1600MHz DDR3 and if you look in the market for example GT 610, you will have a hard time finding models with 1800MHz DDR3. Most come with memory with frequency from 1066MHz to 1400MHz. Looking also at how GT 720 is slower in the GPU, compared to GT 710, I don't expect GT 710 to keep that memory frequency for too long at 1600-1800MHz.
I fail to see the point of this thing. It's not fast enough for games, so why would you add complexity to a system where the IGP is good enough? Sandy Bridge is about 5 years old now, and that iGPU can handle basic Windows.
It is hard to see much value in a card like this. The 64-bit DDR3 memory is going to inhibit performance for a 192 CUDA core part. However, for someone who might own a pre-Sandy Bridge system and want a little more graphics power without a lot of expense or worry over power supply upgrades and the like, this might not be a bad option. I personally wouldn't suggest anything less than a GT 730 with GDDR5 on a 64-bit interface for about 40 GB/s, but those sorts of cards retail for north of $60 which might be more than someone wants to spend on a graphics card.
Yeah I mean adding this thing to an IGP system isn't going to change anything, so what's actually the point? The IGP is already good enough for Windows desktop compisiting. Anything beyond that, neither the IGP nor this GT710 are going to be enough. You might go from 3 FPS to 6 FPS...still unplayable. I know some people can't afford high-end GPUs etc. but buying this is so pointless you're just pouring money down the drain. At lest if you buy something like a GTX 950 you can actually run games on it.
I just bought one of these for my wife's PC. It has a A8-3850 processor, which with its quad-core CPU and HD-6550 IGP was absolutely fine for web surfing, photo processing, MS Office and solitaire, which makes up 99.9% of what she does. However, after upgrading to Windows 10, we discovered that the IGP had been relegated to "legacy" status by AMD and the only drivers were "beta" and debilitatingly buggy. After researching the matter, I discovered that the GT 710 cards were the cheapest, lowest-power, recently new option with full, overt Windows 10 support listed on the box. I paid the $40 for a Zotac card at Microcenter, popped it in, installed the drivers: Problem solved.
The only use case I cank think of is the salesman deliberately sell these to tech illiterate customers buying new system for more money. I mean for $30-50, you can get better heatsink, mobo, or processor with higher perf iGPU.
or like Anton said, for old system with low iGPU performance.
I buy cards like this all the time. I usually get the AMD 6450 or 5450. They are great when you need graphics with Xeon workstations or to get more monitors, not to mention other monitor cables (if you have HDMI-only monitors and all the CPU's are DVI or VGA). I routinely head over to Microcenter and buy a half dozen of these at a time. They usually cost $30.
The comment about all of nVidia's partners launching 710's makes me wonder, do they still require them to buy a full set of GPUs parts all the way down their stack to maintain access to the high performance/high margin parts at the top of it?
Having HDMI 2.0 would've made it viable for an HTPC build (over intel integrated) but it doesn't have HDMI 2.0 either. Either way, you'd have to use a DP to HDMI 2.0 adapter so there's not much difference in that.
I remember when Fermi was delayed and Charlie accused Nvidia of forcing their AIB's to buy a certain number of their first 40nm product (a very small die, very low end chip... can't remember the code name). I don't know if there was any truth to Charlie's ramblings, but who knows, we could be looking at a similar situation.... Nvidia granting priority access to Pascal if AIB's help move old inventory.
Hey, I might buy a couple. These should slot around the HD 4670/GT 240 performance tier and will play Source based games (L4D, DotA2) and Minecraft just fine. As AMD is axing W10 drivers for legacy cards, I might be in the market.
GT 240 Memory Bandwidth 28.8 GB/sec or 54.4 GB/sec HD 4670 Memory Bandwidth 32 GB/sec or 27.94 GB/sec
GT 710 Memory Bandwidth 14.4GB/sec BEST case scenario.
GT 240 and HD 4670 will be totally destroying GT 710 in every 3D benchmark/game there is out there. ALWAYS check the available bandwidth first, then check the number of cores and everything else. With a bandwidth of only 14.4GB/sec or even lower, GT 710 is going to be 2-3 times slower than those cards you mention in 3D games. If it had at least 30-40GB/sec it could perform better than those two old cards, but that 14.4GB/sec is a HUGE bottleneck.
Will they? You are awfully certain about untested part. You are abusing one set of numbers and ignoring rest. (As if we haven't learned that with other cases like Kepler/GCN comparison) Since there is no evidence or data to back your assertions up, this post is nothing more then baseless supposition.
Yes they will. That bandwidth is very very low to have any 3D performance at all. I know it because I have a GT 620 (Fermi based 96 CUDA cores) and was comparing with an old 9800GT (112 cores if I remember correctly). 9800GT was many times faster than GT 620. At CUDA encoding,yes, GT 620 was better, but at Borderlands for example and 720p resolution where 9800GT was scoring 30fps, GT 620 at the same settings was producing a laughable slideshow. The bandwidth is very very low. With so limited bandwidth architecture differences or cuda core count is not the main factor.
Memory bandwidth is crucial, as shown by every review of AMD APUs that can't perform up to the specs of their GPU because of bandwidth limitations. It's a nice reliable upper bound and a good indicator of when the worst stuttering type problems will occur. It wouldn't be a superstar if it had HBM, but similarly that bandwidth is going to keep performance very low even if the chip were a Titan X OC.
Be carefull with GT 730. There are three versions of GT 730. http://www.geforce.com/hardware/desktop-gpus/gefor... One Kepler with 64bit GDDR5, one Kepler with 64bit DDR3 and one Fermi with 128bit DDR3. From those three, only the GDDR5 version can be considered a good option. With 40GB/sec bandwidth it can offer performance between a GT640 and a GTX650. The other two versions, especially that Kepler with the 64bit bus and DDR3, are not good options.
Buy only GDDDR5 cards. If you buy a DDR3 card check first the memory bus and don't buy anything with a DDR3 and 64bit data bus combination. When comparing so low end cards buy always the one with the fastest RAM, NOT the more RAM. 4GB DDR3 at 1200MHz is much worst than 1GB DDR3 at 1800MHz.
400/500 series where competitive with 600/700 series that had twice the cores. http://www.anandtech.com/bench/product/1337?vs=112... Also when having to do with so low bandwidth, you can always end up with that garbage performing better than the newer version, just because of the over 20-28GB/sec bandwidth. The worst option here is the 64bit+DDR3 Kepler card, not the Fermi one.
This is a good point. Even the GTX 640 can be pretty awful for the price. Usually you are always better going AMD below $100 (I currently have a GTX970, so I'm no AMD fanboy).
The old 7750 and 6750 were great cards for cheap, as was the 7790. I recall seeing a 7750 on sale at BestBuy once for $50. The GTX 640 was $100.
Typo in the 5th paragraph, 1st sentence which reads "Meanwhile from a technical perspective, as the GeForce GTX 710..." Shouldn't the GTX be a GT instead?
Ah, my mistake. You're completely correct. The new GTX 710 will make all other GPUs released before it look shamefully slow by comparison. I don't know what I was thinking. ^.^
<15W, Passive cooled, HDMI 2.0, HDCP 2.2 and DP and I'd buy 2
I don't know why these low end cards are always 2+ years behind. These are well suited to make old computers drive big screens like TV:s etc. Don't need performance, I need new interfaces and low power.
You might be surprised. I met a guy with a high-end graphics card who did not game. He was using the DVI off the motherboard to the monitor. He had no idea. He just had a large, expensive graphics card as well as the power supply to drive it for no other reason than looks.
The selling pont for me of those low end gpu's has generally been to bring tripple monitor setups to low end office pc's and for the people who need adobe shit (it crashes when running displays on both the igpu and the gpu) I use sapphire flex ones that are also rather inexpensive. Not everyone really needs good gpu performance.
This card is good because of the 3 different display connections, I've used plenty of low end cards like this at the places I work at as their IT guy. Works great for workstations that want dual monitors and such.
Fight HD 3000 and 4000 maybe, anything newer than should give it a good fight and I imagine the HD 530 will beat it handily, never mind any eDRAM setups.
They last tried to with the i740 in 1998 and got crushed by nVidia and ATI. I don't think its a coincidence that performance has shot up about the time patents from the era are expiring.
As far as I can tell the gt 730 uses 4W more but doubles cores and texture units. Anyway I have a gt 730 (kepler) in a low profile low power pc (intel core 2) and it runs great. games like ark survival, battlefield, minecraft etc on low settings play great. .
As above, it would make sense if it ran on a 128 bit memory bus or GDDR5. I'd take one at half the performance of a 730 gt but 60% of the price and 74% of the heat.
I wonder why none of the GPU manufacturers seem to think there is a market for non-gamers with a need for a GPU that has a low TDP (passive cooling) and can drive 2x 4K monitors at 60hz (2x Display Port 1.2a).
I think that there are a lot of programmers and professionals out there who would buy something like this, but of course, I'm only speaking for myself.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
60 Comments
Back to Article
tareyza - Wednesday, January 27, 2016 - link
No coverage on Intel's new high end Iris iGPU? (The ones with the 2+3e for example) It seems they were just released, here's a video of a laptop with one:https://www.youtube.com/watch?v=4-uIC-K0vFs&ab...
Ryan Smith - Wednesday, January 27, 2016 - link
In fact we just had a story on the subject yesterday: http://www.anandtech.com/show/9990/skylake-iris-pr...tareyza - Wednesday, January 27, 2016 - link
Sorry, I should clarify, I meant the ones with consumer chips (like the i series). Although that article does detail the 4e variant, it doesn't mention the 3e variant or pricing/details on the consumer (i.e. non-Xeon) 4e variants.Ryan Smith - Wednesday, January 27, 2016 - link
2+3e won't be coming for the desktop. As for mobile, we'll be looking at those once we have review units in.tareyza - Wednesday, January 27, 2016 - link
Cool, looking forward to it then,Byte - Wednesday, January 27, 2016 - link
I think they have a nice market by making videos cards with 1 of every output or 4 of 1 type of output.Anonymous Blowhard - Thursday, January 28, 2016 - link
Shortcut to the "gaming" part - 16 minutes in, Bioshock Infinite gets briefly played at 1080p Medium, and seems to be pretty stable in the low 30fps ballpark.Samus - Wednesday, January 27, 2016 - link
Kepler? Really? Are they just using old inventory at this point or something because I don't see any reason NVidia should be launching non-Maxwell based parts in 2016?yannigr2 - Wednesday, January 27, 2016 - link
GT 930 will have three versions. One Maxwell 1, one Kepler and one Fermi.silverblue - Thursday, January 28, 2016 - link
I'm waiting for a certain ch- person to waltz in here saying this isn't a rebrand. And that there is a market for this.yannigr2 - Thursday, January 28, 2016 - link
I think he stopped posting everywhere. One more indication that he was on a payroll. A temporary job. JMO.MrSpadge - Thursday, January 28, 2016 - link
If they made a new low end chip based on Maxwell they'd probably loose money on this, given these cards are probably extremly cheap for OEMs. And Maxwell would be of hardly any benefit to these using such cards: they don't really care about their GPU, otherwise they'd be using at least the GT720.10101010 - Wednesday, January 27, 2016 - link
Are all of these cards PCIe 2.0 x16? Seems like another niche market for low end discrete graphics would be DIY compute/deep learning systems that don't have processor/motherboard graphics, but do have an empty x1 or x4 PCIe slot.extide - Wednesday, January 27, 2016 - link
Sometimes cards like these are released as PCIe x1 cards -- of course this is just the OEM building it that way. I know Zotac has done this in the past.Mr Perfect - Thursday, January 28, 2016 - link
I guess you could always cut out the plastic on the end of the PCIe 1x slot and let the card hang off the end of it. The card should negotiate whatever link is available.yannigr2 - Wednesday, January 27, 2016 - link
On Nvidia low end cards, that memory clock means ABSOLUTELY NOTHING. It is ABSOLUTELY MISLEADING AND A LIE. Even many GT 710 models that where just announced, are coming with slower 1600MHz DDR3 and if you look in the market for example GT 610, you will have a hard time finding models with 1800MHz DDR3. Most come with memory with frequency from 1066MHz to 1400MHz. Looking also at how GT 720 is slower in the GPU, compared to GT 710, I don't expect GT 710 to keep that memory frequency for too long at 1600-1800MHz.MonkeyPaw - Wednesday, January 27, 2016 - link
I fail to see the point of this thing. It's not fast enough for games, so why would you add complexity to a system where the IGP is good enough? Sandy Bridge is about 5 years old now, and that iGPU can handle basic Windows.10101010 - Wednesday, January 27, 2016 - link
If it came in a PCIe x4 format, I might buy one for my compute box.For mainstream uses, I think many people forget that even a slow 2GB discrete video card is nice to have on a slower/memory-constrained system.
BrokenCrayons - Thursday, January 28, 2016 - link
It is hard to see much value in a card like this. The 64-bit DDR3 memory is going to inhibit performance for a 192 CUDA core part. However, for someone who might own a pre-Sandy Bridge system and want a little more graphics power without a lot of expense or worry over power supply upgrades and the like, this might not be a bad option. I personally wouldn't suggest anything less than a GT 730 with GDDR5 on a 64-bit interface for about 40 GB/s, but those sorts of cards retail for north of $60 which might be more than someone wants to spend on a graphics card.xthetenth - Thursday, January 28, 2016 - link
The only use I could see for it is as an adaptor card for running three monitors when the iGPU has awkward outputs.JimmiG - Thursday, February 4, 2016 - link
Yeah I mean adding this thing to an IGP system isn't going to change anything, so what's actually the point? The IGP is already good enough for Windows desktop compisiting. Anything beyond that, neither the IGP nor this GT710 are going to be enough. You might go from 3 FPS to 6 FPS...still unplayable.I know some people can't afford high-end GPUs etc. but buying this is so pointless you're just pouring money down the drain. At lest if you buy something like a GTX 950 you can actually run games on it.
Forthrast - Saturday, February 27, 2016 - link
I just bought one of these for my wife's PC. It has a A8-3850 processor, which with its quad-core CPU and HD-6550 IGP was absolutely fine for web surfing, photo processing, MS Office and solitaire, which makes up 99.9% of what she does. However, after upgrading to Windows 10, we discovered that the IGP had been relegated to "legacy" status by AMD and the only drivers were "beta" and debilitatingly buggy. After researching the matter, I discovered that the GT 710 cards were the cheapest, lowest-power, recently new option with full, overt Windows 10 support listed on the box. I paid the $40 for a Zotac card at Microcenter, popped it in, installed the drivers: Problem solved.WorldWithoutMadness - Wednesday, January 27, 2016 - link
The only use case I cank think of is the salesman deliberately sell these to tech illiterate customers buying new system for more money.I mean for $30-50, you can get better heatsink, mobo, or processor with higher perf iGPU.
or like Anton said, for old system with low iGPU performance.
SkipPerk - Friday, February 5, 2016 - link
I buy cards like this all the time. I usually get the AMD 6450 or 5450. They are great when you need graphics with Xeon workstations or to get more monitors, not to mention other monitor cables (if you have HDMI-only monitors and all the CPU's are DVI or VGA). I routinely head over to Microcenter and buy a half dozen of these at a time. They usually cost $30.DanNeely - Wednesday, January 27, 2016 - link
The comment about all of nVidia's partners launching 710's makes me wonder, do they still require them to buy a full set of GPUs parts all the way down their stack to maintain access to the high performance/high margin parts at the top of it?az060693 - Wednesday, January 27, 2016 - link
Having HDMI 2.0 would've made it viable for an HTPC build (over intel integrated) but it doesn't have HDMI 2.0 either. Either way, you'd have to use a DP to HDMI 2.0 adapter so there's not much difference in that.tviceman - Wednesday, January 27, 2016 - link
I remember when Fermi was delayed and Charlie accused Nvidia of forcing their AIB's to buy a certain number of their first 40nm product (a very small die, very low end chip... can't remember the code name). I don't know if there was any truth to Charlie's ramblings, but who knows, we could be looking at a similar situation.... Nvidia granting priority access to Pascal if AIB's help move old inventory.extide - Wednesday, January 27, 2016 - link
Hrmm, that does sort of make sense...extide - Wednesday, January 27, 2016 - link
It is SOOO weird why they are releasing GK208 parts ... now. There is a GM208!! Why not release that..?extide - Wednesday, January 27, 2016 - link
Erm, I meant GM108 -- but still same difference!benzosaurus - Thursday, January 28, 2016 - link
The awkward moment when Nvidia releases new desktop cards that are slower than their current-gen tablet SOCs.stardude82 - Thursday, January 28, 2016 - link
Hey, I might buy a couple. These should slot around the HD 4670/GT 240 performance tier and will play Source based games (L4D, DotA2) and Minecraft just fine. As AMD is axing W10 drivers for legacy cards, I might be in the market.yannigr2 - Thursday, January 28, 2016 - link
GT 240 Memory Bandwidth 28.8 GB/sec or 54.4 GB/secHD 4670 Memory Bandwidth 32 GB/sec or 27.94 GB/sec
GT 710 Memory Bandwidth 14.4GB/sec BEST case scenario.
GT 240 and HD 4670 will be totally destroying GT 710 in every 3D benchmark/game there is out there. ALWAYS check the available bandwidth first, then check the number of cores and everything else. With a bandwidth of only 14.4GB/sec or even lower, GT 710 is going to be 2-3 times slower than those cards you mention in 3D games. If it had at least 30-40GB/sec it could perform better than those two old cards, but that 14.4GB/sec is a HUGE bottleneck.
Klimax - Thursday, January 28, 2016 - link
Will they? You are awfully certain about untested part. You are abusing one set of numbers and ignoring rest. (As if we haven't learned that with other cases like Kepler/GCN comparison) Since there is no evidence or data to back your assertions up, this post is nothing more then baseless supposition.MrMilli - Thursday, January 28, 2016 - link
LOLyannigr2 - Thursday, January 28, 2016 - link
Yes they will. That bandwidth is very very low to have any 3D performance at all. I know it because I have a GT 620 (Fermi based 96 CUDA cores) and was comparing with an old 9800GT (112 cores if I remember correctly). 9800GT was many times faster than GT 620. At CUDA encoding,yes, GT 620 was better, but at Borderlands for example and 720p resolution where 9800GT was scoring 30fps, GT 620 at the same settings was producing a laughable slideshow.The bandwidth is very very low. With so limited bandwidth architecture differences or cuda core count is not the main factor.
xthetenth - Thursday, January 28, 2016 - link
Memory bandwidth is crucial, as shown by every review of AMD APUs that can't perform up to the specs of their GPU because of bandwidth limitations. It's a nice reliable upper bound and a good indicator of when the worst stuttering type problems will occur. It wouldn't be a superstar if it had HBM, but similarly that bandwidth is going to keep performance very low even if the chip were a Titan X OC.stardude82 - Thursday, January 28, 2016 - link
Thanks for that point. Then this is a horrible deal compared to the GT 730 which is still only a 25W part and can be had for $50.yannigr2 - Friday, January 29, 2016 - link
Be carefull with GT 730. There are three versions of GT 730.http://www.geforce.com/hardware/desktop-gpus/gefor...
One Kepler with 64bit GDDR5, one Kepler with 64bit DDR3 and one Fermi with 128bit DDR3. From those three, only the GDDR5 version can be considered a good option. With 40GB/sec bandwidth it can offer performance between a GT640 and a GTX650. The other two versions, especially that Kepler with the 64bit bus and DDR3, are not good options.
Buy only GDDDR5 cards.
If you buy a DDR3 card check first the memory bus and don't buy anything with a DDR3 and 64bit data bus combination. When comparing so low end cards buy always the one with the fastest RAM, NOT the more RAM. 4GB DDR3 at 1200MHz is much worst than 1GB DDR3 at 1800MHz.
stardude82 - Friday, January 29, 2016 - link
The Fermi version is garbage. A quarter of the cores and twice the heat.yannigr2 - Saturday, January 30, 2016 - link
400/500 series where competitive with 600/700 series that had twice the cores.http://www.anandtech.com/bench/product/1337?vs=112...
Also when having to do with so low bandwidth, you can always end up with that garbage performing better than the newer version, just because of the over 20-28GB/sec bandwidth. The worst option here is the 64bit+DDR3 Kepler card, not the Fermi one.
SkipPerk - Friday, February 5, 2016 - link
This is a good point. Even the GTX 640 can be pretty awful for the price. Usually you are always better going AMD below $100 (I currently have a GTX970, so I'm no AMD fanboy).The old 7750 and 6750 were great cards for cheap, as was the 7790. I recall seeing a 7750 on sale at BestBuy once for $50. The GTX 640 was $100.
BrokenCrayons - Thursday, January 28, 2016 - link
Typo in the 5th paragraph, 1st sentence which reads "Meanwhile from a technical perspective, as the GeForce GTX 710..." Shouldn't the GTX be a GT instead?ImSpartacus - Thursday, January 28, 2016 - link
Definitely not. This is a high end gtx graphics card for sure. This is how you get all the fast colors.BrokenCrayons - Friday, January 29, 2016 - link
Ah, my mistake. You're completely correct. The new GTX 710 will make all other GPUs released before it look shamefully slow by comparison. I don't know what I was thinking. ^.^Anato - Thursday, January 28, 2016 - link
<15W, Passive cooled, HDMI 2.0, HDCP 2.2 and DP and I'd buy 2I don't know why these low end cards are always 2+ years behind. These are well suited to make old computers drive big screens like TV:s etc. Don't need performance, I need new interfaces and low power.
A5 - Thursday, January 28, 2016 - link
Amortizing development costs and old chips are cheaper to make. From a business view, it is pretty simple.And no one who "needs" HDMI 2.0 to drive a 4K display is going to buy a $30 video card.
SkipPerk - Friday, February 5, 2016 - link
You might be surprised. I met a guy with a high-end graphics card who did not game. He was using the DVI off the motherboard to the monitor. He had no idea. He just had a large, expensive graphics card as well as the power supply to drive it for no other reason than looks.qlum - Thursday, January 28, 2016 - link
The selling pont for me of those low end gpu's has generally been to bring tripple monitor setups to low end office pc's and for the people who need adobe shit (it crashes when running displays on both the igpu and the gpu) I use sapphire flex ones that are also rather inexpensive. Not everyone really needs good gpu performance.firerod1 - Thursday, January 28, 2016 - link
This card is good because of the 3 different display connections, I've used plenty of low end cards like this at the places I work at as their IT guy. Works great for workstations that want dual monitors and such.Anonymous Blowhard - Thursday, January 28, 2016 - link
> fight integrated graphicsFight HD 3000 and 4000 maybe, anything newer than should give it a good fight and I imagine the HD 530 will beat it handily, never mind any eDRAM setups.
Shadowmaster625 - Thursday, January 28, 2016 - link
Never buy a card that has DDR3. Just dont do it. The cheapest card on ebay that has GDDR5 is going to smoke this thing and cost less.versesuvius - Thursday, January 28, 2016 - link
Why isn't Intel making DGPUs?stardude82 - Friday, January 29, 2016 - link
It took them only 18 years to make a competitive GPU...Alexey291 - Sunday, January 31, 2016 - link
You're making it sound incredibly simple.stardude82 - Monday, February 1, 2016 - link
They last tried to with the i740 in 1998 and got crushed by nVidia and ATI. I don't think its a coincidence that performance has shot up about the time patents from the era are expiring.moozoo - Thursday, January 28, 2016 - link
Why would anyone buy a gt 710 over the Kepler version of the gt 730?see the specs http://www.anandtech.com/show/8225/best-video-card...
As far as I can tell the gt 730 uses 4W more but doubles cores and texture units.
Anyway I have a gt 730 (kepler) in a low profile low power pc (intel core 2) and it runs great. games like ark survival, battlefield, minecraft etc on low settings play great.
.
stardude82 - Friday, January 29, 2016 - link
As above, it would make sense if it ran on a 128 bit memory bus or GDDR5. I'd take one at half the performance of a 730 gt but 60% of the price and 74% of the heat.junky77 - Thursday, February 4, 2016 - link
Even more powerful GPUs are not as good or significantly better than current midrange iGPUs like the HD530..for example, the gt 730M with 384 cuda cores:
http://www.notebookcheck.com/NVIDIA-GeForce-GT-730...
nucc1 - Sunday, February 28, 2016 - link
I wonder why none of the GPU manufacturers seem to think there is a market for non-gamers with a need for a GPU that has a low TDP (passive cooling) and can drive 2x 4K monitors at 60hz (2x Display Port 1.2a).I think that there are a lot of programmers and professionals out there who would buy something like this, but of course, I'm only speaking for myself.