I love these niche products. I don't even need it or want it, but I'm already trying to think of a reason to install it in something. Any chance AT will test it? Has AT ever reviewed GK208 under a different name? I couldn't find anything.
Yeah, me too. This reminds me of the PCI (not express) Geforce 430 they released a few years ago, which I did buy. There was a similar 5xx series if I'm not mistaken.
These cards are useful to convert a server PC with crap embedded graphics (Matrox) in the case of most Xeon's or underperforming iGPU graphics for certain applications like HTPC where only the most modern Intel CPU's (Haswell+) might actually perform as well as this card. Sure Skylake with Crystallake or the rare Haswell/Broadwell with Crystalwell will crush it, as will most AMD A8/A10 APU's but really, that's about it. Lots of applications where a 1x card is useful.
That is the exact application I would use it for. I have an X5670 based server I use it as a game and file server, running Windows. Onboard Matrox GPU with 16mb of ram. The windows environment is almost unusable with that. Luckily I found an old GeForce 7300LE in a draw with a 20w TDP to throw in, but my server is what that 710 is built for.
Same here, with a dual Xeon E5 2670 server. I can't even use Splashtop to remote in to the server currently due to the underpowered iGPU, and I can't even set the resolution to 1920x1080. This will save me from taking up one of the four PCIe x8 slots on the SuperMicro motherboard I have.
Speaking from personal experience, this is not always a matter of 'pain'.
If you want to use a hi-res display with a Celeron- or Pentium-class IGPU you will quickly discover that the on-board HDMI/DVI outputs will not support anything over 1920x1080 and running a WQHD monitor over VGA is every bit as crappy as you can imagine.
The users of these video cards are mostly companies who are using outdated pc's and needs 2 or more displays. Some people that I know are using 2 video cards for more than 3+ displays.
It's just about a GT640 cut in half. Testing won't tell you much, it is obviously to slow for any kind relevant gaming or compute applications, but it will handle 2D-Graphics and Full-HD Video playback just fine. It may or may not handle HEVC-coded 4K Video, but it doesn't have HDMI2.0, so it doesn't make sense to use it in a 4K-Video-Player anyways.
Also, this card has been available to OEMs for over 2 years (though at first with 512MB of memory only). I'm a little surprised they releasing this now as a end-customer version, would have rather expected that they get ready to produce low-performance versions of Maxwell now as Pascal is about to be released.
First thing that comes to mind is any Xeon E5 machine as there are no E5 with iGPU as far as I know. And you need some kind of video output, but not necessarily the graphics processing power so why waste money on a fancy GPU.
I wonder if we are gonna see a USB 3.1 graphics card soon, bandwidth is pretty much the same as PCIE x1, although latency is higher, but as long as individual transfers are combined together as bulk transfers, latency shouldn't be that much of an issue.
I use a USB 2.0 graphics adapter for my 3rd monitor at work. While not for gaming, I don't perceive any latency issues when using that display. Application windows move as smoothly on that display as on my other displays. Win 7 with Aero enabled. I expect you could do a lot with a USB 3.1 graphics card.
USB to HDMI/DVI/VGA adapters don't typically contain a GPU. Their drivers usually "hack" the existing graphics card and create some sort of a virtual additional monitor, rendered by the existing GPU, then compress (in the case of USB2 adapters, compress harshly) the frame and send it to the adapter for output. This compression causes the usual latency noticed on such adapters, albeit barely noticeable in some cases.
You probably couldn't use your USB adapter on a completely GPU-less computer.
Due to overall latency issues (AFAIK) USB3 adapters do the same thing, except with much better results as the bandwidth and latency are overall so much better than on USB2.
As long as it's a 1080p theater, HDMI 1.4 will get the job done. Anyone with a 4K theater can afford a better video card, and may have a pair of at least 970s so they can game.
lol what? If someone has a < $1000 4k TV they probably don't want to spend $300 more on just a video card just to output 4k. And the idea of putting a high end, high power gaming card in a mini ITX media enclosure only used for watching streaming 4k content is absurd. Many people use HTPC's to save money vs. consumer media players. There are still no cheap HDMI 2.0 video cards that I am aware of.
I point out that you're incorrect, and your response is "lol, what>" I should know better than to respond to people who aren't making any sense.
For your post to seem to make any sense, one has to mistake a 4K TV for a home theater, and assume that most people with home theaters don't game. 4K projectors are still very expensive, so most of us do still have 1080p home theaters, with the 4K TV in the living room.
one excellent use for such a card is in a server. I have an FX8320 based server at home that has no iGPU. I use all of my PCIe 16x slots (quad gigabit NIC, 10 GBe NIC, SAS card, and SAS expander). I use an AMD PCIe 1x card for console output.
Exactly. I have an Asus 2U rackmount and the standard VGA was soooo pathetically slow as to be near useless. I dropped a GT630 based card in there (25w TDP) and it's no comparison. Plus, the higher TDP cards require the PEG power connectors which sometimes aren't available in a server based power supply.
But as you mentioned, these work perfectly well in these circumstances.
Based on the PhysX FAQ on NV's site (http://www.nvidia.com/object/physx_faq.html#q3) it looks like this GPU meets the fairly modest minimum requirements easily. In summary, you need an 8-series GeForce (think 8800 GTS era) with at least 32 cores, and at least 256MB of VRAM.
I think my only question though is whether or not there are enough PhysX titles out there and if dedicating a GPU to PhysX would make it worth doing. I was under the possibly mistaken impression that there weren't a lot of PhysX games in the wild.
Thèse cardes coule also BE use full for peuple who are trains to drive massive Numbers of monitors lime digital signage where performance si notre critical.
Throughout the article x4 and x1 are interchangeably used. Was this designed for x1 or x4? Seems with the low performance and bandwidth it wouldn't need more than x1, but the very first paragraph says "The new GeForce GT 710 graphics card with PCIe 3.0 x4 interface is not going to outperform modern higher-end iGPUs in games, but it will help owners of very low-cost systems" Scroll a little further and it says "The main selling points of the ZOTAC GT 710 are its PCIe 3.0 x1 interface". So which is it?
x1 - But your comment brings up an interesting thought on our perspective about performance. I know the numbers aren't quite comparable because of the cross-generational nature of the comparison, but bear with me. A relatively high end GeForce 8800 GTS munched up over 100W to feed 92 stream processors clocked at 500MHz and 320 or 640MB of VRAM over a first generation PCIe x16 slot that offered about 4 GB/s of bandwidth. A PCIe 3.0 x1 slot only has about 1 GB/s of throughput but is going to be burdened with 192 much more modern CUDA cores at almost double the clock speed. True the 710 has a lot less memory bandwidth (the 8800 had a 320 bit wide bus) but I would think if it was given access to a PCIe x16 slot, it could most certainly put the added throughput to use. It's just a pity this thing isn't on GDDR5 as it probably has enough GPU power to take advantage of a lot more memory bandwidth if comparing it roughly to an 8800 GTS is any indication.
Yeah, I admit I didn't look more than briefly at the 8800 spec chart when I was poking up my previous post. That whole difference between core and shader clock didn't cross my mind. I'd still be pretty interested in an Anandtech analysis measuring the difference between a PCIe x16 and x1 GT 710.
Isn't part of the idea of PCIe that you can plug whatever size connector into whatever size slot? I'm pretty sure that PCIe slots have an open back just for that sort of scenario.
Closed back slots seem to've gone out of style (when PCIe first came into being they were relatively common in x1 and sometimes even x4); but OEMs are still putting stuff in the space where an x16 card connector would go. From the 4 most recently reviewed mobos:
The fourth has 4 x16 slots making it mostly a moot point. It's 2 x1 slots don't have any crap in the way of an x16 cards connector; but both would be blocked by a 2 slot GPU. http://anandtech.com/show/10206/the-asus-maximus-v...
Not all motherboards have open slots, and even when they do sometimes other components get in the way. And besides, PCIe was designed for the "other way around" compatibility. Meaning putting x1, x4, x8 all into x16 slot. The usage if x16 in x1 slot wasn't by design. There is a design of x16 physical connector but with just x4 or x1 paths. This is also something that PCIe switching allows for slots that can go from x16 to x8 or x4.
Anyway, long story short - if you want to use x16 graphics card in x1 slot on mbo you have way more chance to fail than to succeed... While putting x1 in x16 has to work by specification of the standard.
Please - review this if you can get one! This is not just for people with old computers, it can be used in servers fir some special cases, PCs for again special use where many displays are needed, and so on. These are all rare situations, but many such niche markets when combined make a relatively large market for such device. Just see the amount of comments on this "niche product" news article!
Actually, that would be fine, but would prefer to see how these work for general HTPC duties. Image quality, 1080p30 / 1080p60 streaming, etc. I'm in a situation right now of needing a video card for my dad's pc to enable HDMI output to his TV. I'm curious if this would fit the bill for basic needs in this regard. Thanks! :)
>does not require active cooling (which means, it is also whisper quiet)
Well, if it doesn't have any fan, doesn't that mean it's actually SILENT? Whispers, however quiet, by definition involve noise. Fanless tech makes none. If the article is factually inaccurate about basic science, what hope should we have about the technical details? :(
Nah, air convection created by thermoshypon effect did move air and there bound to be some turbulence between the fins. Therefore creating a tiny amount of noise. It counts, heh... :D
Is this a viable HTPC card? I have to wonder with HBM tech, in 18 to 24 months just how impressive a basic slimline HTPC card will be in such a small space.
I love these Zotec cards. Cheap, low power and multi-display. I buy those to compile CUDA code. I had the GT-610 'Fermi' and GT-630 'Kepler' (now), both from Zotec. I posted some pictures and SPECs in the linked forum below:
Hmm ... would the GT710 or GT720 work in an old Dell SFF Optiplex 980 to replace an ATI Radeon 3450? Would need to drive 2 displays but also use Adobe Creative Cloud.
This x1 GPU looks perfect for my unRAID NAS/HTPC build as I need a dGPU for the HTPC VM as the HD530 graphics will be utilized by unRAID's GUI. Because the Skylake i5 6400 only has 16 Gen 3.0 lanes, after an x8 HBA card and a x4 NVMe SSD, I have just 4 lanes left for the GPU. This card will allow me to install the required dGPU and a USB Controller Card and still have a couple of spare lanes for future expansion
Would anyone know of a bricks-and-mortar shop in London (UK, central London if poss) that might just have such a puppy available for purchase TODAY? I have an HP Proliant Server ML370 G5 which I would LOVE to get up and running TODAY. I have successfully got Win 10 onto the thing with only ONE little problem: it will not load any driver for the onboard ATI ES1000 video except for the Microsoft Basic Display driver. I actually have a PCI-E x16 GTX680 card lying around, and I'm desparate enough that I took a soldering iron and melted off the back of the last PCI-E slot, so I could insert the (huge) card in -- but it's not even recognized. (I'm quite sure i did a careful job, but ??). At this point I'll take ANY PCI-E x1 graphics card -- but it'd be awesome to be able to find one of these. Ordering online will get me a card at best on Tuesday, too late to be of use. If you have one personally, or if you might know what magic I need to do to get the large NVIDIA card to be recognized, send mail to [email protected] and i'll reply immediately to communicate. thanks!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
55 Comments
Back to Article
nathanddrews - Friday, April 22, 2016 - link
I love these niche products. I don't even need it or want it, but I'm already trying to think of a reason to install it in something. Any chance AT will test it? Has AT ever reviewed GK208 under a different name? I couldn't find anything.domboy - Friday, April 22, 2016 - link
Yeah, me too. This reminds me of the PCI (not express) Geforce 430 they released a few years ago, which I did buy. There was a similar 5xx series if I'm not mistaken.http://www.newegg.com/Product/Product.aspx?Item=N8...
I kind of do want this PCIe 1x card, I just don't know what I'd do with it. Folding@home perhaps??
Thatguy97 - Friday, April 22, 2016 - link
Nah they will say they will review something but it'll be 6 months and they will never have reviewed it just like the gtx 950.Ian Cutress - Sunday, April 24, 2016 - link
If you hadn't noticed, the GTX 950 and 960 numbers have been in Bench for a long while now.
http://anandtech.com/bench/product/1670?vs=1596
kwrzesien - Sunday, April 24, 2016 - link
Maybe there should be weekly pipeline summaries of the products added to Bench?Ian Cutress - Monday, April 25, 2016 - link
For CPUs, I've recently been tweeting updates to Bench as they're made live. But it's an idea worth looking into for sure.descendency - Saturday, April 23, 2016 - link
https://www.youtube.com/watch?v=sph6cjJeRdISave yourself the pain and either use the onboard ("free") graphics or spend an extra few dollars and get a real graphics card.
Samus - Sunday, April 24, 2016 - link
These cards are useful to convert a server PC with crap embedded graphics (Matrox) in the case of most Xeon's or underperforming iGPU graphics for certain applications like HTPC where only the most modern Intel CPU's (Haswell+) might actually perform as well as this card. Sure Skylake with Crystallake or the rare Haswell/Broadwell with Crystalwell will crush it, as will most AMD A8/A10 APU's but really, that's about it. Lots of applications where a 1x card is useful.Lythieus - Sunday, April 24, 2016 - link
That is the exact application I would use it for. I have an X5670 based server I use it as a game and file server, running Windows. Onboard Matrox GPU with 16mb of ram. The windows environment is almost unusable with that. Luckily I found an old GeForce 7300LE in a draw with a 20w TDP to throw in, but my server is what that 710 is built for.trevdawg94 - Friday, April 29, 2016 - link
Same here, with a dual Xeon E5 2670 server. I can't even use Splashtop to remote in to the server currently due to the underpowered iGPU, and I can't even set the resolution to 1920x1080. This will save me from taking up one of the four PCIe x8 slots on the SuperMicro motherboard I have.When I searched for this card, it didn't show up as available on any US-based website on Google but Newegg does have it in stock. Here's the link: http://www.newegg.com/Product/Product.aspx?Item=N8...
Tom Braider - Monday, April 25, 2016 - link
Speaking from personal experience, this is not always a matter of 'pain'.If you want to use a hi-res display with a Celeron- or Pentium-class IGPU you will quickly discover that the on-board HDMI/DVI outputs will not support anything over 1920x1080 and running a WQHD monitor over VGA is every bit as crappy as you can imagine.
pugster - Tuesday, April 26, 2016 - link
The users of these video cards are mostly companies who are using outdated pc's and needs 2 or more displays. Some people that I know are using 2 video cards for more than 3+ displays.ShieTar - Sunday, April 24, 2016 - link
It's just about a GT640 cut in half. Testing won't tell you much, it is obviously to slow for any kind relevant gaming or compute applications, but it will handle 2D-Graphics and Full-HD Video playback just fine. It may or may not handle HEVC-coded 4K Video, but it doesn't have HDMI2.0, so it doesn't make sense to use it in a 4K-Video-Player anyways.Also, this card has been available to OEMs for over 2 years (though at first with 512MB of memory only). I'm a little surprised they releasing this now as a end-customer version, would have rather expected that they get ready to produce low-performance versions of Maxwell now as Pascal is about to be released.
Barilla - Sunday, April 24, 2016 - link
First thing that comes to mind is any Xeon E5 machine as there are no E5 with iGPU as far as I know. And you need some kind of video output, but not necessarily the graphics processing power so why waste money on a fancy GPU.ddriver - Friday, April 22, 2016 - link
I wonder if we are gonna see a USB 3.1 graphics card soon, bandwidth is pretty much the same as PCIE x1, although latency is higher, but as long as individual transfers are combined together as bulk transfers, latency shouldn't be that much of an issue.JeffFlanagan - Friday, April 22, 2016 - link
I use a USB 2.0 graphics adapter for my 3rd monitor at work. While not for gaming, I don't perceive any latency issues when using that display. Application windows move as smoothly on that display as on my other displays. Win 7 with Aero enabled. I expect you could do a lot with a USB 3.1 graphics card.nightbringer57 - Friday, April 22, 2016 - link
USB to HDMI/DVI/VGA adapters don't typically contain a GPU. Their drivers usually "hack" the existing graphics card and create some sort of a virtual additional monitor, rendered by the existing GPU, then compress (in the case of USB2 adapters, compress harshly) the frame and send it to the adapter for output. This compression causes the usual latency noticed on such adapters, albeit barely noticeable in some cases.You probably couldn't use your USB adapter on a completely GPU-less computer.
Due to overall latency issues (AFAIK) USB3 adapters do the same thing, except with much better results as the bandwidth and latency are overall so much better than on USB2.
descendency - Saturday, April 23, 2016 - link
You might see a "Type-C" thunderbolt backed one, but not a USB 3.1 based card, due to latency issues of using the USB bus.QuantumPion - Friday, April 22, 2016 - link
This seems like it would be a perfect HTPC video card except that it lacks HDMI 2.0, making it useless for that purpose.JeffFlanagan - Friday, April 22, 2016 - link
As long as it's a 1080p theater, HDMI 1.4 will get the job done. Anyone with a 4K theater can afford a better video card, and may have a pair of at least 970s so they can game.QuantumPion - Friday, April 22, 2016 - link
lol what? If someone has a < $1000 4k TV they probably don't want to spend $300 more on just a video card just to output 4k. And the idea of putting a high end, high power gaming card in a mini ITX media enclosure only used for watching streaming 4k content is absurd. Many people use HTPC's to save money vs. consumer media players. There are still no cheap HDMI 2.0 video cards that I am aware of.JeffFlanagan - Friday, April 22, 2016 - link
I point out that you're incorrect, and your response is "lol, what>" I should know better than to respond to people who aren't making any sense.For your post to seem to make any sense, one has to mistake a 4K TV for a home theater, and assume that most people with home theaters don't game. 4K projectors are still very expensive, so most of us do still have 1080p home theaters, with the 4K TV in the living room.
Samus - Sunday, April 24, 2016 - link
People don't need to spend $300 for 4K unless they gaming. For HTPC you would need no more than a 3 year old 750Ti.abhaxus - Friday, April 22, 2016 - link
one excellent use for such a card is in a server. I have an FX8320 based server at home that has no iGPU. I use all of my PCIe 16x slots (quad gigabit NIC, 10 GBe NIC, SAS card, and SAS expander). I use an AMD PCIe 1x card for console output.bill.rookard - Friday, April 22, 2016 - link
Exactly. I have an Asus 2U rackmount and the standard VGA was soooo pathetically slow as to be near useless. I dropped a GT630 based card in there (25w TDP) and it's no comparison. Plus, the higher TDP cards require the PEG power connectors which sometimes aren't available in a server based power supply.But as you mentioned, these work perfectly well in these circumstances.
JeffFlanagan - Friday, April 22, 2016 - link
Can this be used as a Physx card? If so, a lot of mid-range game machines have a spare x1 slot available.BrokenCrayons - Friday, April 22, 2016 - link
Based on the PhysX FAQ on NV's site (http://www.nvidia.com/object/physx_faq.html#q3) it looks like this GPU meets the fairly modest minimum requirements easily. In summary, you need an 8-series GeForce (think 8800 GTS era) with at least 32 cores, and at least 256MB of VRAM.I think my only question though is whether or not there are enough PhysX titles out there and if dedicating a GPU to PhysX would make it worth doing. I was under the possibly mistaken impression that there weren't a lot of PhysX games in the wild.
hansmuff - Friday, April 22, 2016 - link
There are a bunch of considerations about your question. So first off, this should work, yes.Here's a decent article on the matter: http://www.volnapc.com/all-posts/how-much-differen...
They used a GTX 650 as the Physx card, and if I see this correctly the GT710 will be quite a bit slower. It would be a very cool test to do.
The other consideration is games, and which use PhysX:
http://www.geforce.com/hardware/technology/physx/g...
If you play those a lot, I would definitely give it a shot. The card is cheap enough.
fazalmajid - Friday, April 22, 2016 - link
Thèse cardes coule also BE use full for peuple who are trains to drive massive Numbers of monitors lime digital signage where performance si notre critical.fanofanand - Friday, April 22, 2016 - link
Throughout the article x4 and x1 are interchangeably used. Was this designed for x1 or x4? Seems with the low performance and bandwidth it wouldn't need more than x1, but the very first paragraph says "The new GeForce GT 710 graphics card with PCIe 3.0 x4 interface is not going to outperform modern higher-end iGPUs in games, but it will help owners of very low-cost systems" Scroll a little further and it says "The main selling points of the ZOTAC GT 710 are its PCIe 3.0 x1 interface". So which is it?magreen - Friday, April 22, 2016 - link
Definitely confusing. But the photo shows a x1 interface.nightbringer57 - Friday, April 22, 2016 - link
Clearly a x1 card. The x4 is a typo.Ryan Smith - Friday, April 22, 2016 - link
Sorry about that. Fixed.BrokenCrayons - Friday, April 22, 2016 - link
x1 - But your comment brings up an interesting thought on our perspective about performance. I know the numbers aren't quite comparable because of the cross-generational nature of the comparison, but bear with me. A relatively high end GeForce 8800 GTS munched up over 100W to feed 92 stream processors clocked at 500MHz and 320 or 640MB of VRAM over a first generation PCIe x16 slot that offered about 4 GB/s of bandwidth. A PCIe 3.0 x1 slot only has about 1 GB/s of throughput but is going to be burdened with 192 much more modern CUDA cores at almost double the clock speed. True the 710 has a lot less memory bandwidth (the 8800 had a 320 bit wide bus) but I would think if it was given access to a PCIe x16 slot, it could most certainly put the added throughput to use. It's just a pity this thing isn't on GDDR5 as it probably has enough GPU power to take advantage of a lot more memory bandwidth if comparing it roughly to an 8800 GTS is any indication.nightbringer57 - Friday, April 22, 2016 - link
The 8800GTS's shaders were clocked at 1200MHz ;)And even at that time, on pci-express 1.0/1.1, you could barely see the difference between x4 and x16 slots, let alone x8 and x16 slots.
BrokenCrayons - Friday, April 22, 2016 - link
Yeah, I admit I didn't look more than briefly at the 8800 spec chart when I was poking up my previous post. That whole difference between core and shader clock didn't cross my mind. I'd still be pretty interested in an Anandtech analysis measuring the difference between a PCIe x16 and x1 GT 710.fanofanand - Friday, April 22, 2016 - link
I was just going to say, I had the second gen 8800 GTS and I remembered clocking it over 1200. Fantastic card while it lasted.extide - Friday, April 22, 2016 - link
It's x1, the x4 is a typo.ingwe - Friday, April 22, 2016 - link
Please tell me I am not the only one who laughed at the pun in the title.xthetenth - Friday, April 22, 2016 - link
Isn't part of the idea of PCIe that you can plug whatever size connector into whatever size slot? I'm pretty sure that PCIe slots have an open back just for that sort of scenario.DanNeely - Friday, April 22, 2016 - link
Closed back slots seem to've gone out of style (when PCIe first came into being they were relatively common in x1 and sometimes even x4); but OEMs are still putting stuff in the space where an x16 card connector would go. From the 4 most recently reviewed mobos:http://anandtech.com/show/10236/the-msi-z170a-sli-...
2 of 3 x1 slots are obstructed for x16 cards (1 battery, 1 m,2 screw post)
http://anandtech.com/show/10041/gigabyte-z170x-ud5...
2 of 3 x1 slots are obstructed for x16 cards (1 battery, 1 m,2 screw post). This one also has close backed slots.
http://anandtech.com/show/10264/the-gigabyte-mw31-...
1 of 1 x4 slots obstructed by the BIOS battery.
The fourth has 4 x16 slots making it mostly a moot point. It's 2 x1 slots don't have any crap in the way of an x16 cards connector; but both would be blocked by a 2 slot GPU.
http://anandtech.com/show/10206/the-asus-maximus-v...
LuxZg - Saturday, April 23, 2016 - link
Not all motherboards have open slots, and even when they do sometimes other components get in the way.And besides, PCIe was designed for the "other way around" compatibility. Meaning putting x1, x4, x8 all into x16 slot. The usage if x16 in x1 slot wasn't by design. There is a design of x16 physical connector but with just x4 or x1 paths. This is also something that PCIe switching allows for slots that can go from x16 to x8 or x4.
Anyway, long story short - if you want to use x16 graphics card in x1 slot on mbo you have way more chance to fail than to succeed... While putting x1 in x16 has to work by specification of the standard.
Bateluer - Friday, April 22, 2016 - link
Looks like the 710 is the same as the 720. Unless the 720 comes with GDDR5 more often than DDR3?watzupken - Friday, April 22, 2016 - link
I wonder how much better will this perform over the current gen iGPU. I don't think there is any incremental benefit at all.LuxZg - Saturday, April 23, 2016 - link
Please - review this if you can get one! This is not just for people with old computers, it can be used in servers fir some special cases, PCs for again special use where many displays are needed, and so on. These are all rare situations, but many such niche markets when combined make a relatively large market for such device. Just see the amount of comments on this "niche product" news article!Ian Cutress - Sunday, April 24, 2016 - link
What benchmarks? Just IGP gaming plus OpenCL?Ian Cutress - Sunday, April 24, 2016 - link
Inb4 folding. There's a new FAH benchmark iircDenithor - Tuesday, April 26, 2016 - link
Actually, that would be fine, but would prefer to see how these work for general HTPC duties. Image quality, 1080p30 / 1080p60 streaming, etc. I'm in a situation right now of needing a video card for my dad's pc to enable HDMI output to his TV. I'm curious if this would fit the bill for basic needs in this regard. Thanks! :)asmian - Thursday, April 28, 2016 - link
>does not require active cooling (which means, it is also whisper quiet)Well, if it doesn't have any fan, doesn't that mean it's actually SILENT? Whispers, however quiet, by definition involve noise. Fanless tech makes none. If the article is factually inaccurate about basic science, what hope should we have about the technical details? :(
ArdWar - Saturday, May 14, 2016 - link
Nah, air convection created by thermoshypon effect did move air and there bound to be some turbulence between the fins. Therefore creating a tiny amount of noise. It counts, heh... :DAbRASiON - Tuesday, May 10, 2016 - link
Is this a viable HTPC card? I have to wonder with HBM tech, in 18 to 24 months just how impressive a basic slimline HTPC card will be in such a small space.jeffry - Sunday, May 22, 2016 - link
I love these Zotec cards. Cheap, low power and multi-display. I buy those to compile CUDA code. I had the GT-610 'Fermi' and GT-630 'Kepler' (now), both from Zotec. I posted some pictures and SPECs in the linked forum below:http://www.computerbase.de/forum/showthread.php?t=...
herbgo - Wednesday, May 25, 2016 - link
Hmm ... would the GT710 or GT720 work in an old Dell SFF Optiplex 980 to replace an ATI Radeon 3450? Would need to drive 2 displays but also use Adobe Creative Cloud.TheWolfHowling - Saturday, July 2, 2016 - link
This x1 GPU looks perfect for my unRAID NAS/HTPC build as I need a dGPU for the HTPC VM as the HD530 graphics will be utilized by unRAID's GUI. Because the Skylake i5 6400 only has 16 Gen 3.0 lanes, after an x8 HBA card and a x4 NVMe SSD, I have just 4 lanes left for the GPU. This card will allow me to install the required dGPU and a USB Controller Card and still have a couple of spare lanes for future expansiondesparate - Saturday, August 27, 2016 - link
Would anyone know of a bricks-and-mortar shop in London (UK, central London if poss) that might just have such a puppy available for purchase TODAY? I have an HP Proliant Server ML370 G5 which I would LOVE to get up and running TODAY. I have successfully got Win 10 onto the thing with only ONE little problem: it will not load any driver for the onboard ATI ES1000 video except for the Microsoft Basic Display driver. I actually have a PCI-E x16 GTX680 card lying around, and I'm desparate enough that I took a soldering iron and melted off the back of the last PCI-E slot, so I could insert the (huge) card in -- but it's not even recognized. (I'm quite sure i did a careful job, but ??). At this point I'll take ANY PCI-E x1 graphics card -- but it'd be awesome to be able to find one of these. Ordering online will get me a card at best on Tuesday, too late to be of use. If you have one personally, or if you might know what magic I need to do to get the large NVIDIA card to be recognized, send mail to [email protected] and i'll reply immediately to communicate. thanks!