If I were them I would release gpu's with no fan, just a big aluminum heatsink so people can add fans from whoever vendor they want and get better cooling.
Big chunk of metal + 120mm fan is way better than those triple 85mm fans.
damn right, a good made full aluminum heatsink per GPU would allow a crap ton of cooling compared to most of the blower style or even the vast majority of dual or triple fan "custom" coolers that blow the heat everywhere, put a slight incline on the heatsink and one or 2 heatpipes and crazy good cooling.
where there is a will there is a way.
but I totally agree, to chase a "miner" now, is basically pointless, might as well just fork out the bigger bucks and go with a dedicated ASIC which buries the best of the best GPU out there at a fraction of the energy and heat output.
no wonder the Radeons are still seeing bloated pricing.
Biostar’s new iMiner A578X8D is specifically designed for cryptocurrency mining. It has an 8-core CPU that can handle up to 8 GPU cards. This performance is optimized for mining MTP, ETH, and ZEC at the same time. Most powerful Bitcoin miner https://aicrypto.tips/what-is-the-most-powerful-bi...
The Biostar A578X8D supports both AMD and NVIDIA graphics card for multi-GPU mining rigs. The motherboard comes with a built in 8x PCIe x16 slots, which can be used by up to 8 GPU cards in a single system at the same time. Furthermore, it features a 4 x PCIe x1 slots to provide video outputs from the GPU cards for all your monitors or even TV screens.
Your first paragraph there doesn't add up. Aren't these custom coolers mostly just massive aluminium fin arrays with heat pipes? Sure, they have small-ish fans, but that's mainly due to the requirement of fitting within a 2 (or even 3) slot thickness, card and all. Also, how would a "full aluminium heatsink" (whatever that means) not "blow the heat everywhere"? Does it include ducting? Where is the fan mounted?
Also, the thermal conductivity of aluminium is garbage compared to heatpipes, so the bigger your full-metal heatsink is, the worse it'll perform compared to its size. Beyond moving heat a few cm in each direction from its source (like the standard cheapo GTX 1050 Ti-style coolers), you'd either need crazy thick aluminium (which goes against adding surface area for dissipation) or heatpipes for effective thermal transfer. The point being: beyond small sizes and low thermal loads, monolithic metal heatsinks are not good at all - there's a reason for the proliferation of "fins-threaded-on-heatpipes" designs in both the CPU and GPU space. It's far superior.
It is true that ASIC-resistant algorithms are not really feasible. However, currencies certainly can fork in fairly trivial ways that nevertheless render the current ASIC useless. An updated ASIC can readily be designed to match the new algorithm, but it is cheaper and easier to install new software for the GPUs than it is to buy a new ASIC. Some currencies make it known that they will fork anytime an ASIC is released/discovered. So GPU is dead for some, but not all, currencies.
I like your idea, but I'd drop the heatsink too. Just sell bare cards with an industry standard mounting system. Then let aftermarket brands, that already make cpu coolers, have some of the cake and design some kickass GPU coolers we can all pick from. What was lost in profit would easily be offset by them being able to drop their budget on cooling R&D.
They already spent possibly over a year in R&D and manufacturing this product. Sometime between the initial green light of making such a product (where gpu mining was at a high) and now, gpu mining took a steep turn down. But they're still out of several months of work on designing a product that was probably almost finished when the downturn took place. Biostar's probably well aware. They, just like many GPU manufacturers, just got burned the moment they try to spin up production or new products to meet excess mining demand, only for it to falter and then be left with a large amount of mining products that miners don't want. (This is exactly why nVidia and AMD didn't want to increase GPU production during that year or so of increased GPU demand.)
It's unfortunate for Biostar, but this product would've gotten more buyers if it had launched a while back.
... so a classic case of the sunk cost fallacy making people make stupid decisions, then. These probably entered volume production a month or two back, at which time crypto had already plummeted. You're of course right that the R&D must have started long before that, and preparations for production would also have been in the pipeline for a while. But the idea that they "have to" make it just because they've already spent money on R&D and preparations is silly. They're not going to recoup any costs here. Best case scenario unless crypto makes an unheard-of comeback, they break even on the production costs - but even that's unlikely. In all likelihood, they'd be better off scrapping the project wholesale (and thus eating the R&D cost) before starting production, instead of spending even more spinning up a production line and getting distribution in order for a product nobody wants.
Depends how large a line they spun up. Sales are almost certain to be much lower than when they started the project, but the frothy history of crypto suggests it's only a matter of time until the next bubble starts. At that point ramping volume on the existing model or a new v2 product will be faster than restarting from scratch after cancelling; and by putting something out in even low volumes they can gather real world data on how well their cooling/etc choices work and be able to make a better v2 product when the demand for it exists.
So as it has eight 16-pin connectors, is high bandwidth communication possible between cards, so suitably designed GPUs could fetch each other's memory? Would that might make it attractive for some kinds of machine learning/training?
almost certainly not. Mining barely requires any IO. Bigger boards only feed an x1 to each slot because it's plenty for the needs. It's possible this might have x2 or a mix of x2/x4 electrical slots, but I wouldn't be surprised if it's all x1 to save on cost.
The various cryptocurrencies' prices peaked and then crashed around the end of last year. GPU mining was viable before then, remains viable now, and will remain viable until more efficient hardware (e.g. ASICs, or future more efficient GPUs) is available.
Just look at that notch! So bold! So revolutionary!
Unfortunately, the mining market seems to go through booms and busts. I think we're in a bust phase right now. At some point things will swing back around to boom. Biostar may look late and foolish at the moment, but things could turn back around at any time.
You've got it right, unlike the previous posters. One even says "Best case scenario unless crypto makes an unheard-of comeback..." Like the rise from $1 to $1000, the drop to $400 and rise back to $8000? crypto has done nothing but make "unheard of" comebacks. It's quite reasonable to think it will rise again 800% or more. I think especially if Bitcoin continues to dominate. When the first drop came, I held my breath for another crypto to replace it, which would have been logical. But markets are never logical. In fact, I'd go as far to say they only look late and foolish to short-termers who never understood crypto from the start. Just as their cooling solution only looks foolish to those that don't really understand the real-world physics of cooling. I guess this is mainly an elaborate troll. My biggest gripe is that Newegg is selling these but do not accept crypto as payment. Sellers should put their "money where their mouths" are, so to speak. I'd be tempted to invest in one, only because crypto is what has enabled me to even think of doing so in the first place.
AMD and Intel etc need to step and start building ASICS for mining. make dedicated sha256 and whatever else. its a new avenue for these companies to use excess Fab space that wouldnt be used because I know some fabs in intel are still at 22 and 40nm, TSMC wld be the same, I know GLO Fo would be like this too. why arent these guys doing this and generating BILLIONS in revenue for their companies. heck I might even start a mining chip company. Ill be fabless just like NV and AMD and I will thrive!!!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
26 Comments
Back to Article
Lolimaster - Tuesday, September 11, 2018 - link
When mining gpu is dead.If I were them I would release gpu's with no fan, just a big aluminum heatsink so people can add fans from whoever vendor they want and get better cooling.
Big chunk of metal + 120mm fan is way better than those triple 85mm fans.
Dragonstongue - Tuesday, September 11, 2018 - link
damn right, a good made full aluminum heatsink per GPU would allow a crap ton of cooling compared to most of the blower style or even the vast majority of dual or triple fan "custom" coolers that blow the heat everywhere, put a slight incline on the heatsink and one or 2 heatpipes and crazy good cooling.where there is a will there is a way.
but I totally agree, to chase a "miner" now, is basically pointless, might as well just fork out the bigger bucks and go with a dedicated ASIC which buries the best of the best GPU out there at a fraction of the energy and heat output.
no wonder the Radeons are still seeing bloated pricing.
gfkBill - Tuesday, September 11, 2018 - link
The Ethereum ASIC that's been touted at this stage doesn't "bury" current GPU solutions at all. It's less efficient than a well-setup GTX1060 rig.mitchelpt - Wednesday, January 5, 2022 - link
Biostar’s new iMiner A578X8D is specifically designed for cryptocurrency mining. It has an 8-core CPU that can handle up to 8 GPU cards. This performance is optimized for mining MTP, ETH, and ZEC at the same time. Most powerful Bitcoin miner https://aicrypto.tips/what-is-the-most-powerful-bi...The Biostar A578X8D supports both AMD and NVIDIA graphics card for multi-GPU mining rigs. The motherboard comes with a built in 8x PCIe x16 slots, which can be used by up to 8 GPU cards in a single system at the same time. Furthermore, it features a 4 x PCIe x1 slots to provide video outputs from the GPU cards for all your monitors or even TV screens.
Valantar - Wednesday, September 12, 2018 - link
Your first paragraph there doesn't add up. Aren't these custom coolers mostly just massive aluminium fin arrays with heat pipes? Sure, they have small-ish fans, but that's mainly due to the requirement of fitting within a 2 (or even 3) slot thickness, card and all. Also, how would a "full aluminium heatsink" (whatever that means) not "blow the heat everywhere"? Does it include ducting? Where is the fan mounted?Also, the thermal conductivity of aluminium is garbage compared to heatpipes, so the bigger your full-metal heatsink is, the worse it'll perform compared to its size. Beyond moving heat a few cm in each direction from its source (like the standard cheapo GTX 1050 Ti-style coolers), you'd either need crazy thick aluminium (which goes against adding surface area for dissipation) or heatpipes for effective thermal transfer. The point being: beyond small sizes and low thermal loads, monolithic metal heatsinks are not good at all - there's a reason for the proliferation of "fins-threaded-on-heatpipes" designs in both the CPU and GPU space. It's far superior.
Jaybus - Wednesday, September 12, 2018 - link
It is true that ASIC-resistant algorithms are not really feasible. However, currencies certainly can fork in fairly trivial ways that nevertheless render the current ASIC useless. An updated ASIC can readily be designed to match the new algorithm, but it is cheaper and easier to install new software for the GPUs than it is to buy a new ASIC. Some currencies make it known that they will fork anytime an ASIC is released/discovered. So GPU is dead for some, but not all, currencies.kmi187 - Monday, September 17, 2018 - link
I like your idea, but I'd drop the heatsink too. Just sell bare cards with an industry standard mounting system. Then let aftermarket brands, that already make cpu coolers, have some of the cake and design some kickass GPU coolers we can all pick from. What was lost in profit would easily be offset by them being able to drop their budget on cooling R&D.Valantar - Tuesday, September 11, 2018 - link
... why?JoeyJoJo123 - Tuesday, September 11, 2018 - link
To all those asking why:They already spent possibly over a year in R&D and manufacturing this product. Sometime between the initial green light of making such a product (where gpu mining was at a high) and now, gpu mining took a steep turn down. But they're still out of several months of work on designing a product that was probably almost finished when the downturn took place. Biostar's probably well aware. They, just like many GPU manufacturers, just got burned the moment they try to spin up production or new products to meet excess mining demand, only for it to falter and then be left with a large amount of mining products that miners don't want. (This is exactly why nVidia and AMD didn't want to increase GPU production during that year or so of increased GPU demand.)
It's unfortunate for Biostar, but this product would've gotten more buyers if it had launched a while back.
Valantar - Wednesday, September 12, 2018 - link
... so a classic case of the sunk cost fallacy making people make stupid decisions, then. These probably entered volume production a month or two back, at which time crypto had already plummeted. You're of course right that the R&D must have started long before that, and preparations for production would also have been in the pipeline for a while. But the idea that they "have to" make it just because they've already spent money on R&D and preparations is silly. They're not going to recoup any costs here. Best case scenario unless crypto makes an unheard-of comeback, they break even on the production costs - but even that's unlikely. In all likelihood, they'd be better off scrapping the project wholesale (and thus eating the R&D cost) before starting production, instead of spending even more spinning up a production line and getting distribution in order for a product nobody wants.DanNeely - Wednesday, September 12, 2018 - link
Depends how large a line they spun up. Sales are almost certain to be much lower than when they started the project, but the frothy history of crypto suggests it's only a matter of time until the next bubble starts. At that point ramping volume on the existing model or a new v2 product will be faster than restarting from scratch after cancelling; and by putting something out in even low volumes they can gather real world data on how well their cooling/etc choices work and be able to make a better v2 product when the demand for it exists.verl - Wednesday, September 12, 2018 - link
If you could swap out the AMD gpus and the PSU to fit nVidia GPUs, it might work for deep learning applications.Gc - Tuesday, September 11, 2018 - link
So as it has eight 16-pin connectors, is high bandwidth communication possible between cards, so suitably designed GPUs could fetch each other's memory? Would that might make it attractive for some kinds of machine learning/training?DanNeely - Tuesday, September 11, 2018 - link
almost certainly not. Mining barely requires any IO. Bigger boards only feed an x1 to each slot because it's plenty for the needs. It's possible this might have x2 or a mix of x2/x4 electrical slots, but I wouldn't be surprised if it's all x1 to save on cost.boozed - Tuesday, September 11, 2018 - link
HahahaHxx - Tuesday, September 11, 2018 - link
just in time for the new iPhone unveiling. 2 questions though 1) does it have face ID and 2) can I charge wirelessly?RU482 - Tuesday, September 11, 2018 - link
gives you a very interesting insight into how quickly Biostar can bring a product to marketedzieba - Wednesday, September 12, 2018 - link
The various cryptocurrencies' prices peaked and then crashed around the end of last year. GPU mining was viable before then, remains viable now, and will remain viable until more efficient hardware (e.g. ASICs, or future more efficient GPUs) is available.jardows2 - Wednesday, September 12, 2018 - link
I wonder if this could be useful for other purposes than mining, say a render farm? Or anything else that can leverage multiple GPU for calculations?BigDragon - Wednesday, September 12, 2018 - link
Just look at that notch! So bold! So revolutionary!Unfortunately, the mining market seems to go through booms and busts. I think we're in a bust phase right now. At some point things will swing back around to boom. Biostar may look late and foolish at the moment, but things could turn back around at any time.
DanNeely - Wednesday, September 12, 2018 - link
you're trolling the wrong article...cuvtixo - Wednesday, September 19, 2018 - link
You've got it right, unlike the previous posters. One even says "Best case scenario unless crypto makes an unheard-of comeback..." Like the rise from $1 to $1000, the drop to $400 and rise back to $8000? crypto has done nothing but make "unheard of" comebacks. It's quite reasonable to think it will rise again 800% or more. I think especially if Bitcoin continues to dominate. When the first drop came, I held my breath for another crypto to replace it, which would have been logical. But markets are never logical. In fact, I'd go as far to say they only look late and foolish to short-termers who never understood crypto from the start. Just as their cooling solution only looks foolish to those that don't really understand the real-world physics of cooling.I guess this is mainly an elaborate troll. My biggest gripe is that Newegg is selling these but do not accept crypto as payment. Sellers should put their "money where their mouths" are, so to speak. I'd be tempted to invest in one, only because crypto is what has enabled me to even think of doing so in the first place.
nunya112 - Thursday, September 13, 2018 - link
AMD and Intel etc need to step and start building ASICS for mining. make dedicated sha256 and whatever else.its a new avenue for these companies to use excess Fab space that wouldnt be used because I know some fabs in intel are still at 22 and 40nm, TSMC wld be the same, I know GLO Fo would be like this too. why arent these guys doing this and generating BILLIONS in revenue for their companies. heck I might even start a mining chip company. Ill be fabless just like NV and AMD and I will thrive!!!
Yakku - Monday, September 17, 2018 - link
Noooo! An other week without having my life being raytraced.....Yakku - Monday, September 17, 2018 - link
Aha, wrong article.