I have to agree. It could have a titan in there and cost £600, but i wouldn't buy it so long as it looks like that. Maybe this is going to be marketed at teenagers. Because i can't see that many adults liking the idea of a bright green box next to their tv, it would just look like poop.
Personally i like to see my stuff. Why i don't have a case for my phone or tablets. The looks of a product are hugely important to me. Not because of other people seeing them, but because i just like my possessions to be aesthetically pleasing to me. Not to mention I guess i am lazy. I hate having to open cabinets or doors in order to turn on AV equiptment or a computer. It's such a trivial thing that it can't be laziness, just one of those things that irks me. I'm like that in most of my life. I like to have everything in view so it is instantly accessible, i hate having to open drawers or boxes for something i need.
Why does the pc itself need to be instantly accessible? Peripherals would be wireless no? Aesthetics makes sense when it's something you're physically interacting with constantly, with something that's just serving a purpose and is almost fully interacted with by means of a remote or wireless peripheral, things looks much sleeker when they're out of sight.
As i say, just the way i do things. Be it the power button, or the dvd drive, former being pressed once a day, latter being used maybe once every 6 months, it just irks me when they are behind a door. It is absurd, i know that, but just how i am. /but as i say, i also like to see the things i buy. I'm just quirky :) But I know i'm not alone, most others however wanting other people to see the things they buy.
My HTPC goes to sleep, and is "turned on" (ie woken) via USB wake on my wireless keyboard dongle. You don't need to see the chassis in most circumstances.
Windows has been a platform for decades whereas SteamOS is in beta. So of course Windows offers more. But SteamOS has a lot of potential and I'd like to see what they can do with it. And besides, they utilize the same hardware, so the choice is yours. I'm personally interested in giving SteamOS an honest shot at being something.
This is a neat idea, but I would have to say "warm to the touch" is a best-case of how this will perform thermally. I'd be impressed by it if it doesn't sound like a jet plane and burn down into the core of the earth.
Gigabyte's got some really neat stuff going on in the Brix line. I'm not sure they're necessary for any specific application and I'm certainly not replacing my current desktop with one, but they are really neat.
"sound like a jet plane and burn down into the core of the earth" THAT experience would make it worth buying. How many people can honestly say their computer excavated a volcano in their living room when they loaded it up?
Well, the cooling solution here is clearly changed in this version. The left vent and the heat spreader you can see through it are much bigger than the original Brix Pro. Plus, even though the heat output is increased, the total die area is also a lot bigger (CPU+GPU), so it will be easier to keep temps down.
my thoughts exactly. can't imagine they can dissipate 200w+ in a case as small as this. with maxwell we would be in the 100w ballpark, much more manageable.
This appears to be targeting casual-to-mid gamers. If a 750 Ti is barely sufficient for you, you're probably not the target market for this. For me, something like the Adreno 330 packed inside a Snapdragon 801 would be more than enough to do everything I want my computer to be able to do 9 days out of 10, from a volume of processing power standpoint. It would be able to handle programming, web surfing, and the occasional, casual game.
The Brix machines are neat, but they're generally overpriced for what they are. And this thing in particular is, well, really stupidly designed. They've taken a 170W desktop GPU (yes, it's a desktop 760) and paired it with a 47W dual-core desktop CPU, despite quad-core desktop i7 chips being available at 35W, let alone using a mobile CPU...
That's a max TDP of over 200W... any guess how loud those fans will be?
I'm going based on the assumption that both the article linked to has it as a desktop part, and in the comments someone confirmed this by talking to nVidia reps (at PAX I believe) who repeatedly insisted it was a desktop part, and lots of other articles are also reporting that it's a desktop part (with many articles expressing confusion about why somebody would put such a high-power part in an NUC).
It's entirely possible that they (the journalists) are wrong... A desktop GPU in an NUC-style machine makes zero sense to me, but at the same time I've seen their Brix product lineup first-hand at PAX myself, and it's kind of nuts. It's like they're throwing everything at the wall and seeing what sticks, one model even has a bloody projector built into it (with the whole thing running off an external battery), the GB-BXPi3-4010. They have so many different models with different hardware that I wouldn't put it past them to try cramming something insane like a mid-power desktop GPU in there (although perhaps with a restricted TDP).
"sporting a 47W Core i5-4200H mobile CPU (dual core, hyperthreading, 2.8 GHz / 3.4 GHz turbo) "
Um, what? Is that thing super cheap? I can get an i5-4670T with a 45W TDP, 4 real cores, 2.3 to 3.3GHz clock speeds for 180€. That should eat the 4200H alive. Can someone explain that? Because I don't see it. Unless it is just so they can use the same underlying platform.
This is one of those designs that is going to market too soon. The Broadwell plus Maxwell on 20 nm will produce a much better design for this machine. So it will be a stop gap until the next generation is available. I wonder when everyone is going to start admiting that Moores Law is dead. We haven't seen an increase in frequency of processors since 2004. If we had actually been seeing those increases we would have 128 Ghz processors by now. Silicon can't go above 4Ghz without melting down. Given all the other improvements in computer technology we would be seeing more progress in many areas of tech that are calculation dependant. Now we are going to see new nodes spreading apart for longer than 2 years each. During the 90's we saw node changes every 18 months and matching frequency improvements. At that pace we would have been closing in on Terahertz processors in the next year.
We need to move on to a new semi-conductor and fiberoptic data connections in the next 5 to 10 years.
Moore's law talks about transistor density, not frequency. What you do with these transistors is totally up to you - but normally they're spend in a wise way to improve performance. Otherwise the company making those chips won't last long.
What we really need to do is to make even better use of those transistors, as in hardware-/software codesign. And a slower pace in the hardware world gives the software guys some time to actually optimize for the current technology, rather than being 1 or 2 generations behind.
Agreed this thing is going to melt a hole through the earth, screaming like a banchee while it does so. Much cleverer cooling engineering required IMO.
I don't get it, if they can get laptop class components to run in a laptop chassis, why can't they get them to run in a BRIX thats physically bigger and has more cooling?
This is getting closer, but still waiting for a 2013 Mac Pro-like form factor (size-wise and quiet-wise) with an i7-4770, GTX850M (and better), upgradeable to 32GB RAM, SSD... and that's it. That may not be a "huge" market, but I'm positive there's enough of a market for some smart OEM to make money on it.
"Given how loud the BRIX Pro seemed to be, one would assume that GIGABYTE has aimed towards a mobile GPU." - Do you mean that since BRIX systems are usually extremely quiet that it must be the mobile GPU? If so using the word "loud" to express that is very strange.
However, if your implication is that the system is very loud, then why would it be the mobile GPU? Wouldn't that be indicative of the full desktop GPU?
Given how loud the older system (BRIX Pro) with integrated Intel graphics was, it's unlikely they'd use an even more power hungry desktop GPU without blowing the thermal budget in this new BRIX.
This version of the brix is almost like in the public beta phase of design. When we finally get to 2nd generation 14nm skylake with the more efficient lower voltage ddr4 and the many other power efficiency features planned and the 20nm GPU's with the GM104 chip (maxwells GTX 680/770 replacement or even possibly GM110 (maxwells GTX 780/780ti/titan replacement) (if maxwells 20nm power efficiency gets a large enough boost) these tiny little boxes will then truly be able to replace ALL 1920x1080 resolution gaming pc's, almost all 2560x1440 gaming pc's, and possibly even some 4k gaming pc's if gm110's power usage is cut enough by the 20nm maxwell architecture. and do it without sounding like a raging vacuum cleaner and without much heat buildup.
All we need is another 2 years and a 14nm skylake + 20nm maxwell BRIX mini pc could be the new standard for 1080p gamers. Everyone will just want these little tiny boxes to play games at 60fps and 1080p with ultra details instead of a big massive tower since 60 fps and ultra 1080p is already maxed out performance basically. The only gamers that will want a big tower will be 120fps ultra 1080p gamers, 60-120 fps 2560x1440/1600 gamers, and 60fps 4k UHD gamers
Suspect not quite. The previous version of this was struggling to cool a 65w processor properly, with no DGPU. If you go just a little bit bigger you can get cases which will cool that sort of processor purely passively, or the far from huge mac pro mentioned earlier which is very happily cooling about 5 times that amount.
Some of that is better cooling design of course, but a lot of it is simply the amount of space that there (isn't) for cooling designs in machines quite this small.
Does someone know which interface communicates with the GTX card? is it minipcie? is it PCIE? which speed? NUC standard doesn´t include pcie interface as far as I know.... I am terribly curious about the guts of this little machine.
I wished that would make the discrete graphics version twice as long so it becomes a shoebox form-factor, then the thermal issues and sound issues can really be solved plus the fact that there would be more space for a couple of two terabyte drives including the m.Sata SSD. Not that users would mount these on vesa behind the monitor, these are too bulky as they are for such mounting anyways......
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
39 Comments
Back to Article
dstarr3 - Wednesday, April 23, 2014 - link
Second attempt at a Steambox, by the looks of it. Which, if it costs reasonably and doesn't sound like a hair dryer, it may not be a bad deal.hughlle - Wednesday, April 23, 2014 - link
I have to agree. It could have a titan in there and cost £600, but i wouldn't buy it so long as it looks like that. Maybe this is going to be marketed at teenagers. Because i can't see that many adults liking the idea of a bright green box next to their tv, it would just look like poop.dstarr3 - Wednesday, April 23, 2014 - link
Indeed. If I got saddled with one of these, I'd have to paint it black so it's at least somewhat subtle.nathanddrews - Wednesday, April 23, 2014 - link
Your first mistake is having it visible at all. ;-)hughlle - Wednesday, April 23, 2014 - link
Personally i like to see my stuff. Why i don't have a case for my phone or tablets. The looks of a product are hugely important to me. Not because of other people seeing them, but because i just like my possessions to be aesthetically pleasing to me. Not to mention I guess i am lazy. I hate having to open cabinets or doors in order to turn on AV equiptment or a computer. It's such a trivial thing that it can't be laziness, just one of those things that irks me. I'm like that in most of my life. I like to have everything in view so it is instantly accessible, i hate having to open drawers or boxes for something i need.EnzoFX - Wednesday, April 23, 2014 - link
Why does the pc itself need to be instantly accessible? Peripherals would be wireless no? Aesthetics makes sense when it's something you're physically interacting with constantly, with something that's just serving a purpose and is almost fully interacted with by means of a remote or wireless peripheral, things looks much sleeker when they're out of sight.hughlle - Thursday, April 24, 2014 - link
As i say, just the way i do things. Be it the power button, or the dvd drive, former being pressed once a day, latter being used maybe once every 6 months, it just irks me when they are behind a door. It is absurd, i know that, but just how i am. /but as i say, i also like to see the things i buy. I'm just quirky :) But I know i'm not alone, most others however wanting other people to see the things they buy.Gigaplex - Wednesday, April 23, 2014 - link
My HTPC goes to sleep, and is "turned on" (ie woken) via USB wake on my wireless keyboard dongle. You don't need to see the chassis in most circumstances.damianrobertjones - Thursday, April 24, 2014 - link
Why limit such a nice machine by making it a Steambox? At this time Windows offers FAR MORE. Baffles me... Or is the word steambox cool these days?dstarr3 - Thursday, April 24, 2014 - link
Windows has been a platform for decades whereas SteamOS is in beta. So of course Windows offers more. But SteamOS has a lot of potential and I'd like to see what they can do with it. And besides, they utilize the same hardware, so the choice is yours. I'm personally interested in giving SteamOS an honest shot at being something.Flunk - Wednesday, April 23, 2014 - link
This is a neat idea, but I would have to say "warm to the touch" is a best-case of how this will perform thermally. I'd be impressed by it if it doesn't sound like a jet plane and burn down into the core of the earth.Gigabyte's got some really neat stuff going on in the Brix line. I'm not sure they're necessary for any specific application and I'm certainly not replacing my current desktop with one, but they are really neat.
LordOfTheBoired - Wednesday, April 23, 2014 - link
"sound like a jet plane and burn down into the core of the earth"THAT experience would make it worth buying. How many people can honestly say their computer excavated a volcano in their living room when they loaded it up?
DryAir - Thursday, April 24, 2014 - link
Well, the cooling solution here is clearly changed in this version. The left vent and the heat spreader you can see through it are much bigger than the original Brix Pro. Plus, even though the heat output is increased, the total die area is also a lot bigger (CPU+GPU), so it will be easier to keep temps down.Anonymous Blowhard - Wednesday, April 23, 2014 - link
Given the thermal constraints this has to operate under, this is begging for a Maxwell chip inside (GTX850M/GTX860M?)fokka - Wednesday, April 23, 2014 - link
my thoughts exactly. can't imagine they can dissipate 200w+ in a case as small as this. with maxwell we would be in the 100w ballpark, much more manageable.schizoide - Wednesday, April 23, 2014 - link
Was just coming here to say exactly that. It's clearly a mobile 760, and that's not good enough. A desktop 750Ti would barely be sufficient for me.Definitely getting there, though.
coder543 - Wednesday, April 23, 2014 - link
This appears to be targeting casual-to-mid gamers. If a 750 Ti is barely sufficient for you, you're probably not the target market for this. For me, something like the Adreno 330 packed inside a Snapdragon 801 would be more than enough to do everything I want my computer to be able to do 9 days out of 10, from a volume of processing power standpoint. It would be able to handle programming, web surfing, and the occasional, casual game.EnzoFX - Wednesday, April 23, 2014 - link
Same. Why can't they just make it a little wider? Nothing wrong with it being 10x more powerful and being 2x the size.Ortanon - Wednesday, April 23, 2014 - link
I'm pretty sure each person who reads this article is going to think that lol.yannigr - Wednesday, April 23, 2014 - link
Why not a 750Ti?Guspaz - Wednesday, April 23, 2014 - link
The Brix machines are neat, but they're generally overpriced for what they are. And this thing in particular is, well, really stupidly designed. They've taken a 170W desktop GPU (yes, it's a desktop 760) and paired it with a 47W dual-core desktop CPU, despite quad-core desktop i7 chips being available at 35W, let alone using a mobile CPU...That's a max TDP of over 200W... any guess how loud those fans will be?
Casper42 - Wednesday, April 23, 2014 - link
The Dual Core part has a higher clock speed per core though which is generally better for games.Not to mention a cost savings of $100 to $150 depending on which Quad core you meant.
schizoide - Wednesday, April 23, 2014 - link
Is it really a desktop GPU? You have proof of that? Desktop 760 is fast enough for a real steambox.Noise is likely a problem, but if it's only loud while playing games I'm not sure that I care. Guess it depends how loud we're talking about.
Guspaz - Thursday, April 24, 2014 - link
I'm going based on the assumption that both the article linked to has it as a desktop part, and in the comments someone confirmed this by talking to nVidia reps (at PAX I believe) who repeatedly insisted it was a desktop part, and lots of other articles are also reporting that it's a desktop part (with many articles expressing confusion about why somebody would put such a high-power part in an NUC).It's entirely possible that they (the journalists) are wrong... A desktop GPU in an NUC-style machine makes zero sense to me, but at the same time I've seen their Brix product lineup first-hand at PAX myself, and it's kind of nuts. It's like they're throwing everything at the wall and seeing what sticks, one model even has a bloody projector built into it (with the whole thing running off an external battery), the GB-BXPi3-4010. They have so many different models with different hardware that I wouldn't put it past them to try cramming something insane like a mid-power desktop GPU in there (although perhaps with a restricted TDP).
Gigaplex - Wednesday, April 23, 2014 - link
It's a mobile CPU, not a desktop one.Death666Angel - Wednesday, April 23, 2014 - link
"sporting a 47W Core i5-4200H mobile CPU (dual core, hyperthreading, 2.8 GHz / 3.4 GHz turbo) "Um, what? Is that thing super cheap? I can get an i5-4670T with a 45W TDP, 4 real cores, 2.3 to 3.3GHz clock speeds for 180€. That should eat the 4200H alive. Can someone explain that? Because I don't see it. Unless it is just so they can use the same underlying platform.
ptmmac - Wednesday, April 23, 2014 - link
This is one of those designs that is going to market too soon. The Broadwell plus Maxwell on 20 nm will produce a much better design for this machine. So it will be a stop gap until the next generation is available. I wonder when everyone is going to start admiting that Moores Law is dead. We haven't seen an increase in frequency of processors since 2004. If we had actually been seeing those increases we would have 128 Ghz processors by now. Silicon can't go above 4Ghz without melting down. Given all the other improvements in computer technology we would be seeing more progress in many areas of tech that are calculation dependant. Now we are going to see new nodes spreading apart for longer than 2 years each. During the 90's we saw node changes every 18 months and matching frequency improvements. At that pace we would have been closing in on Terahertz processors in the next year.We need to move on to a new semi-conductor and fiberoptic data connections in the next 5 to 10 years.
MrSpadge - Wednesday, April 23, 2014 - link
Moore's law talks about transistor density, not frequency. What you do with these transistors is totally up to you - but normally they're spend in a wise way to improve performance. Otherwise the company making those chips won't last long.What we really need to do is to make even better use of those transistors, as in hardware-/software codesign. And a slower pace in the hardware world gives the software guys some time to actually optimize for the current technology, rather than being 1 or 2 generations behind.
tviceman - Wednesday, April 23, 2014 - link
Not a gtx 750 ti? Gigabyte you are retarded.jb14 - Wednesday, April 23, 2014 - link
Agreed this thing is going to melt a hole through the earth, screaming like a banchee while it does so. Much cleverer cooling engineering required IMO.wintermute000 - Wednesday, April 23, 2014 - link
I don't get it, if they can get laptop class components to run in a laptop chassis, why can't they get them to run in a BRIX thats physically bigger and has more cooling?irusun - Wednesday, April 23, 2014 - link
This is getting closer, but still waiting for a 2013 Mac Pro-like form factor (size-wise and quiet-wise) with an i7-4770, GTX850M (and better), upgradeable to 32GB RAM, SSD... and that's it. That may not be a "huge" market, but I'm positive there's enough of a market for some smart OEM to make money on it.Hrel - Wednesday, April 23, 2014 - link
"Given how loud the BRIX Pro seemed to be, one would assume that GIGABYTE has aimed towards a mobile GPU." - Do you mean that since BRIX systems are usually extremely quiet that it must be the mobile GPU? If so using the word "loud" to express that is very strange.However, if your implication is that the system is very loud, then why would it be the mobile GPU? Wouldn't that be indicative of the full desktop GPU?
Gigaplex - Wednesday, April 23, 2014 - link
Given how loud the older system (BRIX Pro) with integrated Intel graphics was, it's unlikely they'd use an even more power hungry desktop GPU without blowing the thermal budget in this new BRIX.NZLion - Wednesday, April 23, 2014 - link
As someone who just bought a Brix and is using it in part as a Steam machine... god DAMMIT. I wish I'd known this was right on the horizonLaststop311 - Thursday, April 24, 2014 - link
This version of the brix is almost like in the public beta phase of design. When we finally get to 2nd generation 14nm skylake with the more efficient lower voltage ddr4 and the many other power efficiency features planned and the 20nm GPU's with the GM104 chip (maxwells GTX 680/770 replacement or even possibly GM110 (maxwells GTX 780/780ti/titan replacement) (if maxwells 20nm power efficiency gets a large enough boost) these tiny little boxes will then truly be able to replace ALL 1920x1080 resolution gaming pc's, almost all 2560x1440 gaming pc's, and possibly even some 4k gaming pc's if gm110's power usage is cut enough by the 20nm maxwell architecture. and do it without sounding like a raging vacuum cleaner and without much heat buildup.All we need is another 2 years and a 14nm skylake + 20nm maxwell BRIX mini pc could be the new standard for 1080p gamers. Everyone will just want these little tiny boxes to play games at 60fps and 1080p with ultra details instead of a big massive tower since 60 fps and ultra 1080p is already maxed out performance basically. The only gamers that will want a big tower will be 120fps ultra 1080p gamers, 60-120 fps 2560x1440/1600 gamers, and 60fps 4k UHD gamers
Qwertilot - Thursday, April 24, 2014 - link
Suspect not quite. The previous version of this was struggling to cool a 65w processor properly, with no DGPU. If you go just a little bit bigger you can get cases which will cool that sort of processor purely passively, or the far from huge mac pro mentioned earlier which is very happily cooling about 5 times that amount.Some of that is better cooling design of course, but a lot of it is simply the amount of space that there (isn't) for cooling designs in machines quite this small.
jpeich - Monday, April 28, 2014 - link
Does someone know which interface communicates with the GTX card? is it minipcie? is it PCIE? which speed? NUC standard doesn´t include pcie interface as far as I know.... I am terribly curious about the guts of this little machine.fteoath64 - Monday, April 28, 2014 - link
I wished that would make the discrete graphics version twice as long so it becomes a shoebox form-factor, then the thermal issues and sound issues can really be solved plus the fact that there would be more space for a couple of two terabyte drives including the m.Sata SSD. Not that users would mount these on vesa behind the monitor, these are too bulky as they are for such mounting anyways......