great marketing, what doesnt fit in the name: Gaming G1 wifi....high end gaming, expesive cpu-gpu and make a wifi network connection :) no thx i will continue with 1Gb for now...
Yeah I agree with you on the wired vs wireless, especially in a product like this. I would much rather see 2 good 1 Gb/s network connections (i.e. good Intel NICs with teaming) than 1+wireless.
Sure, but intranet speeds are very important. If I'm pushing high bitrate 1080p videos or even 4K eventually, I will want my home network running gigabit to avoid saturation.
Given my past experience with Gigabyte boards, a P43, a AMD 970, and a X79 boards, from the quality of their bios and driver support to the quality of their boards in general, this just doesn't interest me. I just want to warn people away from these guys. These guys are the Chrysler of the motherboard industry.
My experience is the completely opposite of yours. Gigabyte P965, P35, P45, X48, P55, P67, Z68, Z77, Z87 have all been rock solid. Their boards in fact have been so awesome that I overclocked E6400 to 3.4Ghz on a $90 board, Q6600 G0 to 3.4Ghz on a $120 board and kept both systems running 24/7 doing distributed computing with 99% CPU load for 3 years each.
While Gigabyte UD3H and UD5H series are rock solid boards. They now even have the Black Series with 168 hour server testing. Really, there are no Chrysler makers among the top 4. We are talking Mercedes, BMW, Audi and Lexus. They are all so close now.
yeah, well, that hasn't been my experience over the years. They beta tested and then released a bios on my x79 board that kept my screen saver from kicking in, kept the processor cores from going idle or downclocking, kept the monitor powered on, and turned off the PCIe x1 slots so my sound card wouldn't work, then set up the update program for that bios to not allow a backdate for the bios to restore the old version. It took me two weeks to get a response to get a program that would backdate the bios. (I shared it out on my dropbox account so everyone can access it now. BTW, if anyone reading this has a x79-UP4 board and update to the F4, F5, or F7 bios, I have a program that can revert it to the more functional F3. It won't work with IB-E, but it will actually work.)
Have you been secretly using my computers? With the exception of P67, I've used every single motherboard you've used and my experience is the same as yours but for the overclocking. I've never overclocked but push my machines to the limit but Gigabyte motherboards have never been anything but reliable.
I'm finding Gigabyte boards aren't that bad and the one time I used their service department for a warranty claim it was smooth , but they seem to burn it quick. I get 2 maybe 3 years out if them (several boards in a row now) , on the other hand I have had Asus boards for at least 2x as long and never quit even when being pushed hard.
The main reason would be to run four video cards with minimal bottlenecking from the PCIe lanes. In conjunction with 20nm GPUs, I believe that would be sufficient to game on a triple UHD display configuration.
Most games, for now. You might not realize it, but the newest consoles are both 8 core AMD processors. In order to get the most out of those consoles, the software developers are learning, right now, how to program their games for 8 threads. It won't be much longer before that becomes a mainstay within gaming. Our games will all begin sing more than the 4 threads we are used to right now.
It's funny. Games always move forward. They begin to use what people have. It's the "build it and they will come" strategy. You honestly don't think all our future games will be limited to 4 core processors, do you?
1) Consoles actually used all 8 cores for gaming, but they don't, they only use 6.
2) Jaguars single threaded performance is in orders-of-magnitude slower than a modern Core i3/i5/i7 core, 6 Jaguar cores are probably equivalent to a Core i3 in total compute performance.
3) The PC is shifting to more efficiency with console-like low-level optimizations at the API level, things like Mantle, nvapi and Direct X 12 as well as some improvements in OpenGL all push towards that goal.
Conversely, you just need to take a look at the Pentium Anniversary processor to see that "More cores" aren't everything, it's able to play in the same ballpark as the Core i5 even in heavily threaded games, despite the i5 having twice the processing cores, more cache, more bandwidth etc'. Just through overclocking.
And lastly, over the decades the CPU has slowly been doing less and less when it comes to gaming, Before TnL (Texture and Lighting) for instance, the CPU used to handle all those calculations, then it was shifted to the TnL unit and now it's done in a GPU's pixel shader hardware. Things like Anti-Aliasing are typically done on a consoles CPU rather than the GPU too, mostly just framebuffer effects like Morphological, on the PC, the GPU handles that task.
Those who have a Quad-Core Core i5, that's Sandy or newer are pretty much set for years to come, provided you don't mind overclocking, heck, people still hang onto the Core 2 Quad, albeit overclocked to 3.6ghz+ as it still handles every game fine, that's a processor that's 6-7 years old.
Will we need 8 cores for gaming in the immediate future? Unlikely. - The consoles are *still* holding back the PC with it's low-end and mid-range anemic hardware.
1. where have you heard this? This is not what I've heard. 2. doesn't matter, they're still learning to program to accommodate more cores and more threading. 3. So you say. It depends greatly on what people do with their machines. I'm pushing more and more for voice recognition/ command and more overall capabilities. I run a VM on my system nearly 100% of the time, specifically for web surfing. (Windows 8.1 main OS with Ubuntu as the VM OS.) The 32GB of memory I have on my system is constantly at 50%+ usage, partly due to the VM, but also due to voice recognition, system monitoring, game clients, and other things. There will always be people pushing their systems to do more, and that "console-like low-level optimization" attitude is not exactly going to be the only path for personal computing.
I never said that "more cores" are everything right now. I'm saying that game programmers are learning to use more cores, mainly due to the new consoles. You can't deny that. Because of that, games will be able to use the extra cores in the higher end processors, so they aren't useless.
Graphics are the bottleneck for now, but that won't always be the case. A Core i5 4670k may be the most powerful processor needed for maximum performance for many games, but that certainly won't the case in the near future. The next couple generations of graphics cards will probably need more powerful CPUs to drive them. Nvidia has already geared their drivers to accommodate more cores.
On top of all that, there are some games that do use more core right now. World of Warcraft and Star Trek online both have a bit of an improvement on a Core i7 3930k over a 4790k. I ought to know, I just switched back over this past weekend. My previous x79 board was going out, and I didn't want to wait several months for the Haswell-E, and I bought an Asus x97-WS with a Core i7 4790k. What a disaster that was. It turned out that the Devil's Canyon wasn't all it was cracked up to be, and suffered from tons of thermal throttling. Under load in Star Trek Online, it would thermally throttle down as low as 3.7GHz, and that was just 50% usage. WoW performance was even worse. Same Corsair H100i cooler, the 4790k would exceed 90C consistently while at stock clocks, while the 3930k stays under 40C overclocked to 4.2GHz. The whole platform turned out to be a total waste of money. I finally got a new Asus X79 Sabretooth to replace my old Gigabyte X79-UP4, and switched everything back over. The performance difference was startling. I hadn't even realized how much performance I gave up switching to that Haswell POJ. I also saw much more stable frame rates. I didn't see the jumping with the 3930k that I saw with the 4790k, where the processor would thermally throttle downward, lag a few frame renders, and then catch up. The 3930k handled it with ease.
On top of all that, there is also the matter of the massively more useful I/O config of the socket 2011 platform. The socket 1150 platform is stuck with only 16 lanes of PCIe, where the socket 2011 platform has so many more lanes to use. I got my Z97-WS because it uses a PLX bridge chip to give 4 PCIe x8 slots, for my dual GTX680s and my RAID controller. It was the only way a Z97 platform could handle such things. It may have worked, if the CPU hadn't been so sucky at handling heat. The slot layout of the x79 Sabretooth isn't to my liking, but I can still handle all the cards I have currently, without the PLX bridge.
Overall, I don't subscribe to your opinions, but I can prove that this upcoming Haswell-E platform is anything but useless. Plenty of people, including me, have a use for it. Unfortunately, I fell victim to opinions like yours and bought into the socket 1150 craze, and now I'm going to have to wait with my 3930k for another year before I can get the money together to replace it. I'll just make my 4790k into a nice router and server in the mean time.
Intel could have avoided a lot of this madness by just making midrange chips have more like 24 lanes or 32, while bumping up IB-E revision to 80 or something. People seem to have forgotten that P55 had the same 16-lane limit and at the time that was regarded as _low-end_, while 4-core X58 was 'mainstream' (the venerable i7 920) and 6-core X58 was high-end, the latter two giving a lot more lanes. P55 worked very well though because of it's lower latency, and some boards like the ASUS P7P55 WS used switches to offer x16/x16 or x8/x8/x8/x8 (I have two of them), but it's just so weird now that the small number of lanes with Z97/HW is regarded as mainstream and normal; how did that happen??
I agree with you dgingeri, people have bought into this limited functionality craze way too much. Btw, I'm intrigued you didn't consider the P9X79 WS or E-WS as those have excellent multi-GPU features and the same oc'ing potential as the R4E, etc. Mine has four GTX 580 3GB cards, 64GB @ 2133, 3930K @ 4.7, etc.
4K gaming, more PCIe lanes to cope with multiple GPUs. Using a mainstream chipset for such a task with only 16 lanes from the chip is crazy IMO, severely limits the options for getting the most from multiple GPUs at that kind of res. Some board makers get round this by using PLEX chips to offer x16/x16 or x8/x8/x8/x8 on mainstream boards (eg. like Asrock did with its Z68 Extreme7), but that merely proves the point that mainstream chips are starved for PCIe lanes, especially with makers now using some of those lanes for newer storage tech which involves all sorts of messing about (use X, or Y, but not both, etc.)
I want to build a 4K gaming system soon, probably with two Maxwell cards. It'll be X99, likely with the 5930K. The only thing that could scupper my plans is if NVIDIA is dumb enough to release the 800 series cards with less than 4GB RAM, and preferably 6GB. They've been so scared of harming Quadro sales recently, forcing users to go with Titan if they want 6GB+ (wrongly so IMO, many pro users will not buy Titan anyway, for various reasons), or of course they buy AMD instead.
We shall see...
Meanwhile, for solo professionals, X99 is ideal for supporting multiple GPUs with a good CPU/RAM balance at the same time, plus hefty storage I/O. There's a fair bit of overlap between these two potential markets.
When is "Launch Day?" Does anyone know? I'm not interested in this particular board, but waiting for the X99 chip set and three new CPU's from Intel...has been excruciating. DDR4 support is the only thing that we haven't seen before and it's largely derivative. More SATA III ports and native USB 3.0 have been in the "non-enthusiast" chip sets for one and two years respectively. Everything else is pretty much the same...These X99 motherboards don't look or function much different than the X79 motherboards they will replace. I just can't help to think that this should all have happened last year... What's the big f'ing deal Intel?
"Launch day" is likely some time in September, according to rumor.
I couldn't wait for the X99 because my Gigabyte motherboard was going out, so I decided to "upgrade" to a Z97 Haswell system. What a disastrous mistake that was. Now my money is all tied up in a Z97 motherboard, 4790k process, and memory, and yet the whole thing doesn't even overclock to match my 3930k. I overclock it, but the damn chip thermally throttles back down to base clock rate when it get busy even though I have a H100i cooler. The only advantage is that I have more 6Gb SATA ports. The mainstream Haswell architecture sucks, mostly due to the chip packaging, and that's where Haswell-E does so much better. It probably won't be thermally throttling back down to base clocks like the mainstream does.
I went with the x79 back in the day because I had two video cards and a raid controller, so I needed more PCIe lanes than the mainstream SB allowed. I am at the same point now, but different video cards and raid controller now. (I had dual GTX470s and a 3ware 9650 three years ago, now I have dual GTX 680s and a Dell H710. Believe me, the raid controller makes a HUGE difference in storage performance. I hate waiting for things to load. Well worth the money.) The X99 definitely has a place with gamers.
I'm just disappointed the 5930K is only going to be a 6-core. That makes the 8-core largely unaffordable to so many who've been waiting for HW-E. I thought what Intel was going to do was simply bump up the entire baseline from X79, ie. entry chip is a 6-core (akin to the 4820K), other two above both 8-core, with the latter being separated by the same kind of cache, base/turbo clock & sampling differences we've seen with SB-E and IB-E. Instead, we have 4-core entry, 6-core middle and 8-core at the top. That's dumb. It means to a large extent, the speedup from a 3930K to a 5930K isn't really going to be that much at all, and likely not worth the platform update given the cost of the new mbd & RAM.
So, X99/5930K fine for a new build, but may well suck value-wise as an upgrade (even more so for those with IB-E setups).
If you look just to the right of the bottom of the wireless card, you'll see the M.2 card connector. The M.2 SSD would actually sit over the wireless card.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
23 Comments
Back to Article
duploxxx - Friday, August 8, 2014 - link
great marketing, what doesnt fit in the name: Gaming G1 wifi....high end gaming, expesive cpu-gpu and make a wifi network connection :) no thx i will continue with 1Gb for now...Fallen Kell - Friday, August 8, 2014 - link
Yeah I agree with you on the wired vs wireless, especially in a product like this. I would much rather see 2 good 1 Gb/s network connections (i.e. good Intel NICs with teaming) than 1+wireless.danjw - Friday, August 8, 2014 - link
Why not? Most modern WiFi connections can beat the speed of most people's internet connection.jordanclock - Friday, August 8, 2014 - link
Sure, but intranet speeds are very important. If I'm pushing high bitrate 1080p videos or even 4K eventually, I will want my home network running gigabit to avoid saturation.frag85 - Wednesday, August 13, 2014 - link
Also comes down to connection quality (and reliability). WiFi is pretty shotty for certain types of gaming.dgingeri - Friday, August 8, 2014 - link
Given my past experience with Gigabyte boards, a P43, a AMD 970, and a X79 boards, from the quality of their bios and driver support to the quality of their boards in general, this just doesn't interest me. I just want to warn people away from these guys. These guys are the Chrysler of the motherboard industry.RussianSensation - Friday, August 8, 2014 - link
My experience is the completely opposite of yours. Gigabyte P965, P35, P45, X48, P55, P67, Z68, Z77, Z87 have all been rock solid. Their boards in fact have been so awesome that I overclocked E6400 to 3.4Ghz on a $90 board, Q6600 G0 to 3.4Ghz on a $120 board and kept both systems running 24/7 doing distributed computing with 99% CPU load for 3 years each.Each generation the difference between Gigabyte, MSI, Asus and Asrock keeps shrinking. For overclockings, the Gigabyte SOC Force have been near the top:
http://www.techpowerup.com/202047/gigabyte-z97x-so...
While Gigabyte UD3H and UD5H series are rock solid boards. They now even have the Black Series with 168 hour server testing. Really, there are no Chrysler makers among the top 4. We are talking Mercedes, BMW, Audi and Lexus. They are all so close now.
dgingeri - Friday, August 8, 2014 - link
yeah, well, that hasn't been my experience over the years. They beta tested and then released a bios on my x79 board that kept my screen saver from kicking in, kept the processor cores from going idle or downclocking, kept the monitor powered on, and turned off the PCIe x1 slots so my sound card wouldn't work, then set up the update program for that bios to not allow a backdate for the bios to restore the old version. It took me two weeks to get a response to get a program that would backdate the bios. (I shared it out on my dropbox account so everyone can access it now. BTW, if anyone reading this has a x79-UP4 board and update to the F4, F5, or F7 bios, I have a program that can revert it to the more functional F3. It won't work with IB-E, but it will actually work.)Senor.Jalapeno - Wednesday, August 13, 2014 - link
Have you been secretly using my computers? With the exception of P67, I've used every single motherboard you've used and my experience is the same as yours but for the overclocking. I've never overclocked but push my machines to the limit but Gigabyte motherboards have never been anything but reliable.frag85 - Wednesday, August 13, 2014 - link
I'm finding Gigabyte boards aren't that bad and the one time I used their service department for a warranty claim it was smooth , but they seem to burn it quick. I get 2 maybe 3 years out if them (several boards in a row now) , on the other hand I have had Asus boards for at least 2x as long and never quit even when being pushed hard.casteve - Friday, August 8, 2014 - link
Isn't X99 Gaming an oxymoron? Why in the world would you spend money for a Haswell-E based gaming system when most games peak with an i5?Assimilator87 - Friday, August 8, 2014 - link
The main reason would be to run four video cards with minimal bottlenecking from the PCIe lanes. In conjunction with 20nm GPUs, I believe that would be sufficient to game on a triple UHD display configuration.dgingeri - Friday, August 8, 2014 - link
Most games, for now. You might not realize it, but the newest consoles are both 8 core AMD processors. In order to get the most out of those consoles, the software developers are learning, right now, how to program their games for 8 threads. It won't be much longer before that becomes a mainstay within gaming. Our games will all begin sing more than the 4 threads we are used to right now.It's funny. Games always move forward. They begin to use what people have. It's the "build it and they will come" strategy. You honestly don't think all our future games will be limited to 4 core processors, do you?
StevoLincolnite - Sunday, August 10, 2014 - link
That would be true if...1) Consoles actually used all 8 cores for gaming, but they don't, they only use 6.
2) Jaguars single threaded performance is in orders-of-magnitude slower than a modern Core i3/i5/i7 core, 6 Jaguar cores are probably equivalent to a Core i3 in total compute performance.
3) The PC is shifting to more efficiency with console-like low-level optimizations at the API level, things like Mantle, nvapi and Direct X 12 as well as some improvements in OpenGL all push towards that goal.
Conversely, you just need to take a look at the Pentium Anniversary processor to see that "More cores" aren't everything, it's able to play in the same ballpark as the Core i5 even in heavily threaded games, despite the i5 having twice the processing cores, more cache, more bandwidth etc'.
Just through overclocking.
And lastly, over the decades the CPU has slowly been doing less and less when it comes to gaming, Before TnL (Texture and Lighting) for instance, the CPU used to handle all those calculations, then it was shifted to the TnL unit and now it's done in a GPU's pixel shader hardware.
Things like Anti-Aliasing are typically done on a consoles CPU rather than the GPU too, mostly just framebuffer effects like Morphological, on the PC, the GPU handles that task.
Those who have a Quad-Core Core i5, that's Sandy or newer are pretty much set for years to come, provided you don't mind overclocking, heck, people still hang onto the Core 2 Quad, albeit overclocked to 3.6ghz+ as it still handles every game fine, that's a processor that's 6-7 years old.
Will we need 8 cores for gaming in the immediate future? Unlikely. - The consoles are *still* holding back the PC with it's low-end and mid-range anemic hardware.
dgingeri - Monday, August 11, 2014 - link
1. where have you heard this? This is not what I've heard.2. doesn't matter, they're still learning to program to accommodate more cores and more threading.
3. So you say. It depends greatly on what people do with their machines. I'm pushing more and more for voice recognition/ command and more overall capabilities. I run a VM on my system nearly 100% of the time, specifically for web surfing. (Windows 8.1 main OS with Ubuntu as the VM OS.) The 32GB of memory I have on my system is constantly at 50%+ usage, partly due to the VM, but also due to voice recognition, system monitoring, game clients, and other things. There will always be people pushing their systems to do more, and that "console-like low-level optimization" attitude is not exactly going to be the only path for personal computing.
I never said that "more cores" are everything right now. I'm saying that game programmers are learning to use more cores, mainly due to the new consoles. You can't deny that. Because of that, games will be able to use the extra cores in the higher end processors, so they aren't useless.
Graphics are the bottleneck for now, but that won't always be the case. A Core i5 4670k may be the most powerful processor needed for maximum performance for many games, but that certainly won't the case in the near future. The next couple generations of graphics cards will probably need more powerful CPUs to drive them. Nvidia has already geared their drivers to accommodate more cores.
On top of all that, there are some games that do use more core right now. World of Warcraft and Star Trek online both have a bit of an improvement on a Core i7 3930k over a 4790k. I ought to know, I just switched back over this past weekend. My previous x79 board was going out, and I didn't want to wait several months for the Haswell-E, and I bought an Asus x97-WS with a Core i7 4790k. What a disaster that was. It turned out that the Devil's Canyon wasn't all it was cracked up to be, and suffered from tons of thermal throttling. Under load in Star Trek Online, it would thermally throttle down as low as 3.7GHz, and that was just 50% usage. WoW performance was even worse. Same Corsair H100i cooler, the 4790k would exceed 90C consistently while at stock clocks, while the 3930k stays under 40C overclocked to 4.2GHz. The whole platform turned out to be a total waste of money. I finally got a new Asus X79 Sabretooth to replace my old Gigabyte X79-UP4, and switched everything back over. The performance difference was startling. I hadn't even realized how much performance I gave up switching to that Haswell POJ. I also saw much more stable frame rates. I didn't see the jumping with the 3930k that I saw with the 4790k, where the processor would thermally throttle downward, lag a few frame renders, and then catch up. The 3930k handled it with ease.
On top of all that, there is also the matter of the massively more useful I/O config of the socket 2011 platform. The socket 1150 platform is stuck with only 16 lanes of PCIe, where the socket 2011 platform has so many more lanes to use. I got my Z97-WS because it uses a PLX bridge chip to give 4 PCIe x8 slots, for my dual GTX680s and my RAID controller. It was the only way a Z97 platform could handle such things. It may have worked, if the CPU hadn't been so sucky at handling heat. The slot layout of the x79 Sabretooth isn't to my liking, but I can still handle all the cards I have currently, without the PLX bridge.
Overall, I don't subscribe to your opinions, but I can prove that this upcoming Haswell-E platform is anything but useless. Plenty of people, including me, have a use for it. Unfortunately, I fell victim to opinions like yours and bought into the socket 1150 craze, and now I'm going to have to wait with my 3930k for another year before I can get the money together to replace it. I'll just make my 4790k into a nice router and server in the mean time.
mapesdhs - Tuesday, August 12, 2014 - link
Intel could have avoided a lot of this madness by just making midrange chips have
more like 24 lanes or 32, while bumping up IB-E revision to 80 or something. People
seem to have forgotten that P55 had the same 16-lane limit and at the time that was
regarded as _low-end_, while 4-core X58 was 'mainstream' (the venerable i7 920)
and 6-core X58 was high-end, the latter two giving a lot more lanes. P55 worked
very well though because of it's lower latency, and some boards like the ASUS P7P55
WS used switches to offer x16/x16 or x8/x8/x8/x8 (I have two of them), but it's just
so weird now that the small number of lanes with Z97/HW is regarded as mainstream
and normal; how did that happen??
I agree with you dgingeri, people have bought into this limited functionality craze way
too much. Btw, I'm intrigued you didn't consider the P9X79 WS or E-WS as those
have excellent multi-GPU features and the same oc'ing potential as the R4E, etc. Mine
has four GTX 580 3GB cards, 64GB @ 2133, 3930K @ 4.7, etc.
Will you switch to X99?
Ian.
Durolith - Thursday, August 28, 2014 - link
Q9550 @ 4 ghz since 2009 the 8 core really tempting my since my cpu bottleneck a single HD7950 and i run em in crossfire ... RSI gotta need some juicemapesdhs - Tuesday, August 12, 2014 - link
4K gaming, more PCIe lanes to cope with multiple GPUs. Using a mainstream
chipset for such a task with only 16 lanes from the chip is crazy IMO, severely
limits the options for getting the most from multiple GPUs at that kind of res.
Some board makers get round this by using PLEX chips to offer x16/x16 or
x8/x8/x8/x8 on mainstream boards (eg. like Asrock did with its Z68 Extreme7),
but that merely proves the point that mainstream chips are starved for PCIe
lanes, especially with makers now using some of those lanes for newer storage
tech which involves all sorts of messing about (use X, or Y, but not both, etc.)
I want to build a 4K gaming system soon, probably with two Maxwell cards. It'll
be X99, likely with the 5930K. The only thing that could scupper my plans is if
NVIDIA is dumb enough to release the 800 series cards with less than 4GB RAM,
and preferably 6GB. They've been so scared of harming Quadro sales recently,
forcing users to go with Titan if they want 6GB+ (wrongly so IMO, many pro users
will not buy Titan anyway, for various reasons), or of course they buy AMD instead.
We shall see...
Meanwhile, for solo professionals, X99 is ideal for supporting multiple GPUs with a good
CPU/RAM balance at the same time, plus hefty storage I/O. There's a fair bit of overlap
between these two potential markets.
Ian.
TEAMSWITCHER - Friday, August 8, 2014 - link
When is "Launch Day?" Does anyone know? I'm not interested in this particular board, but waiting for the X99 chip set and three new CPU's from Intel...has been excruciating. DDR4 support is the only thing that we haven't seen before and it's largely derivative. More SATA III ports and native USB 3.0 have been in the "non-enthusiast" chip sets for one and two years respectively. Everything else is pretty much the same...These X99 motherboards don't look or function much different than the X79 motherboards they will replace. I just can't help to think that this should all have happened last year... What's the big f'ing deal Intel?dgingeri - Friday, August 8, 2014 - link
"Launch day" is likely some time in September, according to rumor.I couldn't wait for the X99 because my Gigabyte motherboard was going out, so I decided to "upgrade" to a Z97 Haswell system. What a disastrous mistake that was. Now my money is all tied up in a Z97 motherboard, 4790k process, and memory, and yet the whole thing doesn't even overclock to match my 3930k. I overclock it, but the damn chip thermally throttles back down to base clock rate when it get busy even though I have a H100i cooler. The only advantage is that I have more 6Gb SATA ports. The mainstream Haswell architecture sucks, mostly due to the chip packaging, and that's where Haswell-E does so much better. It probably won't be thermally throttling back down to base clocks like the mainstream does.
I went with the x79 back in the day because I had two video cards and a raid controller, so I needed more PCIe lanes than the mainstream SB allowed. I am at the same point now, but different video cards and raid controller now. (I had dual GTX470s and a 3ware 9650 three years ago, now I have dual GTX 680s and a Dell H710. Believe me, the raid controller makes a HUGE difference in storage performance. I hate waiting for things to load. Well worth the money.) The X99 definitely has a place with gamers.
mapesdhs - Tuesday, August 12, 2014 - link
I'm just disappointed the 5930K is only going to be a 6-core. That makes the 8-core largely
unaffordable to so many who've been waiting for HW-E. I thought what Intel was going to
do was simply bump up the entire baseline from X79, ie. entry chip is a 6-core (akin to
the 4820K), other two above both 8-core, with the latter being separated by the same kind
of cache, base/turbo clock & sampling differences we've seen with SB-E and IB-E. Instead,
we have 4-core entry, 6-core middle and 8-core at the top. That's dumb. It means to a large
extent, the speedup from a 3930K to a 5930K isn't really going to be that much at all, and
likely not worth the platform update given the cost of the new mbd & RAM.
So, X99/5930K fine for a new build, but may well suck value-wise as an upgrade (even
more so for those with IB-E setups).
Ian.
Meaker10 - Friday, August 8, 2014 - link
Looks like a m.2 pcie wireless card rather than mini pcie to me. Look at the screw position.dgingeri - Friday, August 8, 2014 - link
If you look just to the right of the bottom of the wireless card, you'll see the M.2 card connector. The M.2 SSD would actually sit over the wireless card.