Comments Locked

49 Comments

Back to Article

  • meyerkev248 - Monday, December 10, 2012 - link

    If you're going to stick anything bigger than that GeForce 210 into the 650D, I'd strongly recommend getting the mesh side panel for it. and mounting some extra fans. The default fan setup is fairly terrible, and I was basically losing a part every month before I got the mesh panel. Great case otherwise though.

    /I also was stupid enough to stick 2 6970's into a case with a bad fan setup, and then wonder why every single card I stuck between the graphics cards was dying in a week. I lost a sound card, a tv tuner, a wireless card, my motherboard, and one of the graphics cards inside 3 months. 4 120mm fans later, I' haven't lost a thing since last Christmas.
  • UnderscoreHero - Friday, September 20, 2013 - link

    I've got 2 windforce 670's in SLI, I have had no cooling issues at all. What is your fan configuration like? I have the stock 200mm fan in the front for intake, and the other on top for exhaust. Rear 120mm, and a push/pull Hyper 121 Evo with Corsair SP120's. Those SP's push the air out the back pretty quickly. I also don't have drives in the bottom cage, only the top. Is your fan filter all dusty? Room temp too hot?
  • UnderscoreHero - Friday, September 20, 2013 - link

    * Hyper 212 Evo
  • Dustin Sklavos - Monday, December 10, 2012 - link

    AMD's workstation cards can be a little hinky on the driver side, but if you're just working with Maya they can be a killer bargain.

    If you're editing video masters, the i7-3930K is going to be a good choice, HOWEVER...if you're editing video that's going to go up on YouTube, you may actually be better served by an i7-3770K and QuickSync. There's a clear performance hit in initial render time when you do the master, but I know that for my burlesque performance videography, QuickSync has been absolutely invaluable.

    Finally, either way, video editors are going to want at least a pair of mechanical drives (or a very large SSD) in some form of striped RAID to use as a scratch drive.
  • Next9 - Monday, December 10, 2012 - link

    i7-3770K is absolutely inappropriate for Workstation, since it lack Vt-d support. In addition what is the point of buying i7, if the real Xeon E3 v2 costs the same?
  • Dustin Sklavos - Monday, December 10, 2012 - link

    If you're editing video. A lot of the prosumers overclock their workstations because video editing is so CPU intensive (check the Adobe Premiere Pro forums). Vt-d isn't a major loss for these users.
  • Next9 - Monday, December 10, 2012 - link

    only kids "overclock"....

    Workstation means rockstable, 24/365 reliable machine. Where is ECC RAM with i7-3770K? Proffesionals edit video on XEON workstations running RedHat using Autodesk Smoke.

    overclocked i7-K with Adobe is hobbyist market :-)
  • GrizzledYoungMan - Monday, December 10, 2012 - link

    Wrong. Premiere Pro and FCP (less so, since the FCP X debacle) make up the majority of the professional video editing market. Autodesk software is only used in very high end applications.

    I do technology consulting for lots of low and middle tier video editing houses in NYC, the sorts of guys who pump out the content that fills up cable and broadcast TV and professionally produced internet video. They all use Premiere Pro and FCP.

    Even advertising is quickly adopting lower cost commodity editing systems like the one described here. Which leaves only high end cinema and very high end television for Autodesk - a small fraction of the market.

    As far as "only kids overclock," that's also wrong. Yes, professionals place a much higher emphasis on stability, especially in large corporate environments where procurement procedures take forever and gear has to last just as long.

    But for a lot of high performance SMB applications, I see overclocking being done all the time. Lots of independent and smaller media/design operations overclock to gain a performance advantage or save costs, and I've even seen a few software vendors buy overclocked servers (which are pretty easy to find from grey box resellers) for applications that are very sensitive to single threaded performance.
  • AstroGuardian - Monday, December 10, 2012 - link

    Totally agreed.
  • twtech - Monday, December 10, 2012 - link

    My official work machine is a dual processor Xeon workstation. It's very stable. I think I've only gotten a BSOD once in 3.5 years, and that's was the driver for a failing consumer-class GPU.

    However, I also do some work from home, and in that case I'll do work on my 3930K to which I applied a 800MHz OC to 4GHz. It's also very stable with a closed-loop watercooler, having gone a year so far running 24x7.

    Is the Xeon workstation more stable than my home machine? Objectively, probably yes. On the other hand, is my hand-built home workstation stable enough to depend on? I'd say also yes.

    Of bigger concern really is that I should get a beefy UPS. While my overclocked processor has never failed me, the power has gone out a couple times while I was using the machine.
  • Next9 - Tuesday, December 11, 2012 - link

    there is another important argument - NBD on site warranty.

    If there is any problem with your real workstation, you call the vendor and next day, you have functional machine.

    If there is a problem with you do-it-ourself consumer grade so-called workstation, you are left on your own.
  • PCMerlin - Monday, December 10, 2012 - link

    I have to agree with you Next9. The stability of ECC and raw power of Xeon, along with the Quadro or Fire series video cards should be the only combination for a serious CAD or other graphics workstation.

    I would NOT want to be the tech that has to answer the call when a designer wants answers to why the drawing he/she just spent the better part of the day working on just got "zapped" when his/her system blue-screened.

    In the regular workplace, the helpdesk guy can be the "hero" by restoring a crashed system back to life. CAD designers and engineers, on the other hand, would be perfectly happy if they never saw anyone from the IT world during their day-to-day work.
  • zebrax2 - Monday, December 10, 2012 - link

    What happened to the workstation GPU review?
  • A5 - Monday, December 10, 2012 - link

    Is that what the kids are calling it these days? ;)
  • GrizzledYoungMan - Monday, December 10, 2012 - link

    While I agree with you on some points (see below), I'm still deeply skeptical of the usefulness of quick sync for professional video encoding. The image quality of those Intel commodity hardware encoders is really poor relative to any halfway decent software encoder. And pros tend to value quality over a few minutes (or even hours) of encoding time, as it so heavily affects the perceived overall quality of their product.

    But maybe that's changed? Perhaps a comparison article is in order?
  • JarredWalton - Monday, December 10, 2012 - link

    I believe the point is that if you're uploading something to YouTube (which will futz with the quality, so there's no point in ultra-high rendering quality in the first place), Quick Sync is awesome. E.g. "Here's a preview clip -- it encoded in one minute on my little Ultrabook with Quick Sync, and since it's on YouTube the loss in quality doesn't matter."
  • GrizzledYoungMan - Monday, December 10, 2012 - link

    I don't want to nitpick, but the fact that youtube generally recompresses any video delivered to the site isn't a justification for skimping on the quality of video delivered to youtube, it's a rationale for being even MORE careful about what you deliver to youtube.

    Speaking from experience, it's definitely possible to get video up on youtube that looks great, you just have to deliver at the highest quality possible. If memory serves, youtube accepts files of up to 20GB in size with no practical limit on bitrate, so I usually max out bitrate (via high quality settings, larger frame size, frame rate, etc etc) as much as possible relative to the length of the clip and the file size limit.

    In general, the rule when encoding is that good video put through bad compression gives you mediocre video. Mediocre video put through bad compression gives you bad video.

    To put it another way, the more information (by way of better quality compression) delivered to the youtube encoding pipeline, the better the overall result.
  • Next9 - Monday, December 10, 2012 - link

    What is the point of using garbage consumer grade boards like ASUS or ASrock?

    ASUS boards usually lacks proper VT-d, ECC, AMT and other professional features support. BIOS/UEFI interface is complete piece of shit with GUI targeted at 10 year old kids full of stupid tawdry "features" with no real value to usability.
  • Rick83 - Monday, December 10, 2012 - link

    I was about to say the same - This review lacks consideration of S1155 Xeons, C216 chipsets, ECC...basically all that makes the distinction between a desktop and a workstation.
    And even the C216 ASUS Board does not support AMT.

    With th current price of these components, you would only add around 200 dollars to the mid-end machine, to bring it up to workstation spec.
    ECC-UDIMMs are only mildly more expensive than non-ECC-UDIMMs, S1155 Xeons are only marginally more expensive than the i7, and come with all features unlocked, and the Supermicro X9SAE(-V) (the only boards for the S1155 workstation market, that can be found in retail) go below 200 dollars, if you shop around - twice the price of the bargain bin B75 board, but you get so much more for your money....

    There's little use in going higher end, as anything that requires more performance should probably not be at your workplace, but rather in a server room.
    The AMD route is an interesting way of getting ECC at a slightly cheaper price. But only if you can stomach losing remote management.
  • Ktracho - Monday, December 10, 2012 - link

    What motherboard(s) would you recommend for for ECC and full VT-d support? I built a system with 3 Tesla cards with the idea that one or two of them could be dedicated to a virtual machine, but I didn't realize the motherboard also needed to support VT-d. I have no idea how to find out what motherboards have this feature.
  • Rick83 - Monday, December 10, 2012 - link

    With 3 Teslas, you definitely want a C606 based chipset, otherwise you'll run out of PCIe fast. Haven't looked intensely at that market yet, because it's outside my needs/budget.
  • Next9 - Tuesday, December 11, 2012 - link

    What about Supermicro? They have plenty of single socket and dual socket LGA2011 motherboards? Even with onboard audio and USB 3.0. Or there are also C32/G34 alternatives if you prefer Opterons.
  • Ananke - Monday, December 10, 2012 - link

    I absolutely agree. Workstation path is Supermicro/Tyan/Intel motherboard with Xeon, ECC RAM, etc. and NVidia GPGPU. If you use mostly Maya, than you can cheap with some consumer Radeon. Eventually, redundant PSU. Such system can go as high as $50k though.
  • Next9 - Tuesday, December 11, 2012 - link

    Cheap Radeons are also great in Virtualized environment (Vt-d/IOMMU). Consumer grade GeForce cards have often problems with direct HW passthrough.
  • Penti - Monday, December 10, 2012 - link

    Workstation features and text-mode firmware is what you want, not to be reminded about the GUI BIOS's of 486 computers of the past. It wasn't a good idea then and isn't now. Working implementations is all that matters.

    I guess high-end should be something like a LGA2011 Xeon machine. Of course something like a Supermicro board will have it's own (third party) IPMI and KVM over IP embedded BMC/stack. Or a 4P Opteron Piledriver machine. For high-end enterprise type stuff. (Boxx sells systems with dual processor Opteron or Xeon, I guess dual processor Opteron will give a boost for some, at least with G34 quad-channel and 2/4P on top of that). At least those systems where you have the choice to go 2P or 4P is what you would call actual workstations. Which performs better or fit other uses then a say clocked 3770k any how. Between a six core SB-E and Ivy Bridge quad-core there just won't be a lot of difference to justify the shift. If you want VT-d you could find Z77 boards with support for it if your looking. Just not Asus boards. Provided you choose a CPU that as support for it too, like 3770 (non-k). Probably Q77 vPro/AMT supported boards too if you look for them. If you do any creative multimedia type stuff you probably want a much more powerful graphics card then GF210 btw. In the terms of supporting stuff like Adobe's Mercury Playback Engine, CUDA acceleration or professional CAD or modeling software.

    All depends on need. If you need it a good machine will probably be worth it.
  • Olaf van der Spek - Monday, December 10, 2012 - link

    What's up with the Radeon 5450 suggestion? With only 64 bit DDR3 it's seriously low on memory bandwidth, which might even cause trouble on the desktop.
  • DanNeely - Monday, December 10, 2012 - link

    I've ran a 2560x1600 and 1200x1600 monitor simultaneously on significantly slower cards without trouble before. You're not going to be able to do anything GPU heavy on them but the desktop's requirements are so low virtually anything can handle them.
  • slatanek - Monday, December 10, 2012 - link

    just wanted to add, for all adobe cs6 users (premiere pro especially) go with nvidia for the graphics. amd's gpu's get limited support and it's osx only (mercury engine mostly).
    for the lack of ecc, v-td etc. I understand that Zack stated/advised that if your work is critical (thats where you'd use ecc after all) you'd be better off with pre-build systems that come with full service. thats why, I assume, no mention of the pro features.
  • ggathagan - Tuesday, December 11, 2012 - link

    Are you sure about that?

    In CS5, the Mercury engine only supported CUDA.
    As I understand it, however, Mercury has been modified in CS6, dropping CUDA support in favor of OpenCL and OpenGL

    http://helpx.adobe.com/photoshop/kb/photoshop-cs6-...

    "MGE is new to Photoshop CS6 and uses both the OpenGL and OpenCL frameworks. It does not use the proprietary "CUDA framework from nVidia."
  • TeXWiller - Monday, December 10, 2012 - link

    That is an inappropriate selection for a workstation. Try the W-series instead. And since we are talking about entry-level workstations, the E3 Xeons and AMD based ECC-supporting consumer boards should have been included, like the other commenters have pointed out. The A300s don't support ECC but would have been an interesting point of comparison.
  • Kristian Vättö - Monday, December 10, 2012 - link

    I don't personally find the Intel SSD 520 to be a good value anymore. Sure, it comes with a 5-year warranty but so does almost all high-end SSDs nowadays. Its performance isn't worth any extra either because to be honest, it's slow compared to other high-end SSDs. Especially if you're dealing with incompressible data, SandForce really isn't the best choice and I think it's important for workstation users to have consistent performance, which SandForce cannot provide.

    If Samsung SSD 840 Pro is out of reach, I would recommend either Corsair Neutron GTX or Plextor M5 Pro. At 120/128GB, they cost around as much as the Intel SSD 520 but if you go for the 240/256GB model, you'll be able to save a few bucks. Both also come with 5-year warranty if that's a concern.
  • mrdude - Monday, December 10, 2012 - link

    I came in here to post this but I'm glad I'm not the only one.

    The Intel drives really don't offer anything special anymore, particularly since Corsair's LAMD acquisition. Plextor also offers a 5 year warranty and they've got the best Marvell in-house firmware on the market with rock solid stability and fantastic performance. Since their M3 they've been my SSD of choice and the drives I recommend to everyone, but now it's a toss up between Corsair's Neutron and Plextor's drives. Of course, if we're talking power consumption in a laptop then it's pretty one-sided.

    As far as quicksync and video editing goes, it highly depends on the software involved. Some software responds well to CUDA/openCL and blazes through with GPU assist and shows no signs of even slight distortion or muddiness while other software maintains great image quality via fixed-function units like QuickSync. The most consistent as far as image quality goes will always be a straight CPU approach, but that doesn't necessarily mean that it's the only viable solution.
  • Doctor Z - Monday, December 10, 2012 - link

    You do realize that for most users who need power, dual and quad-CPU server motherboards make better workstations. Why didn't you include those Zach? Because they're in the $10,000-$30,000 range fully-loaded?
  • A5 - Monday, December 10, 2012 - link

    If that's what you need, you aren't building your own. You're either part of a company that will buy it for you (and therefore your IT department will want something serviceable with a warranty) or you're running your own business (and can deduct the expense) and need something that is rock-solid reliable.
  • JDG1980 - Monday, December 10, 2012 - link

    If it doesn't have ECC RAM, it's not a workstation. Period. Not one of the builds showcased in your articles includes this basic feature - an inexcusable omission.
  • Pityme22 - Monday, December 10, 2012 - link

    First, you have to identify the programs used by the workstation to even begin commenting. I.e. You dont mention CAD programs which for use of all features require a Quattro or FirePro graphics cards. Anand, I am very surprised that you let this "article" be posted as it is very much below normal AnandTech standards. Shame, Shame.
  • jamesgor13579 - Monday, December 10, 2012 - link

    I completely agree with many of the other posters here. If it does not have ECC is isn't a workstation. I work in R&D for a large tech company. There is a reason ALL of our desktops have ECC. RAM just isn't that reliable.

    Here is an excellent example of why:
    I am an ASIC designer. We have to run a lot of simulations of the logic and timing to make sure everything works. Once our design is layed out, modeling all of the timing takes a lot of memory. Dozens of GB just to simulate part of the design. Someone was cheap and built a three "workstations" out of desktop motherboards and 64GB (8x8GB in a socket 2011) non-ECC memory. Well less than a year later when it was crunch time on the project and the machines were running week long simulations, two of them started randomly crashing. Guess what, it was the RAM. Replaced one DIMM and the machine worked again.

    If you need your system to work, Non ECC RAM is not OK.
  • Makaveli - Monday, December 10, 2012 - link

    Doesn't that just mean you had a bad stick of memory?

    And your telling me that can't happen with ECC memory?
  • smpltn - Monday, December 10, 2012 - link

    650D paint chips easily
  • Kevin G - Monday, December 10, 2012 - link

    Depending on the task at hand ECC is a requirement. The graphic artist or video editor would likely only encounter a pixel being off color for a memory error. Those types of fields can generally tolerate such errors. The CAD, research or financial markets for example absolutely need to have ECC due to the need for continual data integrity.

    As such, more consideration should have been put into AMD motherboards (the FX line supports ECC but not all motherboards do) as well as socket 1155/C200 series chipsets for the low end and midrange builds. Even if the use-case doesn't need ECC, I'd have still opted to include such a motherboard with the AMD FX build. High end socket 2011 build I'd recommend the Gigabyte X79S-UP5 motherboard supports ECC memory as it is really based upon the C606 chipset. This motherboard would cover a wide range of workstation uses. Only those wanting dual sockets would have to look else where.

    The graphics card choice is also 'interesting'. For simple 2D work, a low end consumer card is more than enough for some use-case scenarios. For things like image editing, getting a low end FirePro or a Quadro would make sense for superior drivers and 30 bit color support. Other use-scenarios are starting to use GPU's for some heavy processing: video editing applications for example for accelerating some effects. High end consumer cards are often equal to midrange workstation cards due to artificially crippling GPGPU performance on the consumer side. Selecting between cards often boils down to specifically what applications the workstation will be running.

    Power supply selection is a bit weak. A workstation tends to be expandable and I'd provision some room for future expansion. Upgrading the lowend builds here with a midrange or better consumer GPU would entail a PSU upgrade as well. The article does mention getting bigger PSU's with bigger video cards but I see it wiser to provision a PSU with these possible upgrades in mind before purchasing them. Only with the niche GPGPU workstation area I can see multiple video cards being worth considering so that does put a reasonable upper bound on PSU requirements.

    For a workstation I always recommend that a pair of hard drives are setup in a RAID1 array to protect your data in the event of a disk failure. A five year warranty won't help you when your drive is dead and you have to pay for downtime and recreating work from your most recent backup. Speaking of which, including good backup software/external storage and a solid UPS would be wise for a workstation regardless of use-case scenario. When you're using a system for work, you want it to work continuously.
  • Blibbax - Monday, December 10, 2012 - link

    What made you pick the i7-3770 over the cheaper Xeon E3-1230V2?
  • stickmansam - Monday, December 10, 2012 - link

    Yeah I was wondering about that too
  • slatanek - Monday, December 10, 2012 - link

    From what I read on the 1st page of this guide I understand it's a guide focused on consumer/enthusiast grade workstation where your work is not involved in serious money/critical appliances. I mean, c'mon guys is that so hard to get? It's written right there in the introduction. So stop just outsmarting each other out with statements about what is and isn't a workstation. Nowadays a workstation doesn't even mean that much - what is that different in a modern workstation vs. enthusiast PC? Frankly, not much. It's basically the same architecture, layout etc. As I've said before - go and read again, it's written right there:

    "If your computer is more than important (i.e. mission critical), DIY is rarely a good idea."

    Am I the only one who read the whole article before posting a comment? I get the impression that some of you just looked at the components choice lists and went on trolling.

    As a reply I say:

    "anything that runs ECC is not worth writing about, cause guys using those "things" are too focused on their job to even bother reading about it".

    Nah, just kidding ;-)
    Cheers
  • JonnyDough - Monday, December 10, 2012 - link

    "Though Piledriver chips don't match Intel's highest-end performance processors, at certain price points, Piledriver CPUs are worth consideration because they can outperform equivalently priced Intel products (with a few qualifications)."

    However, the difference in power usage may make the Intel system still a better deal.
  • beaker7 - Monday, December 10, 2012 - link

    Cupcake article. The parts in the high end build are neither high-end nor workstation class. A 3930k could be used in a budget situation, I suppose.

    Current high end is:

    E5-2687w Xeons
    SuperMicro dual LGA 2011
    1600 Mhz ECC RAM

    etc
  • lunadesign - Friday, December 14, 2012 - link

    +1000 (I totally agree)
  • Uncognoscenti - Wednesday, December 12, 2012 - link

    Would appreciate some elaboration regarding criteria for selecting Windows 7 over Windows 8 in this application.
  • kadajawi - Wednesday, December 19, 2012 - link

    I find it hard to believe that you do not specify a CPU cooler in that build. The Intel stock heatsink is loud, and sometimes even inadequate. I have had i7 overheat/slow down because the CPU just gets too hot. Switched it with a Xigmatek Gaia, and the thing was not only silent, but also so cool that the thing could be overclocked from 2.66 to 3.8 GHz (I did also switch the lousy case for a Xigmatek case).

    Also the 3570K can be an interesting CPU for a workstation. All 3D workstations I have built use the 3570K on a Z77 board by Asus (P8Z77-M), cooling is provided by a CoolerMaster 212 EVO, encased in a Xigmatek Asgard XP, power supply is a Seasonic S12-II Bronze 620 (one of which died within a few months). The systems run pretty good and are rock solid at 4.2 GHz. For OS and software the Intel 320 SSD was used. Impossible to hear unless under full load. The performance easily rivals the i7 PCs that were already there.
  • harth234 - Friday, August 9, 2013 - link

    More guides please!
    It's been half a year!

Log in

Don't have an account? Sign up now