I just glossed through the charts (will read article tomorrow), but I noticed there are no Nehalems in the comparisons. It would be nice if both a Bloomfield and a Gulftown were thrown in. If Phenom IIs are still there, Nehalem shouldn't be THAT outdated right? Anyways, I'm sure the article is great. Thanks for your hard work and I look forward to reading this at work tomorrow :)
I agree with etamin - if Phenom is in there, a great Intel benchmark cpu would be the Nehalem i7-920 D0 OC'd to 3.6Ghz - I'd wager a significant percentage of Anand's readers (myself included!) still have this technological wonder in our everyday rigs. The i7-920 would be a good 'reference' for us all when evaluating/comparing performance. Thanks, and awesome article, as always!
It's always "best" here to forget about other Intel and nVidia - as if they suddenly don't exist anymore - it makes amd appear to shine. Happens all the time. Every time. I suppose it's amd's evil control - or the yearly two week island vacation (research for reviewers of course)
I have to agree as well. I have an i7 920 C0 and often wonder how it stacks up today against Ivy Bridge. I'm thinking that holding off for Haswell is a safe bet even though I have the upgrade itch! It's been 3 years, which is great to have gotten this much milage out of my current system, but I wanna BUILD SOMETHING!!
It's whole video card tier of frame rate difference in games, plus you have sata 6 and usb 3 to think about not to mention pci-e 3.0 to be ready for when needed.
Buy your stuff and sell your old parts to keep it worth it.
I'm guessing the stock idle power consumption number is with EIST disabled?
I've been waiting for some of these lower-powered IVB chips to come out to build a NAS. Was thinking a Core i3 (if they ever get around to releasing it), or maybe the lowest Xeon. Though at this point I might just bite the bullet and wait for 64-bit ARM... or go with a Cortex-A15 maybe.
If a Cortex-A15 would give you enough computing power, you should also be happy with a Pentium or even a Celeron. The i3 is already rather overkill for a simple NAS.
I have a fileserver with Windows Server Home running on a Pentium G620, and it has absolutely no problem to push 120 MB/s over a GBit Ethernet switch from a RAID-0 pack of HDDs while running µtorrent, Thunderbird, Miranda and Firefox on the side. Power consumption of the complete system is around 40-50W in idle, and I havn't even shopped for specifically low-power components but used a lot of leftovers.
Yeah, the CPU is just 1 part of the power consumption puzzle.. And since in "file sharing" mode, it will almost always be in a low power/idle state... An ARM CPU would show little improvement..
But if you ever offloaded any kind of work to that box, you'd have wasted your money with an ARM box, as no ARM processor will ever match real task performance of any of the x86 processors.
Upping the Turbo Boost multipliers for the 400 MHz overclock is only on Z75 and Z77, right? That makes it very much less "free" I would say.
There seem to be some reasonable budget option B75 and H77 motherboards, not to mention the previous socket 1155 offerings with an updated BIOS to accept the IVB processor.
I sometimes wish Intel could just present a Lineup for OEM, another for Retail Consumers. To greatly simplify Lineup. Just looking at the Lineup Hurts my brain and eyes, they could just offer all CPU with HT, 2 / 4 Core Variants with Speed Differentiation would be MORE then enough for me.
Then AMD is plain stupid for not capturing more market shares with their APU. Their new CEO has it right, where he had watched AMD systematically shoot itself in the foot, over and over again.
Rory Read wasn't responsible for Fusion in any way; the only thing he can realistically do here is to push as many resources at getting APUs out of the door as possible.
HD3000 is not 500 HDs better than HD2500, it is in fact an older, less capable version of HD graphics. The numbering scheme is admittedly silly, but thats to be expected from Intel by now.
In the end, this CPU is not meant for gamers, not even if they want Ivy Bridge for a low cost. For 60$ less than the 3470 you will soon get the 2 core / 4 threads i3-3220. For a low budget gamer, this will still give you more than enough CPU power to team with any GPU you can afford. And those 60$ you saved can buy you an AMD 6670, which should be at least twice as fast as HD4000.
The 3470 makes much more sense for people that can accept minimal GPU power, but appreciate the increased CPU power of the (real) quadcore. Think office PC handling massive excel files with loads of calculations: Not enough to warrant a Xeon based system, but definitly enough to make the 60$ premium from a dual core worthwhile.
The idea that any poor sap has to game on any 2000, 2500, 3000, or 4000, or llano, or trinity, is too much to bear.
What a PATHETIC place to be. How the heck is someone going to spend on HD3000 or HD4000 and the board and accompanying items and not have $75-$100 for say a GTX460 ?
I mean how badly do these people plan on torturing themselves and what point are they really trying to prove ?
They could get a $100 amd phemon 2 and a $100 GTX 460 if they want to game, and be so far ahead...
Correct PrinceGaz, but then we have the amd fan boy contingent, that for some inexplicably insane fatnasy reason, now want to pretend llano, trinity and hd graphics are gaming items...
The whole place has gone bonkers. But a freaking video card, every *********** motherboard has one 16x slot in it.
I noticed you ran a BF3 bench and assumed that the game is playable on a HD4000. Have you actually played the game with the HD4000 on a 32 player map. I dare you to try and update this review and say that BF3 is "playable" on a HD4000.
"The HD 4000 delivered a nearly acceptable experience in Battlefield 3..." I wouldn't call that saying the game is "playable". Obviously, the more people there are on a map the worse it gets, but if you're playing BF3 on 32 player maps (or you plan to), I'd hope you understand that you'll need as much GPU (and quite a bit of CPU) as you can throw at the game.
That said, I'll edit the paragraph to note that we're discussing single-player performance, and multiplayer is a different beast.
Well, when one sees 37 fps BF3, one might wrongly assume that: "OMG 37 FPS on a thin and low powered [insert ultrabook brand]"; I'm going tomorrow and buying it. I think that playing the game for 5-10 minutes with FRAPS enabled and providing highest/lowest FPS count for each hardware setup is more revealing than running built-in engine demos.
On the quad-core desktop IVB chips it surely will -- that's why the result is in the charts -- but for laptops? Nope. I think the best result I've gotten (at minimum details and 1366x768) is around 25-26FPS in BF3.
"Intel has backed OpenCL development for some time and currently offers an OpenCL 1.1 runtime for their CPUs, however an OpenCL runtime for Ivy Bridge will not be available at launch. As a result Ivy Bridge is limited to DirectCompute for the time being, which limits just what kind of compute performance testing we can do with Ivy Bridge."
The Intel® SDK for OpenCL Applications 2012 has been available for several weeks now and is supposed to include GPU OpenCL support for Ivy Bridge. Isn't that sufficient to enable you to run your OpenCL benchmarks?
If not, the beta drivers for Windows 8 also support Windows 7 and adds both GPU OpenCL 1.1. support and full OpenGL 4.0 support including tessellation so would allow you to run your OpenCL benchmarks and a Unigine Heaven OpenGL tessellation comparison.
Agreed. I don't understand why Intel basically stood pat on the low end HD. But then again like everyone else I never understood why the HD3000 was only in the K series and maybe 5% of K series users don't have discreet gpu so the HD3000 isn't being used.
Good point. Lucid logic tried to fix that some, and did a decent job, and don't forget quick sync, plus now with zero core amd cards, and even low idle power 670's and 680's, leaving on SB K chip hd3000 cores looks even better - who isn't trained in low power if they have a video card, after all it's almost all people rail about for the last 4 years. So if any of that constant clamor for a few watts power savings has any teeth whatsoever, every person with an amd card before this last gen will be using the SB HD3000 and then switching on the fly to gaming with lucid logic.
So this must be a midrange desktop chip? Horrendous performance on the graphics side from Intel again.
Very curious how AMD's trinity dekstop will perform, at the same pricerange it will be obvious it will obliterate Intel's offerings on the graphics side. What's more impressive AMD is still on 32nm..
For me this IS the perfect chip. No use for the GPU so cheaper = better. I would need a K model though for OC'ing potential, but I'm glad to see that if I can't have my CPU-only (no GPU) chip, at least I can have a hacked down version that is more in line with a traditional CPU.
What Intel should really be doing here is offering the 4000 on all i3s and some i5s to offset the reduced CPU performance. If you want to give AMD something to think about, HD 4000 on an Ivy Bridge dual core is very much the right way of going about it.
Then Intel has a lame trinity level cpu next to a losing gpu. I think Intel will stick with it's own implementations, don't expect to be hired as future product manager.
Interestingly enough, Intel will also happily sell you what is basically the same chip, without any GPU, 100 MHz slower but with 2MB extra L3-Cache for the same price. They call that offer Xeon E3-1220V2. And it is 69W TDP, not 77W as the i5-3470.
Who knows, the bigger Cache might even make it the better CPU for a not-overclocking gamer. If normal boards support it.
Following on from your closing statement with regards to the HD 4000 being the miniumum, will you be doing a review of the 3570K? Surely with this model being the lowest budget Ivy HD4000 chip, it'll be a fairly popular option for many system builders and OEM's.
f they truly were interested in building the best APU. And by that, a knockout iGPU experience.
Where are the dual-core Core i7's with 30-40 EU's?? Or the AMD <whatevers> (not sure anymore what they call their APU) Phenom X2 CPU with 1200 Shaders??
When we are talking about a truly GPU intense application, a LOT of times single/dual core CPU is enough. Heck, if you were to take a dual-core Core 2, and stick it with a GeForce 670 or Radeon 7950.. You would see very similar numbers in terms of gaming performance to what's in the BENCH charts. ESP at the 1920x1080 and below.
Surely Intel can afford another die that aims a ton of transistors at just the GPU side of things. AMD, maybe. Why do we get from BOTH, their top end iGPU stuck with the most transistors dedicated to the CPU??
I find it hard to believe anyone shopping for an APU is hoping for amazing CPU performance to go with their average iGPU performance. That market would be the opposite. Sacrifice a few threads on the CPU side for amazing iGPU.
Am I missing something technically limiting?? Is that many GPU units overkill in terms of power/heat dissipation of the CPU socket??
Well, their chips have to work in a certain set of thermal limits. Maybe at this point 1200 shader cores would not be possible on the same die as a quad core CPU for power consumption and heat reasons. I think Haswel will have 64 EUs though if the rumours are true.
There is no point of a 1200 shaders apu due to memory bandwidth. You couldn't feed a beast of an apu with only dual channel 1600 mhz memory when that same memory limits the performance of llano and trinity compared to their gpu cousins which have the same calculation units and core clocks but the gpus perform significantly better.
Good Points. But currently, Intel has Quad Channel DDR3-1600 up on the Socket 2011. I am sure AMD could get more bandwidth there too, if they step up the memory controller.
My overall point, is that neither is even trying for a low-medium transistor CPU and a high transistor GPU.
It's either Low-Medium CPU with Low-Medium GPU (disabled cores and what have you), or High End CPU with "High End" GPU.
There is no attempt at giving up CPU die space for more GPU transistors from either.. None. If you someone spends $$ on the High End of the CPU (Quad Core i7), the implementation of iGPU is not even close to worth using for that much CPU.
Quad Channel is not a "free upgrade" it requires much more traces on the motherboard as well as more pins on the cpu socket. This dramtically increases costs for the motherboard and the cpu. Both of those are going against what AMD is trying to do with their APUs which will be both laptop as well as desktop chips. They are trying to increase their margins on their chips not decrease them.
You have a large number OEMs only putting a single 4gb ddr3 stick in laptops and desktops (thus not achieving dual channel) in the current apus. You want think those same vendors are suddenly going to put 16gbs of memory on an apu (and it is going to be 16gbs since 2gb ddr3 sticks are being phased out via the memory manufactures.)
I'm curious why the HD4000 outperforms something like the 5450 by nearly double in Skyrim, yet falls behind in something like Portal or Civ, or even Minecraft? Is it immature drivers or something in the architecture itself?
For Minecraft, read the article and what it has to say about OpenGL.
For Portal or Civ, it might very well be related to Memory Bandwidth. The HD2500 can have 25.6 GB/s (with DDR3-1600), or even more. The 5450 generally comes with half as much (12.8 GB/s), or even a quarter of it since there are also 5450s with DDR2.
As a matter of fact, I remember reading several reports on how much the Llano-Graphics would improve with faster Memory, even beyond DDR3-1600. I havn't seen any tests on the impact of memory speed from Ivy Bridge or Trinity yet, but that would be interesting given their increased computing powers.
I'm sure it'll matter for both, more so for Trinity. I'm not sure we'll see much in the way of a comparison until the desktop Trinity appears, but for IB, I'm certainly waiting.
Having half the memory bandwidth would lead to the reverse expectation, the 5450 is close to or even surpasses the HD4000 with twice the bandwidth in those games, yet the 4000 beats it by almost double in games like Skyrim, even the 2500 beats it there.
Intel actually has a beta driver (tested on the Ultrabook) that improves Portal 2 performance. I expect it will make its way to the public driver release in the next month. There are definitely still driver performance issues to address, but even so I don't think HD 4000 has the raw performance potential to match Trinity unless a game happens to be CPU intensive.
With 1920×1080 being the standard thesedays I find it annoying that all AT tests continue to ignore it. Are you trying to goad monitor makers back into 16:10 or something?
The 1080p resolution may have become standard for televisions, but it certainly isn't so for computer monitors. These days the "standard" computer monitor (meaning, what an OEM rig will ship with in most cases whether it's a desktop or notebook) is some variant of 136#x768 resolution, so that gets tested for low-end graphics options that are likely to be seen in cheap OEM desktops and most OEM laptops (such as integrated graphics seen here.)
The 1680x1050 resolution was the highest end-user resolution available cheaply for a while and is kind of like a standard among tech enthusiasts- sure you had other offerings available like some (expensive) 1920x1200 CRTs, but most people's budget left them with sticking to 1280x1024 CRTs or cheap LCDs or if they wanted to go with a slightly higher quality LCD practically the only available resolution at the time was 1680x1050. A lot of people don't care enough about the quality of their display to upgrade it as frequently as performance-oriented parts so many of us still have at least one 1680x1050 lying around, probably in use as a secondary or for some even a primary display despite 1080p monitors being the same cost or lower price when purchased new.
I imagine with the heat/OC'ing issues with the trigate chips, Intel is working to resolve Fab as well as operational issues with IB and thus isn't ramping as fast as normal.
Would the HQV score of the HD2500 be the same as the HD4000 in the Anandtech review? Basically would video playback performance be the same (HQV, 24fps image enhancement features etc.)?
A lot of processors in the low power ivy bridge lineup have the HD2500. If playback quality is the same this would make those very good candidates for my next HTPC. The Core i5 3470T specifically.
Also does the HD2500 lock at the correct FPS rate which is not exactly 24FPS. AMD has had this for ages but Intel only caught up with the HD4000. For me it is the difference between an i7-3770T and an i5-3470T
This is a replacement of the i5-2400. Actually the 3450 was, but this is 100mhz faster. You should be comparing HD2000 vs HD2500 as well as these aren't top tier models with the HD3000/4000.
In the GPU Power Consumption comparison section, did you disable HT and lock the 3770k to the same frequency as the 3470 to get a more accurate comparison between just the HD 4000 and HD 2500?
any gamer with a good quad core doesnt need to upgrade their cpu. Who's going to spend hundreds of dollars to upgrade from another quad core (like lets say my i7 920) to this one for a whopping 7 fps in one game and 1 fps in another? That sounds like something an apple fanboy would do... oh look the new isuch-and-such is out and its marginally better than the one I spent $x00 on last month I have to buy it now! no thanks.
This really depends a lot on what you have (or want) to do with your computer. Architectural differences are obviously a big deal or else instead of an i7-920 you'd probably be rocking a Phenom (1) x4 or Core2 Quad by your logic that having a passable quad core means you don't need to upgrade your processor until the majority of gaming technology catches up.
Let's take the bsnes emulator as an example here, it focuses on low-level emulation of the SNES hardware to reproduce games as accurately as possible. With most new version releases, the hardware requirements gradually increase as more intricate backend code needs to execute within the same space of time to avoid dropping framerates; being that these games determined their running speed by their framerate and being sub-60 or sub-50 (region-dependent) means running at less than full speed, this could eventually be a problem for somebody wanting to use such an accurate emulator. From what I've heard, most Phenom and Phenom II systems are very bogged down and can barely get any games running at full speed on it these days and from my own experience, Nehalem-based Intel chips either require ludicrous clock speeds or simply aren't capable of running certain games at full speed (such as Super Mario RPG.) Obviously in cases such as this, the performance increases from a new architecture could benefit a user greatly.
Another example I'll give is based on the probability through my own experiences dealing with other people that the vast majority of gamers DO use their rigs for other tasks too. Any intensive work with maths, spreadsheets, video or image editing and rendering, software development, blueprinting, or anything else you could name that people do on a computer nowadays instead of by hand in order to speed the process will see massive gains when moving to a faster processor architecture. For anybody that has such work to do, be it for a very invested-in hobby, as part of a post-secondary education course, or as part of their career, the few hundred dollars/euros/currency of choice it costs to update their system is easily worth the potentially hundreds or thousands of hours per upgrade cycle they may save through the more powerful hardware.
I will concede that in today's market, the majority of gaming-exclusive cases don't yield much better results from increasing a processor's power (usually being GPU-limited instead) however that's a very broad statement and doesn't account for things that are heavily multithreaded (like newer Assassin's Creed games) or that are very processor-intensive (which I believe Civilization V can qualify as in mid- to late-game scenarios.)
There will always be case-specific conditions which will make buying something make sense or not, but do try to keep in mind that a lot of people do have disposable income and will very likely end up putting it into their hobbies before anything else. If their hobbies deal with computers they're likely going to want to always have, to the best extent they can afford, the latest and greatest technology available. Does it mean your system is trash? Of course not. Does it mean they're stupid? No moreso than the man that puts $10 a week into his local lottery and never wins anything. It just comes down to you having different priorities from them.
The only other thing I want to address is your stance on Apple products. Yes the hipsters are annoying, but you would likely lose the war if you wanted to argue on the upgrade cycle users take with Mac OSX-based computers. New product generations only come about once a year or so and most users wait 2-3 generations before upgrading and quite a few wait much longer than the average Linux/Windows PC user will before upgrading. The ones that don't wait are usually professionals in some sort of graphic arts industry (such as photography) where they need the most processing power, memory, graphics capabilities, and battery life possible and it's a justified business expense.
People usually skip a generation - so from i7 920 we can call it one gen with SB being nearly the same as IB, so you're correct.
But anyone on core 2 or phenom 2 or athlon 2x or 4x, yeah they could do it an be happy - and don't forget the sata 6 and usb 3 they get with the new board - so it's not just the cpu with IB and SB - you get hard drive and usb speed too.
So with those extras it could drive a few people in your position - sata 6 card and usb 3 card is half the upgrade cost anyway, so add in pci-e 3 as well. I see some people moving from where you are.
The onboard graphics of the Ivy Bridge processors was never seriously intended for playing games. It is intended to replace chipset graphics for to support office applications with large LCD monitors. And it adds transcoding capabilities.
@Anand : If you want to do a more meaningful comparison of graphics performance for those that might be doing gaming, why not test and compare some DX9 games (still being written) of titles available 5 years ago. Real people play these games because they are cheap or free and provide as much entertainment as DX10 or DX11 games. Frame rates will be 60fps or slightly better. Or will your sponsors at nVidia, AMD or Intel not permit this sort of comparison.
Its ridiculous to compare onboard graphics to discrete graphics performance. A dedicated GPU, optimized for graphics, will always beat a onboard graphics GPU for a given gate size.
The Ivy Bridge graphics (performance/power consumption) , if I interpret these comparisons that have been presented correctly, is also inefficient compared to the processing capabilities of a discrete graphics card.
As you mentioned, I'd like to see some mention of the 2D performance. I use Awesome WM on a 3520x1200 X screen, and smooth scrolling can sometimes get choppy with my Graphics My Ass GPU.
I'd like to upgrade my Core2 duo, but I'm not sure whether the HD2500 graphics in this chip will suffice, or if I need to be looking at higher end CPUs. I don't really care about the difference between shitty 3D and ho-hum 3D.
That's a shame that they still sale the GT 520 and GT 610 and the ATi 5450, When a integrated GPU like the HD 2500 out performs a dedicated GPU it's time to retire them from the market. I bought a 3470 and I am running a R9 270 with 8GB of 1600 Ripjaws. I tried out the HD 2500 on the chip just to see how it would do, It honestly sucked, But for videos and gaming on very low settings it works, It actually surprised me. But I don't think I could ever stand to have a intergrated GPU, What's the point in buying a i5 if you are only going to use the integrated gpu? It does not make sense, You may as well keep your old P4 if you are not going to add a real GPU to it. This is why I don't understand the point of a integrated GPU inside a high end processor.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
67 Comments
Back to Article
etamin - Thursday, May 31, 2012 - link
I just glossed through the charts (will read article tomorrow), but I noticed there are no Nehalems in the comparisons. It would be nice if both a Bloomfield and a Gulftown were thrown in. If Phenom IIs are still there, Nehalem shouldn't be THAT outdated right? Anyways, I'm sure the article is great. Thanks for your hard work and I look forward to reading this at work tomorrow :)SK8TRBOI - Thursday, May 31, 2012 - link
I agree with etamin - if Phenom is in there, a great Intel benchmark cpu would be the Nehalem i7-920 D0 OC'd to 3.6Ghz - I'd wager a significant percentage of Anand's readers (myself included!) still have this technological wonder in our everyday rigs. The i7-920 would be a good 'reference' for us all when evaluating/comparing performance.Thanks, and awesome article, as always!
CeriseCogburn - Thursday, May 31, 2012 - link
It's always "best" here to forget about other Intel and nVidia - as if they suddenly don't exist anymore - it makes amd appear to shine.Happens all the time. Every time.
I suppose it's amd's evil control - or the yearly two week island vacation (research for reviewers of course)
LancerVI - Thursday, May 31, 2012 - link
Throw me into the list that agrees.Still running a i7 920 C0. the Nehalems being in the chart would've been nice.
jordanclock - Thursday, May 31, 2012 - link
That's what Bench is for.HanzNFranzen - Thursday, May 31, 2012 - link
I have to agree as well. I have an i7 920 C0 and often wonder how it stacks up today against Ivy Bridge. I'm thinking that holding off for Haswell is a safe bet even though I have the upgrade itch! It's been 3 years, which is great to have gotten this much milage out of my current system, but I wanna BUILD SOMETHING!!CeriseCogburn - Monday, June 11, 2012 - link
It's whole video card tier of frame rate difference in games, plus you have sata 6 and usb 3 to think about not to mention pci-e 3.0 to be ready for when needed.Buy your stuff and sell your old parts to keep it worth it.
jwcalla - Thursday, May 31, 2012 - link
I'm guessing the stock idle power consumption number is with EIST disabled?I've been waiting for some of these lower-powered IVB chips to come out to build a NAS. Was thinking a Core i3 (if they ever get around to releasing it), or maybe the lowest Xeon. Though at this point I might just bite the bullet and wait for 64-bit ARM... or go with a Cortex-A15 maybe.
ShieTar - Thursday, May 31, 2012 - link
If a Cortex-A15 would give you enough computing power, you should also be happy with a Pentium or even a Celeron. The i3 is already rather overkill for a simple NAS.I have a fileserver with Windows Server Home running on a Pentium G620, and it has absolutely no problem to push 120 MB/s over a GBit Ethernet switch from a RAID-0 pack of HDDs while running µtorrent, Thunderbird, Miranda and Firefox on the side. Power consumption of the complete system is around 40-50W in idle, and I havn't even shopped for specifically low-power components but used a lot of leftovers.
BSMonitor - Thursday, May 31, 2012 - link
Yeah, the CPU is just 1 part of the power consumption puzzle.. And since in "file sharing" mode, it will almost always be in a low power/idle state... An ARM CPU would show little improvement..But if you ever offloaded any kind of work to that box, you'd have wasted your money with an ARM box, as no ARM processor will ever match real task performance of any of the x86 processors.
MonkeyPaw - Thursday, May 31, 2012 - link
You can always go with the 2120T. It's only 35W (4 threads) and would beat the pants off the other 2 options you are considering.http://www.newegg.com/Product/Product.aspx?Item=N8...
nubie - Thursday, May 31, 2012 - link
I have to second the G620, they are damn cheap, about half the price of the 2120T, and you aren't losing any level 3 cache.SleepIT - Friday, June 15, 2012 - link
I use the mini-ITX Atom-based boards running Ubuntu/Webmin for NAS's (OS on thumbdrive, 4 x 2Tb drives on the latest). Performance is stellar!majfop - Thursday, May 31, 2012 - link
Upping the Turbo Boost multipliers for the 400 MHz overclock is only on Z75 and Z77, right? That makes it very much less "free" I would say.There seem to be some reasonable budget option B75 and H77 motherboards, not to mention the previous socket 1155 offerings with an updated BIOS to accept the IVB processor.
iwod - Thursday, May 31, 2012 - link
I sometimes wish Intel could just present a Lineup for OEM, another for Retail Consumers. To greatly simplify Lineup. Just looking at the Lineup Hurts my brain and eyes, they could just offer all CPU with HT, 2 / 4 Core Variants with Speed Differentiation would be MORE then enough for me.Then AMD is plain stupid for not capturing more market shares with their APU. Their new CEO has it right, where he had watched AMD systematically shoot itself in the foot, over and over again.
BSMonitor - Thursday, May 31, 2012 - link
It's ALL for OEM's. Retail CPU consumers are such a tiny fraction of the pie. Consumers just jump in where the CPU fits them best.The only consumer aimed retail products are the high end i7 hex-cores.
silverblue - Thursday, May 31, 2012 - link
Rory Read wasn't responsible for Fusion in any way; the only thing he can realistically do here is to push as many resources at getting APUs out of the door as possible.thunderising - Thursday, May 31, 2012 - link
One could have atleast hoped for HD3000 in a 200$ chip, not HD2500 crapShieTar - Thursday, May 31, 2012 - link
HD3000 is not 500 HDs better than HD2500, it is in fact an older, less capable version of HD graphics. The numbering scheme is admittedly silly, but thats to be expected from Intel by now.In the end, this CPU is not meant for gamers, not even if they want Ivy Bridge for a low cost. For 60$ less than the 3470 you will soon get the 2 core / 4 threads i3-3220. For a low budget gamer, this will still give you more than enough CPU power to team with any GPU you can afford. And those 60$ you saved can buy you an AMD 6670, which should be at least twice as fast as HD4000.
The 3470 makes much more sense for people that can accept minimal GPU power, but appreciate the increased CPU power of the (real) quadcore. Think office PC handling massive excel files with loads of calculations: Not enough to warrant a Xeon based system, but definitly enough to make the 60$ premium from a dual core worthwhile.
CeriseCogburn - Monday, June 11, 2012 - link
The idea that any poor sap has to game on any 2000, 2500, 3000, or 4000, or llano, or trinity, is too much to bear.What a PATHETIC place to be. How the heck is someone going to spend on HD3000 or HD4000 and the board and accompanying items and not have $75-$100 for say a GTX460 ?
I mean how badly do these people plan on torturing themselves and what point are they really trying to prove ?
They could get a $100 amd phemon 2 and a $100 GTX 460 if they want to game, and be so far ahead...
This whole scene is one big bad freaking JOKE.
duploxxx - Thursday, May 31, 2012 - link
you could ask why even bother to add a GPU here, it is utter crap.now lets have a look at teh other review what they left in the U 17W parts ......
PrinceGaz - Thursday, May 31, 2012 - link
Because most people don't buy a PC to play the latest games or do 3D rendering work with it.CeriseCogburn - Monday, June 11, 2012 - link
Correct PrinceGaz, but then we have the amd fan boy contingent, that for some inexplicably insane fatnasy reason, now want to pretend llano, trinity and hd graphics are gaming items...The whole place has gone bonkers. But a freaking video card, every *********** motherboard has one 16x slot in it.
mother of god !
ananduser - Thursday, May 31, 2012 - link
I noticed you ran a BF3 bench and assumed that the game is playable on a HD4000. Have you actually played the game with the HD4000 on a 32 player map. I dare you to try and update this review and say that BF3 is "playable" on a HD4000.JarredWalton - Thursday, May 31, 2012 - link
"The HD 4000 delivered a nearly acceptable experience in Battlefield 3..." I wouldn't call that saying the game is "playable". Obviously, the more people there are on a map the worse it gets, but if you're playing BF3 on 32 player maps (or you plan to), I'd hope you understand that you'll need as much GPU (and quite a bit of CPU) as you can throw at the game.That said, I'll edit the paragraph to note that we're discussing single-player performance, and multiplayer is a different beast.
ananduser - Thursday, May 31, 2012 - link
Well, when one sees 37 fps BF3, one might wrongly assume that: "OMG 37 FPS on a thin and low powered [insert ultrabook brand]"; I'm going tomorrow and buying it.I think that playing the game for 5-10 minutes with FRAPS enabled and providing highest/lowest FPS count for each hardware setup is more revealing than running built-in engine demos.
CeriseCogburn - Friday, June 1, 2012 - link
Don't forget this applies to AMD Trinity whose crappy cpu will cave in on a 32 player server.ananduser - Thursday, May 31, 2012 - link
Oh and I doubt single player performance will ever touch that 37 data point as well.JarredWalton - Thursday, May 31, 2012 - link
On the quad-core desktop IVB chips it surely will -- that's why the result is in the charts -- but for laptops? Nope. I think the best result I've gotten (at minimum details and 1366x768) is around 25-26FPS in BF3.ltcommanderdata - Thursday, May 31, 2012 - link
"Intel has backed OpenCL development for some time and currently offers an OpenCL 1.1 runtime for their CPUs, however an OpenCL runtime for Ivy Bridge will not be available at launch. As a result Ivy Bridge is limited to DirectCompute for the time being, which limits just what kind of compute performance testing we can do with Ivy Bridge."http://software.intel.com/en-us/articles/vcsource-...
The Intel® SDK for OpenCL Applications 2012 has been available for several weeks now and is supposed to include GPU OpenCL support for Ivy Bridge. Isn't that sufficient to enable you to run your OpenCL benchmarks?
http://downloadcenter.intel.com/Detail_Desc.aspx?a...
If not, the beta drivers for Windows 8 also support Windows 7 and adds both GPU OpenCL 1.1. support and full OpenGL 4.0 support including tessellation so would allow you to run your OpenCL benchmarks and a Unigine Heaven OpenGL tessellation comparison.
Ryan Smith - Thursday, May 31, 2012 - link
Your completely right. We were in a rush and copied that passage from our original IVB review, which is no longer applicable.SteelCity1981 - Thursday, May 31, 2012 - link
Intel could have at least called it a 3500 and slap 2 more EU's onto it.fic2 - Thursday, May 31, 2012 - link
Agreed. I don't understand why Intel basically stood pat on the low end HD.But then again like everyone else I never understood why the HD3000 was only in the K series and maybe 5% of K series users don't have discreet gpu so the HD3000 isn't being used.
CeriseCogburn - Monday, June 11, 2012 - link
Good point. Lucid logic tried to fix that some, and did a decent job, and don't forget quick sync, plus now with zero core amd cards, and even low idle power 670's and 680's, leaving on SB K chip hd3000 cores looks even better - who isn't trained in low power if they have a video card, after all it's almost all people rail about for the last 4 years.So if any of that constant clamor for a few watts power savings has any teeth whatsoever, every person with an amd card before this last gen will be using the SB HD3000 and then switching on the fly to gaming with lucid logic.
n9ntje - Thursday, May 31, 2012 - link
So this must be a midrange desktop chip? Horrendous performance on the graphics side from Intel again.Very curious how AMD's trinity dekstop will perform, at the same pricerange it will be obvious it will obliterate Intel's offerings on the graphics side. What's more impressive AMD is still on 32nm..
7Enigma - Thursday, May 31, 2012 - link
For me this IS the perfect chip. No use for the GPU so cheaper = better. I would need a K model though for OC'ing potential, but I'm glad to see that if I can't have my CPU-only (no GPU) chip, at least I can have a hacked down version that is more in line with a traditional CPU.silverblue - Thursday, May 31, 2012 - link
What Intel should really be doing here is offering the 4000 on all i3s and some i5s to offset the reduced CPU performance. If you want to give AMD something to think about, HD 4000 on an Ivy Bridge dual core is very much the right way of going about it.CeriseCogburn - Friday, June 1, 2012 - link
Then Intel has a lame trinity level cpu next to a losing gpu.I think Intel will stick with it's own implementations, don't expect to be hired as future product manager.
ShieTar - Thursday, May 31, 2012 - link
Interestingly enough, Intel will also happily sell you what is basically the same chip, without any GPU, 100 MHz slower but with 2MB extra L3-Cache for the same price. They call that offer Xeon E3-1220V2. And it is 69W TDP, not 77W as the i5-3470.Who knows, the bigger Cache might even make it the better CPU for a not-overclocking gamer. If normal boards support it.
Pazz - Thursday, May 31, 2012 - link
Anand,Following on from your closing statement with regards to the HD 4000 being the miniumum, will you be doing a review of the 3570K? Surely with this model being the lowest budget Ivy HD4000 chip, it'll be a fairly popular option for many system builders and OEM's.
BSMonitor - Thursday, May 31, 2012 - link
f they truly were interested in building the best APU. And by that, a knockout iGPU experience.Where are the dual-core Core i7's with 30-40 EU's??
Or the AMD <whatevers> (not sure anymore what they call their APU) Phenom X2 CPU with 1200 Shaders??
When we are talking about a truly GPU intense application, a LOT of times single/dual core CPU is enough. Heck, if you were to take a dual-core Core 2, and stick it with a GeForce 670 or Radeon 7950.. You would see very similar numbers in terms of gaming performance to what's in the BENCH charts. ESP at the 1920x1080 and below.
Surely Intel can afford another die that aims a ton of transistors at just the GPU side of things. AMD, maybe. Why do we get from BOTH, their top end iGPU stuck with the most transistors dedicated to the CPU??
I find it hard to believe anyone shopping for an APU is hoping for amazing CPU performance to go with their average iGPU performance. That market would be the opposite. Sacrifice a few threads on the CPU side for amazing iGPU.
Am I missing something technically limiting?? Is that many GPU units overkill in terms of power/heat dissipation of the CPU socket??
tipoo - Thursday, May 31, 2012 - link
Well, their chips have to work in a certain set of thermal limits. Maybe at this point 1200 shader cores would not be possible on the same die as a quad core CPU for power consumption and heat reasons. I think Haswel will have 64 EUs though if the rumours are true.Roland00Address - Thursday, May 31, 2012 - link
There is no point of a 1200 shaders apu due to memory bandwidth. You couldn't feed a beast of an apu with only dual channel 1600 mhz memory when that same memory limits the performance of llano and trinity compared to their gpu cousins which have the same calculation units and core clocks but the gpus perform significantly better.silverblue - Thursday, May 31, 2012 - link
Possibly, but at the moment, bandwidth is a surefire performance killer.BSMonitor - Thursday, May 31, 2012 - link
Good Points. But currently, Intel has Quad Channel DDR3-1600 up on the Socket 2011. I am sure AMD could get more bandwidth there too, if they step up the memory controller.My overall point, is that neither is even trying for a low-medium transistor CPU and a high transistor GPU.
It's either Low-Medium CPU with Low-Medium GPU (disabled cores and what have you), or High End CPU with "High End" GPU.
There is no attempt at giving up CPU die space for more GPU transistors from either.. None. If you someone spends $$ on the High End of the CPU (Quad Core i7), the implementation of iGPU is not even close to worth using for that much CPU.
Roland00Address - Thursday, May 31, 2012 - link
Quad Channel is not a "free upgrade" it requires much more traces on the motherboard as well as more pins on the cpu socket. This dramtically increases costs for the motherboard and the cpu. Both of those are going against what AMD is trying to do with their APUs which will be both laptop as well as desktop chips. They are trying to increase their margins on their chips not decrease them.You have a large number OEMs only putting a single 4gb ddr3 stick in laptops and desktops (thus not achieving dual channel) in the current apus. You want think those same vendors are suddenly going to put 16gbs of memory on an apu (and it is going to be 16gbs since 2gb ddr3 sticks are being phased out via the memory manufactures.)
tipoo - Thursday, May 31, 2012 - link
I'm curious why the HD4000 outperforms something like the 5450 by nearly double in Skyrim, yet falls behind in something like Portal or Civ, or even Minecraft? Is it immature drivers or something in the architecture itself?ShieTar - Thursday, May 31, 2012 - link
For Minecraft, read the article and what it has to say about OpenGL.For Portal or Civ, it might very well be related to Memory Bandwidth. The HD2500 can have 25.6 GB/s (with DDR3-1600), or even more. The 5450 generally comes with half as much (12.8 GB/s), or even a quarter of it since there are also 5450s with DDR2.
As a matter of fact, I remember reading several reports on how much the Llano-Graphics would improve with faster Memory, even beyond DDR3-1600. I havn't seen any tests on the impact of memory speed from Ivy Bridge or Trinity yet, but that would be interesting given their increased computing powers.
silverblue - Thursday, May 31, 2012 - link
I'm sure it'll matter for both, more so for Trinity. I'm not sure we'll see much in the way of a comparison until the desktop Trinity appears, but for IB, I'm certainly waiting.tipoo - Thursday, May 31, 2012 - link
Having half the memory bandwidth would lead to the reverse expectation, the 5450 is close to or even surpasses the HD4000 with twice the bandwidth in those games, yet the 4000 beats it by almost double in games like Skyrim, even the 2500 beats it there.JarredWalton - Thursday, May 31, 2012 - link
Intel actually has a beta driver (tested on the Ultrabook) that improves Portal 2 performance. I expect it will make its way to the public driver release in the next month. There are definitely still driver performance issues to address, but even so I don't think HD 4000 has the raw performance potential to match Trinity unless a game happens to be CPU intensive.n9ntje - Thursday, May 31, 2012 - link
Don't forget memory bandwidth. Both the CPU and GPU use the same memory on the motherboard.tacosRcool - Thursday, May 31, 2012 - link
kinda a waste in terms of graphicsparaffin - Thursday, May 31, 2012 - link
With 1920×1080 being the standard thesedays I find it annoying that all AT tests continue to ignore it. Are you trying to goad monitor makers back into 16:10 or something?Sogekihei - Monday, June 4, 2012 - link
The 1080p resolution may have become standard for televisions, but it certainly isn't so for computer monitors. These days the "standard" computer monitor (meaning, what an OEM rig will ship with in most cases whether it's a desktop or notebook) is some variant of 136#x768 resolution, so that gets tested for low-end graphics options that are likely to be seen in cheap OEM desktops and most OEM laptops (such as integrated graphics seen here.)The 1680x1050 resolution was the highest end-user resolution available cheaply for a while and is kind of like a standard among tech enthusiasts- sure you had other offerings available like some (expensive) 1920x1200 CRTs, but most people's budget left them with sticking to 1280x1024 CRTs or cheap LCDs or if they wanted to go with a slightly higher quality LCD practically the only available resolution at the time was 1680x1050. A lot of people don't care enough about the quality of their display to upgrade it as frequently as performance-oriented parts so many of us still have at least one 1680x1050 lying around, probably in use as a secondary or for some even a primary display despite 1080p monitors being the same cost or lower price when purchased new.
Beenthere - Thursday, May 31, 2012 - link
I imagine with the heat/OC'ing issues with the trigate chips, Intel is working to resolve Fab as well as operational issues with IB and thus isn't ramping as fast as normal.Fritsert - Thursday, May 31, 2012 - link
Would the HQV score of the HD2500 be the same as the HD4000 in the Anandtech review? Basically would video playback performance be the same (HQV, 24fps image enhancement features etc.)?A lot of processors in the low power ivy bridge lineup have the HD2500. If playback quality is the same this would make those very good candidates for my next HTPC. The Core i5 3470T specifically.
cjs150 - Friday, June 8, 2012 - link
Also does the HD2500 lock at the correct FPS rate which is not exactly 24FPS. AMD has had this for ages but Intel only caught up with the HD4000. For me it is the difference between an i7-3770T and an i5-3470TAffectionate-Bed-980 - Thursday, May 31, 2012 - link
This is a replacement of the i5-2400. Actually the 3450 was, but this is 100mhz faster. You should be comparing HD2000 vs HD2500 as well as these aren't top tier models with the HD3000/4000.bkiserx7 - Thursday, May 31, 2012 - link
In the GPU Power Consumption comparison section, did you disable HT and lock the 3770k to the same frequency as the 3470 to get a more accurate comparison between just the HD 4000 and HD 2500?shin0bi272 - Friday, June 1, 2012 - link
any gamer with a good quad core doesnt need to upgrade their cpu. Who's going to spend hundreds of dollars to upgrade from another quad core (like lets say my i7 920) to this one for a whopping 7 fps in one game and 1 fps in another? That sounds like something an apple fanboy would do... oh look the new isuch-and-such is out and its marginally better than the one I spent $x00 on last month I have to buy it now! no thanks.Sogekihei - Monday, June 4, 2012 - link
This really depends a lot on what you have (or want) to do with your computer. Architectural differences are obviously a big deal or else instead of an i7-920 you'd probably be rocking a Phenom (1) x4 or Core2 Quad by your logic that having a passable quad core means you don't need to upgrade your processor until the majority of gaming technology catches up.Let's take the bsnes emulator as an example here, it focuses on low-level emulation of the SNES hardware to reproduce games as accurately as possible. With most new version releases, the hardware requirements gradually increase as more intricate backend code needs to execute within the same space of time to avoid dropping framerates; being that these games determined their running speed by their framerate and being sub-60 or sub-50 (region-dependent) means running at less than full speed, this could eventually be a problem for somebody wanting to use such an accurate emulator. From what I've heard, most Phenom and Phenom II systems are very bogged down and can barely get any games running at full speed on it these days and from my own experience, Nehalem-based Intel chips either require ludicrous clock speeds or simply aren't capable of running certain games at full speed (such as Super Mario RPG.) Obviously in cases such as this, the performance increases from a new architecture could benefit a user greatly.
Another example I'll give is based on the probability through my own experiences dealing with other people that the vast majority of gamers DO use their rigs for other tasks too. Any intensive work with maths, spreadsheets, video or image editing and rendering, software development, blueprinting, or anything else you could name that people do on a computer nowadays instead of by hand in order to speed the process will see massive gains when moving to a faster processor architecture. For anybody that has such work to do, be it for a very invested-in hobby, as part of a post-secondary education course, or as part of their career, the few hundred dollars/euros/currency of choice it costs to update their system is easily worth the potentially hundreds or thousands of hours per upgrade cycle they may save through the more powerful hardware.
I will concede that in today's market, the majority of gaming-exclusive cases don't yield much better results from increasing a processor's power (usually being GPU-limited instead) however that's a very broad statement and doesn't account for things that are heavily multithreaded (like newer Assassin's Creed games) or that are very processor-intensive (which I believe Civilization V can qualify as in mid- to late-game scenarios.)
There will always be case-specific conditions which will make buying something make sense or not, but do try to keep in mind that a lot of people do have disposable income and will very likely end up putting it into their hobbies before anything else. If their hobbies deal with computers they're likely going to want to always have, to the best extent they can afford, the latest and greatest technology available. Does it mean your system is trash? Of course not. Does it mean they're stupid? No moreso than the man that puts $10 a week into his local lottery and never wins anything. It just comes down to you having different priorities from them.
The only other thing I want to address is your stance on Apple products. Yes the hipsters are annoying, but you would likely lose the war if you wanted to argue on the upgrade cycle users take with Mac OSX-based computers. New product generations only come about once a year or so and most users wait 2-3 generations before upgrading and quite a few wait much longer than the average Linux/Windows PC user will before upgrading. The ones that don't wait are usually professionals in some sort of graphic arts industry (such as photography) where they need the most processing power, memory, graphics capabilities, and battery life possible and it's a justified business expense.
CeriseCogburn - Monday, June 11, 2012 - link
People usually skip a generation - so from i7 920 we can call it one gen with SB being nearly the same as IB, so you're correct.But anyone on core 2 or phenom 2 or athlon 2x or 4x, yeah they could do it an be happy - and don't forget the sata 6 and usb 3 they get with the new board - so it's not just the cpu with IB and SB - you get hard drive and usb speed too.
So with those extras it could drive a few people in your position - sata 6 card and usb 3 card is half the upgrade cost anyway, so add in pci-e 3 as well. I see some people moving from where you are.
ClagMaster - Saturday, June 2, 2012 - link
The onboard graphics of the Ivy Bridge processors was never seriously intended for playing games. It is intended to replace chipset graphics for to support office applications with large LCD monitors. And it adds transcoding capabilities.@Anand : If you want to do a more meaningful comparison of graphics performance for those that might be doing gaming, why not test and compare some DX9 games (still being written) of titles available 5 years ago. Real people play these games because they are cheap or free and provide as much entertainment as DX10 or DX11 games. Frame rates will be 60fps or slightly better. Or will your sponsors at nVidia, AMD or Intel not permit this sort of comparison.
Its ridiculous to compare onboard graphics to discrete graphics performance. A dedicated GPU, optimized for graphics, will always beat a onboard graphics GPU for a given gate size.
The Ivy Bridge graphics (performance/power consumption) , if I interpret these comparisons that have been presented correctly, is also inefficient compared to the processing capabilities of a discrete graphics card.
vegemeister - Wednesday, June 6, 2012 - link
As you mentioned, I'd like to see some mention of the 2D performance. I use Awesome WM on a 3520x1200 X screen, and smooth scrolling can sometimes get choppy with my Graphics My Ass GPU.I'd like to upgrade my Core2 duo, but I'm not sure whether the HD2500 graphics in this chip will suffice, or if I need to be looking at higher end CPUs. I don't really care about the difference between shitty 3D and ho-hum 3D.
P39Airacobra - Tuesday, July 1, 2014 - link
That's a shame that they still sale the GT 520 and GT 610 and the ATi 5450, When a integrated GPU like the HD 2500 out performs a dedicated GPU it's time to retire them from the market. I bought a 3470 and I am running a R9 270 with 8GB of 1600 Ripjaws. I tried out the HD 2500 on the chip just to see how it would do, It honestly sucked, But for videos and gaming on very low settings it works, It actually surprised me. But I don't think I could ever stand to have a intergrated GPU, What's the point in buying a i5 if you are only going to use the integrated gpu? It does not make sense, You may as well keep your old P4 if you are not going to add a real GPU to it. This is why I don't understand the point of a integrated GPU inside a high end processor.Imogen596 - Saturday, September 29, 2018 - link
Materials to guarantee lasting sturdiness. https://about.me/lenabryan It needs to likewise be adjustable for http://www.bricksite.com/dogharnesstouch convenience as well as safety.