Intel has something impressive in the works with Broadwell (at least on paper). I can't wait to get a Broadwell based Surface Pro. Assuming that Microsoft improves an already impressive hardware design from the sp3, the Broadwell iteration will likely be my next computer purchase.
I have a feeling SP4 will be fundamentally the same design as SP3 save for minor tweaks and improvements. SP3 was clearly designed for a processor with the kind of power profile Broadwell is set to deliver rather than the current Haswell profile. It will be interesting to see which set of SKUs Microsoft will put in the SP4, Core M or Broadwell ULT. Core M has a number of obvious benefits for power and area efficiency, but will it be powerful enough for their market with some features reduced from Haswell and Boradwell ULT.
Pure speculation, but I think Intel might already be giving MS premium bins of Haswell for SP3, because SP3 is the only device to date to actually show off the ability to run premium Intel CPUs in a tablet format. Sure, MBA looks great, but SP3 took it to the next level.
That said, I doubt that MS will use Core M in SP4, for the same reason we don't have Haswell-Y in SP3 (at least at the high end). It will probably be a step back in processing power to use one.
Do we know what wattage the Broadwell ULT and Core M chips will be targeting? 15W TDP is clearly too high for the SP3 to handle so moving all SP4 chips to 11.5W like the current Haswell Y looks quite plausible at the moment, it just seems to be a matter of which version of Broadwell will have the 11.5W TDP.
15W is only a problem in SP3 to people who use it like a high performance computer (24x7 full load applications) but for general purpose use it barely warms up. We have people running Lightroom 8 hours a day on these things and like the Surface 2's (which I still have) they never got "hot" or "loud".
That said, someone in the office infected their SP3 with some malware a few weeks ago (they literally owned the tablet not even 24 hours) and when they handed it to me, it was VERY hot with the fans whirling. Some 800kb task was using 100% of their CPU doing who knows what...at first I thought it was Cryptolocker but it turned out to be retrying a network connection. This was an i5 model, however, and it didn't seem to be throttling. The i3 will presumable run cooler, even at the same TDP.
What people need to keep in mind is these are mobile devices.
agreed, Broadwell, and skylake will be vast improvements to PCs in general. Intel's Broadwell-Y announcement is all about "small, cool, efficient" while the recent FX-9590 seems more about "big, hot, gluttony" similar to the David vs Goliath story, the interesting part was the small one besting the big one. Ironically Intel is the bigger company. Hopefully AMD's new A1100 pans out as I don't want another Comcast, Microsoft, De Beers or Luxottica.
well, if amd was as agressive as intel in shrinking dies or whathaveyou, then an AMD FX chip will probably be toe-to-toe to an intel i7-4930k or whatever the 6-core enthusiast intel chip is labeled. and not even die shrinks, but, also aggressive in producing a $500 cpu. imagine that. and you'd probably see an a10-7850k performance in a laptop by now. but, AMD seems content is sitting back and letting the other company do all the work, creating a path. as long as AMD doesn't completely die out, it's fine. we just need an alternative and AMD is the only one. so, go AMD. don't worry about broadwell. build it and we will come. be a niche. convert future x99 users to a future AMD product. and start from there.
For one, die-shrinks costs money... For fab contracts, man-hours, research and possibly buying technology from other companies such as IBM.
AMD can't aggressively shrink dies anyway, they are at the mercy of fabrication companies like TSMC and Global Foundries, so what they can produce is limited to what they provide. Intel has always been ahead of the industry in fabrication, the only way AMD can beat Intel is through something ground breaking (Like moving away from silicon?) or if Intel drops the ball, like they did with Netburst. Or, AMD buys a fab company who is far ahead of Intel, which simply isn't going to happen.
Otherwise they can only compete on price and using an older more mature fabrication process allows them to do just that as the chips are much cheaper to produce, they just need to provide "Good enough" performance to mostly stay relevant, which the FX doesn't really do.
well, an fx-8350 is toe-to-toe with an i7-2600k, which is no slouch until today. and comparing fx-8350 with today's i7-4770k would be a little unfair since the 4770k is 22nm while the 8350 is at 32nm. and we're not even considering software optimizations from OS and/or programs that are probably bent towards intel chips due to its ubiquity.
so, i think, you're wrong that the fx-8350 doesn't provide good enough. i have both i7-3770k oc'd to 4.1 ghz and an fx-8320 at stock and the amd is fine. it's more than good enough. i've ripped movies using handbrake on both systems and to me, both systems are fast. am i counting milliseconds? no. does it matter to me if the fx-8320 w/ lets say amd r9-290 has 85 fps for so and so game and an i7-4770k w/ the same gpu has a higher fps of 95, let's just say? i don't think so. that extra 10 fps cost that intel dude $100 more. and 10 extra frames with avg frames of 85-95 is undecipherable. it's only when the frames drop down below 60 does one notice it since most monitors are at 60 hz.
so what makes the fx not good enough for you again? are you like a brag queen? a rich man?
Not fair to compare against a 22nm from Intel? Bogus, I can go to the store and buy a 22nm Intel so it should be compared against AMDs greatest. An i5-4670K matches or exceeds the performance of even the FX-9590 in all but the most embarrassingly threaded tasks while costing 50$ more. Cost to operate the machine through the power bill makes up for that price difference at a fairly standard 12c per KWh when used heavily 2 hours per day for 4 years or idling 8 hours per day for the same 4 years.
Your argument for gaming with the 8350 being good enough is weak too when the 10$ cheaper i3-4430 keeps up. Or spent 125$ less to get a Pentium G3258 AE, then mildly overclock it to again have the same good enough gaming performance if >60FPS is all that matters. The i3 and pentiums are ~70$ cheaper yet when power use is counted again.
well, if a pentium g3258 is good enuff for gaming, then so is an fx-8350. whaaaaaat? omg we know intel is king. i acknowledge and understand that. intel rules. but, amd is not bad. not bad at all is all im trying to make.
first off you are assuming a lot and not bothering to check any published benchmarks out there so,
1. 8350 isn't even equal to 2500 i5 let alone 2600 i7. 2. 32nm vs. 22nm means nothing at all when comparing raw performance in a desktop. it will limit the thermal ceiling so in a laptop the higher nm chip will run hotter therefore be unable to hit higher clocks but in a desktop it means nil. 3. handbrake ripping relies on speed of dvd/blu-ray drive, handbrake transcoding relies on cpu performance and the 8350 gets spanked there by a dual core i3 not by miliseconds but tens of seconds. i5 it gets to the level of minutes i7 more so. 4. let's say you're pulling framerates for an r9-290 out of somewhere other than the ether... reality is an i5 is faster than the 8350 in almost any benchmark i've ever seen by roughly 15% overall. in certan games with lots of ai you get crazy framerate advantages with i5 over 8350, things like rome total war and starcraft 2 and diablo 3 etc...
i'll just say fx8350 isn't good enough for me and i'm certainly not a rich man. system build cost for what i have vs. what the 8350 system would have run was a whopping $65 difference
#3 is B.S. a dual-core i3 can't rip faster than an fx-8350 in handbrake.
#4 the r-290 was an example to pair a fairly high end gpu with an fx-8350. a fairly high end gpu helps in games. thus, pairing it with an fx-8350 will give you a good combo that is more than good enough for gaming.
#2 22nm vs. 32nm does matter in desktops. the fx-8350 is 32nm. if it goes to 22nm, the die shrink would enable the chip to either go higher in clockspeed or lower it's tdp.
u sound like a benchmark queen or a publicity fatso.
oh and #1--i am not saying the fx 8350 is better than the i7-2600k. i said "toe-to-toe." the i5-2500k can also beat the fx-835o b/c of intel's IPC speed advantage. but, i think the reasons for that are programs not made to be multithreaded and make use of fx-8350 8-cores to it's potential. since amd trails intel in IPC performance by a lot--this means that a 4-core i5-2500k can match it or sometimes even beat it in games. in a multithreaded environment, the 8-core fx-8350 will always beat the i5-2500k. although it might still trailer the 4-core + 4 fake cores i7-2600k. just kidding. lol.
i said toe to toe with 2600k which means its "competitive" to an i7-2600k even though the AMD is handicapped with slower IPC speed and most programs/OS not optimize for multithreading. so, to be 10-20% behind in most benchmarks against an i7-2600k is not bad considering how programs take advantage of intel's higher IPC performance.
i'm sorry, is your argument here that the FX-8350 is better because it's inferior? because that's all i'm getting out of this. Of course a benchmark is going to take advantage of higher IPC performance. That's the point of a benchmark: to distinguish higher performance. The way you talk about benchmarks it's as if you think benchmarks only give higher numbers because they're biased. That's not how it works. The benchmarks give the i7-2600k higher scores because it is a higher performance part in real life, which is what anyone buying a CPU actually care about. Not to mention the significantly higher efficiency, which is just an added benefit. Also, it's really hard to take you seriously when your posts make me think they're written by a teenage girl.
also, if the fps disparity is so huge btwn fx-8350 and say i5-2500k in games u mention like starcraft 2, then something is wrong with that game. and not the fx-8350. i actually have sc2 and i have access to a pc w/ an fx-8320. so i am going to do a test later tonight. my own pc is an i7-3770k. so i could directly compare 2 different systems. the only thing is that the amd pc has an hd5850 gpu, which should be good enuff for sc2 and my pc has a gtx680 so it's not going to be a direct comparison. but, it should still give a good idea, right?
i just played starcraft 2 on a pc with fx-8320 (stock clockspeed), 8GB 1600Mhz RAM, 7200rpm HDD and an old AMD HD5850 w/ 1GB VRAM. the experience was smooth. the settings were 1080P, all things at ultra or high and antialiasing set to ON. i wasn't looking at FPS since i don't know how to do it with starcraft 2, but, the gameplay was smooth. it didn't deter my experience.
i also play this game on my own pc which is an i7-3770k OC'd to 4.1, 16GB 1600 Mhz RAM, 7200rpmHDD and an Nvidia GTX680 FTW w/ 2GB VRAM and i couldn't tell the difference as far as the smoothness of the gameplay is concerned. there is some graphical differences between the AMD GPU and the Nvidia GPU but that is another story. my point is that my experience were seamless playing on an FX chip pc to my own pc with 3700k.
to make another point, i also have this game on my macbook pro and that is where the experience of playing this game goes down. even in low settings. the MBP just can't handle it. at least the one i have with the older gt330m dGpu and dual-core w/ hyperthreading i7 mobile cpu.
so.... there.... no numbers or stats. just the experience, to me, which is what counts did not change with the pc that had the amd fx cpu.
So you must be feeling pretty darn stupid now, realizing that you never had to buy the more expensive 3770K (plus the gtx680), since from your point of view it "feels" the same as an 8350 or an 8320 with an HD5850... eh? Here's an idea, sell your 3770K/GTX680 system, and buy an FX8320/HD5850... you would still get some of your money back - if you can't do that, then at least just shut the hell up and stop deliberately spreading misinformation, you unethical hypocrite.
Everyone can have their own opinions. Leave wurizen alone. Heaven forbid someone say something you don't agree with. Put on your big boy pants intel fanboys.
Global Foundries was AMD spunoff fab, AMD still holding share on Global Foundries, Global Foundries are working tightly with Samsung fab right now for better manufacturing process, when they reach their goal in nm race they can compete with Intel in die shrink.
Actually StevoLincolnite and others, you are quite confused. Using larger node sizes is not "cheaper to produce". Larger node sizes are more expensive per chip. The reason AMD (global foundries) does not just jump down to the next node is that it requires a great deal of capital up front (they are relatively broke) and R@D time which they are already behind on. Intel has proven more adept at shrinking process nodes and invests more in R&D ahead of time. This allows Intel to use the new node budget to partially increase performance, partially decrease power and partially decrease cost/chip. Cost per chip is the main driver for increasing density and depending on the generation Intel has re-balanced the performance/power/cost relationship.
I'd hesitate to say that this will enable any kind of "real" gaming in the traditional sense. The iGPU isn't strong enough in this form factor, and AMD/NV draw too much power.
Not true, assuming Intel will integrate Iris Pro into its mobile CPUs. With HD5200 (iGPU) you can run just about any game on 768p with at least medium settings. Here are some benchmarks for that iGPU: http://www.anandtech.com/show/6993/intel-iris-pro-... I know that TDP there 47W but it includes quad core i7 clocked at 3,6GHz wnich itself draws some power. Intel could improve on that design, put it into its newest CPUs and if they do, they sure will brag about it.
Half the problem with Intel Decelerators is with the graphics drivers, AMD and nVidia's drivers are gold plated in comparison, sure they are better these days than they used to be, but still hardly ideal.
In a tablet form factor you are going to encounter severe thermal throttling while gaming, the Anandtech article you linked doesn't have those cooling restraints. Just look at Anantech's article on the Surface Pro 3's thermal throttling. Despite having a slightly better i5, it does worse in most heavy use scenarios.
I think this discussion is completely irrelevant. Very few are going for a graphical gaming tablet (unless it's poker or fruit slices, etc.). That field is dominated by Playstation or XBox on an HDTV. Tablets and convertibles are for Internet Surfing or business documents, the low end for Internet surfing only.
No, we use them for gaming, too. If you get a high-end convertible tablet, it's because a regular tablet doesn't do it and you don't need a similarly-sized laptop. This means that you'll either have a proper big gaming rig or you'll be relying on your convertible to game. I'm doing all my gaming on my Surface Pro 1, and I can play most games with my CPU at 700 MHz. (I mean, there are orders of magnitudes of game needs. I'll sometimes switch it to full-throttle when playing a game, when it needs more power.)
Sure, but there's an obvious difference between gaming on a mobile device and sitting down in front of a TV. So talking about Xboxes and PlayStations is missing the point by a pretty large margin.
M stands for Mediocre. The main reason they are hitting those TDP levels is because they are reducing performance for those specific chips that they claim to use little power.
I also bet Broadwell will jump even higher in price than Haswell, which also jumped 40 percent over IVB, considering they're having significant yield issues.
Original prices. Haswell is not the "mass market" chip for Intel, which means high volume, and can afford lower prices than initially, while IVB is low volume chip now, so it won't be much cheaper. Watch for when Broadwell comes out. It won't be just 10 percent more expensive than Haswell.
I believe Krysto was referring to the jump in prices from IB to HW, and not SB to IB. But either case, I'll have to disagree with Kyrsto. prices seems to have decreased over the years if anything. don't forget, because of IPC gains you are paying the same for more performance. 3570 = $210 4590 = $200
All depends on from factor. If you want a tiny tablet, this is the only manner to do it. Or you do like Tegra K1 that scores 12W under full GPU operations??? Come on, put it in a slim convertible if you can.
"and as we’ve seen with ARM based tablets so far they form a market that has continued to grow and continued to erode the x86 laptop market that Intel has dominated for so long."
Is this your assumption or is it a fact? If the latter is true, can you provide some references?
Gartner regularly tracks PC unit sales, and tablets are regularly cited as a factor. On a more micro level, I know several people who have reduced their PC usage over the last couple of years (and delaed PC replacements) due to their tablet use.
Those weren't actual numbers, they were industry guesses. The lesson is not to trust pundits and "Industry analysts". If you can wait for proper news to be released, it's better than rumours.
Actually, the tablet market as a cause for dropping PC sales is only half or less than half of the story. Logically speaking, most consumers find their 2-4 year old machines sufficient for their "productive" needs. Unlike previous years, PCs are living far beyond their intended years of services, and consumers are in no dire need for newer, faster components. Mind you, hardware isn't getting that much faster with each iteration (relatively speaking), and a simple SSD upgrade and/or more RAM would improve performance significantly that a whole hardware upgrade becomes less appealing (same for other components). It's mostly about convenience, connectivity and mobility for consumers these days (cheap mobile media consumption) that's why the tablet market "appear" to have affected PC sales. Those who find a tablet replacing their fully fledged PC didn't need a PC in the first place.
How on earth is TurboBoost cheating? Is it cheating to include a feature that actually does result in a CPU that allow itself to do short sprints in moments where it is needed the most in order to make a device "feel" fast (i.e. respond quickly to events generated by the user)?
Cannot talk for BDW but both HSW and SMT usually stay at or very close to max turbo for _very_ long, Z3770 sitting inside a 10" tablet can run at 2.3-2.4GHz on all 4 cores for several minutes.
TurboBoost is not cheating but optimal demand-based power management.
All modern ARM CPUs are also doing the same thing (scaling down frequency when idle, up when busy). I am quite sure it is the case with the AMD CPUs, too - it is just that I do not have any to check.
The only three things that matter are a) price b) performance c) power draw (real that is measured, not marketing terms such as TDP, SDP, etc.).
TDP isn't a marketing term, it's the wattage of heat displayed by the processor. Wattage is a bit of a weird word for it, but Watts are joules-per-second, and a joule is the amount of energy needed to heat water a certain amount.
TDP is "Total Dissipated Power" and as used in describing processors refers to the maximum instantaneous power that the processor will dissipate. Watt, being a unit unit of power, is the only SI unit suitable for specifying TDP. Since nearly all of the energy expended by a processor is due to heat transfer, a Watt is in this case essentially a measure of the rate of heat transfer, and TDP is the rate of heat transfer required when operating the processor at or near its maximum operating temperature.
@Jaybus Nope. Please don't correct people with false info. TDP is Thermal Design Power, and it is NOT "peak" power. It is used for marketing as mkozakewich said above as the definition of TDP measurement varies from company to company.
Oops! I read mkozakewich comment backwards. It is a marketing term in a sense... It is used to design cooling systems but it gets thrown around by marketing groups all the time since TDP limits how small a device can be.
With all these reductions in power use on a per core basis and a stagnation of clock speeds we very well could see a quad core i7-5770k with a 65W TDP. I hope Intel plans on bumping up their mainstream high end SKUs to six core, the desktop market doesn't need maximum power use that low.
Yes this worries me too. I think we will see yet another pointless desktop release, with hardly any improvements (once you consider the bad overclocking).
I still don't see a desktop Broadwell replacing my 2500k, which runs low(ish) heat at 4.3GHz. What I would love to see is a 5.5GHz Broadwell monster, otherwise I will be skipping broadwell, just like Hazwell and IB before it.
Surely there has to be some light on the horizon for us gamers?!
And someone in your position also of course has the double irony of still having more performance headroom if they're willing to explore better cooling, etc. H80i would let your 2500K run at 5GHz no problem, assuming your particular sample isn't naturally limited in some way (actually, even an old TRUE and two typical fans would be fine, something I can confirm as I've done it with several different 2700Ks, never mind a 2500K, so I know the heat isn't an issue). Thus, except for threaded workloads, an upgrade would have to be a lot better than what you already have in order to be worth bothering with, except if you wanted something new due to newer tech sych as PCI Express, M.2, etc.
Anyway, this is why IB was so bad with its default metal cap setup, ie. Intel made SB too good which resulted in lots of 2500K owners just not seeing the point in upgrading. Thing is, they've still not boosted performance enough to make upgrading worthwhile for most SB users (the exception of course being non-K owners).
My 2700K runs at 5.0. Takes me 3 mins to set this up on an ASUS M4E, ludicrously easy to do. HW at 4.5 is about the same or a bit quicker, but much more complex to do, given how many samples won't go that high, and for gaming it makes no difference at all. I suppose the 4790K can help on the oc front a bit, but the cost has put me off so far, used 2700Ks are so much cheaper. I want to build a 4K gaming system next, but that'll be X99 instead I think (don't see the point of mainstream chipsets for 4K gaming, surely with multiple GPUs the limited PCIe lanes is going to hold back performance eventually).
I wish I had your luck with SB. My 2500k will only do 4.5 GHz stable with 1.365 vcore (in reality, monitoring software reports 1.391~ vcore with my motherboard's way of handling vdroop). If I could go any higher, I'd have to dump a load of voltage into it that I'm just afraid of doing. Though this may be a motherboard problem as it's given me problems before, but that still wouldn't account for a 500 MHz variance. I guess I just got a bad SB. :/
They want you to go up to LGA2011 for the solder-based TIM. And only because of die-cracking issues (I found the papers covering that) on smaller dies that force them to use boring old polymer TIM.
People are cool with high TSkins on their devices. I know my phone passes 35°C easily if I load it up, and I'm fine with that. Then again, I'm completely fine with a 60°C idle, because that's where ICs like to live...
I'm surprised they didn't move the PCH to 22nm. Relatively low power consumption or not, they pushed everything else to the wall to get Core M's TDP as low as possible and between doing custom designs for it anyway and soft sales meaning they've got the spare 22nm capacity available I don't see any obvious reason why they couldn't've done so.
The process nodes are very expensive to produce, so they need to get as much life out of them as possible. Also, a new(er) process isn't going to have a high enough yield. 22 Might have worked, but I bet the older process gave them a better bang for their buck.
I was thinking of buying one of these, but it sounds like the focus is still on TDP over all else so it looks like waiting to Skylake is the plan for anyone with Sandy Bridge or newer.
i think AMD is your hope. if you don't have a sandy bridge cpu or in phenom land, an FX series cpu is a great cpu that will hold one over until AMD updates their desktop FX series line of cpu's. i mean a 990FX mobo has all you need. i think pci 2.0 is still adequate for today's video cards so that doesn't really matter. even though on paper, intel has pci 3.0 and usb 5.0 (just kidding) and thunderbolt 9.9, they're superflous and doesn't make you game better. i mean, an fx-8350 with a decent gpu should give one smooth framerates. i mean who cares if in the benchmark, an fx-8350 with so and so gpu will run so and so game at, say 120fps, while an intel chip will run it at 190fps? 120 fps is like very good and that person who has an intel chip running 190 fps probably paid hundreds of dollars more for their system and i bet they wouldn't be even to decipher those 70 or so of more frames. i mean, it's not total fps that matters but the average frame rate and i think an fx-8350 will deliver that. it's a beast of a cpu. and broadwell? who cares. 90% of the words in the article above are buzz tech fancy words to get gadget heads salivating and stock wigs greasing their palms.
Yeah, sure, and that's exactly what everyone was saying back when we were waiting for the followon to Phenom II; just wait, their next chip will be great! Intel killer! Hmph. I recall even many diehard AMD fans were pretty angry when BD finally came out.
Benchmarks show again and again that AMD's CPUs hold back performance in numerous scenarios. I'd rather get a used 2700K than an 8350; leaves the latter in the dust for all CPU tasks and far better for gaming.
Btw, you've answered your own point: if an 8350 is overkill for a game, giving 120fps, then surely one would be better off with an i3, G3258 or somesuch, more than enough for most gaming if the game is such that one's GPU setup & sscreen res, etc. is giving that sort of frame rate, in which case power consumption is less, etc.
I really hope AMD can get back in the game, but I don't see it happening any time soon. They don't have the design talent or the resources to come up with something genuinely new and better.
an fx-8350 isn't holding anything back. come on, man. are you like a stat paper queen obsessor or something? oh, please. an fx-8350 and an amd r9290 gpu will give you "happy" frame rates. i say happy because it i know the frames rates will be high enough. more than good enough even. will it be lower than an i7-4770k and an r90? maybe. maybe the fx-8350 will avg 85 fps on so and so game while the i7-4770k will avg 90 fps. boohoo. who cares about 5 more frames.
also, while you mention i3 as a sufficient viable alternative to an fx-8350. remember that the cost will probably be about the same. and fx-8350 is like $190. maybe the i3 is 20 dollars less. but, here's the big but, an i3 is not as good as an fx-8350 in video editing stuff and photo editing stuff if one would like to use their pc for more than just games. an fx-8350, while not as power efficient as an i3 (but who cares since we are talking about a desktop) literally has more bang for the back. it has more cores and is faster.
amd will get back in the game. it is just a question of when. an fx-8350 is already toe-to-toe with an i7-2600k, which is no slouch in todays standard. so, amd just needs to refine their cpu's.
as for talent? amd came up with x64, or amd64 before intel. intel developed their own x86-64 later.
the resource that intel has over amd is just die shrinking. that's it. architecturally, an fx chip or the phenom chip before it seems like a more elegant design to me than intel chips. but that's subjective. and i don't really know that much about cpu's. but, i have been around since the days of 286 so maybe i just see intel as those guys who made 286 which were ubiquitous and plain. i also remember cyrix. and i remember g4 chips. and to me, the fx chip is like a great chip. it's full of compromises and promises at the same time.
i5-2500k beats it just as badly and actually sells for less than the 8350 used on ebay. Games love single threaded power and the 8350 just doesn't have it.
the games they have in that comparison are starcraft 2 and dragon age. 47 fps at 768 p for 8350 looks suspect on starcraft 2. what gpu did they use?
it's not way worse as i say. omg.
i have an i7-3770k oc'd to 4.1Ghz and a stock FX-8320 at stock. both can run cod: ghost and bf3. haven't tested my other games. might do the starcraft 2 test tomorrow. i don't have the numbers nor care. what ppl need to realize is the actual game experience while playing games and not the number. is the game smooth? a cpu that can't handle a game will be very evident. this means it's time to upgrade. and there are no fx cpu's from amd that can't handle modern games. again, they will trail intel, but that is like a car going at 220mph so that car wins but the other car is going at 190mph and it will lose but realistically and the experience of going at 190mph will still be fast. the good thing is that amd or cpu don't race each other unless you care about benchmarks. but, if you look past the benchmarks and just focus on the experience itself, an fx series cpu by amd is plenty fast enuff.
so your response to people who are disappointed that broadwell is focused more on TDP instead of performance is to buy an AMD cpu with even lower performance?
well, they don't have to get the fx-9590, which has serverlike cpu of 2008 like tdp or a gpu like tdp of 220 watts. there is a more modest tdp of 125w with the fx8350. all overclockable. seems like a good cpu for tinkerers, pc/enthusiast, gamers and video editors. i don't even think it's a budget cpu. there is the 6-core and 4-core variants which are cheaper. i am also not saying that an fx-8350 is like the best cpu since it's not and falls way down in the benchmark charts. but, it's not a bad cpu at all. it gets the work done (video editing) and let's you play games (wit's a modern cpu after) even though it's sort of 2 yrs old already. the 990FX chipset is even an older chipset. there's something to be said about that and i think im trying to say it. in light of all the news about intel, which we are guaranteed to get every year with each tick and tock... there is that little AMD sitting in the corner with a chipset that hasn't been updated for yrs and an 8-core cpu that's remarkably affordable. the performance is not that low at all. i mean, video editing with it or playing games with it doesn't hamper one's experience. so, maybe one will have to wait a couple more minutes for a video to render in a video editing program versus say an i7-4790k. but, one can simply get up from one's chair and return. instead of staring at how fast their cpu renders a video on the screen.
know what i'm saying?
so, yeah. an fx-8350 with an old 990fx mobo and now intel's upcoming broadwell cpu's with z97 chipsets and all the bells and whistles and productivity for either one will probably be similar. also, most video editing programs now will also leverage the gpu so an old fx-8350 w/ a compatible gpu will have help rendering those gpu's....
i guess it's like new doesn't mean anything now. or something. like m2 sata and pci 3.0, which intel chipsets have over amd is kinda superflous and doesn't really help or do much.
Oh yes, Skylake. Intel has given 5% IPC improvements for every generation since Nehalem, but now Skylake is going to change everything? If you're one of the ten people on the planet who can actually get value out of AVX-512 then, sure, great leap forward. For everyone else, if you were pissed off at IB, HSW, BDW, you're going to be just as pissed off with Skylake.
No, the interest in Skylake is for all the non-CPU speed things promised with it. PCIe 4.0 and a bump from 16 to 20 CPU lanes (for PCIe storage) are at the top of the list. Other expected, but AFAIK not confirmed, benefits include USB3.1 and more USB3.x on the chipset than the current generation. We should have consumer DDR4 with Skylake too; but that's not expected to be a big bump in the real world.
Actually, apart from power-users I fail to see any tangible improvements in performance of modern CPUs that matter to desktop/notebook usage, Intel or otherwise.
In the mobile space, it is improvements in GPU which mattered, but even that will eventually flatten once some peak is reached since graphics improvements on 4" / 5" screen can only matter to wide audiences up to some point.
However, there are surely enough customers that do look forward to more power - this is workstation and server market. Skylake and its AVX512 will matter to scientists and its enormous core count in EP (Xeon) version will matter to companies (virtualization, etc.).
Standard desktop, not so much. But, then again, ever since Core 2 Quad 6600 this was the case. If anything, large-scale adoption of SSDs is probably the single most important jump in desktop performance since the days of Conroe.
I find the reduction in die thickness to be a big deal. Maybe this will prevent temperatures from getting out of control when the cpu core area gets cut in half for 14nm. High power 22nm cpus already easily hit 30c temperature difference between the cpu and heatsink.
PC sales are down mostly because people can keep their systems longer due to the lack of innovation coming from Intel on desktop chips and the lack of utilizing the current CPU technology by software developers. They could be so much more, if only developers would actually make use of the desktop CPU capabilities for things such as a voice command OS that doesn't need to be trained. Intel would then have a reason to produce more powerful chips that would trigger more PC sales.
As it is, the current processor generation is less than 10% faster clock for clock compared to three generations ago. A great many thing aren't any faster at all. Know what? It doesn't even matter because nothing uses that much power these days.
Tablets and smartphones can't take the place of full PCs for most people. Their screens are just too small. Perhaps the younger generations prefer the small form factors right now, but give them a little time, and their eyes won't let them use such things. I can see the move to laptops, especially with 14-15" screens, but trying to show the same content on a 10" screen is just near unusable, and a 5" smartphone screen is just downright impossible. However, desktop PCs still have their place, and that's never going to change.
This push by "investors" for the tablet and smartphone market is just asinine. Broadwell isn't going to help sales all that much. Perhaps, they might sell some more Intel based tablets, but it won't be all that much of an improvement. Tablets have a niche, but it really isn't that much of one.
Tablets are a niche and not much of one? lol yea ok... well while you were asleep in a cave, over 195 million tablets were sold in 2013 between Android/Apple/Microsoft which is just shy of 80 million more than the previous year. World wide PC sales totaled 316M units, so we are talking nearly 2 tablets for every 3 PC's sold. Eh...small niche...
yeah, lots of people have them, but how much do they really use them? I have two, one Android and one Windows RT, and I only use them for reading books or for reading the web news while away from home. The Windows unit showed promise, since I could use it to run Office and terminal programs, but I ended up not using it at work anymore because it couldn't use a USB to serial adapter for talking to switches and raid arrays. It ended up being only half useful. They're nice to have for certain things, but they aren't as versatile as a PC. My parents own two, and two PCs, and they use the PCs far more. My older sister has one, and she barely uses it. Her 7 year old uses it to play games most of the time. My nephew has one, and he's only ever used it to read Facebook. It's a telling tale that everyone I've known who has one only has limited used for it.
Point taken, but if people are *buying* them, irrespective of whether they use them, then it doesn't really matter.
Besides, this whole field of mobile computing, smart phones, tablets, now phablets, etc., it's too soon to be sure where we're heading long-term.
Many people say the copout porting of console games to PCs with little enhancement is one thing that's harmed PC gaming sales. This may well be true. Now that the newer consoles use PC tech more directly, perhaps this will be less of an issue, but it's always down to the developer whether they choose to make a PC release capable of exploiting what a PC can do re high res, better detail, etc. Wouldn't surprise me if this issue causes internal pressures, eg. make the PC version too much better and it might harm console version sales - with devs no doubt eager to maximise returns, that's something they'd likely want to avoid.
BDW-Y is 82 mm^2. The PCH looks like it's about a third of that, so total is maybe 115 mm^2 or so. In comparison, Apple A7 is about 100 mm^2. A7 includes some stuff BDW-Y doesn't, and vice versa, so let's call it a wash in terms of non-CPU functionality. BDW-Y obviously can perform a LOT better (if it's given enough power, probably performs about the same at the same power budget). On the other hand it probably costs about 10x what an A7 costs.
Sure, also let's conveniently forget that Broadwell Y benefits not only of 3D transistors, but a 2 generation node shrink, too, compared to A7. Now put A7 on 14nm and 3d transistors...and let's see which does better.
This is the issue nobody seems to understand, not even Anand, or just conveniently ignored it when he declared that the "x86 myth is busted". At the time we were talking about a 22nm Trigate Atom vs 28nm planar ARM chip, with Atom barely competing on performance (while costing 2x more, and having half the GPU performance). Yet Anand said the x86 bloat myth is busted...How exactly?! Put them on the same process technology...and then we'll see if x86 myth is indeed busted, or it's still bloated as a pig.
Broadwell Y's CPU portion is on 14nm but the PCH is on 32nm.
Not to mention that Intel's 22nm process isn't that small compared to Foundries 28nm. 28nm Foundry is ~30% larger than Intel 22nm not 100%.
20nm Foundry is really a half node jump from 28nm considering density improves but performance much less. 16nm is another half node since density barely changes but perf improves.
The 'myth' was that x86 was a horrible power-sucking pig. It was shown that it was possible to at least get close to ARM processors.
Meanwhile, Intel's chips are NOT as low-performance as Apple's. The chip in Surface Pro 3 is about 2x-3x faster, and these should be about the same. With this year's Apple chip, imagine it to be 2x faster or so. Meanwhile, they'll probably both take the same amount of power. SoCs these days are running multiples of Watts. Even Intel Atoms used to take over 10, but now a full Intel Core computer can run under 6 Watts. It would be really interesting to run all these numbers and give a complete report on the state of mobile vs. notebook processing. They seem matched on power usage, but ARM chips are far cheaper and run at far less performance.
Finally, a long awaited fanless design. I don't care about thickness that much as about energy waste on the heat and the fan to dissipate the heat. But... I have 2008 MacBook Pro with 2.4GHz Pentium M. If they achieved fanless design by bringing frequency down to, say, 1.8GHz, I am not interested (given that IPCs are not that different for real world applications). For me to upgrade, I want it to reach 3GHz, even if for a second and in a single thread, when starting applications for example. Anything below that is not a noticeable upgrade, and below 2.2GHz or so will be downgrade in practice. And biggest problem of Intel is not how thick they processors are, it is Microsoft - with Windows 8 (and 8.1) being so unbelievably awful (yes, I do own it for a while). Krzanich should call Nadella immediately and tell him to fire Larson-Green, or they both are going down.
You do realize this is for tablets and low power laptops (Chromebook/netbook style) only, right?
It's not coming to MBP for any time soon unless you're talking about something else entirely, like the MBA, which also is not going to get M either because it'll be too big of a regression on the performance.
You forget about IPC. Last I checked, compared to a Core 2 CPU at equal clock speeds, a Sandy Bridge CPU is 50+% faster on average, and Haswell is a further 15+% faster on top of that, all the while using less power.
Since this is the FIRST core to be "fanless", they're probably squeezing a lot of stuff to make that work, and it probably still overheats. I wouldn't be too excited about it until we see how it does in actual devices, both from a power consumption point of view, but also a performance one (because if performance didn't matter, then we'd all just use ARM chips, no?).
It would be laughable if Denver, which already beats mainstream Haswell Celeron, would be in range of Broadwell Y in performance, but still more energy efficient and with much better GPU performance.
OK, I'm confused, so any help appreciated. I want to replace my old desktop replacement laptop with a Broadwell equivalent. For example, the Broadwell version of an HP Envy 17t. What flavor of Broadwell am I waiting for? The Y flavor? U? H? Something else? thanks.
I would ask the opposite: why do you need a replacement at all? What is it your current device cannot do, or does poorly? Once the newer products are out, find out which one solves those issues at the lowest cost. This focus on the jargon side of computing tech is the absolute worst aspect of how consumer computing has evolved since the 1980s.
I appreciate the thoughts, but I'm actually an ASIC designer, so I have no problem with lingo - I just am not informed regarding my specific question. And my current laptop is a Penryn 2008 laptop, so forgive me if we ignore the need question and stay focused on what version(s) of Broadwell are intended for mainstream (non-gaming) desktop replacement laptops. thanks!
You want the full HQ/MQ series for your next laptop. Those are the full quad-core-enabled machines, which is something your probably want for a hi-performance machine. Since this is a gaming laptop though, you may want to look into building a small mini-ITX desktop though, they're comparable if your primary definition of "mobile gaming" is "drag computer to a LANparty/friend's place" rather than actually gaming on a train or similar.
If you can afford it, why are you dicking around with trying to save a few hundred dollars? If you want a serious machine, buy an rMBP with quadcore i7. If you want a light machine, buy an MBA. Then either stick Parallels on it (if you use OSX) or just run Boot Camp and never touch OSX, except maybe once every two months to update the firmware.
If you can afford it, life's too short to waste time trying to figure out which of fifty slightly different laptop PCs will suck less in their different hardware, different pre-installed crap-ware, different drivers. Just pay a little more and get something that most people consider works well.
As an ASIC designer, OSX is just a non-option. Hell, even Linux is rather hairy compared to Windows for ASIC stuff.
I personally like the Dell Precision line, but Lenovo Thinkpad W and HP Elitebook would also doo. In a pinch, a ridiculously-specced Compal (in it's various Sager rebrands) would also do, but IMO are just not built anywhere close.
Thanks, Kjella. Upon further googling after reading your post, I learned that there apparently will be a mobile and deskop version of the H flavor, and it does look like the mobile H is the most likely for me. But that's not rumored to come out until May or June 2015, which is disappointing.
Even weirder, since the first desktop version of Broadwell will be available in PCs in May/June 2015, and since the first version of Skylake (rumored to be a desktop version) is rumored to be "available" 2H 2015, it seems Broadwell H desktop is slated for a very, very short product life. Similarly, is Broadwell H mobile also slated for an extremely short product life?
Perhaps I missed it, but it would be great if there were an Anandtech chart comparing Intel's definitions of "announced", "launched", "samples", "volume shipments", "available", and similar Intelese to figure out about how many months from each it is till Joe Consumer can buy the chip in question. I suspect these definitions and even the lingo can vary with each tick and tock, but some kind of cheat sheet guestimates would be great (and revised as better info arrives, of course).
To further clarify the need for a cheat sheet, I'm familiar with the timing of tick and tock for the last few years, but it seems that 2014/2015 at a minimum will diverge so much from the last few years that previous expectations add confusion rather than clarity.
Judging by the delay for BW, Skylake will probably be pushed forward at least 6 months, if only to make up the R&D costs of BW. Then again, Intel wants that tablet market, so they might not either.
"Similarly, is Broadwell H mobile also slated for an extremely short product life?"
No its not. Companies like Intel cares about 5% profit differences, so having a short product life would make absolutely no sense. Broadwell isn't coming to "mainstream" desktops, only high-end enthusiast ones like the K series.
So they will all happily be a family like this: -Skylake mainstream desktop -Broadwell H/U/Y/K
Semiaccurate says that quadcores (which I take to mean H-series) will not be out until 11 months from now. (Makes you wonder WTF has happened to the Skylake timeline. They haven't yet admitted that that has even slipped, let alone by how much.)
I also fail to see how will Intel be able to keep their cadence with Skylake without either skipping the entire generation (obsoleting it in 6 months does not sound reasonable from the financial point of view) or delaying the Skylake introduction.
Also the fact that Intel decided to enable all 18 cores in the Haswell EP is telling IMHO. Initially, this was to happen only with BDW-EP, so it might not be impossible that Intel might just skip Broadwell for some segments and go with Skylake.
Those who know ain't talking, but I can observe the following; - Delaying Skylake for financial reasons related to Broadwell is braindead. Broadwell development costs are sunk costs and Intel is or should be trying to maximize overall profits, not just a particular program's profits. Intel should release Skylake as quickly as possible when its yields hit target values, regardless of Broadwell delays, with two caveats: - If the Broadwell yield difficulties also slowed down Skylake, then Skylake will likely be inherently delayed to some degree - If Intel screws up product planning such that they flood the market with Broadwell, then their customers might be very angry if they are stuck with that inventory upon a Skylake release.
My bet at this point? Broadwell H mobile will be a very short-lived product (about 6 months).
For the gpu It is noteworthy that unlike nvidia and amd the subslice block (at least before gen8) doesn't really have an inherent minimal size which cannot be changed without significantly altering the architecture. E.g. Gen7 (which is just about the same as Gen7.5) had subslice sizes of 6 (IvyBridge GT1), 8 (IvyBridge GT2) and 4 even (BayTrail). It is also quite interesting that everybody (nvidia since gk2xx and Maxwell, amd since even before GCN, notably their Northern Islands VLIW4 designs, intel since Gen8) now has ended up with the exact same ALU:TEX ratio (one quad tmu per 64 ALU lanes), though of course the capabilities of these tmus vary a bit (e.g. nvidia can do fullspeed fp16 filtering, amd only half speed etc.)
In the fourth to last paragraph an intel driver dev says that broadwell graphics "dwarf any other silicon iteration during my tenure, and certainly can compete with the likes of the gen3->gen4 changes." I'm going to go with the guy who's actually developing the drivers on this.
Simple question: will all the new Intel desktop CPUs have integrated graphics? If the answer's yes, why would they waste the silicon area for those using discrete?
Because the people who obsess about discrete graphics are a RIDICULOUSLY small fraction of the purchasing public, a fraction which is utterly unwilling to accept this fact and the fact that CPU design is not targeted at the needs of high-end gamers.
Compared to the total Intel CPU market and compared to the cost of creating an IGPless CPU die for the mainstream socket it's entirely on the mark. If you want an IGPless design your only choice is to wait a year for the Xeon derived LGA2011 model; and the fact that LGA1366 launched as an enthusiast design well before the Xeon's did, but that Intel hasn't done the same for any other models shows that it didn't sell well enough to justify rushing one for us.
Small fraction is right. Projected worldwide PC hardware sales for 2015 is ~ $385B (Source: eTForcasts). Projected PC gaming sales (both hardware and software) is ~$20B (Source: Statista), less than 10% of total PC hardware sales alone. A 10% market niche is very, very small in the overall scheme of the PC market.
Nvidia, which has around 60% of the discrete gpu market, has a yearly revenue of around $4 000 000 000. So, you're looking at a total market of around $7 000 000 000.
"Maybe not obsess, but to characterise the PC gaming market as ridiculously small, is pretty far off the mark...."
I think the original comment was fairly accurate, even in the PC gaming market there's a large proportion of people using Intel graphics cards. Looking at the current Steam survey results, 75% are using Intel processors and 20% overall are using Intel graphics which means around 1 in 3 people with Intel processors on Steam are using the onboard graphics card. The means even among the gaming market there's a lot of integrated cards in use and that's just one small portion as I'd expect most other areas to mainly be using integrated cards.
There are workstation graphics cards but professionals using those are unlikely to be using consumer processors and the enthusiast/workstation processors do not have an integrated graphics card.
I have had steam on my company laptop with just internal GPU just to take part into the sales campains etc. This makes my contribution to 50:50 in terms on dGPU / iGPU, even though 100% of gaming happens with dGPU.
So....how do NVIDIA and ATI stay in business? Obviously many people use discrete cards. The fact you say "obsess" tells me you probably don't realize the massive performance difference, and it's not limited to gaming. CAD uses 3D.
Doesn't Intel make X-version CPUs that can be overclocked? The OC market is gonna be much smaller than dGPU, and they're already making a dedicated product for that.
Because they are using that anti-competitive tactic to drive out the discrete competition. They force OEMs to buy them bundled, so more and more people say "why should I pay twice for the GPU...I'll just get the Intel one".
It's a nasty tactic, Intel has been employing for years, and unfortunately it's working. But it's terribly uncompetitive.
It's akin to Microsoft bundling IE with Windows "Why would I need to get another browser...I'll just use IE". That tactic WORKED for Microsoft. It only stopped working when they became lazy. But they could've hold the 90 percent market share of IE for a lot longer, if they didn't get lazy.
"we’ll still have to wait to see just how good the resulting retail products are, but there shouldn’t be any technical reason for why it can’t be put into a mobile device comparable to today’s 10”+ tablets. "
There may not be TECHNICAL reasons, but there are very definite economic reasons. People think of tablets as cheap devices --- iPad at the high end, but the mass market at $350 or so. This CPU alone will probably cost around $300. MS is willing to pay that for Surface Pro 4; no-one else is, not for a product where x86 compatibility is not essential. We'll see it in ultrabooks (and various ultrabook perversions that bend or slide or pop into some sort of tablet) but we're not going to see a wave of sub<$1000 products using this.
Nothing was said about cheap tablets in that quote, so I'm not sure why you're bringing up the price.
Not that I disagree with your point. Of course, by continuing to focus on premium priced parts, Intel is never going to gain a profitable foothold in the mobile market. Core M needs to be cheaper, not just lower power, to be interesting. Otherwise there's no reason to care. If you're paying for a $1000 device, why do you want something that's obviously going to be so performance gimped compared to Y-series Broadwells?
i would like to see ordinary 13" ultrabooks with broadwell-y. don't make it too slim and see what performance and battery life is like with a 4.5w cpu. if performance is high enough for everyday tasks, it would really be nice to have slim notebooks approach 20 hours of battery life in light usage scenarios.
but i guess companies will just use the low power cpus as an excuse to implement smaller batteries and 4k displays and we still won't get much more than 10h in best case scenarios...
I'd argue that it may well be too late for Intel to enter this market, unless they can deliver a major step change in performance compared to ARM. Right now the ARM-Android & ARM-iOS ecosystems are well established and humming along nicely. On top of which tablet sales in the developed world are slowing down. The developing world is still growing but in those regions, cost will be a key factor.
That leaves Intel targeting a market with an entrenched competitor, a set software ecosystems with no benefits from migrating to a new architecture (what do Apple, Google, Samsung, HTC, LG etc gain out of this?) and slowing hardware sales.
I Core M can deliver double performance for the same power draw AND price, then sure, I can see a rush to migrate to it, otherwise what's the point?
"Core" chips will never EVER compete with ARM in its main market. The best Intel can do is try to compete in $700+ devices. Core is just not competitive on price. Not even close. Period.
Intel's only competition against ARM in the MOBILE market is Atom, and Nvidia's Denver is already TWICE as fast as that. Also Atom is twice as expensive as Denver, but Intel keeps subsidizing half the cost...for as long as they can, which probably won't be much longer.
This article feels like marketing drivel just listing point after point without any further explanation. I'd expect a little more in-depth analysis. Seriously a tick/tock chart?
This sounds as a distraction meant to divert attention from the fact that Intel is about a year late with 14nm. It seems than the Moore's law will stop working quite soon. I would still bet on Intel being able to implement 10nm but that will likely be the end of the road for the current technology. And there is no clearly visible path beyond that. Not that progress will stop but it may slow down to the pace we observe e.g. in car design and production rather than what we are used to expect in chip design and production.
"Intel’s solution to this problem is both a bit brute force and a bit genius, and is definitely unlike anything else we’ve seen on PC GPUs thus far. Since Intel can’t reduce their idle voltage they are going to start outright turning off the GPU instead;"
That qualifies as genius these days? The most obvious method to reduce a GPU's power consumption is to turn it off.
I liked the comments in this thread but was disturbed by the fan boys. Just because there was an amd marketer here doesn't mean that we should resort to intel worship. At least for me, those discussions are totally moot in this thread.
Broadwell Core M is supposed to support fanless designs. Will Broadwell-U support fanless as well? Meaning will high end laptops with Broadwell-U be fanless? If not, will Skylake be able to have high end machines be fanless?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
158 Comments
Back to Article
crispbp04 - Monday, August 11, 2014 - link
Intel has something impressive in the works with Broadwell (at least on paper). I can't wait to get a Broadwell based Surface Pro. Assuming that Microsoft improves an already impressive hardware design from the sp3, the Broadwell iteration will likely be my next computer purchase.frostyfiredude - Monday, August 11, 2014 - link
I have a feeling SP4 will be fundamentally the same design as SP3 save for minor tweaks and improvements. SP3 was clearly designed for a processor with the kind of power profile Broadwell is set to deliver rather than the current Haswell profile. It will be interesting to see which set of SKUs Microsoft will put in the SP4, Core M or Broadwell ULT. Core M has a number of obvious benefits for power and area efficiency, but will it be powerful enough for their market with some features reduced from Haswell and Boradwell ULT.MonkeyPaw - Monday, August 11, 2014 - link
Pure speculation, but I think Intel might already be giving MS premium bins of Haswell for SP3, because SP3 is the only device to date to actually show off the ability to run premium Intel CPUs in a tablet format. Sure, MBA looks great, but SP3 took it to the next level.That said, I doubt that MS will use Core M in SP4, for the same reason we don't have Haswell-Y in SP3 (at least at the high end). It will probably be a step back in processing power to use one.
frostyfiredude - Tuesday, August 12, 2014 - link
Do we know what wattage the Broadwell ULT and Core M chips will be targeting? 15W TDP is clearly too high for the SP3 to handle so moving all SP4 chips to 11.5W like the current Haswell Y looks quite plausible at the moment, it just seems to be a matter of which version of Broadwell will have the 11.5W TDP.Samus - Wednesday, August 13, 2014 - link
15W is only a problem in SP3 to people who use it like a high performance computer (24x7 full load applications) but for general purpose use it barely warms up. We have people running Lightroom 8 hours a day on these things and like the Surface 2's (which I still have) they never got "hot" or "loud".That said, someone in the office infected their SP3 with some malware a few weeks ago (they literally owned the tablet not even 24 hours) and when they handed it to me, it was VERY hot with the fans whirling. Some 800kb task was using 100% of their CPU doing who knows what...at first I thought it was Cryptolocker but it turned out to be retrying a network connection. This was an i5 model, however, and it didn't seem to be throttling. The i3 will presumable run cooler, even at the same TDP.
What people need to keep in mind is these are mobile devices.
IntelUser2000 - Wednesday, August 13, 2014 - link
Broadwell ULT: 15WCore M(previously Broadwell-Y): 4.5W
vlad0 - Friday, August 15, 2014 - link
Isn't the core i3 version of the sp3 based on a Y series chip ?bebimbap - Monday, August 11, 2014 - link
agreed, Broadwell, and skylake will be vast improvements to PCs in general. Intel's Broadwell-Y announcement is all about "small, cool, efficient" while the recent FX-9590 seems more about "big, hot, gluttony" similar to the David vs Goliath story, the interesting part was the small one besting the big one. Ironically Intel is the bigger company. Hopefully AMD's new A1100 pans out as I don't want another Comcast, Microsoft, De Beers or Luxottica.wurizen - Monday, August 11, 2014 - link
well, if amd was as agressive as intel in shrinking dies or whathaveyou, then an AMD FX chip will probably be toe-to-toe to an intel i7-4930k or whatever the 6-core enthusiast intel chip is labeled. and not even die shrinks, but, also aggressive in producing a $500 cpu. imagine that. and you'd probably see an a10-7850k performance in a laptop by now. but, AMD seems content is sitting back and letting the other company do all the work, creating a path. as long as AMD doesn't completely die out, it's fine. we just need an alternative and AMD is the only one. so, go AMD. don't worry about broadwell. build it and we will come. be a niche. convert future x99 users to a future AMD product. and start from there.StevoLincolnite - Monday, August 11, 2014 - link
Except AMD can't be aggressive at shrinking dies.For one, die-shrinks costs money... For fab contracts, man-hours, research and possibly buying technology from other companies such as IBM.
AMD can't aggressively shrink dies anyway, they are at the mercy of fabrication companies like TSMC and Global Foundries, so what they can produce is limited to what they provide.
Intel has always been ahead of the industry in fabrication, the only way AMD can beat Intel is through something ground breaking (Like moving away from silicon?) or if Intel drops the ball, like they did with Netburst.
Or, AMD buys a fab company who is far ahead of Intel, which simply isn't going to happen.
Otherwise they can only compete on price and using an older more mature fabrication process allows them to do just that as the chips are much cheaper to produce, they just need to provide "Good enough" performance to mostly stay relevant, which the FX doesn't really do.
wurizen - Monday, August 11, 2014 - link
well, an fx-8350 is toe-to-toe with an i7-2600k, which is no slouch until today. and comparing fx-8350 with today's i7-4770k would be a little unfair since the 4770k is 22nm while the 8350 is at 32nm. and we're not even considering software optimizations from OS and/or programs that are probably bent towards intel chips due to its ubiquity.so, i think, you're wrong that the fx-8350 doesn't provide good enough. i have both i7-3770k oc'd to 4.1 ghz and an fx-8320 at stock and the amd is fine. it's more than good enough. i've ripped movies using handbrake on both systems and to me, both systems are fast. am i counting milliseconds? no. does it matter to me if the fx-8320 w/ lets say amd r9-290 has 85 fps for so and so game and an i7-4770k w/ the same gpu has a higher fps of 95, let's just say? i don't think so. that extra 10 fps cost that intel dude $100 more. and 10 extra frames with avg frames of 85-95 is undecipherable. it's only when the frames drop down below 60 does one notice it since most monitors are at 60 hz.
so what makes the fx not good enough for you again? are you like a brag queen? a rich man?
frostyfiredude - Monday, August 11, 2014 - link
Not fair to compare against a 22nm from Intel? Bogus, I can go to the store and buy a 22nm Intel so it should be compared against AMDs greatest. An i5-4670K matches or exceeds the performance of even the FX-9590 in all but the most embarrassingly threaded tasks while costing 50$ more. Cost to operate the machine through the power bill makes up for that price difference at a fairly standard 12c per KWh when used heavily 2 hours per day for 4 years or idling 8 hours per day for the same 4 years.Your argument for gaming with the 8350 being good enough is weak too when the 10$ cheaper i3-4430 keeps up. Or spent 125$ less to get a Pentium G3258 AE, then mildly overclock it to again have the same good enough gaming performance if >60FPS is all that matters. The i3 and pentiums are ~70$ cheaper yet when power use is counted again.
wurizen - Tuesday, August 12, 2014 - link
well, if a pentium g3258 is good enuff for gaming, then so is an fx-8350. whaaaaaat? omg we know intel is king. i acknowledge and understand that. intel rules. but, amd is not bad. not bad at all is all im trying to make./omg
wetwareinterface - Monday, August 11, 2014 - link
wow...first off you are assuming a lot and not bothering to check any published benchmarks out there so,
1. 8350 isn't even equal to 2500 i5 let alone 2600 i7.
2. 32nm vs. 22nm means nothing at all when comparing raw performance in a desktop. it will limit the thermal ceiling so in a laptop the higher nm chip will run hotter therefore be unable to hit higher clocks but in a desktop it means nil.
3. handbrake ripping relies on speed of dvd/blu-ray drive, handbrake transcoding relies on cpu performance and the 8350 gets spanked there by a dual core i3 not by miliseconds but tens of seconds. i5 it gets to the level of minutes i7 more so.
4. let's say you're pulling framerates for an r9-290 out of somewhere other than the ether... reality is an i5 is faster than the 8350 in almost any benchmark i've ever seen by roughly 15% overall. in certan games with lots of ai you get crazy framerate advantages with i5 over 8350, things like rome total war and starcraft 2 and diablo 3 etc...
i'll just say fx8350 isn't good enough for me and i'm certainly not a rich man. system build cost for what i have vs. what the 8350 system would have run was a whopping $65 difference
wurizen - Tuesday, August 12, 2014 - link
#3 is B.S. a dual-core i3 can't rip faster than an fx-8350 in handbrake.#4 the r-290 was an example to pair a fairly high end gpu with an fx-8350. a fairly high end gpu helps in games. thus, pairing it with an fx-8350 will give you a good combo that is more than good enough for gaming.
#2 22nm vs. 32nm does matter in desktops. the fx-8350 is 32nm. if it goes to 22nm, the die shrink would enable the chip to either go higher in clockspeed or lower it's tdp.
u sound like a benchmark queen or a publicity fatso.
wurizen - Tuesday, August 12, 2014 - link
oh and #1--i am not saying the fx 8350 is better than the i7-2600k. i said "toe-to-toe." the i5-2500k can also beat the fx-835o b/c of intel's IPC speed advantage. but, i think the reasons for that are programs not made to be multithreaded and make use of fx-8350 8-cores to it's potential. since amd trails intel in IPC performance by a lot--this means that a 4-core i5-2500k can match it or sometimes even beat it in games. in a multithreaded environment, the 8-core fx-8350 will always beat the i5-2500k. although it might still trailer the 4-core + 4 fake cores i7-2600k. just kidding. lol.i said toe to toe with 2600k which means its "competitive" to an i7-2600k even though the AMD is handicapped with slower IPC speed and most programs/OS not optimize for multithreading. so, to be 10-20% behind in most benchmarks against an i7-2600k is not bad considering how programs take advantage of intel's higher IPC performance.
do u understand what im trying to say?
Andrew Lin - Tuesday, August 26, 2014 - link
i'm sorry, is your argument here that the FX-8350 is better because it's inferior? because that's all i'm getting out of this. Of course a benchmark is going to take advantage of higher IPC performance. That's the point of a benchmark: to distinguish higher performance. The way you talk about benchmarks it's as if you think benchmarks only give higher numbers because they're biased. That's not how it works. The benchmarks give the i7-2600k higher scores because it is a higher performance part in real life, which is what anyone buying a CPU actually care about. Not to mention the significantly higher efficiency, which is just an added benefit.Also, it's really hard to take you seriously when your posts make me think they're written by a teenage girl.
wurizen - Tuesday, August 12, 2014 - link
also, if the fps disparity is so huge btwn fx-8350 and say i5-2500k in games u mention like starcraft 2, then something is wrong with that game. and not the fx-8350. i actually have sc2 and i have access to a pc w/ an fx-8320. so i am going to do a test later tonight. my own pc is an i7-3770k. so i could directly compare 2 different systems. the only thing is that the amd pc has an hd5850 gpu, which should be good enuff for sc2 and my pc has a gtx680 so it's not going to be a direct comparison. but, it should still give a good idea, right?wurizen - Tuesday, August 12, 2014 - link
i just played starcraft 2 on a pc with fx-8320 (stock clockspeed), 8GB 1600Mhz RAM, 7200rpm HDD and an old AMD HD5850 w/ 1GB VRAM. the experience was smooth. the settings were 1080P, all things at ultra or high and antialiasing set to ON. i wasn't looking at FPS since i don't know how to do it with starcraft 2, but, the gameplay was smooth. it didn't deter my experience.i also play this game on my own pc which is an i7-3770k OC'd to 4.1, 16GB 1600 Mhz RAM, 7200rpmHDD and an Nvidia GTX680 FTW w/ 2GB VRAM and i couldn't tell the difference as far as the smoothness of the gameplay is concerned. there is some graphical differences between the AMD GPU and the Nvidia GPU but that is another story. my point is that my experience were seamless playing on an FX chip pc to my own pc with 3700k.
to make another point, i also have this game on my macbook pro and that is where the experience of playing this game goes down. even in low settings. the MBP just can't handle it. at least the one i have with the older gt330m dGpu and dual-core w/ hyperthreading i7 mobile cpu.
so.... there.... no numbers or stats. just the experience, to me, which is what counts did not change with the pc that had the amd fx cpu.
wurizen - Tuesday, August 12, 2014 - link
well, i should point out that my macbook pro (mid-2010 model) can handle starcraft 2. but, it's not a "fun" experience. or as smooth.D. Lister - Saturday, August 30, 2014 - link
So you must be feeling pretty darn stupid now, realizing that you never had to buy the more expensive 3770K (plus the gtx680), since from your point of view it "feels" the same as an 8350 or an 8320 with an HD5850... eh? Here's an idea, sell your 3770K/GTX680 system, and buy an FX8320/HD5850... you would still get some of your money back - if you can't do that, then at least just shut the hell up and stop deliberately spreading misinformation, you unethical hypocrite.wintermute000 - Tuesday, August 12, 2014 - link
"well, an fx-8350 is toe-to-toe with an i7-2600k"You lost all credibility right there
wurizen - Tuesday, August 12, 2014 - link
no ididn't. u wish.wurizen - Tuesday, August 12, 2014 - link
u mean the credibility of anonymous internet opinions?rkrb79 - Tuesday, November 18, 2014 - link
Everyone can have their own opinions. Leave wurizen alone. Heaven forbid someone say something you don't agree with. Put on your big boy pants intel fanboys.tomsworkshop - Thursday, August 14, 2014 - link
Global Foundries was AMD spunoff fab, AMD still holding share on Global Foundries, Global Foundries are working tightly with Samsung fab right now for better manufacturing process, when they reach their goal in nm race they can compete with Intel in die shrink.FstEddie - Thursday, August 14, 2014 - link
Actually StevoLincolnite and others, you are quite confused. Using larger node sizes is not "cheaper to produce". Larger node sizes are more expensive per chip. The reason AMD (global foundries) does not just jump down to the next node is that it requires a great deal of capital up front (they are relatively broke) and R@D time which they are already behind on. Intel has proven more adept at shrinking process nodes and invests more in R&D ahead of time. This allows Intel to use the new node budget to partially increase performance, partially decrease power and partially decrease cost/chip. Cost per chip is the main driver for increasing density and depending on the generation Intel has re-balanced the performance/power/cost relationship.Samus - Wednesday, August 13, 2014 - link
I love what they did with the Z-height by embedding the chip "through" the motherboard PCB. That's really smart and will definitely improve thickness.[email protected] - Monday, August 11, 2014 - link
this looks to be very interesting... a true gaming windows 8.1/9 tablet laptop convertible in Surface Pro 3 form factor.A5 - Monday, August 11, 2014 - link
I'd hesitate to say that this will enable any kind of "real" gaming in the traditional sense. The iGPU isn't strong enough in this form factor, and AMD/NV draw too much power.Yesumanu - Monday, August 11, 2014 - link
Not true, assuming Intel will integrate Iris Pro into its mobile CPUs. With HD5200 (iGPU) you can run just about any game on 768p with at least medium settings. Here are some benchmarks for that iGPU: http://www.anandtech.com/show/6993/intel-iris-pro-... I know that TDP there 47W but it includes quad core i7 clocked at 3,6GHz wnich itself draws some power. Intel could improve on that design, put it into its newest CPUs and if they do, they sure will brag about it.Yesumanu - Monday, August 11, 2014 - link
Sorry, I messed up the link. You can erase the dot at the end or use this: http://www.anandtech.com/show/6993/intel-iris-pro-...winterspan - Monday, August 11, 2014 - link
Core M is NOT going to have IRIS ProStevoLincolnite - Monday, August 11, 2014 - link
Half the problem with Intel Decelerators is with the graphics drivers, AMD and nVidia's drivers are gold plated in comparison, sure they are better these days than they used to be, but still hardly ideal.RyuDeshi - Monday, August 11, 2014 - link
In a tablet form factor you are going to encounter severe thermal throttling while gaming, the Anandtech article you linked doesn't have those cooling restraints. Just look at Anantech's article on the Surface Pro 3's thermal throttling. Despite having a slightly better i5, it does worse in most heavy use scenarios.RyuDeshi - Monday, August 11, 2014 - link
Better i5 than the surface pro 2's version I meant to say.mb5625083591 - Monday, August 11, 2014 - link
I think this discussion is completely irrelevant. Very few are going for a graphical gaming tablet (unless it's poker or fruit slices, etc.). That field is dominated by Playstation or XBox on an HDTV.Tablets and convertibles are for Internet Surfing or business documents, the low end for Internet surfing only.
mkozakewich - Thursday, August 14, 2014 - link
No, we use them for gaming, too. If you get a high-end convertible tablet, it's because a regular tablet doesn't do it and you don't need a similarly-sized laptop. This means that you'll either have a proper big gaming rig or you'll be relying on your convertible to game. I'm doing all my gaming on my Surface Pro 1, and I can play most games with my CPU at 700 MHz. (I mean, there are orders of magnitudes of game needs. I'll sometimes switch it to full-throttle when playing a game, when it needs more power.)mb5625083591 - Monday, August 11, 2014 - link
P.S. For the price of an I7 tablet you can buy BOTH an XBox and Playstation, and maybe HDTV also, and play excellent graphical games.kyuu - Tuesday, August 12, 2014 - link
Sure, but there's an obvious difference between gaming on a mobile device and sitting down in front of a TV. So talking about Xboxes and PlayStations is missing the point by a pretty large margin.Krysto - Monday, August 11, 2014 - link
M stands for Mediocre. The main reason they are hitting those TDP levels is because they are reducing performance for those specific chips that they claim to use little power.I also bet Broadwell will jump even higher in price than Haswell, which also jumped 40 percent over IVB, considering they're having significant yield issues.
Grizzlebee - Monday, August 11, 2014 - link
I'm looking at prices right now on Newegg and there is a 5-10$ difference between Ivy Bridge and Sandy Bridge in price. Not even close to 40%.Krysto - Monday, August 11, 2014 - link
Original prices. Haswell is not the "mass market" chip for Intel, which means high volume, and can afford lower prices than initially, while IVB is low volume chip now, so it won't be much cheaper. Watch for when Broadwell comes out. It won't be just 10 percent more expensive than Haswell.bebimbap - Monday, August 11, 2014 - link
I believe Krysto was referring to the jump in prices from IB to HW, and not SB to IB.But either case, I'll have to disagree with Kyrsto. prices seems to have decreased over the years if anything. don't forget, because of IPC gains you are paying the same for more performance.
3570 = $210
4590 = $200
2320 = $190
3330 = $190
4430 = $180 w/ $10 off promo applied.
Gondalf - Monday, August 11, 2014 - link
All depends on from factor. If you want a tiny tablet, this is the only manner to do it.Or you do like Tegra K1 that scores 12W under full GPU operations??? Come on, put it in a slim convertible if you can.
HardwareDufus - Monday, August 11, 2014 - link
Pretty sure M stands for Mobile.Will be interesting to see what this means for Desktop variants. Want to see what improvements come the way of the GT3+ gpu.
gostan - Monday, August 11, 2014 - link
"and as we’ve seen with ARM based tablets so far they form a market that has continued to grow and continued to erode the x86 laptop market that Intel has dominated for so long."Is this your assumption or is it a fact? If the latter is true, can you provide some references?
Ryan Smith - Monday, August 11, 2014 - link
"If the latter is true, can you provide some references?"Sure.
http://www.gartner.com/newsroom/id/2647517
http://www.gartner.com/newsroom/id/2420816
Gartner regularly tracks PC unit sales, and tablets are regularly cited as a factor. On a more micro level, I know several people who have reduced their PC usage over the last couple of years (and delaed PC replacements) due to their tablet use.
Flunk - Monday, August 11, 2014 - link
http://www.forbes.com/sites/haydnshaughnessy/2014/...Tablet sales are falling too. Perhaps there is no relation at all and just an overall tightening due to the market maturing.
HanzNFranzen - Monday, August 11, 2014 - link
that doesn't seem to be the case.http://techcrunch.com/2014/07/06/gartner-device-sh...
Morawka - Monday, August 11, 2014 - link
wasn't gartner the one who said apple mac sales were down 40% and it turned out to be up 60%mkozakewich - Thursday, August 14, 2014 - link
Those weren't actual numbers, they were industry guesses. The lesson is not to trust pundits and "Industry analysts". If you can wait for proper news to be released, it's better than rumours.lilmoe - Monday, August 11, 2014 - link
Actually, the tablet market as a cause for dropping PC sales is only half or less than half of the story. Logically speaking, most consumers find their 2-4 year old machines sufficient for their "productive" needs. Unlike previous years, PCs are living far beyond their intended years of services, and consumers are in no dire need for newer, faster components. Mind you, hardware isn't getting that much faster with each iteration (relatively speaking), and a simple SSD upgrade and/or more RAM would improve performance significantly that a whole hardware upgrade becomes less appealing (same for other components). It's mostly about convenience, connectivity and mobility for consumers these days (cheap mobile media consumption) that's why the tablet market "appear" to have affected PC sales. Those who find a tablet replacing their fully fledged PC didn't need a PC in the first place.mayankleoboy1 - Monday, August 11, 2014 - link
i remember being very excited to read the Haswell architecture preview.Felt completely disappointed when the actual CPU were reviewed.
Hope BDW-Y is not the same
edlee - Monday, August 11, 2014 - link
they should make a chromebook with this processor.Krysto - Monday, August 11, 2014 - link
My guess is this chip won't be that much faster than Nvidia's Denver - unless Intel cheats in benchmarks somehow (like with TurboBoost).68k - Monday, August 11, 2014 - link
How on earth is TurboBoost cheating? Is it cheating to include a feature that actually does result in a CPU that allow itself to do short sprints in moments where it is needed the most in order to make a device "feel" fast (i.e. respond quickly to events generated by the user)?Cannot talk for BDW but both HSW and SMT usually stay at or very close to max turbo for _very_ long, Z3770 sitting inside a 10" tablet can run at 2.3-2.4GHz on all 4 cores for several minutes.
nonoverclock - Monday, August 11, 2014 - link
AMD has a turbo mode. Do they cheat?psyq321 - Tuesday, August 12, 2014 - link
TurboBoost is not cheating but optimal demand-based power management.All modern ARM CPUs are also doing the same thing (scaling down frequency when idle, up when busy). I am quite sure it is the case with the AMD CPUs, too - it is just that I do not have any to check.
The only three things that matter are a) price b) performance c) power draw (real that is measured, not marketing terms such as TDP, SDP, etc.).
mkozakewich - Thursday, August 14, 2014 - link
TDP isn't a marketing term, it's the wattage of heat displayed by the processor. Wattage is a bit of a weird word for it, but Watts are joules-per-second, and a joule is the amount of energy needed to heat water a certain amount.Jaybus - Thursday, August 14, 2014 - link
TDP is "Total Dissipated Power" and as used in describing processors refers to the maximum instantaneous power that the processor will dissipate. Watt, being a unit unit of power, is the only SI unit suitable for specifying TDP. Since nearly all of the energy expended by a processor is due to heat transfer, a Watt is in this case essentially a measure of the rate of heat transfer, and TDP is the rate of heat transfer required when operating the processor at or near its maximum operating temperature.473NG3R - Friday, August 15, 2014 - link
@Jaybus Nope. Please don't correct people with false info. TDP is Thermal Design Power, and it is NOT "peak" power. It is used for marketing as mkozakewich said above as the definition of TDP measurement varies from company to company.473NG3R - Friday, August 15, 2014 - link
Oops! I read mkozakewich comment backwards. It is a marketing term in a sense... It is used to design cooling systems but it gets thrown around by marketing groups all the time since TDP limits how small a device can be.frostyfiredude - Monday, August 11, 2014 - link
With all these reductions in power use on a per core basis and a stagnation of clock speeds we very well could see a quad core i7-5770k with a 65W TDP. I hope Intel plans on bumping up their mainstream high end SKUs to six core, the desktop market doesn't need maximum power use that low.klmccaughey - Monday, August 11, 2014 - link
Yes this worries me too. I think we will see yet another pointless desktop release, with hardly any improvements (once you consider the bad overclocking).I still don't see a desktop Broadwell replacing my 2500k, which runs low(ish) heat at 4.3GHz. What I would love to see is a 5.5GHz Broadwell monster, otherwise I will be skipping broadwell, just like Hazwell and IB before it.
Surely there has to be some light on the horizon for us gamers?!
mapesdhs - Monday, August 11, 2014 - link
And someone in your position also of course has the double irony of still having more
performance headroom if they're willing to explore better cooling, etc. H80i would let
your 2500K run at 5GHz no problem, assuming your particular sample isn't naturally
limited in some way (actually, even an old TRUE and two typical fans would be fine,
something I can confirm as I've done it with several different 2700Ks, never mind a
2500K, so I know the heat isn't an issue). Thus, except for threaded workloads, an
upgrade would have to be a lot better than what you already have in order to be
worth bothering with, except if you wanted something new due to newer tech sych
as PCI Express, M.2, etc.
Anyway, this is why IB was so bad with its default metal cap setup, ie. Intel made SB
too good which resulted in lots of 2500K owners just not seeing the point in upgrading.
Thing is, they've still not boosted performance enough to make upgrading worthwhile
for most SB users (the exception of course being non-K owners).
My 2700K runs at 5.0. Takes me 3 mins to set this up on an ASUS M4E, ludicrously
easy to do. HW at 4.5 is about the same or a bit quicker, but much more complex to
do, given how many samples won't go that high, and for gaming it makes no difference
at all. I suppose the 4790K can help on the oc front a bit, but the cost has put me off so
far, used 2700Ks are so much cheaper. I want to build a 4K gaming system next, but
that'll be X99 instead I think (don't see the point of mainstream chipsets for 4K gaming,
surely with multiple GPUs the limited PCIe lanes is going to hold back performance
eventually).
Ian.
garadante - Tuesday, August 12, 2014 - link
I wish I had your luck with SB. My 2500k will only do 4.5 GHz stable with 1.365 vcore (in reality, monitoring software reports 1.391~ vcore with my motherboard's way of handling vdroop). If I could go any higher, I'd have to dump a load of voltage into it that I'm just afraid of doing. Though this may be a motherboard problem as it's given me problems before, but that still wouldn't account for a 500 MHz variance. I guess I just got a bad SB. :/pt2501 - Monday, August 11, 2014 - link
You said it. My 2500k at 4.6 Ghz isn't going anywhere unless something revolutionary is released. Best processor I ever bought.ZeDestructor - Monday, August 11, 2014 - link
They want you to go up to LGA2011 for the solder-based TIM. And only because of die-cracking issues (I found the papers covering that) on smaller dies that force them to use boring old polymer TIM.steve wilson - Tuesday, August 12, 2014 - link
My thoughts exactly. My 2500k runs everything fine for me at 4.4ghz. I will be skipping Broadwell and praying Skylake will deliver.sherlockwing - Monday, August 11, 2014 - link
Tskin=41C?!? Is Intel out of their mind? Did they not read how much trouble iPad 3 got into for a Tskin of 33.6C? http://www.theguardian.com/technology/2012/mar/20/...I hope they/OEMs do keep Tskin undercontrol with better throttling when actually shipping these chips in products.
ZeDestructor - Monday, August 11, 2014 - link
People are cool with high TSkins on their devices. I know my phone passes 35°C easily if I load it up, and I'm fine with that. Then again, I'm completely fine with a 60°C idle, because that's where ICs like to live...Gondalf - Saturday, August 23, 2014 - link
Ummmm present iPad Air scores a 42.1°C skin temperature with an A7 running inside.So not a concern, all recent tablets are pretty hot.
magnusmundus - Monday, August 11, 2014 - link
Looking forward to seeing benchmarks and desktop 14nm parts.Also, I found a typo on the closing thoughts page "Though laptops at a category" should be "Though laptops as a category"
DanNeely - Monday, August 11, 2014 - link
I'm surprised they didn't move the PCH to 22nm. Relatively low power consumption or not, they pushed everything else to the wall to get Core M's TDP as low as possible and between doing custom designs for it anyway and soft sales meaning they've got the spare 22nm capacity available I don't see any obvious reason why they couldn't've done so.klmccaughey - Monday, August 11, 2014 - link
Vastly diminishing returns for the expense seem the most likely answer to that.mkozakewich - Thursday, August 14, 2014 - link
The process nodes are very expensive to produce, so they need to get as much life out of them as possible. Also, a new(er) process isn't going to have a high enough yield. 22 Might have worked, but I bet the older process gave them a better bang for their buck.Flunk - Monday, August 11, 2014 - link
I was thinking of buying one of these, but it sounds like the focus is still on TDP over all else so it looks like waiting to Skylake is the plan for anyone with Sandy Bridge or newer.klmccaughey - Monday, August 11, 2014 - link
Yup :( Is there no hope for us gamers?wurizen - Monday, August 11, 2014 - link
i think AMD is your hope. if you don't have a sandy bridge cpu or in phenom land, an FX series cpu is a great cpu that will hold one over until AMD updates their desktop FX series line of cpu's. i mean a 990FX mobo has all you need. i think pci 2.0 is still adequate for today's video cards so that doesn't really matter. even though on paper, intel has pci 3.0 and usb 5.0 (just kidding) and thunderbolt 9.9, they're superflous and doesn't make you game better. i mean, an fx-8350 with a decent gpu should give one smooth framerates. i mean who cares if in the benchmark, an fx-8350 with so and so gpu will run so and so game at, say 120fps, while an intel chip will run it at 190fps? 120 fps is like very good and that person who has an intel chip running 190 fps probably paid hundreds of dollars more for their system and i bet they wouldn't be even to decipher those 70 or so of more frames. i mean, it's not total fps that matters but the average frame rate and i think an fx-8350 will deliver that. it's a beast of a cpu. and broadwell? who cares. 90% of the words in the article above are buzz tech fancy words to get gadget heads salivating and stock wigs greasing their palms.mapesdhs - Monday, August 11, 2014 - link
Yeah, sure, and that's exactly what everyone was saying back when we were waiting for
the followon to Phenom II; just wait, their next chip will be great! Intel killer! Hmph. I recall
even many diehard AMD fans were pretty angry when BD finally came out.
Benchmarks show again and again that AMD's CPUs hold back performance in numerous
scenarios. I'd rather get a used 2700K than an 8350; leaves the latter in the dust for all CPU
tasks and far better for gaming.
Btw, you've answered your own point: if an 8350 is overkill for a game, giving 120fps, then
surely one would be better off with an i3, G3258 or somesuch, more than enough for most
gaming if the game is such that one's GPU setup & sscreen res, etc. is giving that sort of
frame rate, in which case power consumption is less, etc.
I really hope AMD can get back in the game, but I don't see it happening any time soon.
They don't have the design talent or the resources to come up with something genuinely
new and better.
Ian.
wurizen - Monday, August 11, 2014 - link
an fx-8350 isn't holding anything back. come on, man. are you like a stat paper queen obsessor or something? oh, please. an fx-8350 and an amd r9290 gpu will give you "happy" frame rates. i say happy because it i know the frames rates will be high enough. more than good enough even. will it be lower than an i7-4770k and an r90? maybe. maybe the fx-8350 will avg 85 fps on so and so game while the i7-4770k will avg 90 fps. boohoo. who cares about 5 more frames.also, while you mention i3 as a sufficient viable alternative to an fx-8350. remember that the cost will probably be about the same. and fx-8350 is like $190. maybe the i3 is 20 dollars less. but, here's the big but, an i3 is not as good as an fx-8350 in video editing stuff and photo editing stuff if one would like to use their pc for more than just games. an fx-8350, while not as power efficient as an i3 (but who cares since we are talking about a desktop) literally has more bang for the back. it has more cores and is faster.
amd will get back in the game. it is just a question of when. an fx-8350 is already toe-to-toe with an i7-2600k, which is no slouch in todays standard. so, amd just needs to refine their cpu's.
as for talent? amd came up with x64, or amd64 before intel. intel developed their own x86-64 later.
the resource that intel has over amd is just die shrinking. that's it. architecturally, an fx chip or the phenom chip before it seems like a more elegant design to me than intel chips. but that's subjective. and i don't really know that much about cpu's. but, i have been around since the days of 286 so maybe i just see intel as those guys who made 286 which were ubiquitous and plain. i also remember cyrix. and i remember g4 chips. and to me, the fx chip is like a great chip. it's full of compromises and promises at the same time.
Drumsticks - Monday, August 11, 2014 - link
I think AMD might have a way back into the game, but the difference right now is way worse than you say.http://www.anandtech.com/bench/product/697?vs=287
FX-8350 trails the 2600k frequently by 10-20% or more (in gaming).
http://www.anandtech.com/bench/product/697?vs=288
i5-2500k beats it just as badly and actually sells for less than the 8350 used on ebay. Games love single threaded power and the 8350 just doesn't have it.
wurizen - Monday, August 11, 2014 - link
the games they have in that comparison are starcraft 2 and dragon age. 47 fps at 768 p for 8350 looks suspect on starcraft 2. what gpu did they use?it's not way worse as i say. omg.
i have an i7-3770k oc'd to 4.1Ghz and a stock FX-8320 at stock. both can run cod: ghost and bf3. haven't tested my other games. might do the starcraft 2 test tomorrow. i don't have the numbers nor care. what ppl need to realize is the actual game experience while playing games and not the number. is the game smooth? a cpu that can't handle a game will be very evident. this means it's time to upgrade. and there are no fx cpu's from amd that can't handle modern games. again, they will trail intel, but that is like a car going at 220mph so that car wins but the other car is going at 190mph and it will lose but realistically and the experience of going at 190mph will still be fast. the good thing is that amd or cpu don't race each other unless you care about benchmarks. but, if you look past the benchmarks and just focus on the experience itself, an fx series cpu by amd is plenty fast enuff.
omg.
silverblue - Tuesday, August 12, 2014 - link
We're well within the realms of diminishing returns as regards standard CPU IPC. AMD has the most to gain here, though with HSA, will they bother?kaix2 - Tuesday, August 12, 2014 - link
so your response to people who are disappointed that broadwell is focused more on TDP instead of performance is to buy an AMD cpu with even lower performance?wurizen - Tuesday, August 12, 2014 - link
well, they don't have to get the fx-9590, which has serverlike cpu of 2008 like tdp or a gpu like tdp of 220 watts. there is a more modest tdp of 125w with the fx8350. all overclockable. seems like a good cpu for tinkerers, pc/enthusiast, gamers and video editors. i don't even think it's a budget cpu. there is the 6-core and 4-core variants which are cheaper. i am also not saying that an fx-8350 is like the best cpu since it's not and falls way down in the benchmark charts. but, it's not a bad cpu at all. it gets the work done (video editing) and let's you play games (wit's a modern cpu after) even though it's sort of 2 yrs old already. the 990FX chipset is even an older chipset. there's something to be said about that and i think im trying to say it. in light of all the news about intel, which we are guaranteed to get every year with each tick and tock... there is that little AMD sitting in the corner with a chipset that hasn't been updated for yrs and an 8-core cpu that's remarkably affordable. the performance is not that low at all. i mean, video editing with it or playing games with it doesn't hamper one's experience. so, maybe one will have to wait a couple more minutes for a video to render in a video editing program versus say an i7-4790k. but, one can simply get up from one's chair and return. instead of staring at how fast their cpu renders a video on the screen.know what i'm saying?
so, yeah. an fx-8350 with an old 990fx mobo and now intel's upcoming broadwell cpu's with z97 chipsets and all the bells and whistles and productivity for either one will probably be similar. also, most video editing programs now will also leverage the gpu so an old fx-8350 w/ a compatible gpu will have help rendering those gpu's....
i guess it's like new doesn't mean anything now. or something. like m2 sata and pci 3.0, which intel chipsets have over amd is kinda superflous and doesn't really help or do much.
know what im saying?
rkrb79 - Tuesday, November 18, 2014 - link
Agreed!!name99 - Tuesday, August 12, 2014 - link
Oh yes, Skylake.Intel has given 5% IPC improvements for every generation since Nehalem, but now Skylake is going to change everything?
If you're one of the ten people on the planet who can actually get value out of AVX-512 then, sure, great leap forward. For everyone else, if you were pissed off at IB, HSW, BDW, you're going to be just as pissed off with Skylake.
DanNeely - Tuesday, August 12, 2014 - link
No, the interest in Skylake is for all the non-CPU speed things promised with it. PCIe 4.0 and a bump from 16 to 20 CPU lanes (for PCIe storage) are at the top of the list. Other expected, but AFAIK not confirmed, benefits include USB3.1 and more USB3.x on the chipset than the current generation. We should have consumer DDR4 with Skylake too; but that's not expected to be a big bump in the real world.psyq321 - Tuesday, August 12, 2014 - link
Actually, apart from power-users I fail to see any tangible improvements in performance of modern CPUs that matter to desktop/notebook usage, Intel or otherwise.In the mobile space, it is improvements in GPU which mattered, but even that will eventually flatten once some peak is reached since graphics improvements on 4" / 5" screen can only matter to wide audiences up to some point.
However, there are surely enough customers that do look forward to more power - this is workstation and server market. Skylake and its AVX512 will matter to scientists and its enormous core count in EP (Xeon) version will matter to companies (virtualization, etc.).
Standard desktop, not so much. But, then again, ever since Core 2 Quad 6600 this was the case. If anything, large-scale adoption of SSDs is probably the single most important jump in desktop performance since the days of Conroe.
Khenglish - Monday, August 11, 2014 - link
I find the reduction in die thickness to be a big deal. Maybe this will prevent temperatures from getting out of control when the cpu core area gets cut in half for 14nm. High power 22nm cpus already easily hit 30c temperature difference between the cpu and heatsink.AnnonymousCoward - Tuesday, August 12, 2014 - link
Probably not. I'd guess thermal dissipation is the same.dgingeri - Monday, August 11, 2014 - link
PC sales are down mostly because people can keep their systems longer due to the lack of innovation coming from Intel on desktop chips and the lack of utilizing the current CPU technology by software developers. They could be so much more, if only developers would actually make use of the desktop CPU capabilities for things such as a voice command OS that doesn't need to be trained. Intel would then have a reason to produce more powerful chips that would trigger more PC sales.As it is, the current processor generation is less than 10% faster clock for clock compared to three generations ago. A great many thing aren't any faster at all. Know what? It doesn't even matter because nothing uses that much power these days.
Tablets and smartphones can't take the place of full PCs for most people. Their screens are just too small. Perhaps the younger generations prefer the small form factors right now, but give them a little time, and their eyes won't let them use such things. I can see the move to laptops, especially with 14-15" screens, but trying to show the same content on a 10" screen is just near unusable, and a 5" smartphone screen is just downright impossible. However, desktop PCs still have their place, and that's never going to change.
This push by "investors" for the tablet and smartphone market is just asinine. Broadwell isn't going to help sales all that much. Perhaps, they might sell some more Intel based tablets, but it won't be all that much of an improvement. Tablets have a niche, but it really isn't that much of one.
HanzNFranzen - Monday, August 11, 2014 - link
Tablets are a niche and not much of one? lol yea ok... well while you were asleep in a cave, over 195 million tablets were sold in 2013 between Android/Apple/Microsoft which is just shy of 80 million more than the previous year. World wide PC sales totaled 316M units, so we are talking nearly 2 tablets for every 3 PC's sold. Eh...small niche...dgingeri - Monday, August 11, 2014 - link
yeah, lots of people have them, but how much do they really use them? I have two, one Android and one Windows RT, and I only use them for reading books or for reading the web news while away from home. The Windows unit showed promise, since I could use it to run Office and terminal programs, but I ended up not using it at work anymore because it couldn't use a USB to serial adapter for talking to switches and raid arrays. It ended up being only half useful. They're nice to have for certain things, but they aren't as versatile as a PC. My parents own two, and two PCs, and they use the PCs far more. My older sister has one, and she barely uses it. Her 7 year old uses it to play games most of the time. My nephew has one, and he's only ever used it to read Facebook. It's a telling tale that everyone I've known who has one only has limited used for it.mapesdhs - Monday, August 11, 2014 - link
Point taken, but if people are *buying* them, irrespective of whether they use them,then it doesn't really matter.
Besides, this whole field of mobile computing, smart phones, tablets, now phablets,
etc., it's too soon to be sure where we're heading long-term.
Many people say the copout porting of console games to PCs with little enhancement
is one thing that's harmed PC gaming sales. This may well be true. Now that the newer
consoles use PC tech more directly, perhaps this will be less of an issue, but it's always
down to the developer whether they choose to make a PC release capable of exploiting
what a PC can do re high res, better detail, etc. Wouldn't surprise me if this issue causes
internal pressures, eg. make the PC version too much better and it might harm console
version sales - with devs no doubt eager to maximise returns, that's something they'd
likely want to avoid.
Ian.
az_ - Monday, August 11, 2014 - link
Ryan, could you add a size comparison to an ARM SOC that would be used in a tablet? I wonder how close are Intel in size. Thanks.name99 - Tuesday, August 12, 2014 - link
BDW-Y is 82 mm^2. The PCH looks like it's about a third of that, so total is maybe 115 mm^2 or so.In comparison, Apple A7 is about 100 mm^2.
A7 includes some stuff BDW-Y doesn't, and vice versa, so let's call it a wash in terms of non-CPU functionality.
BDW-Y obviously can perform a LOT better (if it's given enough power, probably performs about the same at the same power budget). On the other hand it probably costs about 10x what an A7 costs.
Krysto - Tuesday, August 12, 2014 - link
Sure, also let's conveniently forget that Broadwell Y benefits not only of 3D transistors, but a 2 generation node shrink, too, compared to A7. Now put A7 on 14nm and 3d transistors...and let's see which does better.This is the issue nobody seems to understand, not even Anand, or just conveniently ignored it when he declared that the "x86 myth is busted". At the time we were talking about a 22nm Trigate Atom vs 28nm planar ARM chip, with Atom barely competing on performance (while costing 2x more, and having half the GPU performance). Yet Anand said the x86 bloat myth is busted...How exactly?! Put them on the same process technology...and then we'll see if x86 myth is indeed busted, or it's still bloated as a pig.
IntelUser2000 - Wednesday, August 13, 2014 - link
Broadwell Y's CPU portion is on 14nm but the PCH is on 32nm.Not to mention that Intel's 22nm process isn't that small compared to Foundries 28nm. 28nm Foundry is ~30% larger than Intel 22nm not 100%.
20nm Foundry is really a half node jump from 28nm considering density improves but performance much less. 16nm is another half node since density barely changes but perf improves.
Really, it can only be judged as a product.
mkozakewich - Thursday, August 14, 2014 - link
The 'myth' was that x86 was a horrible power-sucking pig. It was shown that it was possible to at least get close to ARM processors.Meanwhile, Intel's chips are NOT as low-performance as Apple's. The chip in Surface Pro 3 is about 2x-3x faster, and these should be about the same. With this year's Apple chip, imagine it to be 2x faster or so. Meanwhile, they'll probably both take the same amount of power. SoCs these days are running multiples of Watts. Even Intel Atoms used to take over 10, but now a full Intel Core computer can run under 6 Watts. It would be really interesting to run all these numbers and give a complete report on the state of mobile vs. notebook processing. They seem matched on power usage, but ARM chips are far cheaper and run at far less performance.
peevee - Monday, August 11, 2014 - link
Finally, a long awaited fanless design. I don't care about thickness that much as about energy waste on the heat and the fan to dissipate the heat.But... I have 2008 MacBook Pro with 2.4GHz Pentium M. If they achieved fanless design by bringing frequency down to, say, 1.8GHz, I am not interested (given that IPCs are not that different for real world applications). For me to upgrade, I want it to reach 3GHz, even if for a second and in a single thread, when starting applications for example. Anything below that is not a noticeable upgrade, and below 2.2GHz or so will be downgrade in practice.
And biggest problem of Intel is not how thick they processors are, it is Microsoft - with Windows 8 (and 8.1) being so unbelievably awful (yes, I do own it for a while). Krzanich should call Nadella immediately and tell him to fire Larson-Green, or they both are going down.
MikhailT - Monday, August 11, 2014 - link
You do realize this is for tablets and low power laptops (Chromebook/netbook style) only, right?It's not coming to MBP for any time soon unless you're talking about something else entirely, like the MBA, which also is not going to get M either because it'll be too big of a regression on the performance.
I don't think we'll see any Macs with Core M.
ZeDestructor - Monday, August 11, 2014 - link
You forget about IPC. Last I checked, compared to a Core 2 CPU at equal clock speeds, a Sandy Bridge CPU is 50+% faster on average, and Haswell is a further 15+% faster on top of that, all the while using less power.Krysto - Tuesday, August 12, 2014 - link
Since this is the FIRST core to be "fanless", they're probably squeezing a lot of stuff to make that work, and it probably still overheats. I wouldn't be too excited about it until we see how it does in actual devices, both from a power consumption point of view, but also a performance one (because if performance didn't matter, then we'd all just use ARM chips, no?).It would be laughable if Denver, which already beats mainstream Haswell Celeron, would be in range of Broadwell Y in performance, but still more energy efficient and with much better GPU performance.
UNCjigga - Monday, August 11, 2014 - link
Time to spin up the 12" iPad Pro rumor mill again...but would Apple really need to build a device that runs 64bit iOS *and* OS X?ilt24 - Monday, August 11, 2014 - link
Apple doesn't seem all that interested in adding touch screen capabilities to OSX.isa - Monday, August 11, 2014 - link
OK, I'm confused, so any help appreciated. I want to replace my old desktop replacement laptop with a Broadwell equivalent. For example, the Broadwell version of an HP Envy 17t. What flavor of Broadwell am I waiting for? The Y flavor? U? H? Something else? thanks.mapesdhs - Monday, August 11, 2014 - link
I would ask the opposite: why do you need a replacement at all? What is it your current
device cannot do, or does poorly? Once the newer products are out, find out which one
solves those issues at the lowest cost. This focus on the jargon side of computing tech
is the absolute worst aspect of how consumer computing has evolved since the 1980s.
Ian.
isa - Monday, August 11, 2014 - link
I appreciate the thoughts, but I'm actually an ASIC designer, so I have no problem with lingo - I just am not informed regarding my specific question. And my current laptop is a Penryn 2008 laptop, so forgive me if we ignore the need question and stay focused on what version(s) of Broadwell are intended for mainstream (non-gaming) desktop replacement laptops. thanks!ZeDestructor - Monday, August 11, 2014 - link
You want the full HQ/MQ series for your next laptop. Those are the full quad-core-enabled machines, which is something your probably want for a hi-performance machine. Since this is a gaming laptop though, you may want to look into building a small mini-ITX desktop though, they're comparable if your primary definition of "mobile gaming" is "drag computer to a LANparty/friend's place" rather than actually gaming on a train or similar.name99 - Tuesday, August 12, 2014 - link
If you can afford it, why are you dicking around with trying to save a few hundred dollars?If you want a serious machine, buy an rMBP with quadcore i7.
If you want a light machine, buy an MBA.
Then either stick Parallels on it (if you use OSX) or just run Boot Camp and never touch OSX, except maybe once every two months to update the firmware.
If you can afford it, life's too short to waste time trying to figure out which of fifty slightly different laptop PCs will suck less in their different hardware, different pre-installed crap-ware, different drivers. Just pay a little more and get something that most people consider works well.
ZeDestructor - Tuesday, August 12, 2014 - link
As an ASIC designer, OSX is just a non-option. Hell, even Linux is rather hairy compared to Windows for ASIC stuff.I personally like the Dell Precision line, but Lenovo Thinkpad W and HP Elitebook would also doo. In a pinch, a ridiculously-specced Compal (in it's various Sager rebrands) would also do, but IMO are just not built anywhere close.
Kjella - Monday, August 11, 2014 - link
Y = tabletsU = ultraportables
H = laptop/desktops
EP/EX = workstations/servers
So H. Out in Q2 2015, by current best guesses.
isa - Monday, August 11, 2014 - link
Thanks, Kjella. Upon further googling after reading your post, I learned that there apparently will be a mobile and deskop version of the H flavor, and it does look like the mobile H is the most likely for me. But that's not rumored to come out until May or June 2015, which is disappointing.Even weirder, since the first desktop version of Broadwell will be available in PCs in May/June 2015, and since the first version of Skylake (rumored to be a desktop version) is rumored to be "available" 2H 2015, it seems Broadwell H desktop is slated for a very, very short product life. Similarly, is Broadwell H mobile also slated for an extremely short product life?
Perhaps I missed it, but it would be great if there were an Anandtech chart comparing Intel's definitions of "announced", "launched", "samples", "volume shipments", "available", and similar Intelese to figure out about how many months from each it is till Joe Consumer can buy the chip in question. I suspect these definitions and even the lingo can vary with each tick and tock, but some kind of cheat sheet guestimates would be great (and revised as better info arrives, of course).
isa - Monday, August 11, 2014 - link
To further clarify the need for a cheat sheet, I'm familiar with the timing of tick and tock for the last few years, but it seems that 2014/2015 at a minimum will diverge so much from the last few years that previous expectations add confusion rather than clarity.ZeDestructor - Monday, August 11, 2014 - link
Judging by the delay for BW, Skylake will probably be pushed forward at least 6 months, if only to make up the R&D costs of BW. Then again, Intel wants that tablet market, so they might not either.IntelUser2000 - Wednesday, August 13, 2014 - link
"Similarly, is Broadwell H mobile also slated for an extremely short product life?"No its not. Companies like Intel cares about 5% profit differences, so having a short product life would make absolutely no sense. Broadwell isn't coming to "mainstream" desktops, only high-end enthusiast ones like the K series.
So they will all happily be a family like this:
-Skylake mainstream desktop
-Broadwell H/U/Y/K
name99 - Tuesday, August 12, 2014 - link
Semiaccurate says that quadcores (which I take to mean H-series) will not be out until 11 months from now.(Makes you wonder WTF has happened to the Skylake timeline. They haven't yet admitted that that has even slipped, let alone by how much.)
psyq321 - Tuesday, August 12, 2014 - link
I also fail to see how will Intel be able to keep their cadence with Skylake without either skipping the entire generation (obsoleting it in 6 months does not sound reasonable from the financial point of view) or delaying the Skylake introduction.Also the fact that Intel decided to enable all 18 cores in the Haswell EP is telling IMHO. Initially, this was to happen only with BDW-EP, so it might not be impossible that Intel might just skip Broadwell for some segments and go with Skylake.
isa - Tuesday, August 12, 2014 - link
Those who know ain't talking, but I can observe the following;- Delaying Skylake for financial reasons related to Broadwell is braindead. Broadwell development costs are sunk costs and Intel is or should be trying to maximize overall profits, not just a particular program's profits. Intel should release Skylake as quickly as possible when its yields hit target values, regardless of Broadwell delays, with two caveats:
- If the Broadwell yield difficulties also slowed down Skylake, then Skylake will likely be inherently delayed to some degree
- If Intel screws up product planning such that they flood the market with Broadwell, then their customers might be very angry if they are stuck with that inventory upon a Skylake release.
My bet at this point? Broadwell H mobile will be a very short-lived product (about 6 months).
Krysto - Tuesday, August 12, 2014 - link
Definitely not Y or U, and wouldn't get M either. Whatever is above that.mczak - Monday, August 11, 2014 - link
For the gpu It is noteworthy that unlike nvidia and amd the subslice block (at least before gen8) doesn't really have an inherent minimal size which cannot be changed without significantly altering the architecture. E.g. Gen7 (which is just about the same as Gen7.5) had subslice sizes of 6 (IvyBridge GT1), 8 (IvyBridge GT2) and 4 even (BayTrail).It is also quite interesting that everybody (nvidia since gk2xx and Maxwell, amd since even before GCN, notably their Northern Islands VLIW4 designs, intel since Gen8) now has ended up with the exact same ALU:TEX ratio (one quad tmu per 64 ALU lanes), though of course the capabilities of these tmus vary a bit (e.g. nvidia can do fullspeed fp16 filtering, amd only half speed etc.)
tuxRoller - Monday, August 11, 2014 - link
http://www.phoronix.com/scan.php?page=article&...In the fourth to last paragraph an intel driver dev says that broadwell graphics "dwarf any other silicon iteration during my tenure, and certainly can compete with the likes of the gen3->gen4 changes."
I'm going to go with the guy who's actually developing the drivers on this.
AnnonymousCoward - Tuesday, August 12, 2014 - link
Simple question: will all the new Intel desktop CPUs have integrated graphics? If the answer's yes, why would they waste the silicon area for those using discrete?name99 - Tuesday, August 12, 2014 - link
Because the people who obsess about discrete graphics are a RIDICULOUSLY small fraction of the purchasing public, a fraction which is utterly unwilling to accept this fact and the fact that CPU design is not targeted at the needs of high-end gamers.wintermute000 - Tuesday, August 12, 2014 - link
"Because the people who obsess about discrete graphics are a RIDICULOUSLY small fraction of the purchasing public"Maybe not obsess, but to characterise the PC gaming market as ridiculously small, is pretty far off the mark....
DanNeely - Tuesday, August 12, 2014 - link
Compared to the total Intel CPU market and compared to the cost of creating an IGPless CPU die for the mainstream socket it's entirely on the mark. If you want an IGPless design your only choice is to wait a year for the Xeon derived LGA2011 model; and the fact that LGA1366 launched as an enthusiast design well before the Xeon's did, but that Intel hasn't done the same for any other models shows that it didn't sell well enough to justify rushing one for us.C'DaleRider - Tuesday, August 12, 2014 - link
Small fraction is right. Projected worldwide PC hardware sales for 2015 is ~ $385B (Source: eTForcasts). Projected PC gaming sales (both hardware and software) is ~$20B (Source: Statista), less than 10% of total PC hardware sales alone. A 10% market niche is very, very small in the overall scheme of the PC market.AnnonymousCoward - Tuesday, August 12, 2014 - link
You should look at discrete graphics HW sales.tuxRoller - Tuesday, August 12, 2014 - link
Nvidia, which has around 60% of the discrete gpu market, has a yearly revenue of around $4 000 000 000. So, you're looking at a total market of around $7 000 000 000.Johnmcl7 - Tuesday, August 12, 2014 - link
"Maybe not obsess, but to characterise the PC gaming market as ridiculously small, is pretty far off the mark...."I think the original comment was fairly accurate, even in the PC gaming market there's a large proportion of people using Intel graphics cards. Looking at the current Steam survey results, 75% are using Intel processors and 20% overall are using Intel graphics which means around 1 in 3 people with Intel processors on Steam are using the onboard graphics card. The means even among the gaming market there's a lot of integrated cards in use and that's just one small portion as I'd expect most other areas to mainly be using integrated cards.
There are workstation graphics cards but professionals using those are unlikely to be using consumer processors and the enthusiast/workstation processors do not have an integrated graphics card.
zepi - Tuesday, August 12, 2014 - link
I have had steam on my company laptop with just internal GPU just to take part into the sales campains etc. This makes my contribution to 50:50 in terms on dGPU / iGPU, even though 100% of gaming happens with dGPU.AnnonymousCoward - Tuesday, August 12, 2014 - link
So....how do NVIDIA and ATI stay in business? Obviously many people use discrete cards. The fact you say "obsess" tells me you probably don't realize the massive performance difference, and it's not limited to gaming. CAD uses 3D.AnnonymousCoward - Wednesday, August 13, 2014 - link
Doesn't Intel make X-version CPUs that can be overclocked? The OC market is gonna be much smaller than dGPU, and they're already making a dedicated product for that.Krysto - Tuesday, August 12, 2014 - link
Because they are using that anti-competitive tactic to drive out the discrete competition. They force OEMs to buy them bundled, so more and more people say "why should I pay twice for the GPU...I'll just get the Intel one".It's a nasty tactic, Intel has been employing for years, and unfortunately it's working. But it's terribly uncompetitive.
Krysto - Tuesday, August 12, 2014 - link
It's akin to Microsoft bundling IE with Windows "Why would I need to get another browser...I'll just use IE". That tactic WORKED for Microsoft. It only stopped working when they became lazy. But they could've hold the 90 percent market share of IE for a lot longer, if they didn't get lazy.AnnonymousCoward - Tuesday, August 12, 2014 - link
I dunno--anyone who plans to get a discrete card is going to get one, regardless of Intel forcing it onto the CPU.I wonder what percent of the desktop die will be GPU. Maybe with the GPU disabled, the CPU turbo will work better since there will be less heat.
name99 - Tuesday, August 12, 2014 - link
"we’ll still have to wait to see just how good the resulting retail products are, but there shouldn’t be any technical reason for why it can’t be put into a mobile device comparable to today’s 10”+ tablets. "There may not be TECHNICAL reasons, but there are very definite economic reasons.
People think of tablets as cheap devices --- iPad at the high end, but the mass market at $350 or so. This CPU alone will probably cost around $300. MS is willing to pay that for Surface Pro 4; no-one else is, not for a product where x86 compatibility is not essential.
We'll see it in ultrabooks (and various ultrabook perversions that bend or slide or pop into some sort of tablet) but we're not going to see a wave of sub<$1000 products using this.
kyuu - Tuesday, August 12, 2014 - link
Nothing was said about cheap tablets in that quote, so I'm not sure why you're bringing up the price.Not that I disagree with your point. Of course, by continuing to focus on premium priced parts, Intel is never going to gain a profitable foothold in the mobile market. Core M needs to be cheaper, not just lower power, to be interesting. Otherwise there's no reason to care. If you're paying for a $1000 device, why do you want something that's obviously going to be so performance gimped compared to Y-series Broadwells?
Drazick - Tuesday, August 12, 2014 - link
Does "Shared Virtual Memory" means the same as AMD's shared memory configuration?No more need to replicate data for the GPU?
Laststop311 - Tuesday, August 12, 2014 - link
A surface Pro with core-M might be pretty good.Krysto - Tuesday, August 12, 2014 - link
For battery life, maybe. For performance, no.fokka - Tuesday, August 12, 2014 - link
i would like to see ordinary 13" ultrabooks with broadwell-y. don't make it too slim and see what performance and battery life is like with a 4.5w cpu. if performance is high enough for everyday tasks, it would really be nice to have slim notebooks approach 20 hours of battery life in light usage scenarios.but i guess companies will just use the low power cpus as an excuse to implement smaller batteries and 4k displays and we still won't get much more than 10h in best case scenarios...
dcaxax - Tuesday, August 12, 2014 - link
I'd argue that it may well be too late for Intel to enter this market, unless they can deliver a major step change in performance compared to ARM. Right now the ARM-Android & ARM-iOS ecosystems are well established and humming along nicely. On top of which tablet sales in the developed world are slowing down. The developing world is still growing but in those regions, cost will be a key factor.That leaves Intel targeting a market with an entrenched competitor, a set software ecosystems with no benefits from migrating to a new architecture (what do Apple, Google, Samsung, HTC, LG etc gain out of this?) and slowing hardware sales.
I Core M can deliver double performance for the same power draw AND price, then sure, I can see a rush to migrate to it, otherwise what's the point?
Krysto - Tuesday, August 12, 2014 - link
"Core" chips will never EVER compete with ARM in its main market. The best Intel can do is try to compete in $700+ devices. Core is just not competitive on price. Not even close. Period.Intel's only competition against ARM in the MOBILE market is Atom, and Nvidia's Denver is already TWICE as fast as that. Also Atom is twice as expensive as Denver, but Intel keeps subsidizing half the cost...for as long as they can, which probably won't be much longer.
tuxRoller - Wednesday, August 13, 2014 - link
Link to the denver benchmarks?Natfly - Tuesday, August 12, 2014 - link
This article feels like marketing drivel just listing point after point without any further explanation. I'd expect a little more in-depth analysis. Seriously a tick/tock chart?KhalidShaikh - Tuesday, August 12, 2014 - link
Great write up.Hrel - Tuesday, August 12, 2014 - link
". Intel has made it clear that they don’t regress on clockspeeds"lol, they often regress on clockspeeds.
nikolayo - Wednesday, August 13, 2014 - link
This sounds as a distraction meant to divert attention from the fact that Intel is about a year late with 14nm. It seems than the Moore's law will stop working quite soon. I would still bet on Intel being able to implement 10nm but that will likely be the end of the road for the current technology. And there is no clearly visible path beyond that. Not that progress will stop but it may slow down to the pace we observe e.g. in car design and production rather than what we are used to expect in chip design and production.asoltesz - Wednesday, August 20, 2014 - link
Car design and production is being turned upside down by electric cars like Tesla.Apart from the comparisson, you may be right.
lagittaja - Wednesday, August 13, 2014 - link
Honestly. After reading the article. Which was really a good read and huge thanks to Ryan. All I can say is. Amazing. That is all.Oxford Guy - Thursday, August 14, 2014 - link
"Intel’s solution to this problem is both a bit brute force and a bit genius, and is definitely unlike anything else we’ve seen on PC GPUs thus far. Since Intel can’t reduce their idle voltage they are going to start outright turning off the GPU instead;"That qualifies as genius these days? The most obvious method to reduce a GPU's power consumption is to turn it off.
c plus plus - Saturday, August 16, 2014 - link
intel is really amazing and i hope broadwell doubles both gpu and cpu performance with the help of magicMr Pras - Tuesday, August 19, 2014 - link
I liked the comments in this thread but was disturbed by the fan boys. Just because there was an amd marketer here doesn't mean that we should resort to intel worship. At least for me, those discussions are totally moot in this thread.Blaiser - Wednesday, September 17, 2014 - link
Can somebody clarify this for me?Broadwell Core M is supposed to support fanless designs. Will Broadwell-U support fanless as well? Meaning will high end laptops with Broadwell-U be fanless? If not, will Skylake be able to have high end machines be fanless?