"but you do get some significant performance benefits from moving the GMCH on chip"
Why? If it's just the same Pentium-M communicating over the same FSB to the same GMCH, why does putting it on the same package help performance more than a little bit? I can see maybe tightening some bus timings a little, but nothing significant.
That said, I do certainly agree this idea is a good one for small form-factor devices that still need substantial CPU power.
Isn't Intel the company that changes a slight bit of a CPU, calls it a new revision and requires a new chipset, which means new motherboard? Why would they take away from their own MB sales by putting the NB on chip?
I have no idea why you would come to that conclusion. With many of the NB chips needing active cooling and graphics cores putting out some heat (even the Intel ones do that, to a lesser extent than Nvidia or ATIs), and voltage regulators also putting out some heat, I'd think we would need to use the same heatsinks, because Intel's CPUs are supposed to produce less heat, but with all this extra crap on the PCB, it should equal out, give or take. Perhaps if Intel is able to ramp up the speed of their uber next-gen uarch then the heat will be somewhat similar to today's CPUs (but a lot more performance), and the NB, Gfx, VR will make it more heat productive than today, so even bigger HSFs.
As I look at the graph of both Coarse Grain Power Management and Fine GPM it seems pretty clear this is an exercise in reducing the "red" or saving power. If a cpu is just doing email it can run like a single core whatever. If not it can fire itself up for more power.
The only way I can make any sense out of the graphs, is if it's actually the cpu frequency that's scaling, hence the need for changing the voltage. Some kind of speedstep with faster transisions?
On the other hand, if the frequency is constant, the voltage is also constant and power consumption is determined by current draw. This is what a voltage regulator do by the way, it keeps the voltage constant regardless of current draw.
Perhaps the point they are trying to make is that voltage regulation is more effecient now that it is on chip. But in that case the graphs are completely bogus.
I don't see why you would. AMD doesn't need to have it integrated, so neither does Intel, I would assume. Also I'm thinking this would bring down motherboard prices, but I'm thinking CPU prices are going to be high as a result.
IIRC, the original design for what became the Pentium M was known as the "Timna". It was designed to integrate the northbridge onto the CPU (including a Rambus memory controller). The project was canned, and the project got rolled over into Banias. Now here we are in 2005 and we are almost back to where we started. The northbridge in the same packaging as the CPU. The next logical step is to, of course, put the entire NB into the same die as the CPU - which might be quite possible by 45nm.
First abandoning low-end chipsets and now simplifying motherboard design. What do these two events tell you if anything?
Will this reap power consumption benefits for laptops? Cheaper, smaller, cooler desktop designs for the office? Also, see "Latest News" on "Vista's answer to PC power woes."
wonder if lower motherboard's complexity.
lower motherboard prices just like AMD boards
http://www.dslreports.com/shownews/66802">http://www.dslreports.com/shownews/66802 anyone see the article intel is going after sony and M$ for control of living room
hahaha funny how by moding and XBOX you can have a $150 media center PC and by Novenber $99
just intel chips alone cost that much not including motherboard HD case next gen PS3 and xbox 360 can play back HD no problem for $300 + mod chip if you want extra functions
wassup with intel, suddently they are coming with all thins great stuff
look at the buzz theyr generating... AMD better have something under theyr sleeves or theyll go back to losing money business again...they had theyr chance, 3 years being the top performer and only managed to take the a bit of the server market, but still in desktop market people dont know how much theyr superior to P4... its amazing how manny people still are in the mhz thing and people who think ExtremeEdition is the shit... im sad lol, we are so few!
I'm sure most normal people would be more than happy with an Intel Extreme Edition processor in their computer, assuming they could afford it. And if it was paired up with Intel Extreme Graphics, then they could be happy believing in the fantasy that they have the most powerful PC available. Until they tried to play a game, that is :)
There's a difference between having a decent camera and getting a decent photo in a dark conference hall at a distance of 100 feet or more. Some slides are only up for a few seconds, so if you take a shot and it turns out blurry, you don't get a second chance. Photoshop can only clean up a poor quality shot so much, unfortunately.
"Some slides are only up for a few seconds, so if you take a shot and it turns out blurry, you don't get a second chance"
That's why I always use continuous shooting mode. Whenever I take a photo, I really take about five over a couple of seconds, then decide which is best later. Still, better to have these blurry photos than none at all.
Anyone reminded of the PCs on a chip movement of the early ninetes? Do we finally have enough technology to do this.
Intel's thingy (for lack of a better word) is interesting, but its just a bunch of dies together on one pice of PCB. Going to be one hell of a dense PGA too with the extra data needed by the GPU (crappy as it is). For notebook and other small PCs, its cool, but for desktops, eh...
I agree, the overall complexity of the PCB (which will have a lot more layers than the current 6 or 4 norm) will definately be cost prohibitive. Like you said notebooks, tablets, PDA/Blackberry's next iteration, cell phones, etc. will definately improve.
The on die graphics and northbridge would also be welcome. I have a feeling ATI is working on such a project. Imageon is just the start, eventhough it is geared for cell phones.
That's all we need ATI doing this too, they gonna make cpu's now too? They can't even put out a decent driver for their hardware how are they going to SUCESSFULLY combine all these things onto one pcb? What could be worse than Intel trying this you say, well our friend here has pointed out what that could be.
Nomatter how close Intel moves the graphics chipset to the CPU, it will still perform like crap. Not like they care. They're making a killing off of them anyhow.
Typically motherboards allow you to up the voltage given to the processor. By adding the Voltage regulator to the processor you no longer have the option to up the voltage. Which will keep you from being able to overlock the processor. This sounds like a great idea for Business machines, but I'm not sure if the the on die voltage regulator is going to help enough on heat to take away the ability to overclock the processor.
Do these Anand folks understand that recent theory on focus? What's up with the pictures you can't see a thing, what did they do take a trip with their webcams? It's about as impressive as this crock that intel is putting forth, what if I don't want a freakin graphics core on the die or anything close to it. Why make me pay for what has historically been CRAP video when it will never be used, just like this viva junk, MCE isn't a good thing in my opinion and I'm not going to buy some processor that FORCES me to use a crippled OS.
Bah, it's not even on die, they're just packaging it together. I wonder if there are any significant improvements on performance, though I do understand it will lower motherboard's complexity.
quote: Bah, it's not even on die, they're just packaging it together. I wonder if there are any significant improvements on performance, though I do understand it will lower motherboard's complexity.
Yeah. Are they making all the components with the same process? or is it a 110nm voltage regulator, 90 nm northbridge and a 65nm cpu?
It's really not "on-chip", more like "on chip module".
the voltage regulator looks very nice. I like that. I guess the question is will they get the same performance as if it were really "on-chip"?
On package generally means it will run at package speed (rather than bus speed). There would be a noticeable performance boost by going from a 400 MHz memory controller to a 2.0+ GHz memory controller. Hell, even Intel's GMA9x0 graphics could be pretty impressive if they were run at 2.0 GHz. Finally, putting everything on package can be seen as the first step towards moving on-chip. L2 cache with Pentium Pro, anyone?
I think the main reason for their doing this is for even more power savings on a laptop. Battery life can big a big concern for a lot of laptop users. I would never buy a laptop with intel graphics, but there are plenty of people who don't need more than that.
regardless if there's any performance improvements,
having the northbridge, graphics, and cpu on one package
will be great for embedded systems, laptops, etc.
Yeah pretty graphs, but far from the truth IMO. Look at the current technology power management, how can delivered power ramp up before user demand? Speculative power management? I think not.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
36 Comments
Back to Article
bradley - Saturday, August 27, 2005 - link
It' about time.ksherman - Friday, August 26, 2005 - link
I do have to say that the idea of the n chip voltage control is VERY exciting, especially for mobile processors... mabye AMD will put it on-die...johnsonx - Friday, August 26, 2005 - link
"but you do get some significant performance benefits from moving the GMCH on chip"Why? If it's just the same Pentium-M communicating over the same FSB to the same GMCH, why does putting it on the same package help performance more than a little bit? I can see maybe tightening some bus timings a little, but nothing significant.
That said, I do certainly agree this idea is a good one for small form-factor devices that still need substantial CPU power.
joex444 - Friday, August 26, 2005 - link
Isn't Intel the company that changes a slight bit of a CPU, calls it a new revision and requires a new chipset, which means new motherboard? Why would they take away from their own MB sales by putting the NB on chip?UltraWide - Friday, August 26, 2005 - link
Does this mean we will no longer need monster heatpipe 5LB heatsinks and 12,000rpm Delta fans???joex444 - Friday, August 26, 2005 - link
I have no idea why you would come to that conclusion. With many of the NB chips needing active cooling and graphics cores putting out some heat (even the Intel ones do that, to a lesser extent than Nvidia or ATIs), and voltage regulators also putting out some heat, I'd think we would need to use the same heatsinks, because Intel's CPUs are supposed to produce less heat, but with all this extra crap on the PCB, it should equal out, give or take. Perhaps if Intel is able to ramp up the speed of their uber next-gen uarch then the heat will be somewhat similar to today's CPUs (but a lot more performance), and the NB, Gfx, VR will make it more heat productive than today, so even bigger HSFs.bupkus - Friday, August 26, 2005 - link
As I look at the graph of both Coarse Grain Power Management and Fine GPM it seems pretty clear this is an exercise in reducing the "red" or saving power. If a cpu is just doing email it can run like a single core whatever. If not it can fire itself up for more power.Larso - Friday, August 26, 2005 - link
The only way I can make any sense out of the graphs, is if it's actually the cpu frequency that's scaling, hence the need for changing the voltage. Some kind of speedstep with faster transisions?On the other hand, if the frequency is constant, the voltage is also constant and power consumption is determined by current draw. This is what a voltage regulator do by the way, it keeps the voltage constant regardless of current draw.
Perhaps the point they are trying to make is that voltage regulation is more effecient now that it is on chip. But in that case the graphs are completely bogus.
sprockkets - Thursday, August 25, 2005 - link
With that much integration you can pretty much forget putting your own processor in the mb; it looks more like it has to be preinstalled.Didn't VIA show something like that to begin with? Their processor and NB on one package for a mini-ITX board?
Rock Hydra - Friday, August 26, 2005 - link
I don't see why you would. AMD doesn't need to have it integrated, so neither does Intel, I would assume. Also I'm thinking this would bring down motherboard prices, but I'm thinking CPU prices are going to be high as a result.Doormat - Thursday, August 25, 2005 - link
IIRC, the original design for what became the Pentium M was known as the "Timna". It was designed to integrate the northbridge onto the CPU (including a Rambus memory controller). The project was canned, and the project got rolled over into Banias. Now here we are in 2005 and we are almost back to where we started. The northbridge in the same packaging as the CPU. The next logical step is to, of course, put the entire NB into the same die as the CPU - which might be quite possible by 45nm.bupkus - Thursday, August 25, 2005 - link
First abandoning low-end chipsets and now simplifying motherboard design. What do these two events tell you if anything?Will this reap power consumption benefits for laptops? Cheaper, smaller, cooler desktop designs for the office? Also, see "Latest News" on "Vista's answer to PC power woes."
gimpsoft - Thursday, August 25, 2005 - link
wonder if lower motherboard's complexity.lower motherboard prices just like AMD boards
http://www.dslreports.com/shownews/66802">http://www.dslreports.com/shownews/66802
anyone see the article intel is going after sony and M$ for control of living room
hahaha funny how by moding and XBOX you can have a $150 media center PC and by Novenber $99
just intel chips alone cost that much not including motherboard HD case next gen PS3 and xbox 360 can play back HD no problem for $300 + mod chip if you want extra functions
let not forget DRM
8NP4iN - Thursday, August 25, 2005 - link
wassup with intel, suddently they are coming with all thins great stufflook at the buzz theyr generating... AMD better have something under theyr sleeves or theyll go back to losing money business again...they had theyr chance, 3 years being the top performer and only managed to take the a bit of the server market, but still in desktop market people dont know how much theyr superior to P4... its amazing how manny people still are in the mhz thing and people who think ExtremeEdition is the shit... im sad lol, we are so few!
PrinceGaz - Tuesday, August 30, 2005 - link
I'm sure most normal people would be more than happy with an Intel Extreme Edition processor in their computer, assuming they could afford it. And if it was paired up with Intel Extreme Graphics, then they could be happy believing in the fantasy that they have the most powerful PC available. Until they tried to play a game, that is :)KristopherKubicki - Thursday, August 25, 2005 - link
They consolidated a lot of their loser projects like a year ago - I am not surprised they have interesting stuff now.Kristopher
Brian23 - Thursday, August 25, 2005 - link
whats up with all the fuzzy logic? (pun intended)couldn't you get a decent cammera?
TrogdorJW - Thursday, August 25, 2005 - link
There's a difference between having a decent camera and getting a decent photo in a dark conference hall at a distance of 100 feet or more. Some slides are only up for a few seconds, so if you take a shot and it turns out blurry, you don't get a second chance. Photoshop can only clean up a poor quality shot so much, unfortunately.PrinceGaz - Tuesday, August 30, 2005 - link
"Some slides are only up for a few seconds, so if you take a shot and it turns out blurry, you don't get a second chance"That's why I always use continuous shooting mode. Whenever I take a photo, I really take about five over a couple of seconds, then decide which is best later. Still, better to have these blurry photos than none at all.
Leper Messiah - Thursday, August 25, 2005 - link
Anyone reminded of the PCs on a chip movement of the early ninetes? Do we finally have enough technology to do this.Intel's thingy (for lack of a better word) is interesting, but its just a bunch of dies together on one pice of PCB. Going to be one hell of a dense PGA too with the extra data needed by the GPU (crappy as it is). For notebook and other small PCs, its cool, but for desktops, eh...
erinlegault - Thursday, August 25, 2005 - link
I agree, the overall complexity of the PCB (which will have a lot more layers than the current 6 or 4 norm) will definately be cost prohibitive. Like you said notebooks, tablets, PDA/Blackberry's next iteration, cell phones, etc. will definately improve.The on die graphics and northbridge would also be welcome. I have a feeling ATI is working on such a project. Imageon is just the start, eventhough it is geared for cell phones.
4AcesIII - Thursday, August 25, 2005 - link
That's all we need ATI doing this too, they gonna make cpu's now too? They can't even put out a decent driver for their hardware how are they going to SUCESSFULLY combine all these things onto one pcb? What could be worse than Intel trying this you say, well our friend here has pointed out what that could be.Cybercat - Thursday, August 25, 2005 - link
Nomatter how close Intel moves the graphics chipset to the CPU, it will still perform like crap. Not like they care. They're making a killing off of them anyhow.brownba - Thursday, August 25, 2005 - link
they don't care because graphics performance doesn't matter for business Office usersothercents - Thursday, August 25, 2005 - link
Typically motherboards allow you to up the voltage given to the processor. By adding the Voltage regulator to the processor you no longer have the option to up the voltage. Which will keep you from being able to overlock the processor. This sounds like a great idea for Business machines, but I'm not sure if the the on die voltage regulator is going to help enough on heat to take away the ability to overclock the processor.Other
4AcesIII - Thursday, August 25, 2005 - link
Do these Anand folks understand that recent theory on focus? What's up with the pictures you can't see a thing, what did they do take a trip with their webcams? It's about as impressive as this crock that intel is putting forth, what if I don't want a freakin graphics core on the die or anything close to it. Why make me pay for what has historically been CRAP video when it will never be used, just like this viva junk, MCE isn't a good thing in my opinion and I'm not going to buy some processor that FORCES me to use a crippled OS.kmmatney - Friday, August 26, 2005 - link
whine, whine, whine...joex444 - Friday, August 26, 2005 - link
it would be whining if there wasn't an alternative cpu to purchase...Furen - Thursday, August 25, 2005 - link
Bah, it's not even on die, they're just packaging it together. I wonder if there are any significant improvements on performance, though I do understand it will lower motherboard's complexity.Ged - Saturday, August 27, 2005 - link
Yeah. Are they making all the components with the same process? or is it a 110nm voltage regulator, 90 nm northbridge and a 65nm cpu?
It's really not "on-chip", more like "on chip module".
the voltage regulator looks very nice. I like that. I guess the question is will they get the same performance as if it were really "on-chip"?
TrogdorJW - Thursday, August 25, 2005 - link
On package generally means it will run at package speed (rather than bus speed). There would be a noticeable performance boost by going from a 400 MHz memory controller to a 2.0+ GHz memory controller. Hell, even Intel's GMA9x0 graphics could be pretty impressive if they were run at 2.0 GHz. Finally, putting everything on package can be seen as the first step towards moving on-chip. L2 cache with Pentium Pro, anyone?stevty2889 - Thursday, August 25, 2005 - link
I think the main reason for their doing this is for even more power savings on a laptop. Battery life can big a big concern for a lot of laptop users. I would never buy a laptop with intel graphics, but there are plenty of people who don't need more than that.brownba - Thursday, August 25, 2005 - link
regardless if there's any performance improvements,having the northbridge, graphics, and cpu on one package
will be great for embedded systems, laptops, etc.
Speedo - Thursday, August 25, 2005 - link
Well, closer is always better, from an electrical view. Also, you saw the graphs from that voltage regulator in action.// Frank
Larso - Friday, August 26, 2005 - link
Yeah pretty graphs, but far from the truth IMO. Look at the current technology power management, how can delivered power ramp up before user demand? Speculative power management? I think not.Speedo - Thursday, August 25, 2005 - link
There could be a way for the mobo makers to control the voltage regulator.Anyway, these improvements from Intel is interesting. I hope AMD has something similar going on...
// Frank