Duh.. newsflash, there are also NF4 boards without fans - just not from DFI. What really sucks about the fan on the DFI board is, that it breaks after 3 months and you end up replacing it with a better fan.
quote: there are also NF4 boards without fans - just not from DFI
Although that's not much of a use if you're aiming to get a DFI board - which I think is where he was coming from. :)
For one reason or another DFI does not seem to be interested, or at least eager, to implement more/more effective passive cooling solutions on their products. Besides the lack of noise, passive cooling's greatest advantage is the fact that it doesn't have moving parts that are prone to failure like fans... as you found out.
At least they did take a unique step in implementing a digital integrated VR design on their board... its remarkable compactness and 'clean' layout without large electrolytic capacitors makes it really worth looking at for motherboard power circuits. Can't wait till more details of their implementation and tests thereof surface.
Dual redundant power supplies in the Asus 1U server would seem to indicate that there are three or four power supplies housed within, but I believe the actuality is that there are only two right? Redundant means secondary as I understand it. Dual redundant means two secondaries. Therefore dual redundant PS's include a backup power supply and then a backup of the backup power supply. Which is it? Are there 2 or 3 power supplies in that thing?
God I am so sick of hearing about x new card that is "even better than" the already $500 dual x1950.9 XFIRE XLI+ v2.0 Z
I miss the days when those cards, the best cards, maxed at $300, the awesome stuff was at $200, and you could do quite well for $150. Now $150 is a joke.
High time the dorks at Nvidia and ATI start working on the power saving front. At least they seemm to have that in mind for the follow-up generations... This only means that R600 and G80 wont make it into my computer until the following cards reduce the power envelope by quite a bit.
Just when Intel has attractive numbers on power consumption and also AMD aims again for lower numbers these GPUs negate all their effort and have you on the look out for an even bigger PSU or an additional PSU. How can ATI or NVIDIA justify this? I'd like a good reason.
When I read that line about 1000-1200 watt PSU's, all I could think of was Doc Brown in Back to the Future yelling "1.21 Gigawatts!?!"... (yeah I know - kilowatt != gigawatt... but still).
Who knows, maybe ATI/Nvidia bundle plutonium batteries (as seen in deep space probes) with their cards in the future? Not only to feed the cards but mainly to power your cooling equipment...
Exactly my first thought. The concept of a seperate external supply for a graphics card is already ludicrous to me, if this becomes standard then I would hate to see future progression from there. Does anyone remember the Taco Town SNL skit? I desire better video as much as anyone else, but I draw the line when I need a nuclear power station to run it. ATI/nVidia must have forgotten that electricity DOES cost money.
Needless to say... impressive Conroe boards, for a premium.
All in all an excellent article, good coverage and of course lovely photos :P
The Asus Pluto board is interesting indeed, especially the audio riser card. By the way, anyone noticed that the riser card is nicknamed Charon? For the uninitiated, Pluto was the Greek god of the Underworld, separated from the living world by the river Styx. Charon was the boatman who ferried the dead across that river to Pluto's domain. The two have hence been generally closely associated with one another. In astronomy, Pluto's moon is also named Charon, for the same reason. Nice bit of humour on Asus' side :P
Actually, Pluto is the Roman name for the Greek God Hades. Charon was indeed the ferryman of the Underworld, but the river wasn't Styx, but Acheron. I believe that there were four famous rivers in the Underworld, Lethe was another, but I can't remember the fourth.
quote: Actually, Pluto is the Roman name for the Greek God Hades.
It is another Greek name for Hades (from Greek ????t??, Plouton), but it was adopted by and hence much more commonly associated with Roman mythology.
quote: Charon was indeed the ferryman of the Underworld, but the river wasn't Styx, but Acheron
Oh heh! Styx was the popular misconception... forgot about it. Thanks for correcting me! There were five rivers of Hades: Acheron (the river of sorrow), Cocytus (lamentation), Phlegethon (fire), Styx (hate) and... Lethe (forgetfulness). I guess that explains why you remember Lethe but forgot the others! ;)
Audio on a riser card - DFI has been doing it for awhile now.
Lighting around the I/O shield - Been done before.
Debug code readout - Been on numerous boards for quite some time. Asus just made it viewable from outside the case.
No, I wasn't saying that anything was new in the industry, but new at least to Asus. IMHO the concept motherboard was an interesting exhibit, even if it isn't the only one of its kind.
More to the point, what's noteworthy is that Asus - along with other manufacturers as well, hopefully - is considering following DFI's Karajan module concept. That can only be a good thing, as long as the manufacturers sincerely mean to improve noise immunity and not just throw it in as a gimmick to charge a premium for.
As for the other ideas... interesting but not terribly unique, as you've already pointed out.
I'd like to see what ASRock have lined up as well, however the mini-ITX Albatron boards that were shown in their place were a nice surprise. If they can tweak those a little and get them out at a nice price, very tempting....
Anand has a ton of pictures and information from his trip, so he'll probably provide a second update later. However, I've updated the page links to correctly reflect the Albatron content on page 17.
quote: ...in that they are not standard electrolytic capacitors filled with liquid. All of the capacitors [...] use a solid chemical compound designed to better withstand higher temperatures, thus preventing leaking or other capacitor related failures.
There are actually two major types, one (the more common) being filled with solid compound as noted, and are simply called solid electrolytic, or solid aluminium electrolytic capacitors. The other type uses non-solid electrolyte. Both types are better than the usual electrolytic capacitors of course in terms of resistance to high temperatures - generally around 1,500-3,000 hours at 105°C-125°C. Another useful characteristic is that they perform well in high-frequency, high-capacitance applications, which is one reason they've often been used on graphics cards, for example.
They are however not immune from leaking and other capacitor related failures... it's just that they can survive prolonged exposure to higher temperatures for longer periods of time than the regular electrolytic capacitors before they too fail.
quote: Gigabyte still only offers a 3-year warranty on this board, despite the technically more reliable capacitors.
This, unfortunately, is non sequitur i.e. it does not follow. Component choice is but one of so many factors in determining the stability, performance and reliability of a product. A cheapo backyard manufacturer could boast 16-phase power, all-Nichicon/Rubycon solid electrolytic caps, Philips MOSFETs and AMP connectors all around but would still fail if the circuits were designed by a half-baked engineer fresh out of college and poorly manufactured. Of course, it could be used by the said cheapo manufacturer as a gimmick to fleece the gullible consumers (of which unfortunately there are very many) who rocket to seventh heaven on seeing such big names. Or a not-so-cheapo manufacturer could make their product more attractive by using such flash while cutting corners on areas where consumers probably won't notice in perhaps the first 6 months of use.
Speaking of gimmick, as mentioned above these capacitors are more useful in high-frequency, high-temperature applications. Naturally not every single circuit nor square inch of a motherboard would call for them and hence the extra cost of 100% solid electrolytic capacitors is unnecessary - they cost in the order of 3-5 times that of regular electrolytic caps, and when added up the savings could certainly be better used elsewhere on the board. For now, at least, other than the circuitries that really could benefit from the solid electrolytic capacitors, I would agree with the other manufacturers that indiscriminate plastering of the board with them is rather more marketing gimmick than engineering.
Anand/Jarred...I have a few questions if you get time.
1. How is the RAS functionality for Woodcrest with the Asus server board?
2. Do the FBDIMMs require active cooling, and if so how much?
3. When Asus released the A8N32-SLI, they hinted at releasing more boards with the 8 phase power...have/will they?
4. Any chance on finding out what the functionality is for the 1207 pins on Socket F?
Floppy is still the only way to load certain drivers for hard drive controllers in Windows (during installation). And as long as this will be the case, the floppy will live on
I linked in higher res images for some of the more interesting photos Anand took. If you would like to see any others in more detail, let me know (via email) and I will see about adding those. Note that some images are left in "lo-def" because it reduces blurriness. :)
quote: Unfortunately due to changes in the VRM requirements for Conroe, no current LGA-775 motherboards will work with the new processor.
Riiiiiiiiiiiiiiiiiight. I thought the next gen Intel chips are supposed to consume less power? I admit that I'm ignorant when it comes to such things, but the common sense tells me that VRM which supports TDP 130 should have no problem handling TDP 80? Could anyone elaborate?
It happens every time Intel releases a new CPU. The then current chipset will support the CPU, but then the "VRM issue" pops up. It's a scam by Intel to force you to buy a new motherboard (preferrably with their chipset, of course). AMD doesn't seem to have this problem.
Yeah, slam that AM2 CPU into your old Socket 939 board. It will work! AMD doesn't require you to buy a new motherboard to buy their latest and greatest!
Conroe is a huge departure from the P4. The fact that it works at all in LGA775 and with old chipsets is impressive.
Bol(*#*&s, digitalfreak is right, of course this is a scam. Just because AMD is now using the same scam doesn't mean this is the way it has to be. I'm sure it wouldn't cost too many transistors to build in a legacy mode on the CPU die.
U guys are missing the point, the X2 dual core socket 939 cpu with TDP 89w works on the socket 939 motherboards that were available at launch 2 years ago, when .13µ newcastle cpus were the only game in town, barring any bios updates of course.
Same thing for socket A, altho not at the right FSB speed depending on chipset.
AMD only forces you to update if there is a significant feature difference that can't be worked out in the current socket. It seems that intel engineers its cpus without 'socket environment' consideration and then engineers a specific chipset for it, while amd engineers its cpus to fit in a specific socket environment that they defined a standard for long ago.
So you don't think that lower power consumption due to the new VRM is a significant feature difference? It is all the technical press is talking about these days.
The LGA775 socket was launched in June of 2004. So, here we are 2 years later with the same socket for Intel, but a change to the voltage regulator. And you can run dual core processors on what was originally a socket meant just for one core, just like AMD.
Socket 939 was also launched 2 years ago, and it is already obsolete.
Sounds like both companies have similar track records, as of late. Perhaps the march of technology these days simply will not allow for using the same motherboard for more than two years.
It could be that the VRM requirements are for lower voltages or cleaner power or something along those lines. Also, just because Conroe runs cooler than Presler doesn't mean it can't have more stringent voltage requirements. I wouldn't be surprised if this is less of a case of *can't* run Conroe but more can't run 100% *stably*. The newer 975X chipsets/motherboards will probably have a few slight tweaks to fix some erratta encountered with current 975X designs.
Conroe has new power states that aren't supported by older motherboards. For example, a Conroe would try to go to "sleep" and the motherboard wouldn't supply the right level of power to it.
wow, all I can say is wow. I am quite impressed with Gigabyte desktop motherboards. From the pictures it looks like a better design than even what DFI would do. Also the ASUS socket F board looks excellent. Quite impressive since I am used to think that ASUS' server boards are inferior to like supermicro/iwill/tyan.
quote: There is a lot of concern about the availability of Conroe, as Intel has only committed to around 25% of its mainstream and high end desktop processor shipments being Conroe by the end of this year. After Dell and HP buy up all the Conroes they will want for their systems, there simply may not be any left for the end user to buy in the channel market. Alternatively, there may end up being some supply in the channel market but at significant markups due to a shortage.
Interesting. Looks like Conroe's may come at a premium until Intel can increase production.
quote: Remember that the HDMI connector can carry both audio and video data, and by outfitting cards with a header for internal audio passthrough (from your soundcard/motherboard to the graphics card) you take advantage of that feature of the HDMI specification
I dont get it, what is the point of sending audio to the monitor?
Do you even know why hdmi exist. Most HDTV that have HDMI connects also has audio out.
YOu connect everything to your tv and send out only singles you need. My guess is you don't have a HDTV.
Yes i have one, i just dont see the point of sending audio to the tv, and then back out to the reciever when you can do just send it directly from the hddvd/bluray to the reciever instead of from the hddvd to tv then back to reciever.
Sending audio to one more spot might degrade quality, i said might, so why not send it directly to the reciever?
I have an HDTV, and I never use the TV's speakers. Why have HDTV but crap 2 channel audio? Instead, I connect the digital audio to my receiver for 5.1 surround.
If you have an HDTV with more than one HDMI port and a SPDIF out, you can connect multiple HDMI sources to the TV and you can take an optical cable from the TV to your receiver since HDMI carries 5.1 channel audio.
Guys, the HDMI cable will go to your receiver carrying both video and audio, then the receiver will send the video out to your HDTV. That's the setup that makes sense to me. No point in sending audio to the HDTV directly. Most newer generation receivers include HDMI switching already.
Lots of people will want to run the HDMI to their HDTV. Just because you dont understand it doesn't mean there isn't a very good reason.
Here is my setup. I have a cable connection connected directly to the HDTV (cable card slot), and my xbox360 hooked to HDMI. My next project is too add a HD-DVD player and when HDMI video cards become common I will have an HTPC hooked by HDMI also. So I got 4 inputs (cable internal and xbox360, HD-DVD players, HTPC on HDMI) hooked to HDTV. With me so far. Now my TV can play the audio directly (yeah it's 2.1) but there are times when I dont want/need the loudness of my receiver. Late at night or when listening to the news dolby digital 5.1 is just overkill.
NOW here is the part you dont understand (and therefore quick to bash others). Most HDTV (mine inclded) have SPDIF OUTPUT. If I hit monitor mute on my remote then the TV speakers shutoff and any digital audio goes directly to the SPDIF. So regardless if I am watching terresterial HDTV, HDTV cable, regular cables, HD-DVD, xbox360 or eventially anything from my HTPC the TV bypasses it directly to the SPDIF output.
So with 1 HDMI cable per source plus only 1 toslink optical cable to my receiver I have hooked up ALL my audio & video gear.
If I listened to "experts" like yourself then I would have three limitations
1) much more cables
2) can't use TV speakers when I just want quiet simple 2.1
3) no high quality way to handle digital cable, and OTA HDTV without 2 more set top boxes.
Now the largest advantage to a setup like this is simplicity. Remember you may be an audio/video expert but 90% of consumers are not. With HDMI they can connect EVERYTHING to their TV. If they dont have a home theater system no problem, if they do then they connect 1 cable from TV to receiver and they are done. Compare that to the rats nest of cables behind most entertainment centers and you can see why the industry is pushing HDMI.
OK, legitimate questions here; please don't feel the urge to own me. :P
1) Let me make sure I understand you, first: With the setup that you describe, it seems that you would have one less HDMI cable and only 1 total SPDIF cable, correct?
2) Are HDTVs with 4+ HDMI ports common/reasonably priced? I haven't made the HD plunge, but it seems as though most of the ones I browse at have 2.
3) The method you describe sounds very efficient and I believe I understand the benefits for HDMI components. How do older components that only have analog cables fit into the equation? I'm certain that you could route those through your receiver, but I'd imagine that takes away from the fluidity of your setup. Will the HDTV pass even analog audio signals out to the receiver? In your post, you mentioned digital audio specifically being passed, so I was hoping you could clarify.
4) The digital audio being passed through the HDTV, does it degrade sound quality at all? I remember years ago when I bought my receiver I had to look for a certain quality specification regarding component video cable switching to make sure that the receiver wouldn't degrade the video signal upon pass through. I was wondering if this was a similar situation.
Sorry for the length, but if you or anyone else could answer this I'd be appreciative.
1) Not sure if I understand the question but the total # of HDMI cables = # of HDMI sources. They all connect to the HDTV. There is only 1 SPDIF (optical/toslink) and it runs from HDTV to receiver.
2) No very expensive. However there are some with 2-3 HDMI that are more reasonably priced. Expect this to change. All future models of HDTV seems to be including more & more HDMI while eliminating DVI, and other ports. I would expect soon most HDTV made will be 3-4 HDMI plus 1 or 2 of each "legacy" port (composite, s-video, component).
3) My TV will digitize analog audio and route is over the spdif out however I havent ever used that feature. The digital cable from cablecard slot does need to be converted and passed to spdif I assume however I havent experienced any audio issues. Best way to find this out is stay away from Bestbuy and goto a real home theater store. Those AV experts can help you sort through all the options.
One side not HDMI 1.2 (current version) only supports "single link" and up to 5.1 audio. The newer HDMI 1.3 (being developed) will support "dual link" and more advanced audio like DolbyDigital TrueHD, and a couple others. I dont find this to be a limitation but some users may.
4) There is no degrading because the signal is digital and the HDTV simply allows it to pass through unchanged. Now if you have analog audio sources there may be more of an issues but I dont know about that.
If it turns out the low-end Conroes will overclock very well (I suspect they might), an Intel purchase might in the horizon for me (my last Intel chip was a Tualatin).
I've just sold my Athlon64 mobo and CPU while I can still get a reasonable price for them. If I can't get Conroe for a good price, then I'll pick up the used X2's that should be flooding the For Sale forums =)
Please let me know if you’re trying to find a piece of writing writer for your site. you've got some really great posts and that I feel I might be an honest asset. If you ever want to require a number of the load off, I’d absolutely like to write some material for your blog in exchange for a link back to mine. Please send me an email if interested. Thank you! http://www.kalpanachawla.in/
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
61 Comments
Back to Article
mindless1 - Wednesday, June 7, 2006 - link
Nice coverage. These new toys leave me drooling. Now off I go to find a smallish nuclear reactor to power everything. LOL.sri2000 - Friday, June 9, 2006 - link
You just need to get yourself a "Mr. Fusion" and you'll be all set.http://en.wikipedia.org/wiki/Mr._Fusion">http://en.wikipedia.org/wiki/Mr._Fusion
bespoke - Tuesday, June 6, 2006 - link
Too bad the new DFI boards still have that hideous fan on the NF chipset - that little bugger runs at 4,000 to 5,000 and is terribly loud.I can't wait to upgrade to Conroe, ditch NF4 and get back to a quiet (yet nicely performing) PC.
Griswold - Wednesday, June 7, 2006 - link
Duh.. newsflash, there are also NF4 boards without fans - just not from DFI. What really sucks about the fan on the DFI board is, that it breaks after 3 months and you end up replacing it with a better fan.Stele - Wednesday, June 7, 2006 - link
Although that's not much of a use if you're aiming to get a DFI board - which I think is where he was coming from. :)
For one reason or another DFI does not seem to be interested, or at least eager, to implement more/more effective passive cooling solutions on their products. Besides the lack of noise, passive cooling's greatest advantage is the fact that it doesn't have moving parts that are prone to failure like fans... as you found out.
At least they did take a unique step in implementing a digital integrated VR design on their board... its remarkable compactness and 'clean' layout without large electrolytic capacitors makes it really worth looking at for motherboard power circuits. Can't wait till more details of their implementation and tests thereof surface.
R3MF - Tuesday, June 6, 2006 - link
was the ECS miniITX A64 motherboard with an nForce chipset.i would love to see a AM2 MCP61-S variant with two dimm slots and PCI-E 16x card!
bldckstark - Tuesday, June 6, 2006 - link
Dual redundant power supplies in the Asus 1U server would seem to indicate that there are three or four power supplies housed within, but I believe the actuality is that there are only two right? Redundant means secondary as I understand it. Dual redundant means two secondaries. Therefore dual redundant PS's include a backup power supply and then a backup of the backup power supply. Which is it? Are there 2 or 3 power supplies in that thing?hoppa - Tuesday, June 6, 2006 - link
God I am so sick of hearing about x new card that is "even better than" the already $500 dual x1950.9 XFIRE XLI+ v2.0 ZI miss the days when those cards, the best cards, maxed at $300, the awesome stuff was at $200, and you could do quite well for $150. Now $150 is a joke.
One43637 - Tuesday, June 6, 2006 - link
is it just me or does the GB motherboard offerings remind you of the Asus motherboards (A8N32 & P5N32) that were released last year...Griswold - Tuesday, June 6, 2006 - link
High time the dorks at Nvidia and ATI start working on the power saving front. At least they seemm to have that in mind for the follow-up generations... This only means that R600 and G80 wont make it into my computer until the following cards reduce the power envelope by quite a bit.*shakes fist*
ceefka - Tuesday, June 6, 2006 - link
Just when Intel has attractive numbers on power consumption and also AMD aims again for lower numbers these GPUs negate all their effort and have you on the look out for an even bigger PSU or an additional PSU. How can ATI or NVIDIA justify this? I'd like a good reason.phusg - Tuesday, June 6, 2006 - link
I think you've already given the only reason (CPU's using less). I won't call it a 'good' reason, but all the same.sri2000 - Tuesday, June 6, 2006 - link
When I read that line about 1000-1200 watt PSU's, all I could think of was Doc Brown in Back to the Future yelling "1.21 Gigawatts!?!"... (yeah I know - kilowatt != gigawatt... but still).Griswold - Tuesday, June 6, 2006 - link
Who knows, maybe ATI/Nvidia bundle plutonium batteries (as seen in deep space probes) with their cards in the future? Not only to feed the cards but mainly to power your cooling equipment...segagenesis - Tuesday, June 6, 2006 - link
Exactly my first thought. The concept of a seperate external supply for a graphics card is already ludicrous to me, if this becomes standard then I would hate to see future progression from there. Does anyone remember the Taco Town SNL skit? I desire better video as much as anyone else, but I draw the line when I need a nuclear power station to run it. ATI/nVidia must have forgotten that electricity DOES cost money.Needless to say... impressive Conroe boards, for a premium.
Stele - Tuesday, June 6, 2006 - link
All in all an excellent article, good coverage and of course lovely photos :PThe Asus Pluto board is interesting indeed, especially the audio riser card. By the way, anyone noticed that the riser card is nicknamed Charon? For the uninitiated, Pluto was the Greek god of the Underworld, separated from the living world by the river Styx. Charon was the boatman who ferried the dead across that river to Pluto's domain. The two have hence been generally closely associated with one another. In astronomy, Pluto's moon is also named Charon, for the same reason. Nice bit of humour on Asus' side :P
punko - Tuesday, June 6, 2006 - link
Actually, Pluto is the Roman name for the Greek God Hades. Charon was indeed the ferryman of the Underworld, but the river wasn't Styx, but Acheron. I believe that there were four famous rivers in the Underworld, Lethe was another, but I can't remember the fourth.Stele - Tuesday, June 6, 2006 - link
It is another Greek name for Hades (from Greek ????t??, Plouton), but it was adopted by and hence much more commonly associated with Roman mythology.
Oh heh! Styx was the popular misconception... forgot about it. Thanks for correcting me! There were five rivers of Hades: Acheron (the river of sorrow), Cocytus (lamentation), Phlegethon (fire), Styx (hate) and... Lethe (forgetfulness). I guess that explains why you remember Lethe but forgot the others! ;)
DigitalFreak - Tuesday, June 6, 2006 - link
Didn't really see anything new here.Audio on a riser card - DFI has been doing it for awhile now.
Lighting around the I/O shield - Been done before.
Debug code readout - Been on numerous boards for quite some time. Asus just made it viewable from outside the case.
Stele - Tuesday, June 6, 2006 - link
No, I wasn't saying that anything was new in the industry, but new at least to Asus. IMHO the concept motherboard was an interesting exhibit, even if it isn't the only one of its kind.More to the point, what's noteworthy is that Asus - along with other manufacturers as well, hopefully - is considering following DFI's Karajan module concept. That can only be a good thing, as long as the manufacturers sincerely mean to improve noise immunity and not just throw it in as a gimmick to charge a premium for.
As for the other ideas... interesting but not terribly unique, as you've already pointed out.
phusg - Tuesday, June 6, 2006 - link
No, transparent side windows already did that ;-)
OCedHrt - Tuesday, June 6, 2006 - link
So what happened to ASRock boards (supposedly page 18)?bongobear - Tuesday, June 6, 2006 - link
I'd like to see what ASRock have lined up as well, however the mini-ITX Albatron boards that were shown in their place were a nice surprise. If they can tweak those a little and get them out at a nice price, very tempting....JarredWalton - Tuesday, June 6, 2006 - link
Anand has a ton of pictures and information from his trip, so he'll probably provide a second update later. However, I've updated the page links to correctly reflect the Albatron content on page 17.Stele - Tuesday, June 6, 2006 - link
There are actually two major types, one (the more common) being filled with solid compound as noted, and are simply called solid electrolytic, or solid aluminium electrolytic capacitors. The other type uses non-solid electrolyte. Both types are better than the usual electrolytic capacitors of course in terms of resistance to high temperatures - generally around 1,500-3,000 hours at 105°C-125°C. Another useful characteristic is that they perform well in high-frequency, high-capacitance applications, which is one reason they've often been used on graphics cards, for example.
They are however not immune from leaking and other capacitor related failures... it's just that they can survive prolonged exposure to higher temperatures for longer periods of time than the regular electrolytic capacitors before they too fail.
This, unfortunately, is non sequitur i.e. it does not follow. Component choice is but one of so many factors in determining the stability, performance and reliability of a product. A cheapo backyard manufacturer could boast 16-phase power, all-Nichicon/Rubycon solid electrolytic caps, Philips MOSFETs and AMP connectors all around but would still fail if the circuits were designed by a half-baked engineer fresh out of college and poorly manufactured. Of course, it could be used by the said cheapo manufacturer as a gimmick to fleece the gullible consumers (of which unfortunately there are very many) who rocket to seventh heaven on seeing such big names. Or a not-so-cheapo manufacturer could make their product more attractive by using such flash while cutting corners on areas where consumers probably won't notice in perhaps the first 6 months of use.
Speaking of gimmick, as mentioned above these capacitors are more useful in high-frequency, high-temperature applications. Naturally not every single circuit nor square inch of a motherboard would call for them and hence the extra cost of 100% solid electrolytic capacitors is unnecessary - they cost in the order of 3-5 times that of regular electrolytic caps, and when added up the savings could certainly be better used elsewhere on the board. For now, at least, other than the circuitries that really could benefit from the solid electrolytic capacitors, I would agree with the other manufacturers that indiscriminate plastering of the board with them is rather more marketing gimmick than engineering.
Viditor - Tuesday, June 6, 2006 - link
Anand/Jarred...I have a few questions if you get time.1. How is the RAS functionality for Woodcrest with the Asus server board?
2. Do the FBDIMMs require active cooling, and if so how much?
3. When Asus released the A8N32-SLI, they hinted at releasing more boards with the 8 phase power...have/will they?
4. Any chance on finding out what the functionality is for the 1207 pins on Socket F?
Cheers! And thanks for the article...!
bob4432 - Tuesday, June 6, 2006 - link
after all this time and the floppy still lives on ...... ;)Calin - Tuesday, June 6, 2006 - link
Floppy is still the only way to load certain drivers for hard drive controllers in Windows (during installation). And as long as this will be the case, the floppy will live onDigitalFreak - Tuesday, June 6, 2006 - link
Fortunately, this problem goes away with Vista. I'm hoping that in a couple of motherboard revs, those damn things will go away.JarredWalton - Tuesday, June 6, 2006 - link
I linked in higher res images for some of the more interesting photos Anand took. If you would like to see any others in more detail, let me know (via email) and I will see about adding those. Note that some images are left in "lo-def" because it reduces blurriness. :)Take care,
Jarred Walton
Editor
AnandTech.com
lopri - Tuesday, June 6, 2006 - link
Riiiiiiiiiiiiiiiiiight. I thought the next gen Intel chips are supposed to consume less power? I admit that I'm ignorant when it comes to such things, but the common sense tells me that VRM which supports TDP 130 should have no problem handling TDP 80? Could anyone elaborate?
DigitalFreak - Tuesday, June 6, 2006 - link
It happens every time Intel releases a new CPU. The then current chipset will support the CPU, but then the "VRM issue" pops up. It's a scam by Intel to force you to buy a new motherboard (preferrably with their chipset, of course). AMD doesn't seem to have this problem.ShapeGSX - Tuesday, June 6, 2006 - link
Yeah, slam that AM2 CPU into your old Socket 939 board. It will work! AMD doesn't require you to buy a new motherboard to buy their latest and greatest!Conroe is a huge departure from the P4. The fact that it works at all in LGA775 and with old chipsets is impressive.
phusg - Tuesday, June 6, 2006 - link
Bol(*#*&s, digitalfreak is right, of course this is a scam. Just because AMD is now using the same scam doesn't mean this is the way it has to be. I'm sure it wouldn't cost too many transistors to build in a legacy mode on the CPU die.Spoelie - Tuesday, June 6, 2006 - link
U guys are missing the point, the X2 dual core socket 939 cpu with TDP 89w works on the socket 939 motherboards that were available at launch 2 years ago, when .13µ newcastle cpus were the only game in town, barring any bios updates of course.Same thing for socket A, altho not at the right FSB speed depending on chipset.
AMD only forces you to update if there is a significant feature difference that can't be worked out in the current socket. It seems that intel engineers its cpus without 'socket environment' consideration and then engineers a specific chipset for it, while amd engineers its cpus to fit in a specific socket environment that they defined a standard for long ago.
ShapeGSX - Tuesday, June 6, 2006 - link
So you don't think that lower power consumption due to the new VRM is a significant feature difference? It is all the technical press is talking about these days.The LGA775 socket was launched in June of 2004. So, here we are 2 years later with the same socket for Intel, but a change to the voltage regulator. And you can run dual core processors on what was originally a socket meant just for one core, just like AMD.
Socket 939 was also launched 2 years ago, and it is already obsolete.
Sounds like both companies have similar track records, as of late. Perhaps the march of technology these days simply will not allow for using the same motherboard for more than two years.
JarredWalton - Tuesday, June 6, 2006 - link
It could be that the VRM requirements are for lower voltages or cleaner power or something along those lines. Also, just because Conroe runs cooler than Presler doesn't mean it can't have more stringent voltage requirements. I wouldn't be surprised if this is less of a case of *can't* run Conroe but more can't run 100% *stably*. The newer 975X chipsets/motherboards will probably have a few slight tweaks to fix some erratta encountered with current 975X designs.giantpandaman2 - Tuesday, June 6, 2006 - link
Conroe has new power states that aren't supported by older motherboards. For example, a Conroe would try to go to "sleep" and the motherboard wouldn't supply the right level of power to it.soydios - Monday, June 5, 2006 - link
1 kilowatt power supply? bloody hellI like all the pictures. But I'm still waiting for Asus AM2 RD580.
highlandsun - Monday, June 5, 2006 - link
Any news on iRAM2 or anything similar?Missing Ghost - Monday, June 5, 2006 - link
wow, all I can say is wow. I am quite impressed with Gigabyte desktop motherboards. From the pictures it looks like a better design than even what DFI would do. Also the ASUS socket F board looks excellent. Quite impressive since I am used to think that ASUS' server boards are inferior to like supermicro/iwill/tyan.krwilsonn - Monday, June 5, 2006 - link
Page 18 of the article seems to be mixed up since the Albatron boards are showing up instead of the Asrock.Regs - Monday, June 5, 2006 - link
Actually consider what AMD is doing at all. Boy times have changed! ;)I'm a life long AMD fan too. Short life, but life long.
bob661 - Monday, June 5, 2006 - link
Where have you been? It's been like that for quite a few years now. Remember when DDR2 was actually on the market? Who wasn't using DDR2 then?bob661 - Monday, June 5, 2006 - link
Figures.bob661 - Monday, June 5, 2006 - link
Interesting. Looks like Conroe's may come at a premium until Intel can increase production.shabby - Monday, June 5, 2006 - link
I dont get it, what is the point of sending audio to the monitor?
Furen - Monday, June 5, 2006 - link
It's meant to be sent to an HD TV. Monitors can just use DVI for digital signaling.shabby - Tuesday, June 6, 2006 - link
And whats the point of that too? Its supposed to go to the reciever not the tv.OrSin - Tuesday, June 6, 2006 - link
Do you even know why hdmi exist. Most HDTV that have HDMI connects also has audio out.YOu connect everything to your tv and send out only singles you need. My guess is you don't have a HDTV.
shabby - Tuesday, June 6, 2006 - link
Yes i have one, i just dont see the point of sending audio to the tv, and then back out to the reciever when you can do just send it directly from the hddvd/bluray to the reciever instead of from the hddvd to tv then back to reciever.Sending audio to one more spot might degrade quality, i said might, so why not send it directly to the reciever?
ShapeGSX - Tuesday, June 6, 2006 - link
I have an HDTV, and I never use the TV's speakers. Why have HDTV but crap 2 channel audio? Instead, I connect the digital audio to my receiver for 5.1 surround.epsilonparadox - Tuesday, June 6, 2006 - link
If you have an HDTV with more than one HDMI port and a SPDIF out, you can connect multiple HDMI sources to the TV and you can take an optical cable from the TV to your receiver since HDMI carries 5.1 channel audio.TauRusIL - Tuesday, June 6, 2006 - link
Guys, the HDMI cable will go to your receiver carrying both video and audio, then the receiver will send the video out to your HDTV. That's the setup that makes sense to me. No point in sending audio to the HDTV directly. Most newer generation receivers include HDMI switching already.namechamps - Tuesday, June 6, 2006 - link
Lots of people will want to run the HDMI to their HDTV. Just because you dont understand it doesn't mean there isn't a very good reason.Here is my setup. I have a cable connection connected directly to the HDTV (cable card slot), and my xbox360 hooked to HDMI. My next project is too add a HD-DVD player and when HDMI video cards become common I will have an HTPC hooked by HDMI also. So I got 4 inputs (cable internal and xbox360, HD-DVD players, HTPC on HDMI) hooked to HDTV. With me so far. Now my TV can play the audio directly (yeah it's 2.1) but there are times when I dont want/need the loudness of my receiver. Late at night or when listening to the news dolby digital 5.1 is just overkill.
NOW here is the part you dont understand (and therefore quick to bash others). Most HDTV (mine inclded) have SPDIF OUTPUT. If I hit monitor mute on my remote then the TV speakers shutoff and any digital audio goes directly to the SPDIF. So regardless if I am watching terresterial HDTV, HDTV cable, regular cables, HD-DVD, xbox360 or eventially anything from my HTPC the TV bypasses it directly to the SPDIF output.
So with 1 HDMI cable per source plus only 1 toslink optical cable to my receiver I have hooked up ALL my audio & video gear.
If I listened to "experts" like yourself then I would have three limitations
1) much more cables
2) can't use TV speakers when I just want quiet simple 2.1
3) no high quality way to handle digital cable, and OTA HDTV without 2 more set top boxes.
Now the largest advantage to a setup like this is simplicity. Remember you may be an audio/video expert but 90% of consumers are not. With HDMI they can connect EVERYTHING to their TV. If they dont have a home theater system no problem, if they do then they connect 1 cable from TV to receiver and they are done. Compare that to the rats nest of cables behind most entertainment centers and you can see why the industry is pushing HDMI.
CKDragon - Tuesday, June 6, 2006 - link
OK, legitimate questions here; please don't feel the urge to own me. :P1) Let me make sure I understand you, first: With the setup that you describe, it seems that you would have one less HDMI cable and only 1 total SPDIF cable, correct?
2) Are HDTVs with 4+ HDMI ports common/reasonably priced? I haven't made the HD plunge, but it seems as though most of the ones I browse at have 2.
3) The method you describe sounds very efficient and I believe I understand the benefits for HDMI components. How do older components that only have analog cables fit into the equation? I'm certain that you could route those through your receiver, but I'd imagine that takes away from the fluidity of your setup. Will the HDTV pass even analog audio signals out to the receiver? In your post, you mentioned digital audio specifically being passed, so I was hoping you could clarify.
4) The digital audio being passed through the HDTV, does it degrade sound quality at all? I remember years ago when I bought my receiver I had to look for a certain quality specification regarding component video cable switching to make sure that the receiver wouldn't degrade the video signal upon pass through. I was wondering if this was a similar situation.
Sorry for the length, but if you or anyone else could answer this I'd be appreciative.
Thanks,
CK
namechamps - Thursday, June 8, 2006 - link
Will try not to own anyone...1) Not sure if I understand the question but the total # of HDMI cables = # of HDMI sources. They all connect to the HDTV. There is only 1 SPDIF (optical/toslink) and it runs from HDTV to receiver.
2) No very expensive. However there are some with 2-3 HDMI that are more reasonably priced. Expect this to change. All future models of HDTV seems to be including more & more HDMI while eliminating DVI, and other ports. I would expect soon most HDTV made will be 3-4 HDMI plus 1 or 2 of each "legacy" port (composite, s-video, component).
3) My TV will digitize analog audio and route is over the spdif out however I havent ever used that feature. The digital cable from cablecard slot does need to be converted and passed to spdif I assume however I havent experienced any audio issues. Best way to find this out is stay away from Bestbuy and goto a real home theater store. Those AV experts can help you sort through all the options.
One side not HDMI 1.2 (current version) only supports "single link" and up to 5.1 audio. The newer HDMI 1.3 (being developed) will support "dual link" and more advanced audio like DolbyDigital TrueHD, and a couple others. I dont find this to be a limitation but some users may.
4) There is no degrading because the signal is digital and the HDTV simply allows it to pass through unchanged. Now if you have analog audio sources there may be more of an issues but I dont know about that.
ChronoReverse - Monday, June 5, 2006 - link
If it turns out the low-end Conroes will overclock very well (I suspect they might), an Intel purchase might in the horizon for me (my last Intel chip was a Tualatin).I've just sold my Athlon64 mobo and CPU while I can still get a reasonable price for them. If I can't get Conroe for a good price, then I'll pick up the used X2's that should be flooding the For Sale forums =)
trippykavya - Saturday, August 8, 2020 - link
This is a great web site. Good sparkling user interface and very informative blogs. I will be coming back in a bit, thanks for the great article. I have found it enormously useful.https://www.escortdelhi.net/bangalore-escorts.html
http://www.dreamgirlsbangalore.com/
trippykavya - Saturday, August 8, 2020 - link
If you want to improve your familiarity just keep visiting this web page and be updated with the most up-to-date news posted here.http://www.bangaloreescortsservice.org/
http://www.naughtymodel.co.in/
http://www.sunnyescorts.in/
http://www.bangaloreescortindia.co.in
payal21 - Friday, January 29, 2021 - link
Please let me know if you’re trying to find a piece of writing writer for your site. you've got some really great posts and that I feel I might be an honest asset. If you ever want to require a number of the load off, I’d absolutely like to write some material for your blog in exchange for a link back to mine. Please send me an email if interested. Thank you!http://www.kalpanachawla.in/