Is not. This is the definitive reference design of glues. The youtube tutorial of gluing things together, complete with darude sandstorm and hilarious accents.
@notashill. no, it's nowhere close to the same thing. AMD used what is called Glue Logic Architecture as the onboard networking and coherence protocols. This is actually much more open and flexible. You can implement any kind of communication protocol over the EMIB connection itself as long as you know how to program an FPGA (Verilog and OpenCL).
Remember, AMD called Intel's chips glued together first: https://pcper.com/2006/11/intel-core-2-extreme-qx6... "So, as many have said, including AMD, Intel’s Kentsfield processor is two dual core processors “glued together” and seems somewhat un-elegent."
Isn't this essentially the same thing that whole Zen architexture is designed is based on making multiple 8 core processor together in same cpu.package.
Calling this product Gluueee is why I pretty much ignore the comments on articles here lately. I believe this designed where the original EMiB in XPS 15 2in1 came from and is sign of future and almost guarantee to be reproduce by competitors in the future.
Until this announcement, Xilinx had the advantage as Altera had plans to leverage Intel's fabs even before they merged. Xilinx arguably surpassed Altera/Intel due to the 10 nm manufacturing woes. This is a catch up move but it is a very big move.
It is funny how all of a sudden people start believing that AMD will rule the world. Hold on, AMD executed well for two years and they are where they are because Intel has the worst problems a manufacturer could have. So that is a hollow victory when you overtake your competitor when he is unable to compete. Intel had fab issues, not design/talent/architecture/ideas issue. They just couldn't fab anything. Get that well into your mind.
So a competition ceases to be one when a player involved stops for whatever reasons, and so no one wins. Got it. Is it AMDs problem that Intel has fab issues? Should TSMC have halted their progress until Intel fixed their issues? AMD is nowhere near ruling the world anytime soon, and I don't think OP meant that either, but they have good products. If you say TSMC fabbing + AMD design is inferior compared to Intel's, then it's a surprisingly good competition for such a product.
AMD had plenty of R&D resources and squandered them. They stuck to 28nm for ages when 20nm was being produced (could have made Excavator mobile-competitive and improved their Opterons too). AMD cut corners with Bulldozer to ridiculous extremes and paid the price.
This is an ill-informed comment. Firstly, AMD's R&D had nothing to do with process technology by the 28nm era because they'd already spun it off to GlobalFoundries.
Secondly, GloFo (to whom AMD were contractually bound at that point) cancelled their 20nm node, which AMD could do nothing about.
Thirdly, TSMC and Samsung did have a 20nm node, but in both cases it was fairly awful because they hadn't integrated FinFET technology, so the leakage (already not great at 28nm) was atrocious. This was just for low-power mobile chips - for high-power desktop chips it would have been a major regression. It's noteworthy that AMD's graphics division and Nvidia's desktop GPU division both stayed the hell away from that node.
So - even if AMD had somehow been able to prevent GloFo canning 20nm and/or found a way to jump ship to TSMC without breach of contract, it wouldn't have been of any use whatsoever for their products.
It's a perfectly informed comment. AMD CHOSE not to move from GloFo 28nm to TSMC 20nm when it became available to at least give them some hope in mobile space. Yes, AMD would have had to pay some cash to get out, but they could have negotiated from a position of strength. GloFo is now next to dead without AMD's business.
20nm's leakage was very low and tight, which made it terrible for high-clock chips, but the mobile game in CPUs and SOCs was maxing in the low to mid 3GHz range, so actually the low leakage was good for temperatures and efficiency, especially with HDL in play.
Nvidia stayed away because they already had their custom 16nm (12) plans with TSMC in negotiation at that point, and AMD was failing spectacularly on both CPU and GPU fronts (again, mostly from cutting corners on the CPU side which destroyed competitiveness, sales, and thus cashflow for further R&D.
And you skipped the core of my argument. AMD cut every corner they could on Bulldozer and failed massively for it. Many of the seniors who left said Excavator was well within design capability back at the start of the Construction Core family, and in fact the architecture never came close to the original planned capability because of poor management. The problem was their penny-pinching CEO. If Sandy Bridge had been up against something roughly Excavator class from the start, AMD would be in a much better position today.
So, if the problem is with engineering incompetence, the victory is hollow, but when it's from incompetent decision making, it's not. I'll write that down.
There are 2 primary players in this market : Altera (Intel) and xilinx. Getting to the FPGA market would take a lot of resources and will have AMD face two well established with a lot of technology and IP competitors, and the TAM is not even close to justify the R&D and risk getting in this market. I would even say AMD would have better value getting in the ARM and RISC V market than to the FPGA.
Just a reminder that FPGAs are also becoming increasingly prevalent (in visible roles) in the consumer market. Many popular scaler and HDMI mods for classic game consoles are FPGA-based, and while those may be niche, products like those from Analogue are built in far larger quantities. Unfortunately, cost is still a big barrier: Analogue’s consoles are stuck at 140k LE for cost reasons, and while the mister features several times as many, it’s only viable for consumer use because the project is based around a subsidized development board.
All that to say it would be interesting to see coverage of FPGAs on the consumer electronics end of the spectrum, rather than simply the enterprise end.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
31 Comments
Back to Article
firewrath9 - Tuesday, November 5, 2019 - link
GLUUEEEIII-V - Wednesday, November 6, 2019 - link
This is pretty fucking far from glue.katsetus - Wednesday, November 6, 2019 - link
Is not. This is the definitive reference design of glues. The youtube tutorial of gluing things together, complete with darude sandstorm and hilarious accents.notashill - Wednesday, November 6, 2019 - link
It's exactly the same kind of thing that Intel criticized AMD for by calling EPYC "4 glued-together desktop die".patrickjp93 - Wednesday, November 6, 2019 - link
@notashill. no, it's nowhere close to the same thing. AMD used what is called Glue Logic Architecture as the onboard networking and coherence protocols. This is actually much more open and flexible. You can implement any kind of communication protocol over the EMIB connection itself as long as you know how to program an FPGA (Verilog and OpenCL).dullard - Wednesday, November 6, 2019 - link
Which is a retort from people calling Intel's dual core chips "glued" together 10 years before that.Anand himself calling Intel's chips glued together in 2005: https://www.anandtech.com/show/1656/2
And others:
https://forums.tomshardware.com/threads/intels-glu...
https://www.ifixit.com/Wiki/Computer_Processor_Cha...
https://www.guru3d.com/articles-pages/core-i5-750-...
FreckledTrout - Wednesday, November 6, 2019 - link
So was AMD's approach but Intel called it glue. So I see no issue poking fun at them.dullard - Wednesday, November 6, 2019 - link
Remember, AMD called Intel's chips glued together first:https://pcper.com/2006/11/intel-core-2-extreme-qx6...
"So, as many have said, including AMD, Intel’s Kentsfield processor is two dual core processors “glued together” and seems somewhat un-elegent."
HStewart - Monday, November 11, 2019 - link
Isn't this essentially the same thing that whole Zen architexture is designed is based on making multiple 8 core processor together in same cpu.package.HStewart - Monday, November 11, 2019 - link
Calling this product Gluueee is why I pretty much ignore the comments on articles here lately.I believe this designed where the original EMiB in XPS 15 2in1 came from and is sign of future and almost guarantee to be reproduce by competitors in the future.
Threska - Tuesday, November 5, 2019 - link
A market AMD shouldn't leave up to just Intel.III-V - Wednesday, November 6, 2019 - link
It's not left to just Intel. Xilinx is the other big player.YB1064 - Wednesday, November 6, 2019 - link
How does Intel's dev environment compare to Vivado? There is a reason big players like National Instruments are exclusively Xilinx based.patrickjp93 - Wednesday, November 6, 2019 - link
Intel's development tools are vastly superior. You're not limited to just Verilog, and transpiling is easily.NI is Xilinx purely because it's an ISO company.
mpjohns3 - Wednesday, November 6, 2019 - link
Vivado is completely bug ridden, but it's still the best tool on the market. Saying it's limited to just Verilog is completely inaccuratepatrickjp93 - Wednesday, November 6, 2019 - link
It is definitely NOT the best tool on the market, no way. The OpenCL programming libraries for Altera FPGAs are so much more intuitive and efficient.mpjohns3 - Wednesday, November 6, 2019 - link
Ah, well there's our disconnect. I'm speaking from a digital design perspective, writing custom RTL.Kevin G - Wednesday, November 6, 2019 - link
Until this announcement, Xilinx had the advantage as Altera had plans to leverage Intel's fabs even before they merged. Xilinx arguably surpassed Altera/Intel due to the 10 nm manufacturing woes. This is a catch up move but it is a very big move.deil - Wednesday, November 6, 2019 - link
maybe they will, but for now they need to focus to monetize what they already have. THEN think about other parts.yeeeeman - Wednesday, November 6, 2019 - link
It is funny how all of a sudden people start believing that AMD will rule the world. Hold on, AMD executed well for two years and they are where they are because Intel has the worst problems a manufacturer could have. So that is a hollow victory when you overtake your competitor when he is unable to compete.Intel had fab issues, not design/talent/architecture/ideas issue. They just couldn't fab anything. Get that well into your mind.
Teckk - Wednesday, November 6, 2019 - link
So a competition ceases to be one when a player involved stops for whatever reasons, and so no one wins. Got it.Is it AMDs problem that Intel has fab issues? Should TSMC have halted their progress until Intel fixed their issues?
AMD is nowhere near ruling the world anytime soon, and I don't think OP meant that either, but they have good products. If you say TSMC fabbing + AMD design is inferior compared to Intel's, then it's a surprisingly good competition for such a product.
prime2515103 - Wednesday, November 6, 2019 - link
"So that is a hollow victory when you overtake your competitor when he is unable to compete."So, any dominance Intel has had over AMD has been hollow because AMD couldn't compete due to a lack of the kind of R&D resources Intel has. Got it.
patrickjp93 - Wednesday, November 6, 2019 - link
AMD had plenty of R&D resources and squandered them. They stuck to 28nm for ages when 20nm was being produced (could have made Excavator mobile-competitive and improved their Opterons too). AMD cut corners with Bulldozer to ridiculous extremes and paid the price.Spunjji - Wednesday, November 6, 2019 - link
This is an ill-informed comment. Firstly, AMD's R&D had nothing to do with process technology by the 28nm era because they'd already spun it off to GlobalFoundries.Secondly, GloFo (to whom AMD were contractually bound at that point) cancelled their 20nm node, which AMD could do nothing about.
Thirdly, TSMC and Samsung did have a 20nm node, but in both cases it was fairly awful because they hadn't integrated FinFET technology, so the leakage (already not great at 28nm) was atrocious. This was just for low-power mobile chips - for high-power desktop chips it would have been a major regression. It's noteworthy that AMD's graphics division and Nvidia's desktop GPU division both stayed the hell away from that node.
So - even if AMD had somehow been able to prevent GloFo canning 20nm and/or found a way to jump ship to TSMC without breach of contract, it wouldn't have been of any use whatsoever for their products.
patrickjp93 - Wednesday, November 6, 2019 - link
It's a perfectly informed comment. AMD CHOSE not to move from GloFo 28nm to TSMC 20nm when it became available to at least give them some hope in mobile space. Yes, AMD would have had to pay some cash to get out, but they could have negotiated from a position of strength. GloFo is now next to dead without AMD's business.20nm's leakage was very low and tight, which made it terrible for high-clock chips, but the mobile game in CPUs and SOCs was maxing in the low to mid 3GHz range, so actually the low leakage was good for temperatures and efficiency, especially with HDL in play.
Nvidia stayed away because they already had their custom 16nm (12) plans with TSMC in negotiation at that point, and AMD was failing spectacularly on both CPU and GPU fronts (again, mostly from cutting corners on the CPU side which destroyed competitiveness, sales, and thus cashflow for further R&D.
And you skipped the core of my argument. AMD cut every corner they could on Bulldozer and failed massively for it. Many of the seniors who left said Excavator was well within design capability back at the start of the Construction Core family, and in fact the architecture never came close to the original planned capability because of poor management. The problem was their penny-pinching CEO. If Sandy Bridge had been up against something roughly Excavator class from the start, AMD would be in a much better position today.
prime2515103 - Wednesday, November 6, 2019 - link
So, if the problem is with engineering incompetence, the victory is hollow, but when it's from incompetent decision making, it's not. I'll write that down.yeeeeman - Wednesday, November 6, 2019 - link
One more thing, Intel bought Altera for ~17B dollars. That is triple what AMD is worth. How could AMD not leave up that to just Intel?flgt - Wednesday, November 6, 2019 - link
AMD has no expertise in FPGA design or design of the required supporting toolchains, so that's pretty much a non-starter.Eliadbu - Wednesday, November 6, 2019 - link
There are 2 primary players in this market :Altera (Intel) and xilinx. Getting to the FPGA market would take a lot of resources and will have AMD face two well established with a lot of technology and IP competitors, and the TAM is not even close to justify the R&D and risk getting in this market. I would even say AMD would have better value getting in the ARM and RISC V market than to the FPGA.
FreckledTrout - Wednesday, November 6, 2019 - link
I disagree. AMD is so small they need focus. They focus on server, desktop, mobile along with graphics that should be enough for now.Guspaz - Thursday, November 7, 2019 - link
Just a reminder that FPGAs are also becoming increasingly prevalent (in visible roles) in the consumer market. Many popular scaler and HDMI mods for classic game consoles are FPGA-based, and while those may be niche, products like those from Analogue are built in far larger quantities. Unfortunately, cost is still a big barrier: Analogue’s consoles are stuck at 140k LE for cost reasons, and while the mister features several times as many, it’s only viable for consumer use because the project is based around a subsidized development board.All that to say it would be interesting to see coverage of FPGAs on the consumer electronics end of the spectrum, rather than simply the enterprise end.