A good move would be to get these Matrix Cores in their RDNA-enabled APUs.
As for M.2 as a compute accelerator form factor, I don't really see the point. Sure, it fits a SBC better than a big PCIe card, but most desktop PCs have enough PCIe slots and better cooling for PCIe cards to make more sense.
I think Quadric is late to the game, by offering a dedicated chip for edge AI processing at this stage. Most customers would probably opt for a SoC with an integrated AI engine. Maybe that's another option they're pursuing, but most SoC vendors already have their own AI engines.
Anyway, it'd be interesting to see some benchmarks of their stuff. I think it probably won't stack up too well against next-gen APUs equipped with tensor/matrix cores.
For HPC applications, there's less need to. That's mainly something you have to tackle for graphics, because graphics is much harder to neatly partition and involve more global data movement.
So, my prediction is that the first AMD GPU to do that will probably be in their RDNA line, not CDNA.
has 16 GPUs controlled by one CPU motherboard that has 2 Xeon sockets. I worked for a company that used a DGX-1 which had 8 GPU sockets and 2 CPU sockets and the system balance was just fine (with some thought about how to encode the training data) for training CNN neural networks
Wake me up when september ends... It's march? Oh, uhh
I'd be curious to see if servers with PCI-e accelerators could ever use tech like WBs with quick detach fittings and the new 12-pin power connector to reduce that huge power budget gap to mezzanine cards.
Ryan, the MI100's GPU is codenamed Arcturus. It fits in with their stellar-based naming scheme and is plastered all over the Linux driver code.
> packed FP32 isn’t free. ...packed operands need to be adjacent and aligned to even registers.
This is like how previous generations dealt with FP64. Your FP64 operands would be split across register pairs, or so I've heard.
The decision to stick with PCIe 4.0 seems a bit weird, since Genoa will presumably have PCIe 5.0 and there should be some considerable overlap in their respective product cycles. I guess AMD is figuring most of their big customers will use OAM form factor, with Infinity Link for CPU communication.
Anyway, AMD really needs to get some of these Matrix Cores into their RDNA product line, if they want the open source community to embrace them and broaden software support for them. Until then, I'm really not much interested. Regardless, my next dGPU will probably be Intel, due to AMD's poor track record of RDNA support for their compute stack.
“Anyway, AMD really needs to get some of these Matrix Cores into their RDNA product line, if they want the open source community to embrace them and broaden software support for them.”
This isn’t relevant for server GPUs, nor are Matrix cores at this point relevant for gaming GPUs, simply buy a Instinct GPU then, if you want it.
“Regardless, my next dGPU will probably be Intel, due to AMD's poor track record of RDNA support for their compute stack.”
That’s pretty funny given the fact that intel so far hasn’t delivered any dGPUs and has proven nothing, probably a fan boy comment.
> nor are Matrix cores at this point relevant for gaming GPUs
Don't agree. Maybe not fp64, but fp32 and lower-precision have applicability to games. At the lowest-precision, they can use it for AI-driven upscaling, like Nvidia's DLSS.
> simply buy a Instinct GPU then, if you want it.
How many developers have that kind of money? This is why Nvidia is winning the the GPU-compute race. Because they support development on their *entire* stack of GPUs, even down to the near-bottom tier of gaming GPUs and the Jetson development boards.
> That’s pretty funny given the fact that intel so far hasn’t delivered any dGPUs
They have a long track record with iGPUs. At my job, we use their iGPUs in a shipping product.
“Don't agree. Maybe not fp64, but fp32 and lower-precision have applicability to games. At the lowest-precision, they can use it for AI-driven upscaling, like Nvidia's DLSS.”
“They have a long track record with iGPUs. At my job, we use their iGPUs in a shipping product.”
So you’re talking about track records and iGPUs and comparing dGPUs to iGPUs of a vendor that never delivered any good dGPUs. I think it’s safe to say that you’re a) not a dev b) trolling and c) never seriously used a AMD Pro card. Last time I checked drivers were pretty good, nobody cares about ancient track records from yesteryear. And hearsay isn’t relevant either. You used the words “track record” instead of “my experience with”, so maybe stop talking out of your ass for a second. Intel is a clusterfuck when it comes to GPUs, a meme at best and a disaster otherwise. Nvidia is good but also very expensive and locks you down into their stuff with CUDA and software limitations. AMD has great open source drivers for Linux, Nvidia isn’t even comparable. You’d know that if you were a serious or real dev.
Maybe could’ve been used for something else, but not for FSR 2.0 as AMD aims for compatibility and it won’t use any special cores, probably the same as with FSR 1.0 and if you ask me that’s the way to go, not Nvidias.
“How many developers have that kind of money? This is why Nvidia is winning the the GPU-compute race. Because they support development on their *entire* stack of GPUs, even down to the near-bottom tier of gaming GPUs and the Jetson development boards.”
A coincidence which happened because Nvidia needs tensor cores to do DLSS and RT denoising. A coincidence and intentional proprietarity locking you artificially into stuff.
I don’t think you need tensor cores to do development for anything, not anything a Radeon Pro can’t do as well. And then we have to wait and see if RDNA3 isn’t coming with some sort of AI cores as well, but at this point I guess it’s unlikely.
> So you’re talking about track records and iGPUs and comparing dGPUs > to iGPUs of a vendor that never delivered any good dGPUs.
Yes, because they're far more similar than they are different. 95% of the software stack needed to do one is also needed for the other.
If you paid attention to my concerns, I'm principally interested in software issues. I don't need Intel's dGPUs to be the best performance, as long as the software support is there for what I/we need and the perf/$ and perf/W is reasonably competitive.
> I think it’s safe to say that you’re a) not a dev b) trolling
This is precisely a troll comment, which is why I'm not going to address it or any similar attempts to impeach my credentials. You're free to disregard my statements and opinions, however I owe you nothing.
> c) never seriously used a AMD Pro card.
This exactly misses my point. Nvidia and Intel fully support development on virtually their entire hardware stack. Why should I have to pay $$$ for an AMD Pro card? If AMD wants developer mindshare, they need to reach developers where they *are*, not blame developers for not beating a path to their door.
> Last time I checked drivers were pretty good,
Where's ROCm's RDNA support?
> nobody cares about ancient track records from yesteryear.
Yes they do. AMD loses hearts and minds when people buy RDNA cards and wait *years* for ROCm support never to materialize. Or when it *breaks* what had been working on earlier generations, like Vega and Polaris.
You've clearly never read the endless threads of people trying to get/keep their AMD GPUs working on DaVinci Resolve, for instance. Many are continuing to run heavily-outdated images, for (very real) fears of breaking their setup.
I know that's slightly off-topic, but not really, since I'm talking about lack of stability in AMD's hardware support of their GPU-compute stack.
> And hearsay isn’t relevant either. You used the words “track record” > instead of “my experience with”
If you don't value my perspective, that's not *my* problem.
> Intel is a clusterfuck when it comes to GPUs
It's funny how you say this, right after attacking *me* for using hearsay and ancient track records.
The fact of the matter is that Intel has been among the first to support each OpenCL release since about version 2.0. AMD seemed to stall out around 2.1. It was a nearly 3 years late on OpenCL 2.2 support. After several minutes of searching, I haven't even found any clear evidence that AMD supports OpenCL 3.0, yet both Intel and (even!) Nvidia do.
> AMD has great open source drivers for Linux
So does Intel.
> You’d know that if you were a serious or real dev.
How do you know I don't? You didn't ask.
Open source drivers are definitely nice. They're not deal-breaker, for me. What I care about most is:
1. Hardware support - the compute stack should work on everything from whatever GCN-based APUs people have to the latest gaming GPUs. Both for developer access, and also so that developers have some assurance they'll be able to support customers with this hardware.
2. API support - I have no more interest in using Hip than I do in using CUDA. I only trust Hip to be any good or stable on AMD hardware, which means it effectively locks me into AMD, even if it's open source. I'm willing to use open source libraries, like deep learning frameworks, that have a Hip or CUDA backend, however. But I will not personally or professionally invest in developing for a vendor-specific API.
3. Platform support. Right now, all I care about is Linux/x86. I think AMD hasn't required kernel patches to use new ROCm releases in a couple years, which is definitely progress.
> I don’t think you need tensor cores to do development for anything
That's not my point. My point is that if AMD wants broader support for hardware features like their Matrix Cores and other new CDNA features, they should worry about getting these features into the hands of more developers via gaming cards.
With Nvidia GPUs recently being so ridiculously expensive and hard to find, AMD has been squandering a huge opportunity by their poor/absent ROCm support & missing hardware features on RDNA GPUs.
> not anything a Radeon Pro can’t do as well.
There's nothing magic about Radeon Pro. You know that, right? They're just the same as AMD's gaming GPUs, with maybe a few extra features enabled, maybe some more RAM, and costing lots more $$$.
> we have to wait and see if RDNA3 isn’t coming with some sort of AI cores as well
Yup. I'm willing to keep an open mind. AMD has finally started coming around with ROCm support of RDNA2, so it's still possible they'll turn over a new leaf.
It is clearly stated that all RDNA and GCN 4 and 5 are supported, pro card or not. And of course support is better for Pro cards, that’s why you buy Pro cards, to get more support. Not any different with Nvidia. Is CUDA better supported than ROCm, yes, but ROCm did a lot of progression and gets better every day. Is Intel even comparable to AMD or Nvidia? No. They simply don’t have any noteworthy GPUs and thus cannot even be compared to those. Intel’s new dGPUs are delayed since 4 months almost, do you know why? Because they drivers suck. It’s well known the products got delayed because of that. They will release their shit so late, that it will compete with current gen AND RDNA 3 and ADL, nobody will buy it. Then later this year their CPUs will get destroyed by Zen 4. Hard days for Intel fanboys.
> It is clearly stated that all RDNA and GCN 4 and 5 are supported, pro card or not.
This proves you don't actually know anything about ROCm, because that's an utter fiction.
Their documentation is about as clear as mud about which devices are actually known to work, and the recent history of ROCm has been a litany of breakage and dropping support for even some recently-sold GPUs.
As of now, AMD cannot even answer a simple question about this. In the docs, he found only a list of 7 Instinct and Pro models.
> ROCm did a lot of progression and gets better every day.
According to whom? And how do expect me to believe this statement, given you obviously know so little about ROCm that you cite Wikipedia over anything on amd.com or the ROC github.
> Is Intel even comparable to AMD or Nvidia? No. They simply don’t > have any noteworthy GPUs and thus cannot even be compared to those.
And I wanna add, you made a few fair points, but your shilling for Intel and too many negative comments about AMD which are simply outdated blabber, leave a bad taste.
Matrix cores won’t come to consumer, AMD isn’t into locking consumers into just one gen of product, thus they aren’t needed for FSR 2.0 and 2.0 will run on any hardware that also supports 1.0 (most probably).
How is anything I said "shilling for Intel"? The only good thing I said about them was that their compute stack support is better, which I supported with specific points about OpenCL support compared with AMD. Oh, and I pointed to *actual benchmark results* to correct Spunjji, who usually knows their stuff.
If I were trying to shill for Intel, don't you think I'd be a bit more effusive? No, you apparently don't seem to think beyond seeing anything you don't like about AMD.
At first, I thought maybe you were an AMD employee, but I actually know AMD employees and I'm getting the clear sense that you wouldn't meet their standards.
Yep maybe I’m wrong about ROCm, maybe not. But at least I can admit it, while 3 people call you biased and you’re still deflecting like a kid. And then I didn’t compare real GPUs that are released for years to some iGPU trash and unreleased stuff from Intel, which really just shows that you’re a huge fanboy of Intel. I really don’t care about you being critical about AMD, or let’s say I barely care. But what I can’t accept is you praising Intel for unreleased stuff or trashy iGPUs at the same time. That’s utterly dumb and unacceptable. We will see how good their GPUs will be, for now they postponed the release for over 2 years because of a mix of terrible performance and terrible drivers. Not much to see so far and nothing that would confirm your points. Some people are just hard fanboys and can’t admit it.
> maybe I’m wrong about ROCm, maybe not. But at least I can admit it
"maybe not" doesn't sound like an admission. Better yet, don't take a strong position on something you don't know anything about, and then you won't be in a position where you need to climb down.
> while 3 people call you biased
Which 3? All I see are two gamers and @Spunjji who made a generally correct, if potentially anachronistic statement. Spunjji is more than capable of taking me on, if they want. The mere fact that all they did was to quip about Intel iGPU performance suggests nothing about my core points.
What you're missing is that a mountain of bad counterpoints doesn't add up to a few good ones. Your argument is only as strong as your strongest point, and I haven't seen you make a strong refutation of any of my core claims.
> you’re still deflecting like a kid.
Don't blame me for your own trolling fails.
> And then I didn’t compare real GPUs that are released for years to some iGPU trash
That shows a lack of understanding on your part, that the software stack for these devices is the same mostly irrespective of whether they're iGPUs or not. You're so caught up in hardware that you can't even see my whole point centers around software.
> you’re a huge fanboy of Intel.
I like whatever actually works for me, and so far Intel has a better track record. Did I mention that I use them in shipping product, earning $Millions of annual revenue with thousands of customers having multiple systems and support contracts which pick up the phone or email whenever anything goes wrong? And if we can't solve the problem remotely, we have to send a very highly-qualified and highly-paid support tech to fix it on site. That's dependability.
Whatever state their gaming drivers might be in, their compute stack is working well for us. And that builds the kind of confidence I'm talking about.
So, call me what you want, but the zero other people reading this thread will see that I never made any effusive claims about Intel. I made a few factual points and that's it. You'd think a real fan would be rather more emphatic.
> But what I can’t accept is you praising Intel for unreleased stuff
I didn't. I just said I expected it would probably be my next dGPU. And I further clarified that by saying I intended to wait at least for initial impressions. But, I actually tend not to be on the bleeding edge. So, it might take months before I finally decide to buy anything. It just depends on many things.
As you'll probably know, Intel is set to launch the first DG2 products by later, this week. I suggest you keep your powder dry, because then you'll have some much more substantial threads to tear into.
> nothing that would confirm your points.
Which points did I even make? I linked to some early benchmarks of Tiger Lake G7. That's it.
I even said that all I needed from Intel's GPUs was to be merely competitive on perf/$ and perf/W. If they can get in the ballpark, then the GPU compute software stack is what's much more important to me.
> Some people are just hard fanboys and can’t admit it.
Where exactly do you “compute” anything with a iGPU? Please. That’s just ridiculous. We will see how their dGPUs fare, iGPUs don’t prove much and I don’t agree with your points “it’s the same software”, hahahaha, not based on my vast experience.
And then again you keep adapting your opinions and pretending you had this opinion from the get go, you’re a smart little weasel. Too bad it won’t work with me.
> Where exactly do you “compute” anything with a iGPU? Please. That’s just ridiculous.
They have more raw compute performance than the CPU cores, and leave the CPU cores free to do other things. The performance isn't great, but it met our needs.
Interesting fact: Intel's older Iris Pro iGPUs actually had more fp64 performance than Nvidia or AMD's gaming cards of that era. That's because they cut it back by only 1/2 of fp32 vs. Nvidia and AMD cutting it to 1/32 or 1/16.
> then again you keep adapting your opinions
When did I change any opinion I voiced?
> Too bad it won’t work with me.
Of course. Perhaps you're merely posing as a pro-AMD troll, but actually you're from Nvidia and just trying to give the AMD camp a bad image. If you gave up after the exchange had reached a reasonable outcome, that'd look far too decent.
PS. Unreleased hardware from Intel and you simply assume it will work flawlessly with the same software that is out now, you’re either a hard fanboy or a inexperienced idiot that simply assumes things out of the blue which countless times was proven to not be the case. Until stuff is released and tested there is no point in assuming anything, only a idiot or fanboy would do otherwise. Given the fact Intel never released a dGPU based on GPU architecture not x86, it’s laughable that you simply assume it will work flawlessly, which just proves what people were throwing at your head. And like I said that’s it for me, won’t waste any more time with you. There are men in this world and then there are weasels.
Really confused how mode_13h is an Intel fanboy? This is a wild thread as someone who's discussed with them from the Intel side where they took the AMD side.
It did get somewhat out of hand. I'm left wondering what @Khanan was even doing in this thread, given their lack of background in GPU Compute or any contribution other than attacking me.
And garbage drivers which crash in every second game if not more. Only a Intel shill would call intel GPUs good. Aside from Quick Sync and delivering a simple picture they’re completely useless.
I agree with this. Anyone who praises unreleased products can only be a fan/trolling or whatever reason be heavily biased.
He made a lot of claims about driver problems with AMD cards, I can’t verify them, however I’m pretty sure they are exaggerated. If you want to do it there is a way, full stop. This is true for PC tech since infinity.
And about Matrix cores: they are only a thing with new CDNA GPUs don’t hold your breath with them being released for RDNA3, they won’t. FSR 2.0 won’t need them so they aren’t needed, AMD isn’t into locking support for their support for just 1 gen of cards, unlike Nvidia who did it with 20 series, despite DLSS being a disaster at the beginning and until release of 2.0. And yes it was absolutely possible to release DLSS for all cards, 1.9 proves this, just no interest by Nvidia who want to copy Apple as much as possible. Great they couldn’t buy ARM, nobody needed that gridlock.
What I praised was their compute stack support for their iGPUs. Those *are* released product, you know?
Also, Tiger Lake has been a released product for almost 1.5 years.
> fan/trolling or whatever reason be heavily biased.
I'm definitely sensing some of that, in this thread.
> driver problems with AMD cards
Their GPU Compute stack. Don't mis-characterize my position. I have nothing to say about their graphics drivers, because that's not my primary focus.
Again, this comment thread is on an article for AMD Compute accelerators. So, it's relevant in ways that talking about graphics & gaming are not.
> And about Matrix cores: they are only a thing with new CDNA GPUs > don’t hold your breath with them being released for RDNA3
Thanks, I won't. I mentioned that as a suggestion, and nothing more.
I'd like to see AMD be more competitive on the GPU compute front. I was really supporting them through the early days of ROCm and when they were pushing standards like OpenCL and HSA.
Sometimes, criticism can come from a place of concern, you know? It doesn't always have to be rooted in wanting to tarnish, undermine, and destroy.
All I did was correct Spunjji, in pointing out that the Tiger Lake G7 iGPUs were actually competitive, which I supported with some evidence. You guys haven't provided nothing to support your allegations, except a link to a sketchy Wikipedia article.
> Anything there should be taken with a grain of salt.
Benchmarks are benchmarks, though. That's real data, with day 1-quality drivers. If anything, it should've gotten better since then.
If you have any evidence which can invalidate what I claimed, you're free to provide it.
> Your past posts show this to be true and accurate.
In this thread? In general? Got links? If you're such a studied expert on me, why don't I recognize your username?
> You come out and disclaim how glorious Intel is in everything
"disclaim"? Wow, if AMD is paying you guys, they should ask for a refund.
> shit on anything AMD every chance you get.
No, that doesn't sound like me. There are posters who do that, however. You must have me confused with one of them.
> Perhaps you should step back a moment and remove some bias from your decision making?
At my job, we make decisions based on data, our customers, and the market for our products. In my personal purchasing decisions, I have goals for projects I want to work on, and I know which APIs I want to use. So, my decisions amount to looking at the hardware offerings and how well those have been working for others who are doing similar sorts of work.
> Seriously. You are here claiming that you would choose unreleased hardware
I'm not going to buy it on day 1. I'm going to at least wait and see what initial impressions of it are. All I said is that my current expectation is that I'll probably go with Intel, next time.
> Well who knows. No logical or rational person would make the decision.
I explained my rationale in quite some detail. You're free to disagree and make your own decisions for your own reasons.
> A logical and rational person would wait, keep their mouth shut and ...
Ah, that's what this is about. Well, if you guys are intent on trying to shut down any negative comments about AMD, this appears to be backfiring.
Why don't you just keep at it, and see where it goes next? I can promise it's not going to make AMD look any better. I have no personal agenda against AMD, but if somebody is calling me a liar, it'll force me to drag out some unflattering evidence to prove I'm not.
Just don’t talk about unreleased stuff and praise Intel for things they didn’t do. Their gaming drivers are terrible, you can quickly google this in 5 seconds, but maybe you’re just trolling. Don’t praise unreleased products full stop. AMD did a lot of progress recently with ROCm this is what I recently read, so don’t expect me to be interested in your gossip, there’s a clear bias you have with Intel whether you want to admit it or not. IGPUs aren’t relevant to this conversation, it’s a simple fact. They do not compete with full GPU cards. So don’t talk about it being better than competing products that are proven and released. Intel recently postponed release of Arc because of terrible drivers, doesn’t really confirm anything you say about Intels driver performance, to the contrary. And I would stay far way from any GPUs of them for at least a year until they have mature drivers that are proven.
If two people come and say you’re a Intel shill or biased I would start thinking about myself and not endlessly deflect everything. Start being a grown up maybe.
> Their gaming drivers are terrible, you can quickly google this in 5 seconds
I never said anything about that. My only real experience with their graphics drivers is just running regular desktop apps.
> Don’t praise unreleased products full stop.
I didn't. However, wouldn't it be hypocrisy for you to say that while simultaneously trashing unreleased products?
> AMD did a lot of progress recently with ROCm this is what I recently read
It's great that you're reading up on ROCm. That's at least something you can take away from this.
I still have hopes ROCm will mature well. Having followed it from the early days, it's been a much longer journey than I imagined.
The core problem with ROCm is that it's funded by their professional and cloud compute sales. That means all their priorities are driven by those sales and contracts, which tends to leave independent developers freezing out in the cold. And it's students and independent developers that are often at the forefront of innovation.
I know they have some good people working on it. AMD just needs to decide they're going to make a similar level of investment in building a similar developer community as Nvidia did. The formula is pretty simple, but it takes investment and time.
> doesn’t really confirm anything you say about Intels driver performance, to the contrary.
I'm not talking about gaming. This article isn't even about a gaming card. Maybe you just saw AMD and failed to notice that?
> If two people come and say you’re a Intel shill or biased I would start thinking about myself
If you don't know when to trust your own competence on a subject matter, then I feel sorry for you. I suggest you get good enough at something that you can build some confidence and know when not to listen to others.
> not endlessly deflect everything.
I'm not the slightest bit sorry for defending against a bunch of malicious and poorly-informed critiques and allegations.
> Start being a grown up maybe.
That would be good advice for some in this thread.
“If you don't know when to trust your own competence on a subject matter, then I feel sorry for you. I suggest you get good enough at something that you can build some confidence and know when not to listen to others.”
I have more confidence than you will ever have, what a cheap and weak allegation. I don’t need weaseling wannabes like yourself pretending to be something which they are not.
You’re constantly adapting your own opinion and then pretending you had this opinion from the get go. Too bad 3 different people called you biased or a AMD hater, so you’re just trying to weasel yourself out now, which won’t work with me and the others don’t even care anymore.
“I'm not the slightest bit sorry for defending against a bunch of malicious and poorly-informed critiques and allegations.”
Nice try, but to this point you didn’t prove anything about your alleged “shortcomings” of ROCm. So you essentially provided nothing and pretended to have something, which I won’t fall for. For every shit you have googled up I can easily google up positive sources to contradict yours. You’re essentially a argumentative idiot that never used the hardware he criticizes and when called out quotes some weak sources that don’t hold up a inspection. That’s it for me, won’t waste any more time with you.
Suffice to say, ROCm is working and anyone who wants to use it, can use it. Devs aren’t exactly noobs when it comes to software, they will know how to do it. You never had a point.
I'll grant you that's sure a confident statement, but sometimes a confident front is only that. True confidence is knowing when to stand your ground because the ground is indeed yours to hold. A foolish confidence is merely defiance in the face facts that are obvious for all else to see.
See also: bluster.
> You’re constantly adapting your own opinion
My original post was so short, it couldn't possibly capture a rich and nuanced point of view. So, I don't know what this nonsense is about "adapting" my opinion. You couldn't hope to get a real view of my opinion and experience from only that.
> Too bad 3 different people called you biased or a AMD hater
If you're going to keep posting, at least come up with new points. I already debunked this one.
> to this point you didn’t prove anything about your alleged “shortcomings” of ROCm.
Ah, so you want to go there, eh? I figured maybe you wouldn't want all of its dirty laundry aired, but I guess that proves you're just an agitator sent to make AMD look bad.
> For every shit you have googled up I can easily google up positive sources to contradict yours.
Really? How's that going to work? For every buggy and broken release, you're going to prove that it's not a bug or didn't break some hardware?
> never used the hardware he criticizes
In this point, you're actually correct. I wish I could, but they never supported 1st gen RDNA GPUs!
Even if they did, AMD turned their back on OpenCL, while Intel did not. Given my experience with Intel's compute stack on iGPUs, I'm willing to give their dGPUs a chance.
> when called out quotes some weak sources that don’t hold up a inspection.
Which ones? If you had anything substantive to say, why not say it, instead of wasting so much typing on childish name-calling?
> ROCm is working and anyone who wants to use it, can use it.
And good luck to anyone who tries. To this very day, it still doesn't support 1st gen RDNA GPUs. Doesn't matter whether Pro or not.
I'm sorry to interject, but sadly mode_13h is right about ROCm at least, I cant comment on Intel GPUs.
AMD's Linux OpenCL support is utter garbage, both for AMDGPU-Pro and ROCm, both things have nearly no documentation, are hard to setup and are extremely prone to breaking. Even trying to use Mesa's OpenCL support is broken somehow.
In my opinion AMD should just give up OpenCL, at this rate they will simply never be a competitor to Nvidia when it comes to compute. They could instead focus on Vulkan compute, which works beautifully and painlessly on their open source drivers. My absolute best track record getting any kind of acceleration out of my Polaris and Vega 10 GPUs has been with Vulkan compute.
While I didn't end up changing the world with it, I did play around with CUDA on my Shield Tab while I was taking a CUDA class at university. I was stunned that it worked at all. It's broken these days, but so are Nvidia's Android aspirations. There's still a clear path to GPU compute using my aging GTX 660 Ti purchased around the same era.
Two open source drivers, no support. AMDGPU-Pro, the third Linux driver, in turn never added any IGPs either. Compute literally works better on my Intel m3 tablet. I got better compute support from fglrx and a Radeon HD 5770.
And here's the real killer -- Intel has consistently put effort toward GPU virtualization (not just pass-through). If that lands in FOSS for the restructured Xe GPUs (it already existed with GVT-G for the prior generation), there won't be any question as to which GPU is right for compute.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
39 Comments
Back to Article
nandnandnand - Tuesday, March 22, 2022 - link
Wake me up inside when I can get CDNA 5 in an M.2 module.https://www.cnx-software.com/2022/03/22/quadric-de...
mode_13h - Wednesday, March 23, 2022 - link
A good move would be to get these Matrix Cores in their RDNA-enabled APUs.As for M.2 as a compute accelerator form factor, I don't really see the point. Sure, it fits a SBC better than a big PCIe card, but most desktop PCs have enough PCIe slots and better cooling for PCIe cards to make more sense.
I think Quadric is late to the game, by offering a dedicated chip for edge AI processing at this stage. Most customers would probably opt for a SoC with an integrated AI engine. Maybe that's another option they're pursuing, but most SoC vendors already have their own AI engines.
Anyway, it'd be interesting to see some benchmarks of their stuff. I think it probably won't stack up too well against next-gen APUs equipped with tensor/matrix cores.
lemurbutton - Tuesday, March 22, 2022 - link
Boring. Wake me up when AMD can do transparent multi GPUs like M1 Ultra.mode_13h - Wednesday, March 23, 2022 - link
For HPC applications, there's less need to. That's mainly something you have to tackle for graphics, because graphics is much harder to neatly partition and involve more global data movement.So, my prediction is that the first AMD GPU to do that will probably be in their RDNA line, not CDNA.
PaulHoule - Friday, April 29, 2022 - link
Is that really true? There are big servers in HPC that have multiple GPUs. For instance the DGX-2https://www.nvidia.com/content/dam/en-zz/Solutions...
has 16 GPUs controlled by one CPU motherboard that has 2 Xeon sockets. I worked for a company that used a DGX-1 which had 8 GPU sockets and 2 CPU sockets and the system balance was just fine (with some thought about how to encode the training data) for training CNN neural networks
Spunjji - Thursday, March 24, 2022 - link
You'll be seeing that later this year - and unlike M1 Ultra, it will have good performance in the vast majority of apps people actually use.Of course none of this has any relevance to datacentre GPUs. You might be lost?
Unashamed_unoriginal_username_x86 - Wednesday, March 23, 2022 - link
Wake me up when september ends... It's march? Oh, uhhI'd be curious to see if servers with PCI-e accelerators could ever use tech like WBs with quick detach fittings and the new 12-pin power connector to reduce that huge power budget gap to mezzanine cards.
mode_13h - Wednesday, March 23, 2022 - link
Ryan, the MI100's GPU is codenamed Arcturus. It fits in with their stellar-based naming scheme and is plastered all over the Linux driver code.> packed FP32 isn’t free. ...packed operands need to be adjacent and aligned to even registers.
This is like how previous generations dealt with FP64. Your FP64 operands would be split across register pairs, or so I've heard.
The decision to stick with PCIe 4.0 seems a bit weird, since Genoa will presumably have PCIe 5.0 and there should be some considerable overlap in their respective product cycles. I guess AMD is figuring most of their big customers will use OAM form factor, with Infinity Link for CPU communication.
Anyway, AMD really needs to get some of these Matrix Cores into their RDNA product line, if they want the open source community to embrace them and broaden software support for them. Until then, I'm really not much interested. Regardless, my next dGPU will probably be Intel, due to AMD's poor track record of RDNA support for their compute stack.
Khanan - Wednesday, March 23, 2022 - link
“Anyway, AMD really needs to get some of these Matrix Cores into their RDNA product line, if they want the open source community to embrace them and broaden software support for them.”This isn’t relevant for server GPUs, nor are Matrix cores at this point relevant for gaming GPUs, simply buy a Instinct GPU then, if you want it.
“Regardless, my next dGPU will probably be Intel, due to AMD's poor track record of RDNA support for their compute stack.”
That’s pretty funny given the fact that intel so far hasn’t delivered any dGPUs and has proven nothing, probably a fan boy comment.
mode_13h - Thursday, March 24, 2022 - link
> nor are Matrix cores at this point relevant for gaming GPUsDon't agree. Maybe not fp64, but fp32 and lower-precision have applicability to games. At the lowest-precision, they can use it for AI-driven upscaling, like Nvidia's DLSS.
> simply buy a Instinct GPU then, if you want it.
How many developers have that kind of money? This is why Nvidia is winning the the GPU-compute race. Because they support development on their *entire* stack of GPUs, even down to the near-bottom tier of gaming GPUs and the Jetson development boards.
> That’s pretty funny given the fact that intel so far hasn’t delivered any dGPUs
They have a long track record with iGPUs. At my job, we use their iGPUs in a shipping product.
> probably a fan boy comment.
AMD can ignore these sentiments at its peril.
Khanan - Thursday, March 24, 2022 - link
“Don't agree. Maybe not fp64, but fp32 and lower-precision have applicability to games. At the lowest-precision, they can use it for AI-driven upscaling, like Nvidia's DLSS.”“They have a long track record with iGPUs. At my job, we use their iGPUs in a shipping product.”
So you’re talking about track records and iGPUs and comparing dGPUs to iGPUs of a vendor that never delivered any good dGPUs. I think it’s safe to say that you’re a) not a dev b) trolling and c) never seriously used a AMD Pro card. Last time I checked drivers were pretty good, nobody cares about ancient track records from yesteryear. And hearsay isn’t relevant either. You used the words “track record” instead of “my experience with”, so maybe stop talking out of your ass for a second. Intel is a clusterfuck when it comes to GPUs, a meme at best and a disaster otherwise. Nvidia is good but also very expensive and locks you down into their stuff with CUDA and software limitations. AMD has great open source drivers for Linux, Nvidia isn’t even comparable. You’d know that if you were a serious or real dev.
Maybe could’ve been used for something else, but not for FSR 2.0 as AMD aims for compatibility and it won’t use any special cores, probably the same as with FSR 1.0 and if you ask me that’s the way to go, not Nvidias.
“How many developers have that kind of money? This is why Nvidia is winning the the GPU-compute race. Because they support development on their *entire* stack of GPUs, even down to the near-bottom tier of gaming GPUs and the Jetson development boards.”
A coincidence which happened because Nvidia needs tensor cores to do DLSS and RT denoising. A coincidence and intentional proprietarity locking you artificially into stuff.
I don’t think you need tensor cores to do development for anything, not anything a Radeon Pro can’t do as well. And then we have to wait and see if RDNA3 isn’t coming with some sort of AI cores as well, but at this point I guess it’s unlikely.
mode_13h - Thursday, March 24, 2022 - link
> So you’re talking about track records and iGPUs and comparing dGPUs> to iGPUs of a vendor that never delivered any good dGPUs.
Yes, because they're far more similar than they are different. 95% of the software stack needed to do one is also needed for the other.
If you paid attention to my concerns, I'm principally interested in software issues. I don't need Intel's dGPUs to be the best performance, as long as the software support is there for what I/we need and the perf/$ and perf/W is reasonably competitive.
> I think it’s safe to say that you’re a) not a dev b) trolling
This is precisely a troll comment, which is why I'm not going to address it or any similar attempts to impeach my credentials. You're free to disregard my statements and opinions, however I owe you nothing.
> c) never seriously used a AMD Pro card.
This exactly misses my point. Nvidia and Intel fully support development on virtually their entire hardware stack. Why should I have to pay $$$ for an AMD Pro card? If AMD wants developer mindshare, they need to reach developers where they *are*, not blame developers for not beating a path to their door.
> Last time I checked drivers were pretty good,
Where's ROCm's RDNA support?
> nobody cares about ancient track records from yesteryear.
Yes they do. AMD loses hearts and minds when people buy RDNA cards and wait *years* for ROCm support never to materialize. Or when it *breaks* what had been working on earlier generations, like Vega and Polaris.
You've clearly never read the endless threads of people trying to get/keep their AMD GPUs working on DaVinci Resolve, for instance. Many are continuing to run heavily-outdated images, for (very real) fears of breaking their setup.
I know that's slightly off-topic, but not really, since I'm talking about lack of stability in AMD's hardware support of their GPU-compute stack.
> And hearsay isn’t relevant either. You used the words “track record”
> instead of “my experience with”
If you don't value my perspective, that's not *my* problem.
> Intel is a clusterfuck when it comes to GPUs
It's funny how you say this, right after attacking *me* for using hearsay and ancient track records.
The fact of the matter is that Intel has been among the first to support each OpenCL release since about version 2.0. AMD seemed to stall out around 2.1. It was a nearly 3 years late on OpenCL 2.2 support. After several minutes of searching, I haven't even found any clear evidence that AMD supports OpenCL 3.0, yet both Intel and (even!) Nvidia do.
> AMD has great open source drivers for Linux
So does Intel.
> You’d know that if you were a serious or real dev.
How do you know I don't? You didn't ask.
Open source drivers are definitely nice. They're not deal-breaker, for me. What I care about most is:
1. Hardware support - the compute stack should work on everything from whatever GCN-based APUs people have to the latest gaming GPUs. Both for developer access, and also so that developers have some assurance they'll be able to support customers with this hardware.
2. API support - I have no more interest in using Hip than I do in using CUDA. I only trust Hip to be any good or stable on AMD hardware, which means it effectively locks me into AMD, even if it's open source. I'm willing to use open source libraries, like deep learning frameworks, that have a Hip or CUDA backend, however. But I will not personally or professionally invest in developing for a vendor-specific API.
3. Platform support. Right now, all I care about is Linux/x86. I think AMD hasn't required kernel patches to use new ROCm releases in a couple years, which is definitely progress.
> I don’t think you need tensor cores to do development for anything
That's not my point. My point is that if AMD wants broader support for hardware features like their Matrix Cores and other new CDNA features, they should worry about getting these features into the hands of more developers via gaming cards.
With Nvidia GPUs recently being so ridiculously expensive and hard to find, AMD has been squandering a huge opportunity by their poor/absent ROCm support & missing hardware features on RDNA GPUs.
> not anything a Radeon Pro can’t do as well.
There's nothing magic about Radeon Pro. You know that, right? They're just the same as AMD's gaming GPUs, with maybe a few extra features enabled, maybe some more RAM, and costing lots more $$$.
> we have to wait and see if RDNA3 isn’t coming with some sort of AI cores as well
Yup. I'm willing to keep an open mind. AMD has finally started coming around with ROCm support of RDNA2, so it's still possible they'll turn over a new leaf.
Khanan - Friday, March 25, 2022 - link
Here just a example of you trolling:https://en.m.wikipedia.org/wiki/ROCm
It is clearly stated that all RDNA and GCN 4 and 5 are supported, pro card or not. And of course support is better for Pro cards, that’s why you buy Pro cards, to get more support. Not any different with Nvidia. Is CUDA better supported than ROCm, yes, but ROCm did a lot of progression and gets better every day. Is Intel even comparable to AMD or Nvidia? No. They simply don’t have any noteworthy GPUs and thus cannot even be compared to those. Intel’s new dGPUs are delayed since 4 months almost, do you know why? Because they drivers suck. It’s well known the products got delayed because of that. They will release their shit so late, that it will compete with current gen AND RDNA 3 and ADL, nobody will buy it. Then later this year their CPUs will get destroyed by Zen 4. Hard days for Intel fanboys.
mode_13h - Friday, March 25, 2022 - link
> It is clearly stated that all RDNA and GCN 4 and 5 are supported, pro card or not.This proves you don't actually know anything about ROCm, because that's an utter fiction.
Their documentation is about as clear as mud about which devices are actually known to work, and the recent history of ROCm has been a litany of breakage and dropping support for even some recently-sold GPUs.
Even today, this is still a live issue:
https://github.com/RadeonOpenCompute/ROCm/issues/1...
As of now, AMD cannot even answer a simple question about this. In the docs, he found only a list of 7 Instinct and Pro models.
> ROCm did a lot of progression and gets better every day.
According to whom? And how do expect me to believe this statement, given you obviously know so little about ROCm that you cite Wikipedia over anything on amd.com or the ROC github.
> Is Intel even comparable to AMD or Nvidia? No. They simply don’t
> have any noteworthy GPUs and thus cannot even be compared to those.
Spoken like a true gamer.
Khanan - Friday, March 25, 2022 - link
And I wanna add, you made a few fair points, but your shilling for Intel and too many negative comments about AMD which are simply outdated blabber, leave a bad taste.Matrix cores won’t come to consumer, AMD isn’t into locking consumers into just one gen of product, thus they aren’t needed for FSR 2.0 and 2.0 will run on any hardware that also supports 1.0 (most probably).
mode_13h - Friday, March 25, 2022 - link
> your shilling for IntelHow is anything I said "shilling for Intel"? The only good thing I said about them was that their compute stack support is better, which I supported with specific points about OpenCL support compared with AMD. Oh, and I pointed to *actual benchmark results* to correct Spunjji, who usually knows their stuff.
If I were trying to shill for Intel, don't you think I'd be a bit more effusive? No, you apparently don't seem to think beyond seeing anything you don't like about AMD.
At first, I thought maybe you were an AMD employee, but I actually know AMD employees and I'm getting the clear sense that you wouldn't meet their standards.
Khanan - Sunday, March 27, 2022 - link
Yep maybe I’m wrong about ROCm, maybe not. But at least I can admit it, while 3 people call you biased and you’re still deflecting like a kid. And then I didn’t compare real GPUs that are released for years to some iGPU trash and unreleased stuff from Intel, which really just shows that you’re a huge fanboy of Intel. I really don’t care about you being critical about AMD, or let’s say I barely care. But what I can’t accept is you praising Intel for unreleased stuff or trashy iGPUs at the same time. That’s utterly dumb and unacceptable. We will see how good their GPUs will be, for now they postponed the release for over 2 years because of a mix of terrible performance and terrible drivers. Not much to see so far and nothing that would confirm your points. Some people are just hard fanboys and can’t admit it.mode_13h - Monday, March 28, 2022 - link
> maybe I’m wrong about ROCm, maybe not. But at least I can admit it"maybe not" doesn't sound like an admission. Better yet, don't take a strong position on something you don't know anything about, and then you won't be in a position where you need to climb down.
> while 3 people call you biased
Which 3? All I see are two gamers and @Spunjji who made a generally correct, if potentially anachronistic statement. Spunjji is more than capable of taking me on, if they want. The mere fact that all they did was to quip about Intel iGPU performance suggests nothing about my core points.
What you're missing is that a mountain of bad counterpoints doesn't add up to a few good ones. Your argument is only as strong as your strongest point, and I haven't seen you make a strong refutation of any of my core claims.
> you’re still deflecting like a kid.
Don't blame me for your own trolling fails.
> And then I didn’t compare real GPUs that are released for years to some iGPU trash
That shows a lack of understanding on your part, that the software stack for these devices is the same mostly irrespective of whether they're iGPUs or not. You're so caught up in hardware that you can't even see my whole point centers around software.
> you’re a huge fanboy of Intel.
I like whatever actually works for me, and so far Intel has a better track record. Did I mention that I use them in shipping product, earning $Millions of annual revenue with thousands of customers having multiple systems and support contracts which pick up the phone or email whenever anything goes wrong? And if we can't solve the problem remotely, we have to send a very highly-qualified and highly-paid support tech to fix it on site. That's dependability.
Whatever state their gaming drivers might be in, their compute stack is working well for us. And that builds the kind of confidence I'm talking about.
So, call me what you want, but the zero other people reading this thread will see that I never made any effusive claims about Intel. I made a few factual points and that's it. You'd think a real fan would be rather more emphatic.
> But what I can’t accept is you praising Intel for unreleased stuff
I didn't. I just said I expected it would probably be my next dGPU. And I further clarified that by saying I intended to wait at least for initial impressions. But, I actually tend not to be on the bleeding edge. So, it might take months before I finally decide to buy anything. It just depends on many things.
As you'll probably know, Intel is set to launch the first DG2 products by later, this week. I suggest you keep your powder dry, because then you'll have some much more substantial threads to tear into.
> nothing that would confirm your points.
Which points did I even make? I linked to some early benchmarks of Tiger Lake G7. That's it.
I even said that all I needed from Intel's GPUs was to be merely competitive on perf/$ and perf/W. If they can get in the ballpark, then the GPU compute software stack is what's much more important to me.
> Some people are just hard fanboys and can’t admit it.
Agreed. Maybe even some people in this thread!
Khanan - Monday, March 28, 2022 - link
Where exactly do you “compute” anything with a iGPU? Please. That’s just ridiculous. We will see how their dGPUs fare, iGPUs don’t prove much and I don’t agree with your points “it’s the same software”, hahahaha, not based on my vast experience.And then again you keep adapting your opinions and pretending you had this opinion from the get go, you’re a smart little weasel. Too bad it won’t work with me.
mode_13h - Monday, March 28, 2022 - link
> Where exactly do you “compute” anything with a iGPU? Please. That’s just ridiculous.They have more raw compute performance than the CPU cores, and leave the CPU cores free to do other things. The performance isn't great, but it met our needs.
Interesting fact: Intel's older Iris Pro iGPUs actually had more fp64 performance than Nvidia or AMD's gaming cards of that era. That's because they cut it back by only 1/2 of fp32 vs. Nvidia and AMD cutting it to 1/32 or 1/16.
> then again you keep adapting your opinions
When did I change any opinion I voiced?
> Too bad it won’t work with me.
Of course. Perhaps you're merely posing as a pro-AMD troll, but actually you're from Nvidia and just trying to give the AMD camp a bad image. If you gave up after the exchange had reached a reasonable outcome, that'd look far too decent.
Khanan - Monday, March 28, 2022 - link
PS. Unreleased hardware from Intel and you simply assume it will work flawlessly with the same software that is out now, you’re either a hard fanboy or a inexperienced idiot that simply assumes things out of the blue which countless times was proven to not be the case. Until stuff is released and tested there is no point in assuming anything, only a idiot or fanboy would do otherwise. Given the fact Intel never released a dGPU based on GPU architecture not x86, it’s laughable that you simply assume it will work flawlessly, which just proves what people were throwing at your head. And like I said that’s it for me, won’t waste any more time with you. There are men in this world and then there are weasels.mode_13h - Monday, March 28, 2022 - link
> Unreleased hardware from Intel and you simply assume it will work flawlessly> with the same software that is out now
I don't assume there won't be problems, but we'll have to see how many there are and how quickly they're worked through.
> you’re either a hard fanboy or a inexperienced idiot
If all you've got are baseless insults, I think we're done.
> Until stuff is released and tested there is no point in assuming anything
Okay, so why are you assuming the worst?
> like I said that’s it for me, won’t waste any more time with you.
Well, you're clearly not accomplishing much in this thread.
> There are men in this world and then there are weasels.
You know that name-calling just proves you're out of moves, right? Yours was a foolish errand from the outset. You can't blame that on me.
lmcd - Tuesday, April 5, 2022 - link
Really confused how mode_13h is an Intel fanboy? This is a wild thread as someone who's discussed with them from the Intel side where they took the AMD side.mode_13h - Wednesday, April 6, 2022 - link
Thanks, mate.It did get somewhat out of hand. I'm left wondering what @Khanan was even doing in this thread, given their lack of background in GPU Compute or any contribution other than attacking me.
Spunjji - Thursday, March 24, 2022 - link
"They have a long track record with iGPUs"And lousy performance!
Khanan - Thursday, March 24, 2022 - link
And garbage drivers which crash in every second game if not more. Only a Intel shill would call intel GPUs good. Aside from Quick Sync and delivering a simple picture they’re completely useless.mode_13h - Thursday, March 24, 2022 - link
> And garbage drivers which crash in every second game if not more.Do you see no irony in complaining about games, when the article is for an AMD card that cannot play *any*?
I'm not interested in games. I don't know, but maybe that's why I take Intel GPUs seriously. I really can't comment on your complaints, either way.
tamalero - Tuesday, May 3, 2022 - link
People are consistently throwing FP32 and FP64 for an equivalent to Nvidia DLSS for GAMES.So its not ironic.
mode_13h - Thursday, March 24, 2022 - link
As a frequent reader of this site, you ought to know that Tiger Lake G7's iGPUs were fully competitive with their Vega-8 contemporaries.https://www.anandtech.com/show/16084/intel-tiger-l...
supdawgwtfd - Friday, March 25, 2022 - link
From the article you linked..."We didn’t have too much time to go into the performance of the new Xe-LP graphics"
Anything there should be taken with a grain of salt.
You clearly are an Intel shill.
Your past posts show this to be true and accurate.
You come out and disclaim how glorious Intel is in everything and shit on anything AMD every chance you get.
Perhaps you should step back a moment and remove some bias from your decision making?
Seriously. You are here claiming that you would choose unreleased hardware and drivers are your preference because....
Well who knows.
No logical or rational person would make the decision.
A logical and rational person would wait, keep their mouth shut and see what is actually made available.
Khanan - Friday, March 25, 2022 - link
I agree with this. Anyone who praises unreleased products can only be a fan/trolling or whatever reason be heavily biased.He made a lot of claims about driver problems with AMD cards, I can’t verify them, however I’m pretty sure they are exaggerated. If you want to do it there is a way, full stop. This is true for PC tech since infinity.
And about Matrix cores: they are only a thing with new CDNA GPUs don’t hold your breath with them being released for RDNA3, they won’t. FSR 2.0 won’t need them so they aren’t needed, AMD isn’t into locking support for their support for just 1 gen of cards, unlike Nvidia who did it with 20 series, despite DLSS being a disaster at the beginning and until release of 2.0. And yes it was absolutely possible to release DLSS for all cards, 1.9 proves this, just no interest by Nvidia who want to copy Apple as much as possible. Great they couldn’t buy ARM, nobody needed that gridlock.
mode_13h - Friday, March 25, 2022 - link
> Anyone who praises unreleased productsWhat I praised was their compute stack support for their iGPUs. Those *are* released product, you know?
Also, Tiger Lake has been a released product for almost 1.5 years.
> fan/trolling or whatever reason be heavily biased.
I'm definitely sensing some of that, in this thread.
> driver problems with AMD cards
Their GPU Compute stack. Don't mis-characterize my position. I have nothing to say about their graphics drivers, because that's not my primary focus.
Again, this comment thread is on an article for AMD Compute accelerators. So, it's relevant in ways that talking about graphics & gaming are not.
> And about Matrix cores: they are only a thing with new CDNA GPUs
> don’t hold your breath with them being released for RDNA3
Thanks, I won't. I mentioned that as a suggestion, and nothing more.
I'd like to see AMD be more competitive on the GPU compute front. I was really supporting them through the early days of ROCm and when they were pushing standards like OpenCL and HSA.
Sometimes, criticism can come from a place of concern, you know? It doesn't always have to be rooted in wanting to tarnish, undermine, and destroy.
mode_13h - Friday, March 25, 2022 - link
Wow, the posse grows!> You clearly are an Intel shill.
All I did was correct Spunjji, in pointing out that the Tiger Lake G7 iGPUs were actually competitive, which I supported with some evidence. You guys haven't provided nothing to support your allegations, except a link to a sketchy Wikipedia article.
> Anything there should be taken with a grain of salt.
Benchmarks are benchmarks, though. That's real data, with day 1-quality drivers. If anything, it should've gotten better since then.
If you have any evidence which can invalidate what I claimed, you're free to provide it.
> Your past posts show this to be true and accurate.
In this thread? In general? Got links? If you're such a studied expert on me, why don't I recognize your username?
> You come out and disclaim how glorious Intel is in everything
"disclaim"? Wow, if AMD is paying you guys, they should ask for a refund.
> shit on anything AMD every chance you get.
No, that doesn't sound like me. There are posters who do that, however. You must have me confused with one of them.
> Perhaps you should step back a moment and remove some bias from your decision making?
At my job, we make decisions based on data, our customers, and the market for our products. In my personal purchasing decisions, I have goals for projects I want to work on, and I know which APIs I want to use. So, my decisions amount to looking at the hardware offerings and how well those have been working for others who are doing similar sorts of work.
> Seriously. You are here claiming that you would choose unreleased hardware
I'm not going to buy it on day 1. I'm going to at least wait and see what initial impressions of it are. All I said is that my current expectation is that I'll probably go with Intel, next time.
> Well who knows. No logical or rational person would make the decision.
I explained my rationale in quite some detail. You're free to disagree and make your own decisions for your own reasons.
> A logical and rational person would wait, keep their mouth shut and ...
Ah, that's what this is about. Well, if you guys are intent on trying to shut down any negative comments about AMD, this appears to be backfiring.
Why don't you just keep at it, and see where it goes next? I can promise it's not going to make AMD look any better. I have no personal agenda against AMD, but if somebody is calling me a liar, it'll force me to drag out some unflattering evidence to prove I'm not.
Khanan - Sunday, March 27, 2022 - link
Just don’t talk about unreleased stuff and praise Intel for things they didn’t do. Their gaming drivers are terrible, you can quickly google this in 5 seconds, but maybe you’re just trolling. Don’t praise unreleased products full stop. AMD did a lot of progress recently with ROCm this is what I recently read, so don’t expect me to be interested in your gossip, there’s a clear bias you have with Intel whether you want to admit it or not. IGPUs aren’t relevant to this conversation, it’s a simple fact. They do not compete with full GPU cards. So don’t talk about it being better than competing products that are proven and released. Intel recently postponed release of Arc because of terrible drivers, doesn’t really confirm anything you say about Intels driver performance, to the contrary. And I would stay far way from any GPUs of them for at least a year until they have mature drivers that are proven.If two people come and say you’re a Intel shill or biased I would start thinking about myself and not endlessly deflect everything. Start being a grown up maybe.
mode_13h - Monday, March 28, 2022 - link
> Their gaming drivers are terrible, you can quickly google this in 5 secondsI never said anything about that. My only real experience with their graphics drivers is just running regular desktop apps.
> Don’t praise unreleased products full stop.
I didn't. However, wouldn't it be hypocrisy for you to say that while simultaneously trashing unreleased products?
> AMD did a lot of progress recently with ROCm this is what I recently read
It's great that you're reading up on ROCm. That's at least something you can take away from this.
I still have hopes ROCm will mature well. Having followed it from the early days, it's been a much longer journey than I imagined.
The core problem with ROCm is that it's funded by their professional and cloud compute sales. That means all their priorities are driven by those sales and contracts, which tends to leave independent developers freezing out in the cold. And it's students and independent developers that are often at the forefront of innovation.
I know they have some good people working on it. AMD just needs to decide they're going to make a similar level of investment in building a similar developer community as Nvidia did. The formula is pretty simple, but it takes investment and time.
> doesn’t really confirm anything you say about Intels driver performance, to the contrary.
I'm not talking about gaming. This article isn't even about a gaming card. Maybe you just saw AMD and failed to notice that?
> If two people come and say you’re a Intel shill or biased I would start thinking about myself
If you don't know when to trust your own competence on a subject matter, then I feel sorry for you. I suggest you get good enough at something that you can build some confidence and know when not to listen to others.
> not endlessly deflect everything.
I'm not the slightest bit sorry for defending against a bunch of malicious and poorly-informed critiques and allegations.
> Start being a grown up maybe.
That would be good advice for some in this thread.
Khanan - Monday, March 28, 2022 - link
“If you don't know when to trust your own competence on a subject matter, then I feel sorry for you. I suggest you get good enough at something that you can build some confidence and know when not to listen to others.”I have more confidence than you will ever have, what a cheap and weak allegation. I don’t need weaseling wannabes like yourself pretending to be something which they are not.
You’re constantly adapting your own opinion and then pretending you had this opinion from the get go. Too bad 3 different people called you biased or a AMD hater, so you’re just trying to weasel yourself out now, which won’t work with me and the others don’t even care anymore.
“I'm not the slightest bit sorry for defending against a bunch of malicious and poorly-informed critiques and allegations.”
Nice try, but to this point you didn’t prove anything about your alleged “shortcomings” of ROCm. So you essentially provided nothing and pretended to have something, which I won’t fall for. For every shit you have googled up I can easily google up positive sources to contradict yours. You’re essentially a argumentative idiot that never used the hardware he criticizes and when called out quotes some weak sources that don’t hold up a inspection. That’s it for me, won’t waste any more time with you.
Suffice to say, ROCm is working and anyone who wants to use it, can use it. Devs aren’t exactly noobs when it comes to software, they will know how to do it. You never had a point.
mode_13h - Monday, March 28, 2022 - link
> I have more confidence than you will ever haveI'll grant you that's sure a confident statement, but sometimes a confident front is only that. True confidence is knowing when to stand your ground because the ground is indeed yours to hold. A foolish confidence is merely defiance in the face facts that are obvious for all else to see.
See also: bluster.
> You’re constantly adapting your own opinion
My original post was so short, it couldn't possibly capture a rich and nuanced point of view. So, I don't know what this nonsense is about "adapting" my opinion. You couldn't hope to get a real view of my opinion and experience from only that.
> Too bad 3 different people called you biased or a AMD hater
If you're going to keep posting, at least come up with new points. I already debunked this one.
> to this point you didn’t prove anything about your alleged “shortcomings” of ROCm.
Ah, so you want to go there, eh? I figured maybe you wouldn't want all of its dirty laundry aired, but I guess that proves you're just an agitator sent to make AMD look bad.
> For every shit you have googled up I can easily google up positive sources to contradict yours.
Really? How's that going to work? For every buggy and broken release, you're going to prove that it's not a bug or didn't break some hardware?
> never used the hardware he criticizes
In this point, you're actually correct. I wish I could, but they never supported 1st gen RDNA GPUs!
Even if they did, AMD turned their back on OpenCL, while Intel did not. Given my experience with Intel's compute stack on iGPUs, I'm willing to give their dGPUs a chance.
> when called out quotes some weak sources that don’t hold up a inspection.
Which ones? If you had anything substantive to say, why not say it, instead of wasting so much typing on childish name-calling?
> ROCm is working and anyone who wants to use it, can use it.
And good luck to anyone who tries. To this very day, it still doesn't support 1st gen RDNA GPUs. Doesn't matter whether Pro or not.
Espinosidro - Wednesday, April 27, 2022 - link
I'm sorry to interject, but sadly mode_13h is right about ROCm at least, I cant comment on Intel GPUs.AMD's Linux OpenCL support is utter garbage, both for AMDGPU-Pro and ROCm, both things have nearly no documentation, are hard to setup and are extremely prone to breaking. Even trying to use Mesa's OpenCL support is broken somehow.
In my opinion AMD should just give up OpenCL, at this rate they will simply never be a competitor to Nvidia when it comes to compute. They could instead focus on Vulkan compute, which works beautifully and painlessly on their open source drivers. My absolute best track record getting any kind of acceleration out of my Polaris and Vega 10 GPUs has been with Vulkan compute.
lmcd - Tuesday, April 5, 2022 - link
While I didn't end up changing the world with it, I did play around with CUDA on my Shield Tab while I was taking a CUDA class at university. I was stunned that it worked at all. It's broken these days, but so are Nvidia's Android aspirations. There's still a clear path to GPU compute using my aging GTX 660 Ti purchased around the same era.Meanwhile, I quite literally never got OpenCL support for my Vega 11 IGP. Look at this beauty! https://bbs.archlinux.org/viewtopic.php?id=254491
Two open source drivers, no support. AMDGPU-Pro, the third Linux driver, in turn never added any IGPs either. Compute literally works better on my Intel m3 tablet. I got better compute support from fglrx and a Radeon HD 5770.
And here's the real killer -- Intel has consistently put effort toward GPU virtualization (not just pass-through). If that lands in FOSS for the restructured Xe GPUs (it already existed with GVT-G for the prior generation), there won't be any question as to which GPU is right for compute.