Yeah. I particularly like the teeny weeny suffixes there.. LP... no, wait, was that HP? And is this HPC or HPG? Need to squint my eyes... man, it's really hard to make out those tiny suffixes. But then again, they are so adorable. Like little kittens trying to lodge themselves onto your ankle...
It depends on the game, but I can barely play, for example, Rocket League on a 60Hz monitor. It's a stuttery horrible mess. This goes for any really fast-paced game 0 but something like Assassin's Creed or The Witcher are just fine running at 60fps (more is still better though).
For fast-paced competitive games though, I'll alter the settings to get the game maxing out my montior's 144Hz refresh rate. It's sooooooooooo much smoother and nicer to play.
When people discover something fun, they tend to find ways to compete. As has happened in gaming, where a significant percentage of all play is competitive play. In competitive play, better equipment is very relevant (e.g. Olympic swimmers who are breaking world records based on better computer analysis of their strokes, and better nanotech suits). In the world of computer gaming, higher performance CPU/GPU, etc. == winning.
Don't be obtuse. That term is obviously meant to contrast Intel's integrated graphics solutions which are acceptable for low-end gaming with low resolutions and minimal graphic settings.
There is a very simple philosophy to PCMR if I am playing a Story mode or Single Player Game the 0.1% fps should be above 60 with as low variance as possible , if I am competing to play against other people 2000 fps and 2000Hz is also low
It might take Intel 5 years to fix game bugs on 5 year old game... Even veteran like AMD suffer from their new Navi core, I have very low hopes on Intel based on their track record on last 5 years reacting to meltdown, fab issue...
Don't get me wrong, I'm skeptical of their commitment, but Intel does pretty good on the software side. They've got one of the best drivers on Linux and they do windows drivers with the best of them. I wouldn't count them out, if they want to commit the cash they can put 500 people on driver optimizations, they could easily outspend both nvidia and AMD combined.
Why does your article leave out what is the most astonishing fact about the HPG varient: That Raja states he has them in his lab now? That means Intel has a GPU fabbed outside of Intel already in silicon and back from the fab, only weeks after stating in their Q2 report that going forward they are prepared to use outside fabs. They had to have started this process more than a year ago to have silicon now.
Intel does have R&D fabs, so they can test and run processes before going live. It's not like one day you hit save on gpu.asm and droxbox it to TSMC to run with.
You do realize you can't move a mask from one fab to another right? The design from the ground up is based on the process tech of the fab it's going to be made in. The entire design has to be customized to the fab.
Wouldn't do you any good at all to tape out on one of their R&D fabs then go to somewhere else. Either they've used the masks and taped out on an outside fab or they did it internal and are planning to do it themselves. They wouldn't spend the $2million+ on a tape out they'd have to throw out the door when they switch manufacturers.
We were manufacturing Intel chips at WaferTech LLC (Camas, WA - TSMC subsidiary) for 20 years or more. I've been employed by both WaferTech and Intel, and both NVIDIA GPU chips and Intel chips were being manufactured on the same equipment for decades now, I was in Ion Implantation with Varian (now AMAT) implanters all through the 2000's and 2010's. Foundry operations save companies like Intel Millions if not Billions of dollars and they simply make good financial sense.
Yeah, I'm definitely not buying Gen 1 Xe. I'll wait for reviews and for driver support issues to be sorted out, at least. But I'm glad we're going to have a 3rd player in the discrete GPU market.
The big test will be if Intel is actually committed to gpu design and moves forward. Remember, they've done dGPU before and abandoned the effort a year later.
I'm guessing this will be built up in Camas, WA at TSMC subsidiary WaferTech LLC. - former Intel, IBM, AMAT, & WaferTech semiconductor manufacturing Tech.
Interesting news... I was wondering why their stack was missing anything heftier than LP for gaming, when they'd previously stated having a gaming GPU as a goal. Perhaps they were waiting on announcing it until after they'd explicitly declared their intentions to use 3rd party fabs?
Either way, I'm fascinated by this and it'll be interesting to see how this plays out. I'm not exactly optimistic, but I'd love for their to be more competition amongst gaming GPUs again. The current duopoly has led to some really grim pricing practices of late.
Hope intel's gpus don't become another amd. With amd like Crappy drivers... Crazy power draw.. No proper ODM launches... limited availability... Over hyped marketing.. Obvious red devil fans ragging for a high end card... We games don't want another overarching promising fury.. Vega.. Gcn.. Rdn1.. Just hope atleast Intel will give a serious competition to Nvidia.
Outsourcing the production of the Xe-HPG die to another fab (almost certainly TSMC, that is *if* they can spare the wafers) must have been, and still must be, a hard and bitter pill to swallow for the upper Intel execs. Intel has perhaps way too much "corporate pride", which makes little business sense and must have also hurt their business significantly. They could have outsourced part of their production to TSMC or Samsung from 2018 or even 2017 already when their 10nm node problems started but instead they kept lying and spinning, repeatedly, insisting that everything was fine and "any day now" their 10nm node issues would be all fixed and the yields would reach 14nm levels in the next couple of quarters.
Apart from pride, outsourcing to third party fabs in 2017/2018 would have been admission of "defeat", i.e. that they no longer had the best fab technology and expertise, since they couldn't deliver, and that TSMC and Samsung had surpassed them. They might have taken a market hit from that as well, if some investors lost confidence and trust* in them - still, I am certain that they would have a net gain, by far. Due to this reason (which is not entirely distinct from pride) and the aforementioned "corporate pride" they kept on lying and misleading the press and their investors quarter by quarter... I doubt there was some solid plan behind that, more likely they made it up as they went along. But it lasted for so long it got frankly tiresome..
*Well, the result of the repeated lies and misleading statements has been loss of trust as well (mainly for their fab business, though it has infected Intel more broadly)! Nobody really trusts Intel's process node projections and roadmaps anymore. Unless they release a product in the market first, in a reasonably high volume, very few people will trust that they will actually be able to buy it, no matter what Intel "promises". That is a hideous company image and it is 100% Intel's own fault.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
39 Comments
Back to Article
Desierz - Thursday, August 13, 2020 - link
Nice logo at least!vladx - Thursday, August 13, 2020 - link
Looks like Internet ExplorerGreat_Scott - Thursday, August 13, 2020 - link
This looks really good! After all, when you can't come up with good news on the CPU side, you emphasise good news on the GPU side.AMD does this every... single... time.
Wait, nevermind, this is Intel news. Sorry!
nico_mach - Friday, August 14, 2020 - link
I think you mean the other way around for AMD.And, yes, after that bombshell Intel needs a good, healthy distraction from their fab division. And PC GPUs definitely need shaking up.
One does wonder how far Apple will go with Apple Silicon. Perhaps Intel isn't the only one that will be hit by that.
Gigaplex - Thursday, April 15, 2021 - link
No, AMD did emphasize GPU news when their CPUs sucked (Bulldozer era)MrVibrato - Friday, August 14, 2020 - link
Yeah. I particularly like the teeny weeny suffixes there.. LP... no, wait, was that HP? And is this HPC or HPG? Need to squint my eyes... man, it's really hard to make out those tiny suffixes. But then again, they are so adorable. Like little kittens trying to lodge themselves onto your ankle...TristanSDX - Thursday, August 13, 2020 - link
High Peformance Gaming - hahaha, stupid name. Whta is it ? 1000 FPS ?Gaming is for fun, not for high-speed work.
xenol - Thursday, August 13, 2020 - link
Tell that to the myriad of PCMR's who think you need 120+ FPS or the game is unoptimized garbage.shabby - Thursday, August 13, 2020 - link
Please, 240fps at the minimum. I blink in a few ms...close - Thursday, August 13, 2020 - link
Amateurs, real gamers don't blink.rangerdavid - Tuesday, August 18, 2020 - link
I wish there was an up-vote button in these forums. Well "played," peeps.inighthawki - Thursday, August 13, 2020 - link
The whole PCMR thing is cringey, but I will absolutely say anything less than 120hz is indeed unoptimized garbage.althaz - Thursday, August 13, 2020 - link
It depends on the game, but I can barely play, for example, Rocket League on a 60Hz monitor. It's a stuttery horrible mess. This goes for any really fast-paced game 0 but something like Assassin's Creed or The Witcher are just fine running at 60fps (more is still better though).For fast-paced competitive games though, I'll alter the settings to get the game maxing out my montior's 144Hz refresh rate. It's sooooooooooo much smoother and nicer to play.
surt - Thursday, August 13, 2020 - link
When people discover something fun, they tend to find ways to compete. As has happened in gaming, where a significant percentage of all play is competitive play. In competitive play, better equipment is very relevant (e.g. Olympic swimmers who are breaking world records based on better computer analysis of their strokes, and better nanotech suits). In the world of computer gaming, higher performance CPU/GPU, etc. == winning.ballast - Thursday, August 13, 2020 - link
Don't be obtuse. That term is obviously meant to contrast Intel's integrated graphics solutions which are acceptable for low-end gaming with low resolutions and minimal graphic settings.ambhaiji - Friday, August 14, 2020 - link
There is a very simple philosophy to PCMR if I am playing a Story mode or Single Player Game the 0.1% fps should be above 60 with as low variance as possible , if I am competing to play against other people 2000 fps and 2000Hz is also lowMrVibrato - Friday, August 14, 2020 - link
It's the same philosophy as dick-measuring contests. Except without any dicks...privater - Thursday, August 13, 2020 - link
It might take Intel 5 years to fix game bugs on 5 year old game... Even veteran like AMD suffer from their new Navi core, I have very low hopes on Intel based on their track record on last 5 years reacting to meltdown, fab issue...rahvin - Saturday, August 15, 2020 - link
Don't get me wrong, I'm skeptical of their commitment, but Intel does pretty good on the software side. They've got one of the best drivers on Linux and they do windows drivers with the best of them. I wouldn't count them out, if they want to commit the cash they can put 500 people on driver optimizations, they could easily outspend both nvidia and AMD combined.tfhw - Thursday, August 13, 2020 - link
Why does your article leave out what is the most astonishing fact about the HPG varient: That Raja states he has them in his lab now? That means Intel has a GPU fabbed outside of Intel already in silicon and back from the fab, only weeks after stating in their Q2 report that going forward they are prepared to use outside fabs. They had to have started this process more than a year ago to have silicon now.vladx - Thursday, August 13, 2020 - link
Did you really thought Intel changed ship towards using 3rd party fabs just last month?fogifds - Thursday, August 13, 2020 - link
Intel does have R&D fabs, so they can test and run processes before going live. It's not like one day you hit save on gpu.asm and droxbox it to TSMC to run with.rahvin - Saturday, August 15, 2020 - link
You do realize you can't move a mask from one fab to another right? The design from the ground up is based on the process tech of the fab it's going to be made in. The entire design has to be customized to the fab.Wouldn't do you any good at all to tape out on one of their R&D fabs then go to somewhere else. Either they've used the masks and taped out on an outside fab or they did it internal and are planning to do it themselves. They wouldn't spend the $2million+ on a tape out they'd have to throw out the door when they switch manufacturers.
Strega315 - Thursday, August 13, 2020 - link
We were manufacturing Intel chips at WaferTech LLC (Camas, WA - TSMC subsidiary) for 20 years or more. I've been employed by both WaferTech and Intel, and both NVIDIA GPU chips and Intel chips were being manufactured on the same equipment for decades now, I was in Ion Implantation with Varian (now AMAT) implanters all through the 2000's and 2010's. Foundry operations save companies like Intel Millions if not Billions of dollars and they simply make good financial sense.Toe - Thursday, August 13, 2020 - link
Anybody seen any pigs flying lately? That's when I'll believe Intel has legit gaming GPUs.sorten - Friday, August 14, 2020 - link
Yeah, I'm definitely not buying Gen 1 Xe. I'll wait for reviews and for driver support issues to be sorted out, at least. But I'm glad we're going to have a 3rd player in the discrete GPU market.rahvin - Saturday, August 15, 2020 - link
The big test will be if Intel is actually committed to gpu design and moves forward. Remember, they've done dGPU before and abandoned the effort a year later.probablynotme - Thursday, August 13, 2020 - link
But can it play Crysis?MrVibrato - Friday, August 14, 2020 - link
If you pair it with a Threadripper, i guess so...Strega315 - Thursday, August 13, 2020 - link
I'm guessing this will be built up in Camas, WA at TSMC subsidiary WaferTech LLC. - former Intel, IBM, AMAT, & WaferTech semiconductor manufacturing Tech.anonomouse - Thursday, August 13, 2020 - link
doubt it- the only way this is competitive in the market is if it's on 7nm, so chances are it's being fabbed in the GigaFabs in Taiwan.mrvco - Thursday, August 13, 2020 - link
Intel needs to be careful or they're going to wake up NVIDIAZILLA.Pyrostemplar - Friday, August 14, 2020 - link
Well, sure HBM is quite more expensive than GDDR6 (twice?), but to call it stratospheric is a bit over the top ;)Santoval - Monday, August 17, 2020 - link
If HBM2 is indeed twice as expensive per GB than GDDR6 then in market terms it *has* a stratospheric price. I don't think it is so expensive though.Spunjji - Friday, August 14, 2020 - link
Interesting news... I was wondering why their stack was missing anything heftier than LP for gaming, when they'd previously stated having a gaming GPU as a goal. Perhaps they were waiting on announcing it until after they'd explicitly declared their intentions to use 3rd party fabs?Either way, I'm fascinated by this and it'll be interesting to see how this plays out. I'm not exactly optimistic, but I'd love for their to be more competition amongst gaming GPUs again. The current duopoly has led to some really grim pricing practices of late.
RedOnlyFan - Saturday, August 15, 2020 - link
Hope intel's gpus don't become another amd. With amd like Crappy drivers... Crazy power draw.. No proper ODM launches... limited availability... Over hyped marketing.. Obvious red devil fans ragging for a high end card... We games don't want another overarching promising fury.. Vega.. Gcn.. Rdn1.. Just hope atleast Intel will give a serious competition to Nvidia.Santoval - Monday, August 17, 2020 - link
Outsourcing the production of the Xe-HPG die to another fab (almost certainly TSMC, that is *if* they can spare the wafers) must have been, and still must be, a hard and bitter pill to swallow for the upper Intel execs. Intel has perhaps way too much "corporate pride", which makes little business sense and must have also hurt their business significantly. They could have outsourced part of their production to TSMC or Samsung from 2018 or even 2017 already when their 10nm node problems started but instead they kept lying and spinning, repeatedly, insisting that everything was fine and "any day now" their 10nm node issues would be all fixed and the yields would reach 14nm levels in the next couple of quarters.Apart from pride, outsourcing to third party fabs in 2017/2018 would have been admission of "defeat", i.e. that they no longer had the best fab technology and expertise, since they couldn't deliver, and that TSMC and Samsung had surpassed them. They might have taken a market hit from that as well, if some investors lost confidence and trust* in them - still, I am certain that they would have a net gain, by far. Due to this reason (which is not entirely distinct from pride) and the aforementioned "corporate pride" they kept on lying and misleading the press and their investors quarter by quarter... I doubt there was some solid plan behind that, more likely they made it up as they went along. But it lasted for so long it got frankly tiresome..
*Well, the result of the repeated lies and misleading statements has been loss of trust as well (mainly for their fab business, though it has infected Intel more broadly)! Nobody really trusts Intel's process node projections and roadmaps anymore. Unless they release a product in the market first, in a reasonably high volume, very few people will trust that they will actually be able to buy it, no matter what Intel "promises". That is a hideous company image and it is 100% Intel's own fault.
nicolaim - Tuesday, August 18, 2020 - link
Typo: GDRR6neojack - Tuesday, August 18, 2020 - link
mmm an all-purpose card that can do gaming and computing...they mean, like GCN ?
been there, done that
GCN was fine for gaming and mining during winter nights, but those days are past