Some of my compute algorithms are all FP32 - it's not as if all compute demands FP64, though some algorithms obviously do (solving PDEs in small scale).
Most hashing algorithms use integers. Which is apparently the main reason AMD GPU's are preferred over NVIDIA's for such tasks: integer shift/rotate performance.
Looks like you're right. That's the first 290 card I've seen with two mini-DP. I'm looking at getting a couple of Dell's 4K 24" monitors, so I need two DP outputs. My existing 7950 has that, and my Linux box has a 7770 with two mini-DP, but almost no current-gen cards from either AMD or Nvidia offer this. (Except the workstation cards.)
8GB of video memory is the Emperor's new clothes of video memory, only bought by kids with their parents money because they've been told that they need it if they want to be hardcore gamers.
I've heard that statement a few years ago, when 128MB vram was considered "to much". It wasn't. And for some applications now and even more in 1-2 years 8GB won't be "to much". Games like Hitman Absolution, Thief 4 and some mods can easily max out the vram of a Titan, while achieving playable FPS in 4K resolution.
Considering the PS4 and XBox 1 both have 8GB of ram each, I was a little surprised that the new videocards where coming out with only 4GB. Granted, it's 8GB of shared ram in the consoles, but next gen games are going to start taking large pools of ram for granted now.
It's probably better to have too much and not need it than to not have enough and need it. They did it due to pressure from developers. If developers find a way to use it all, the PS4 will have a distinct advantage.
So these kids, I'm guessing they used their parent's money as well to buy 3 2560x1600 monitors to actually utilize the 8GB memory that comes with these cards?
That's not quite fair. You assume it's worthless and bought for spoiled rich kids.
First, it's been confirmed through testing that you need ~4GB if you plan on doing at 4xAA at 4K, depending on the game. If you start playing with higher settings, higher-res texture packs, and newer games designed to take advantage of more VRAM (thanks Xbone/PS4), then 8GB won't seem too extreme.
Second, you assume that only gamers want this, but there are many non-gaming applications that will eat up every last bit of VRAM you can throw at them and many consumers unwilling to spend the incredible premium on workstation cards.
Exactly what I've been waiting for, dedicated next gen games should start eating up video ram once they drop last gen as their minimum spec. Future proof gaming here we are!
Titanfall For PC is treating Video RAM like System Memory. Its one of the only PC Games i have that will fully saturate my GTX 780;s 3GB of GDDR5 at the main menu
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
23 Comments
Back to Article
blanarahul - Thursday, March 13, 2014 - link
Didn't AMD say that Hawaii was a gaming oriented chip?? I mean that's why it has 1/8 FP64 performance.Ian Cutress - Thursday, March 13, 2014 - link
Some of my compute algorithms are all FP32 - it's not as if all compute demands FP64, though some algorithms obviously do (solving PDEs in small scale).Flunk - Thursday, March 13, 2014 - link
Most password hashing algorithms, scrypt, stuff like that.Darkstone - Thursday, March 13, 2014 - link
Most hashing algorithms use integers. Which is apparently the main reason AMD GPU's are preferred over NVIDIA's for such tasks: integer shift/rotate performance.The Von Matrices - Thursday, March 13, 2014 - link
NVidia's Maxwell architecture now matches AMD's GCN at these operations.Lorthreth - Thursday, March 13, 2014 - link
Looks like Vapor has single DisplayPort and Toxic has 2 mini-DP.PixyMisa - Friday, March 14, 2014 - link
Looks like you're right. That's the first 290 card I've seen with two mini-DP. I'm looking at getting a couple of Dell's 4K 24" monitors, so I need two DP outputs. My existing 7950 has that, and my Linux box has a 7770 with two mini-DP, but almost no current-gen cards from either AMD or Nvidia offer this. (Except the workstation cards.)Alchemy69 - Thursday, March 13, 2014 - link
8GB of video memory is the Emperor's new clothes of video memory, only bought by kids with their parents money because they've been told that they need it if they want to be hardcore gamers.geekman1024 - Thursday, March 13, 2014 - link
Yeah, and Bill Gates said 640K is enough.Brooklands - Thursday, March 13, 2014 - link
I've heard that statement a few years ago, when 128MB vram was considered "to much". It wasn't. And for some applications now and even more in 1-2 years 8GB won't be "to much". Games like Hitman Absolution, Thief 4 and some mods can easily max out the vram of a Titan, while achieving playable FPS in 4K resolution.http://translate.google.de/translate?sl=auto&t...
Mr Perfect - Thursday, March 13, 2014 - link
Considering the PS4 and XBox 1 both have 8GB of ram each, I was a little surprised that the new videocards where coming out with only 4GB. Granted, it's 8GB of shared ram in the consoles, but next gen games are going to start taking large pools of ram for granted now.piroroadkill - Thursday, March 13, 2014 - link
Nah, Xbox 1 has 64MiB RAM.Xbox One has 8GiB unified.
ImSpartacus - Thursday, March 13, 2014 - link
The consoles use that RAM for the entire system, not just the GPU.rish95 - Thursday, March 13, 2014 - link
Do the PS4 and Xbox One even have the horsepower to make good use of 8GB of RAM?nathanddrews - Friday, March 14, 2014 - link
It's probably better to have too much and not need it than to not have enough and need it. They did it due to pressure from developers. If developers find a way to use it all, the PS4 will have a distinct advantage.EzioAs - Thursday, March 13, 2014 - link
So these kids, I'm guessing they used their parent's money as well to buy 3 2560x1600 monitors to actually utilize the 8GB memory that comes with these cards?nathanddrews - Friday, March 14, 2014 - link
That's not quite fair. You assume it's worthless and bought for spoiled rich kids.First, it's been confirmed through testing that you need ~4GB if you plan on doing at 4xAA at 4K, depending on the game. If you start playing with higher settings, higher-res texture packs, and newer games designed to take advantage of more VRAM (thanks Xbone/PS4), then 8GB won't seem too extreme.
Second, you assume that only gamers want this, but there are many non-gaming applications that will eat up every last bit of VRAM you can throw at them and many consumers unwilling to spend the incredible premium on workstation cards.
Third:
http://www.overclock.net/t/1472145/got-4k
The Von Matrices - Thursday, March 13, 2014 - link
Do these cards use 16 4Gb modules or 32 2Gb modules?Frenetic Pony - Thursday, March 13, 2014 - link
Exactly what I've been waiting for, dedicated next gen games should start eating up video ram once they drop last gen as their minimum spec. Future proof gaming here we are!MrSpadge - Friday, March 14, 2014 - link
Sure... with higher end Maxells and 20 nm almost around the next corner.JlHADJOE - Thursday, March 13, 2014 - link
AMD: Future-proofing their GPUs against mining hardware compensation.jasonelmore - Friday, March 14, 2014 - link
Titanfall For PC is treating Video RAM like System Memory. Its one of the only PC Games i have that will fully saturate my GTX 780;s 3GB of GDDR5 at the main menuJohnmcl7 - Friday, March 14, 2014 - link
No he didn't:http://www.wired.com/politics/law/news/1997/01/148...
John