Comments Locked

98 Comments

Back to Article

  • nathanddrews - Wednesday, June 15, 2016 - link

    Looks like the 460 could end up being a great little HTPC card. H.265/HDMI 2.0/DP1.4 pretty much guarantees compatibility with 8K display. Then when I want to watch the Dota International game stream, it would have enough grunt to do that, too.
  • Flunk - Wednesday, June 15, 2016 - link

    Since Intel's GPUs now support HDMI 2.0 and all the video decodes, why would you need a video card in a HTPC anymore?
  • ddriver - Wednesday, June 15, 2016 - link

    For occasional gaming. And I mean gaming, not solitaire or minesweeper.
  • Midwayman - Wednesday, June 15, 2016 - link

    Even my old Sandy bridge iGPU could handle stuff like LoL.
  • Shadow7037932 - Wednesday, June 15, 2016 - link

    At what resolution and settings?
  • Byte - Wednesday, June 15, 2016 - link

    Solitaire and minesweeper? Bust out my intel 386 for that!
  • HideOut - Wednesday, June 15, 2016 - link

    I dont think it could do x265 natively. Its a heat/noise thing in HTPC. Yes, via software it could, but that causes the CPU to work hard and therefore fans and heat/noise.
  • maroon1 - Friday, June 24, 2016 - link

    skylake has full support for 8-bit H.265 and partial support full 10-bit H.265 and VP9

    Kaby lake iGPU will get full 10-bit H.265 and VP9 support
  • HighTech4US - Wednesday, June 15, 2016 - link

    Exactly
  • shelbystripes - Wednesday, June 15, 2016 - link

    Intel's GPUs support "hybrid" (i.e. partially in software) HEVC decoding. AnandTech's prior testing showed that Intel GPUs choke on 4K 60p material, and failed entirely at playing back 10-bit material.

    Partial software decide means your system is partly limited by CPU speed. I'm sure a high-end Core i7 can do a better job... but probably costs a lot more than a low-power Core i3. And of course it's dependent on driver support. An i3 + RX460 solution sounds like it might be better for a Kodi/OpenELEC-based HTPC.
  • Flunk - Wednesday, June 15, 2016 - link

    No TVs support 10-bit content and TV shows are filmed at 30fps and movies at 24fps. HTPC is HTPC.
  • hojnikb - Wednesday, June 15, 2016 - link

    It's more of an issue of encoding. Lots of content is encoded with 10 bit HEVC (better efficiency).
  • nathanddrews - Wednesday, June 15, 2016 - link

    ... except all HDR TVs, where 10-bit input is mandatory. You know that UHD Blu-ray, 4K Netflix, 4K Amazon, etc. are all 10-bit, right? No 10-bit, no HDR either. As for frame rates, Netflix and Amazon are testing 60fps content. YouTube already has 4K60 content. NHL and NFL streams at 60fps... That's before any pr0n and home content (not implying the two are connected, LOL).
  • Eden-K121D - Wednesday, June 15, 2016 - link

    Vizio P series does
  • name99 - Wednesday, June 15, 2016 - link

    If you have content encoded at 10b AND a decent decoder, you can get most of the value of the 10b by dithering the final output down to 8b. (Cheapest is probably just to add blue noise to the signal and clamp.) This would look better (and definitely result in vastly diminished banding) than encoding at 8b.

    Now DO the existing decoders do a decent dither at the final stage of creating the output frame, as opposed to just clipping the lowest two bits? I have no idea.
  • Alexvrb - Thursday, June 16, 2016 - link

    With any halfway decent player, yes. 10b looks great on an 8b display. Heck VLC does an admirable job and anyone can use that. It's even available on mobile devices, Apple TV, etc - and they're beta testing a UWP version.

    Some of the older/default players for some devices do clip/ignore though which results in funny colors at times. But like I said there's options for novices and advanced users alike.
  • Fujikoma - Thursday, June 16, 2016 - link

    I have 4k 60fps files that are usually short films dealing with science. Monitors do support 10 bit content... as I don't have cable, this works fine in my home. I stream tv and movies, from a couple of NAS setups, over a wireless AC MU-MIMO setup through some WD boxes and computers.
  • plonk420 - Friday, June 17, 2016 - link

    the point of 10-bit is that even if your material to encode is 8-bit, things that would challenge an 8-bit (re)encoder is a piece of cake with a 10-bit colorspace to avoid worse gradient artifacting without having to use obscenely higher bitrates to compensate
  • hojnikb - Wednesday, June 15, 2016 - link

    >AnandTech's prior testing showed that Intel GPUs choke on 4K 60p material, and failed entirely at playing back 10-bit material

    Actually, it works just fine, but you need dual channel ram for it to work.

    Also, 10 bit support is offloaded to the GPU, not cpu.
  • vladx - Wednesday, June 15, 2016 - link

    You're in luck then because Kabt Lake will bring full hardware 4k encode/decode. No need to pay more for an AMD card.
  • vladx - Wednesday, June 15, 2016 - link

    *Kaby Lake
  • mdriftmeyer - Thursday, June 16, 2016 - link

    No, just pay up the ass for the Intel CPU/iGPU when I can buy the Zen/dGPU and get more bang for my money.
  • Michael Bay - Thursday, June 16, 2016 - link

    If Zen is even remotely close to current Intel crop, you`ll be paying just as much. AMD has no room to price dump.
  • maroon1 - Friday, June 24, 2016 - link

    Skylake iGPU has full support for HEVC

    The "hybrid" support is for 10-bit HEVC and VP9 (kaby lake will get full hardware support for this though)
  • emn13 - Wednesday, June 15, 2016 - link

    I currently am also GPU-less, and by and large that's fine, but I've noticed that several programs now offer gpgpu acceleration. And even though it's still quite rare, it's very noticeable how much slower things run without a gpu when gpgpu is supported.
  • CSMR - Wednesday, June 15, 2016 - link

    Assuming you're talking about iGPUs rather than headless servers? These should support OpenCL shouldn't they? Usually GPUs are so much faster at what they can do that any GPU is good enough, even low-end IGPUs.
  • looncraz - Wednesday, June 15, 2016 - link

    Because not everyone wants to replace their motherboard and CPU to gain a few graphics features.

    My HTPC currently runs a 7870XT, which is a power hog and is definitely overkill for its use-cases (some racing games (Dirt 2) and a few other games that are majestic on a 65" TV with theater-quality surround sound). The RX 460 is the card I've been waiting for to upgrade that machine (save power, reduce noise, not lose much - or any - performance, and updated off-loaded decoding).
  • xrror - Wednesday, June 15, 2016 - link

    7870XT being a salvaged Tahiti part (It's basically what a 7930 would have been) isn't really a power hog for what it is, but I don't think the current drivers really optimize for it anymore since it also only has 2GB instead of 3GB so it's unlike all other Tahiti parts.

    Not disagreeing that it might be a bit warm for a compact HTPC, but it wasn't ever really advertised as a power sipper either.
  • Thorburn - Wednesday, June 15, 2016 - link

    No they don't, even Kaby Lake is only 1.4a.
    You can do it using DP 1.2 to HDMI 2.0 adapters though.
  • Srikzquest - Monday, June 20, 2016 - link

    This is a rumor but still something to consider... http://www.notebookcheck.net/Intel-Kaby-Lake-will-...
  • Ken_Masters - Thursday, June 16, 2016 - link

    Litterally if you have any other task than watching videos an intel gpu is insufficient unless its the new one on intels $700 gpu that performs worse than AMDs cheast APU offerings.
  • Michael Bay - Thursday, June 16, 2016 - link

    You have to be a truly insane AMD drone to believe Iris Pro is not ripping current APUs a new one.
  • Alexvrb - Thursday, June 16, 2016 - link

    GT3e+ don't come cheap. Once you get into those leagues dGPU (mobile or otherwise) is quite a good option too. :P
  • quilciri - Thursday, June 16, 2016 - link

    I think you're under the mistaken impression that Home Theater PC's are only for video. Manay people, myself included, use it for gaming. I have lots of local co-op games from steam installed on mine and I play with xbox360/steam controllers.

    Given that your living room probably has more space, this will be alikely location for Occulus Rift/ HTC Vive owners, in which case the HTPC will need to be a beefy gamer.
  • monstercameron - Thursday, June 16, 2016 - link

    intel skylake afair doesnt have hw hevc or vp9 support.
  • tamalero - Tuesday, June 28, 2016 - link

    Do they fully support?
    I still have nightmares of the horrible intel gpu drivers and their useless "accelerations" that caused bsods or graphic artifacts all the time.
  • damianrobertjones - Wednesday, June 15, 2016 - link

    "guarantees compatibility with 8K display"

    Nope. By the time they arrive a new cable standard will be adopted ensuring older stuff no longer works without x or y. You gotta' make the $$$$ if you're a company.
  • MobiusPizza - Wednesday, June 15, 2016 - link

    Erm no, that's what standards are for. This card supports DP 1.4 standard and by definition it will support 8k display. Yes it may require a new cable, but that shouldn't set you off more than few tens of bucks.
  • nathanddrews - Wednesday, June 15, 2016 - link

    DP1.4 supports 8K60/10-bit/HDR and 4K120/10-bit/HDR.
  • nathanddrews - Wednesday, June 15, 2016 - link

    Although it's always possible the GPU itself can't output 8K. There could always be some frame buffer limitation that prevents it from exceeding 5K or something... I guess we'll find out in a couple years.
  • 3DoubleD - Wednesday, June 15, 2016 - link

    I'd suggest content delivery/storage would be another huge problem. File sizes will be massive, even with H.265 compression.

    Also, for video consumption purposes, I don't see much use for 8K. 4K is already only a minor difference and only with large screens and short viewing distances. 8K will require even more impractical arrangements.

    By and large, the most significant difference between 1080p and 4k/UHD broadcasts, streams, and discs has been an improvement in bitrate. The quality of some 1080p streams is appalling, both on the internet and over cable. A bitrate starved 1080p TV broadcast looks worse than a bitrate starved 4k broadcast, since they are sending about 4x as much information in the 4k broadcast. If you compared 1080p and 4k content with proper bitrate, the difference is small, even with a proper setup.

    For this reason (and the storage issue), I'm happy to stick with a high quality 1080p panel. By watching good quality source material and with my current 60" TV at a 10 ft viewing distance setup, I won't see a substantial quality improvement by upgrading to 4k.
  • nathanddrews - Wednesday, June 15, 2016 - link

    By and large, every UHD Blu-ray I've played (about 10 that I own so far) has had a noticeable increase in detail, but much more so, the detail HDR brings to dark scenes, bright scenes, and colors is the most pleasing aspect. Not one Blu-ray has been as good as its UHD Blu-ray counterpart.

    The main movie files off the UHD Blu-ray discs I have on my server (backed up, but not yet decrypted) are between 45GB and 65GB, which is only about double size the average 1080p Blu-ray movie file already stored there. 4x the resolution, higher bit depth, HDR metadata, and still only 2x the size at most. It will be interesting to see if they ever truly utilize discs over 100GB.

    As for 4K streaming, the non-HDR streams aren't much of an improvement over Blu-ray, but the HDR content makes up for its compression with - as they say - "better pixels". The lack of lossless audio is also a reason to avoid it if you already have the disc.
  • Murloc - Thursday, June 16, 2016 - link

    well there's the home cinema market which has 100'' screens and TV-like viewing distances.
  • blahsaysblah - Wednesday, June 15, 2016 - link

    This is incorrect. DP1.3 has the raw bandwidth for 4k@96@30bit. DP 1.4 does not add any more raw bandwidth. Nowhere near the 4k@240Hz, forgetting any jump in color bits... The 8k support listed is all via supposedly "visually lossless" but mathematically lossy compression. Same with 4k@144Hz support, its partly via 8 bit displays(4k@120@8bit/color is max raw) and rest with your perfect pixels from source being transformed.
  • nathanddrews - Wednesday, June 15, 2016 - link

    I think you meant "That is correct." Doesn't matter if it's lossy or lossless, DP1.4 still supports 8K60/10-bit/HDR and 4K120/10-bit/HDR.
  • blahsaysblah - Wednesday, June 15, 2016 - link

    No, DP1.4 has raw bandwidth for 4k@96Hz@10bits/color, 4k@120Hz@8bits/color. Anything beyond that is via "visually lossless" but mathematically lossy compression features theyve been introducing for a while. All that 8k support is bogus.

    DP1.4 cant even support 4k@144Hz with raw perfect source pixels. So expect strong push from VR and or gamers.
  • D. Lister - Saturday, June 18, 2016 - link

    Wow, you would be getting a 460, for an 8K display? That... is just... fantastic.
  • Spunjji - Wednesday, June 15, 2016 - link

    Low Z-Height sounds like the sort of thing Apple would be interested in - a nice thin Polaris 11 would be an ideal match for their Macbook Pro range. I wonder if they had a hand in that requirement?
  • Flunk - Wednesday, June 15, 2016 - link

    It was designed for notebooks and several Polaris 11 notebook GPUs have already been announced. It wouldn't be too surprising to see one in a MacBook Pro the TDP is about right.
  • shelbystripes - Wednesday, June 15, 2016 - link

    I would love to see an MBP with RX460 inside. It would almost justify the long delay in updating the MBP's hardware...
  • tipoo - Wednesday, June 15, 2016 - link

    Pretty please yes. It's crazy that the most money I can pay for a Macbook today will get me GCN 1.0. The M370X doesn't even offer enough performance lead over my Iris Pro 5200 model for me to be tempted.
  • trane - Wednesday, June 15, 2016 - link

    This is a sure shot. Macbook Pro is going to ship with RX 480M. As will XPS 15, and a whole bunch of notebooks. I bet a RX 470M or a lower power variant of Polaris 11 even makes it to ultrabooks like Surface Book. That's what the low z-height is about.
  • lllll - Wednesday, June 15, 2016 - link

    Both rx460 and rx470 look promising to me. However, rx480 should have best theoetical value (tflops / $)
  • Wreckage - Wednesday, June 15, 2016 - link

    Going by their own benchmark of Steam VR, the 480 is about as fast as a 290. That's very underwhelming. I guess Hardocp was right.
  • doggface - Wednesday, June 15, 2016 - link

    390 not 290. And almost as good as a 980. For less than half the price.

    More importantly. Right smack bang in the mass market area where most people buy their GPUs.

    Also very interested in polaris 11 if the price is right and the tdp is low...
  • geniekid - Wednesday, June 15, 2016 - link

    Agreed. The RX 480's theoretical position on the price-performance curve will dominate NVidia in the market overall with current prices. NVidia won't let that happen of course. Rumors of 1060 specs are already making their rounds.
  • Trixanity - Wednesday, June 15, 2016 - link

    Steam VR is an unreliable benchmark. It's not even a benchmark really. It says cards like GTX 580 and GTX 560 Ti are VR ready (it is not) and other graphics cards score differently on each run - very differently. I've heard of a GTX 970 scoring between 6.1 to 7.9 (and everything in between) on the same machine. So don't put any stock in that.
  • hechacker1 - Wednesday, June 15, 2016 - link

    I agree. My old i7 920 @ 4GHz and a overclocked 970 (about 1500MHz boost) gets me a solid 7.7 and nearly "very high" settings. No frame drops or frames under 90fps. It's a weak test.
  • T1beriu - Wednesday, June 15, 2016 - link

    Seems so weird to act as if you still don't know about P10/P11 when you've been to many AMD meetings in the last month and probably you have many RX 480/470 in the office under NDA. :)
  • pats1111 - Wednesday, June 15, 2016 - link

    I have followed Anandtech for many years, and have watched them slide into an Advanced Micro Devices basher, and very Intel/Nvidia pro website, not to mention, GAG, Apple. I no longer take anything this site offers in the way of reviews seriously. To even mention Hardocp is blasphemous, they've been trashing AMD for years, and wonder why they weren't invited to the AMD announcement..... If you want realistic reviews, go elsewhere, specifically where you aren't seeing a multitude of ads on the pages for Intel/Nvidia/Iphone, et al...........
  • johnpombrio - Wednesday, June 15, 2016 - link

    AMD as a company is doing terribly. It is on the deathwatch over at Ars and on Forbes. It is hemorrhaging money, just sold off 11% of the company to the Chinese, and its R&D budget is down 30% in less than 3 years. it is hard to be excited for a company in such dire straits.
  • atlantico - Sunday, June 26, 2016 - link

    I remember when Apple was on the deathwatch, every week, every month, every year for almost a decade, people were creating clicks by writing negative articles about Apple. Major financial players were asking for liquidation of the company (hi Mike Dell!!) and pundits and internet "experts" alike were just counting down to the demise of Apple.

    Those who watched Apple closely at that time knew that things were turning around, because unlike a dying company, there was constant development going on, steady releases and interesting things happening.

    Dumb proclamations of "chinese owning 11%" of AMD mean nothing. Being on deathwatch means nothing. Budgets mean nothing. Capability, drive and product delivery does.

    AMD is still innovating, still making something new and interesting, still delivering those products. AMD is very much alive.
  • Ryan Smith - Thursday, June 16, 2016 - link

    One of our core rules is that we always adhere to embargoes, and that we never discuss whether we're under embargo or not. We don't do the "I know something you don't know" game because it's disrespectful to our readers, our competition, and the companies we cover. For embargoed information, we'll talk about it if and when the company is ready for us to share it with you.

    As a result all articles are written from the perspective of the reader, which means this is newly revealed information.
  • funkforce - Friday, June 17, 2016 - link

    Speaking of reviews.
    Where are the reviews for GTX 960, 950, 1080 or 1070? Every time you've promise "a later day", "next week", "barring further complications" or "personal matters" among a plethora of promises in comment sections or on twitter. Don't you ever feel ashamed that you keep stringing people on for months and months and then just let it come to nothing again and again? Shouldn't an Editor in Chief lead by example? I just feel helpless, you are such a good writer and I've always valued ATs opinion the most. What is going on Ryan? Seriously?
  • ManuelDiego - Wednesday, June 15, 2016 - link

    All other tech sites published their GTX 1080 review on May the 17th. It's June the 15th, 4 full weeks later, and Anandtech hasn't yet published theirs (not to mention the 1070).
    <ironyON> Will we see a conjoint Pascal/Polaris/Vega review some time this fall? <ironyOFF>
  • tarqsharq - Wednesday, June 15, 2016 - link

    The longer we wait the more and more curious I get as to the contents of the review... or what difficulties have delayed it.
  • stardude82 - Wednesday, June 15, 2016 - link

    I'm still waiting for that GTX 950 review from last summer.
  • TallestJon96 - Wednesday, June 15, 2016 - link

    Stole my joke.

    It does kind of become ridiculous when websites like Guru3d have reviews for not 1, but 4 different 1080s, plus reviews for the 1070s, but anandtech can't even get one out. Or Gamer's nexus has 1080 and 1070 review, plus a watercooled review for the 1080 and an SLI review for the 1070. Both of these sites use more intensive metrics than average fps and minimum, they use things like frame time graphs and 1% and 0.1% lows.

    I was sort of waiting for the 1070 to release so they could review at the same time, then I was waiting to see if they reviewed them when the partner cards were availible for review, amd now I don't know what we are waiting for.

    If the Anandtech review blows me away I'll take my complaints back, but I have a feeling the conclusion will be the same: the 1080 is the best graphics card on the market, and the 1070 beats every other graphics cards out there. Buy whichever one you can afford.
  • Glkjouivnqwe - Wednesday, June 15, 2016 - link

    If you're looking for some copy-and-pasted bits from Nvidia PR along with game benchmarks, then Anandtech will never be your go-to place.

    Does SMP - the vr multi-projection - actually do anything? how well? Who knows? Certainly Guru3d haven't tested any part of it, but they've taken up a good half-a-page regurgitating Nvidia's marketing materials of it.

    What about the other buzz-wordy features? The only way to find out is to actually test them. And that takes time. And also drivers with the functionality actually enabled, which may or may not exist yet.
  • Meteor2 - Wednesday, June 15, 2016 - link

    I come to Anandtech mainly for the comments. Well it couldn't be due the reviews, could it? Because there aren't any.
  • Ranger1065 - Thursday, June 16, 2016 - link

    A standard Anandtech apologist remark. While there may yet be a grain of truth to the old "no one goes into the details like Anandtech does" defense, truth be told, this is an inexcusable ball drop of immense proportions.

    As I have said previously, by the time they finally publish the EPIC ARTICLE, (and after this long one wonders if they ever will or even care about doing so) no amount of that special Anandtech sauce will be able to redeem it.

    News that old interests no-one.

    RIP GPU SECTION.
  • trinibwoy - Thursday, June 16, 2016 - link

    Not to mention the MIA 960 review. It's the end of an era.
  • ikjadoon - Wednesday, June 15, 2016 - link

    Interestingly, TechReport (another relatively high-quality publication) also has not released their GTX 1080 review....very very interesting!
  • HighTech4US - Wednesday, June 15, 2016 - link

    Both sites have lost high end personnel.

    I wonder if that is the primary reason for the delays.
  • CaedenV - Wednesday, June 15, 2016 - link

    I love how eSports have become synonymous with 'cheap budget cards' over this last year.
  • ikjadoon - Wednesday, June 15, 2016 - link

    Well, it makes sense. Aren't the big eSports games DOTA2 and LoL, which are hardly GPU-intensive?
  • stardude82 - Wednesday, June 15, 2016 - link

    if an RX 460 is inline with a GTX 950/R7 370, it should handle pretty much any eSport game at 1080p @ 60fps maxed out. The only exception would be Overwatch which is very new and with settings only turned down a notch.
  • nightbringer57 - Wednesday, June 15, 2016 - link

    It does actually make sense.

    We've seen actual budget cards become increasingly weaker compared to the high end. Meanwhile, somewhat undemanding games have become the bread and butter of esports, since to gain momentum, developers of esports oriented games target markets that do not traditionally have big budgets to invest into their computers. In turn, it's only logical that GPU manufacturers would release cheaper, sufficient cards for said markets. They do actually correspond to a reality and I don't think it's purely a marketing ploy.
  • Qwertilot - Wednesday, June 15, 2016 - link

    It isn't even outright about being cheap. There are definite characteristics for this sort of market. Raw speed isn't super crucial, but I guess you still want enough to run the odd console port at half sane settings.

    The crucial things are stuff like very low TDP, small size and very quiet coolers.
  • SoccerVolt - Wednesday, June 15, 2016 - link

    Does this confirm some additional specs for Project Scorpio in any way? Or can additional specs be derived from this presentation?
  • Eden-K121D - Wednesday, June 15, 2016 - link

    Nah that will be veag based otherwise polaris architecture based semi-custom SOC's wont be able to do 4K 60FPS with 6 Tflops of performance as Phil Spencer mentioned. ANother thing though XBOX scorpio might come with 12 GB GDDR5X ram
  • boozed - Wednesday, June 15, 2016 - link

    What's all this "z-height" nonsense. What happened to "thickness"?
  • mateau - Thursday, June 16, 2016 - link

    ANANDTECH is completely missing the big story about AMD's RX 480.

    RX 470 and 460 are for folks who want better performance but can't afford RX 480.

    And that is an addressable market of hundreds of millions of folks.

    But that is NOT the big story.

    What is huge is AMD has released a SCALABLE graphics add in board with RX 480.

    For $199.99 you get better than Fury X performance.

    Buy two and spend $399.98 and you get better than GTX 1080 performance.

    In fact 2x RX 480 in Crossfire mode CRUSHES GTX 1080 in AOTS DX12 benchmarks.

    Who cares about DX11? Or DX10 or DX9 for that matter. DX12 is the future of gaming.

    Need more performance just bung in another $199.99 RX 480.

    NVidia has again screwed over it's customers by cancelling 3-4 way SLI also. NVidia is not supporting scalability.

    Buy Radeon and you BUY AMERICAN.
  • TheinsanegamerN - Thursday, June 16, 2016 - link

    A scale able architecture? You mean crossfire? Thats been around since the mid 2000s. 2FPS faster is not "crushing" the 1080. 3-4 way GPUS have always scaled poorly, so it is no surprise that nvidia dropped it (and 4 way SLI was never officially supported in the first place).

    Whatever you are smoking, you might want to drop it for awhile and come back to reality.
  • Psycho Turkey - Tuesday, June 28, 2016 - link

    You have some neck on you coming out with that garbage and expecting anyone to take you seriously.
    2 fps extra you say? Nearly three years ago I broke an 11 year stint of running solely nvidia gpu's. I went and bought two 290's, flashed them to 290X's and have been running them water cooled and crossfired for nearly 3 years since. The average gain with running a 2nd gpu is well above 90% more fps than with the single gpu in.
    I keep seeing muppets like yourself making ridiculous claims about things they have zero experience of and then expecting people to take them seriously. On top of that you then have the gall to ask the guy "what he was smoking"?
    You're a f-ing joke mate!
  • Ammaross - Thursday, June 16, 2016 - link

    An RX480 is more in line with a 390X, not a Fury X.
    2x RX480s only get a couple FPS more in AOTS DX12 (62ish vs 59).
    DX11 is still a major codebase for most gaming engines and will be a couple of years before DX12 is more the norm.
    NVidia cancelling 3-4way SLI (which they're not) cuts out the <0.1% of the market that did 3+SLI. DX12 is basically replacing XFire/SLI with explicit multi-adapter. Just like you said, "DX12 is the future of gaming" and thus, antiquated AFR is being phased out for DX12 EMA.
    But yes, AMD hitting the mid-to-low end market with the RX480 and down is a smart move. Marketshare and Mindshare is important. They'll have a high-end answer with Vega, but they likely need all the fab wafers they can get right now to keep up with mainstream demand of the new cards. NVidia went the other way and they can't keep their stock up, and not likely due to demand.
  • Jay77 - Thursday, June 16, 2016 - link

    DirectX 12 might be the future of gaming, but AotS isn't. Until there's a compliant game I'd actually buy, with real money, DX12 is just so much MS fart gas.

    Let's all morn the loss of 3+ SLI, I'm sure half a dozen people are absolutely inconsolable.
  • Morawka - Thursday, June 16, 2016 - link

    can't wait to see these in notebooks at super cheap prices.
  • Valantar - Thursday, June 16, 2016 - link

    That second to last slide ("Endnotes") has some interesting tidbits tucked away.

    -Radeon RX 470 is listed as a "110W" card

    -Radeon RX 480M is apparently a fully enabled Polaris 11, with 16 CUs. Roughly the same as RX 460, but clocked lower? Also, >4000 in Fire Strike at 35W system power? Holy sh*t. Even with a dual core 15W i7 (4600), that's amazing. Close to a GTX 960M, but at ... half the power? Less?
  • DonMiguel85 - Thursday, June 16, 2016 - link

    I consider it a travesty that Anandtech never posted reviews for the GTX 960 and 950, even promising to post one for the former.
  • milkod2001 - Thursday, June 16, 2016 - link

    Better get used to it, Since Anand and some of his team have left and also after site got acquired by Purch things got little bit slow here. Purch don't give a shiit about quality or quantity of posts/reviews etc. All it cares are clicks on ads.

    There is literally zero improvements to website since Anand has left. Still not edit option for comments. How pathetic is that for no1 tech site in 2016? Sigh.
  • PeckingOrder - Thursday, June 16, 2016 - link

    Is there any decent alternative to what AnandTech used to be a couple of years ago?
  • Michael Bay - Thursday, June 16, 2016 - link

    >It`S THE CURRENT YEEEEAAAAAAAAR

    You`ll post comments the AT way, and you`ll like it, goddamn it!
  • Mondozai - Sunday, June 19, 2016 - link

    AT is not the #1 tech site anymore. It used to be during the golden days of 2009-2013. That is when most of us discovered the site, especially during the mobile review days. Brian was just the best reviewer out there, period.

    Today there isn't a clear successor. For GPU's, I'd say that SweClockers and computerbase.de are great. For English speakers I'd say GamersNexus.

Log in

Don't have an account? Sign up now