The answer is in the first paragraph. Read the article if you're gonna be snarky:
"Despite the differences, '4K' has become entrenched in the minds of the consumers as a reference to UHD. Hence, we will be using them interchangeably in the rest of this piece."
not being snarky at all, 3840x2160 has been defined by Digital Cinema Initiatives (DCI) as UHD - it's a standard why not following it? because calling two resolutions the same creates a lot more confusion instead of clarifying it
The article clearly states the different standards and since the VAST majority of people refer to them all as 4k, simply stated it would do so as well... If you got confused by this, I suggest you might need to pay attention during your Grade 10 math class...
The same reason why tossing a 'p' after some random number immediately defines some "standard" resolution. I don't think most people realize that 'p' just means progressive scan, and worst of all, it pushes a *really* bad assumption that all resolutions are presented in 16:9.
This is why every resolution should be given with the aspect radio and the number of vertical pixels. You know, how it's been done for the past 20-odd years.
Gives you all the information you need, without any extraneous marketing fluff.
Switching to "number of horizontal pixels" without any other information is just dumb. 2k, 2.5k, 4k, 8k are all just marketing fluff that don't give you enough information to know what the display actually looks like/supports.
It's too bad a technical site like this can't go beyond the marketing to actually presenting propper technical specs, and correct "the common knowledge" dosing around out there.
The same owners of Toms Hardware now own Anandtech... Dailytech then split off.
I went from visiting the site on a daily basis to probably once a month... The content is either poor or there is nothing new, normally that wasn't a problem I would just navigate over to Dailytech and read people's comments in a barry white voice.
That ship has sailed. There's no point squabbling over the insignificantly small difference (6%) between full "DCI 4K" vs "UHD". The people that NEED to know the difference already DO.
Just as 1080p was a byproduct of the 2K era, UHD is a byproduct of the 4K era (and soon, 8K era). Both serve as compromises to the multiple resolutions and aspect ratios of their respective eras, to ensure that everyday consumers have the ability to display content from a wide variety of sources.
This is exactly right, to further quote Wikipedia:
>> >>
UHD
UHD is a resolution of 3840 pixels × 2160 lines (8.3 megapixels, aspect ratio 16:9) and is one of the two resolutions of ultra high definition television targeted towards consumer television, the other being FUHD which is 7680 pixels × 4320 lines (33.2 megapixels). UHD has twice the horizontal and vertical resolution of the 1080p HDTV format, with four times as many pixels overall.[1][31]
Digital cinema
The Digital Cinema Initiatives consortium established a standard resolution of 4096 pixels × 2160 lines (8.8 megapixels, aspect ratio 256:135) for 4K film projection. This is the native resolution for DCI-compliant 4K digital projectors and monitors; pixels are cropped from the top or sides depending on the aspect ratio of the content being projected. The DCI 4K standard has twice the horizontal and vertical resolution of DCI 2K, with four times as many pixels overall. DCI 4K does not conform to the standard 1080p Full HD aspect ratio (16:9), so it is not a multiple of the 1080p display.
4K digital films may be produced, scanned, or stored in a number of other resolutions depending on what storage aspect ratio is used.[33][34] In the digital film production chain, a resolution of 4096 × 3112 is often used for acquiring "open gate" or anamorphic input material, a resolution based on the historical resolution of scanned Super 35mm film.[35]
The Consumer Electronics Association would disagree with you. They defined 4K and UHD as being the same thing. UHD/4K qualifies for any television capable of displaying a MINIMUM of 3840x2160.
"The group also defined the core characteristics of Ultra High-Definition TVs, monitors and projectors for the home. Minimum performance attributes include display resolution of at least eight million active pixels, with at least 3,840 horizontally and at least 2,160 vertically. Displays will have an aspect ratio with width to height of at least 16 X 9. To use the Ultra HD label, display products will require at least one digital input capable of carrying and presenting native 4K format video from this input at full 3,840 X 2,160 resolution without relying solely on up-converting."
anandtech, generalizing helps no one if its base assumption is wrong as per the NHK R&D/BBC R&D multi year collaboration....
you mean officially called "UHD-1" at 3840x2160 and "UHD-2" at 7680x4320 pixels and is ITU BT.2020 real colour 10bit/12bit per pixel 120 fps etc.... compliant
+ lets also not forget 22/2 sound capable as per the NHK R&D/BBC R&D multi year collaboration to define all this UHD-1/UHD-2 standard.
dude, people call things out against their "standard" names all the time. whats next are you going to harp on people calling a USB flash drive a "thumb" drive?
4K is a generic term. You are referring to 4K DCI. UHD is simply another resolution in the 4K family.
4K resolution, also called 4K, refers to a display device or content having horizontal resolution on the order of 4,000 pixels. Several 4K resolutions exist in the fields of digital television and digital cinematography.
Remember how "HD" came to refer to everything higher than 480p? Same idea. If it's 4K-ish, they're gonna call it 4K since that has the greatest market buzzword value right now.
in the UK at least , "HD ready" was deemed acceptable for "720P" by the sale teams,not by the consumers, and "full HD" for "1080P"....
as of today, i see another PR fake "UHD-1" aka 4k/UHD happening , as in they will label anything that is NOT ITU-R Rec. BT.2020 compliant as "UHD [4K] ready" and "UHD [8K] ready" while the real UHD-1/UHD-2 become Full UHD [4k/8k]...
Joint Task Force on Networked Media (JT-NM) Minimum Viable System Requirements Of a Sample System Architecture for Live Multi-Camera Studio Production
if these UHD (12bit!) screen is/are not ITU-R “Rec. BT.2020″ 10bit/12bit real colour compliant then why should we htpc users care....
Video Format Resolution – The system shall be capable of carrying video payload of any resolution up to the size of UHDTV2 (7680 x 4320). Video Image Rate – The system shall be capable of carrying video payload of any frame or field rate up to 300 Hz, and shall be capable of carrying NTSC style fractional frame rates at multiples of (1000/1001). Video Sample Depth – The system shall be capable of carrying video payload sample depths of 10 or 12 bits. Video Chroma Sampling – The system shall be capable of carrying a video payload of 4:2:2 or 4:4:4 chroma sampling. Alpha Channel – The system shall be capable of carrying a video payload that contains an Alpha Channel (a component that represents transparency). Color Spaces – The system shall be capable of carrying video payload in the color space of ITU-R Rec. BT.601, ITU-R Rec. BT.709, and ITU-R Rec. BT.2020.
They did mention the Radeon earlier - looks like they don't support it yet? ATI cards used to be vastly superior for HTPCs once upon a time... wonder what will come out when they release their new graphics lineup.
AMD's current lineup all consist of a pretty old architecture. GCN has been revised 3 times, but it's still the same architecture. Big Maxwell kind of beat them to market. AMD claims they've been waiting to market a new chip until AFTER all these standards are set so they can have native support across all of them (DX12, HDMI 1.4, etc.)
As long as AMD has a stockpile of next-gen chips available for XMAS season they'll be ok.
Last I knew, UVD didn't support h.265 at all, and unless my Google Fu is failing me, that's still correct. However, I did find this nugget:
http://us.hardware.info/reviews/5156/6/amd-a10-785... "Hardware-based support for the brand new H.265 / HEVC codec that will be used for 4K content is not yet a part of UVD, but there's good news in this department. Together with Telestream, AMD has developed HEVC codec that uses HSA that's able to play 4K HEVC content on Kaveri with a very low load on the CPU. It's unclear how and when that codec will become available to consumers, but the fact that the chip is specifically suitable for 4K HEVC is great news if you want to build an HTPC. AMD also wants HSA to be used for Open Source projects, so it wouldn't surprise us if they release an HSA-compatible OpenCL open source H.265 codec."
Because AMD has just been rebadging its existing GPUs while TSMC has been failing to get new processes out the door, while Intel and nVidia have updated their product lines.
AMD supports HEVC decoding through OpenCL. Cyberlink's HEVC decoder and free Strongene Lentoid HEVC decoder can accelerate HEVC decoding using GPU shaders for both AMD and Intel GPUs. Take a look here: http://forum.doom9.org/showthread.php?p=1705352#po...
When they tried to disable 1080P over Component too early consumers complained and they gave in. We need to do the same thing with this HDMI 2.0 nonsense. If they ratified HDMI 2.0 for receivers and TVs they should have required the new 2.2 copy protecion...otherwise what is the point. We need to push them to allow 4K Blueray over hdmi 2.0 period.
Though you are correct that consumers complained, that's not why they gave in. They gave in because the FCC said "you will not disable component outputs on existing devices". They (being Comcast in this specific case) even applied for an exemption for Pay Per View, and were turned down.
So the only way they could disable component ports in the US was to not put them on devices at all. That's why many BluRay players nowadays don't include component outputs.
Note though that internet distribution was immune to the ruling, which is how Vudu gets away with downrezzing component outputs to 480i.
HDCP 2.2 is mostly irrelevant to me, because I don't think there's a significant difference between 1080p and 4K at living room viewing distances. Even though my living room screen is 100". (LCOS projector)
DXVA Checker reports that my GTX 970 supports HEVC_VLD_Main for resolutions up to QFHD. The GT 650M on my rMPB in bootcamp reports HEVC_VLD_Main too, but for FHD only.
10 and 12bit colors are becoming a necessity sooner or later. IPS-screens are approaching contrast ratios of 1:2000 and with only 8-bit processing it is not possible to display more than 256-luminance steps.
What good does high contrast bring if your digital signal path reduces actual presentable dynamic range down to 8 bits?
Digital cameras can capture dynamic ranges up to 13-14 stops (bits) in good conditions and best PVA screens already achieve over 4000:1 contrasts ratios which would benefit from linear contrast steps all the way up to 12-13bit signaling.
And then we have OLED-screens should in theory offer unlimited contrast ratios, though in practice problems with lowest driver currents limit the bottom end usable brightness to something higher that is non-zero. (See Oculus DK2 black smearing issues and "hack" to limit low-end brightness to RGB1,1,1 instead of true zero).
Higher bit-depths are needed if we want to get real and usable increases to dynamic ranges of the monitors.
Currently we are calibrating monitors often to 200nits brightness for 255,255,255 signal with 8bit color channels and that is it, there is nothing above that. It is like saying that 48dB music loudness ought to be enough for eveyone, anyone who wants to listen louder ought to go home and stop destroying peoples hearing...
Yes, I have the review ready, but there are some strange aspects in the storage subsystem testing - waiting for Intel to shed more light on our findings. May opt to put out the review next week even if they don't respond.
1) Windows 10 already has HEVC and MKV support built-in. This has rather limited implications for desktop users, but it means that we could see HEVC support on Xbox One and Windows Phone in the near future which would be interesting. It may not mean anything for 4K content directly, but the ability to have 1080p HEVC content on such devices is going to be a big deal, and we will see 4K on such devices sooner rather than later.
2) Will anyone care about 8K from a content perspective? I mean look at the crap quality of DVD and how well it has managed to be up scaled to 1080p displays. Sure, it does not look like FHD content, but it looks a heck of a lot better than 1990's DVD, and with how little data there is to work with it is truly nothing short of a miracle. Likewise, moving from 1080p content to a 4K display looks pretty nice. Looking at still-frames there is certainly a difference in resolution quality... but in video motion the difference is really not that noticeable. The real difference that will stand out is the expanded color range and contrast ratio of 4K video compared to what is available for 1080p Bluray content rather than the resolution gains. Moving from 4K to 8K will be even less of a difference. 4K content will have so much resolution information, combined with color and contrast information, that it will be essentially indistinguishable from any kind of 8K content available when up scaled. That is not to say that 8K will not be better... just not practically better. Or better put; 8K will only be noticeably better in so few situations that it will not be practical to invest in anything better than 4K content which could be easily and un-noticeably up scaled to newer 8K displays. 4K just hits so close to so many physical limitations that it becomes a heck of a lot of work for little to no benefit to move beyond it. I think we will see 4K hang around for a very long time; at least as a content standard. At least until we switch to natively 3D mediums which will require 4K+ resolution per eye, or we start seeing bionic implants that improve our vision.
3) Is copy protection really going to be a big deal going forward? Last year I got so frustrated with my streaming experiences for disc-less devices, and annoyed at trying to find discs in my media library that I finally bit the bullet and ripped my whole library to a home server. With 4K media it is going to be the exact same workflow where I purchase a disc, rip it to the server, and play it back on whatever device I want. Copy protection is simply never going to be good enough to stop pirates, so why not adopt the format of the pirates and have reasonable pricing on content like the music industry did? Would it really be that difficult to make MKV HEVC the MP3 of the video world and just sell them directly on Amazon where you could store them locally if you want or re-download per viewing if you really don't have the storage space? DRM just seems like such a silly show of back-and-forth that it is less than useless.
CaedenV, I think you are underestimating the number of people who look at still pictures. While what you are saying is mostly true for movies, movies are just one of many uses of a display. We've had 10 megapixel cameras for about 10 years now. And still today, virtually no display can actually show 10 megapixels. 4K displays certainly can't without cutting out lines or otherwise compromizing / compacting the picture.
I hate to be the guy who says "640K ought to be enough for anybody", but at the moment I can barely tell the difference between 720p and 1080p on a 46" screen. Now, I understand the push these days is bigger is better, but at some point even your average household isn't going to want a TV bigger than 65" or so. Who is 4K and 8K for? I can see 8K for the theaters, but short of that...? 1080p will not only hold me for a long time, it might hold me indefinitely. People already can't tell the difference between DTS and DTS-HD in blind tests.
"but at the moment I can barely tell the difference between 720p and 1080p on a 46" screen" Insufficient parameters. Even assuming 20:20 vision, the angle subtended by the display (or the viewing distance from the dsplay combined by the display size) are very important in determining optimum resolution and refresh rate. The distance recommended for SDTV is far too far away for viewing HDTV optimally, so if you swapped an SDTV for an HDTV without moving your chair/sofa, you're getting a sub-optimal picture.
personally I can usually tell a pretty big difference between 720p and 1080p (though sometimes I have been fooled), especially if there is any finely detailed iconography on screen (like text and titles)... but I have pretty good eyes and can't tell the difference between 1080p and 4K in most situations. I plan to move to 4K on my PC because I need (who am I kidding, I WANT) a 35-45" display and 1080p simply does not hold up at those sizes at that distance. For a living room situation 4K does not start making since most of the time until you start getting larger than 55", and even then you need to be sitting relatively close. It would make the most sense with projector systems that take a whole wall... but those are not going to drop in price any time soon.
Oh sure, for a computer monitor in a production environment there is certainly a case to be made for 8K, and even 16K which is in the works. Probably not a case to be made for what I do, but I can certainly see the utility for Photoshop power users and content creators.
But in an HTPC situation (which is what the article is about) there is not a huge experiential difference between 1080p and 4K (at least from a resolution standpoint, I understand 4K brings other things to the table as well). I mean it is better... but not mind-blowingly better like the move from SD to FHD was. The move from UHD to 8K+ will be even less noticeable.
Also, the MP count on your typical sensor is extremely misleading as the effective image size is considerably lower. Most 8MP cameras (especially on cell phones and point-and-shoot devices) can barely make a decent 1080p image. A lot of that sensor data is combined or averaged to get your result, so even then 4K is going to be more than enough for still images (unless shooting RAW).
Why are tech companies so bad at agreeing on standards for things? The PC world is a nightmare right now with competing next-gen standards, and now an HDMI 2.0 cable and an HDMI 2.0 source might not work together properly. I feel like universal standards are more important than cutting costs to the bone and inconveniencing consumers, but then I'm not the CEO of Toshiba.
yes and no. Yes, Skylake should (note should) have hardware HEVC support. However, even older desktop i3/5/7 CPUs are powerful enough to do software decoding for 4K HEVC, so it is sort of a moot point. Hardware support is really only important in the mobile space where you are running on a battery, or low power devices (like Atom chips) which cannot decode fast enough on the CPU cores.
in addition to that, what really matters on the desktop chips is HEVC encode (or VP9 for that matter). Encoding time for HEVC takes forever, and getting hardware encoding would dramatically speed things up, which could really help with things like streaming services.
I've learned a long time ago not to worry about 'future proofing' my AV equipment. I buy everything at once to ensure it's all compatible and actually takes advantage of the new features being hyped. It also ends up being cheaper since by the time the full AV chain is ready, prices have dropped, and quality/compatibility has improved.
so if you are building an HTPC and care about 4k the only real option is using a discrete gtx 960. I dont like having discrete cards in my htpc tho. Will AMD's carrizo APU have hardware HEVC decode? Will skylake have hardware HEVC decode? Has that info been announced at all? I know carrizo is supposed to use Hawaii GCN architecture and that doesn't have hevc hardware decode so that's not rly looking good. Have no clue what intel is doing for skylakes igpu. I hope 2015 solves this issue.
"Unfortunately, the version of HDCP used to protect HD content was compromised quite some time back."
"Consumers need to remember that not all HDMI 2.0 implementations are equal."
Or we just crack the shitty DRM encryption again. It's not even useful. HDCP was cracked but it didn't flood the market with piracy, no that happened 3 years earlier when the encryption used on bluray disks was cracked. Physical access to a device is the ultimate security breach.
cracking the hdcp wasn't rly that useful, who wants to capture an uncompressed video signal to the monitor thats like HUGEEEE file sizes and it's even more impractical for 4k. What is the bitrate on an uncompressed 4k video signal astronomical i know that. Would spinning platter HDD's even be fast enough to capture it im pretty sure SSD's would be required and the whole idea is incredibly impractical. All the money put into new HDCP was basically a complete waste as no 1 is going to bother going through that hassle. The disk encryption will just be cracked again and 4k mkvs encoded in HEVC will b popping up all over once 4k discs become standardized.
Why do you think the industry is so intent on pushing unnecessary bloat in the form of 4K? There are so many positives for industry players and really none for consumers.
totally agree. HDCP is just an annoyance that breaks things at best, and slows down piracy by a few months. They really need to do what the music industry did with MP3 format. Embrace the format that everyone wants and loves, and offer affordable content. If they remove the barriers to ownership, then more people will purchase.
Who on earth would want to intercept a real time uncompressed video stream when you have the disc? It's beyond insane. Just snag the stream and avoid transcoding. The only practical use case is video game consoles, computers, and cable boxes. Even then it should only ever be used on cable boxes.
4k starts to matter at 65 inches imo and definitely matters at 70"+. 4k also matters if you like to sit close and feel immersed. 60 inch and below is pointless for 4k tho unless you sit 2 feet away i agree with that and majority of people only have 60 inches or below. I currently use a 65 inch panasonic st60 tv but the LG 4k oleds are making their appearance albeit at an absurd 10k for 65 inch but just like the 55 inch 1080p started at 11k and is now 3k 2 years later surely this will eventually creep into the 4k range then I can buy it.
"4k starts to matter at 65 inches imo and definitely matters at 70"+"
You're forgetting something. 1080 was given to us precisely because it wasn't enough resolution. Instead of just doubling 720 to 1440, which would have been plenty of resolution, even for large sets, the industry instead has duped people into thinking they need 4K.
It looks like we are about a year away from truly useful 4K / UHD.
I'll keep an eye out for hardware meeting the minimum requirements: HDMI 2.0 (or better yet HDMI 2.0a), HDCP 2.2 and ARC. I have my fingers crossed on Skylake Integrated Graphics 3840x2160p60 support.
Was the million dollar question ever answered about Maxwell 2+ (does it support both 4:4:4 and hdcp 2.2)? I know of no TV/receiver that does atm but never could find the answer about Maxwell 2.
The real "Key Takeaway" is that most popular 4K broadcast content (Netflix, Amazon Instant Video etc.) is not watchable on PC right now - and it isn't clear if it will be possible to do so even in near future.
Congratulations, consumer sheep. You have been scammed again. Instead of just creating a logical HDTV standard, one that maximizes human visual acuity with minimal data size overkill, you have been led around with a carrot on a stick in order to convince you to keep replacing your equipment. First there was 720. Then 1080 (which definitely should have never existed). Then, we go to 4K with manufacturers already plotting 8K sets sooner than you think.
HDTV should have gone right to 1440 and stopped there. 1080 should have never existed and 4K is ridiculous overkill for TV viewing unless you sit one or two feet away from the set. But, manufacturers will continue to convince you that you need highly compressed (artifacted), color-drained, excessively bulky and slow to encode 4K video because moar pixelz = moar entertainment!
The same silliness affects the monitor market, where consumers have been duped into thinking pixels they can't see (because they're so so small!) are more important than things like wide gamut GB-LED backlighting — even though sRGB is an ancient standard that doesn't even cover the even more ancient ISO Coated printing space let alone modern inkjet printers' spaces, let alone coming even slightly close to the limits of human vision.
What should have happened after 720 is 1440 with at least AdobeRGB gamut (if not more). Smaller bandwidth requirements would have lowered the compression demands. That would have, in turn, increased the quality of the playback — most likely above highly-compressed 4K. Now, I notice that wide gamut is now apparently part of the hodgepodge of "standards" being tossed about for 4K, but I'll believe it when I see it. Regardless, the compression is going to have to be heavy to deal with the unnecessarily large files.
I should modify the sentence "the same silliness" to note that, at least 4K makes some sense for monitors — not for TVs, due to viewing distance. See TFTCentral article.
Just like how more CPU cores in phones meant more performance or how a QHD screen somehow got squeezed into a phablet - because more is better. It has nothing to do with good engineering, it has everything to do with stupid marketing departments.
We still have 1080p content on MPEG2 through satellite, in all its blocky compressed "HD" glory. 4k content would probably suffer the same fate.
The number of people who would consider a dual-slot GPU to be a HTPC GPU is very less. It is preferable that HTPC GPUs be single slot and/or passively cooled. Integrate GPUs are the best bet unless one is interested in high-quality madVR or other such taxing renderer configurations.
Also, software support is not that great right now even if the drivers claim 4K HEVC decode capabilities. We should also wait for that aspect to mature.
Wait a sec, does this mean every single 4k TV sold thus far won't be able to play 4k blu-Rays because they don't have HDCP 2.2 support? How is that legally allowed?
You are right in your inference. However, note that some of the currently sold 4K TVs do have HDCP 2.2 ports (as you can see in our panel photo linked above).
Unfortunately, this is one of the disadvantages of being an early adopter... and that is exactly the reason why it is difficult to make any purchases in the entertainment PC setup that involves 4K - be it decoding or display.
derp UHD-2 OC and perhaps cover that in more detail by emailing the BBC R&D team and asking for info and their take on the real UHDTV1/2 time to markets etc...
so basically not worth it unless you are loaded (i.e. willing to upgrade your HTPC + TV) and want to watch the 1 movie in 10 that comes out on 4k (as well as going to the pain of having to deal with discs again).
>10b encodes, despite being supported in the drivers, played back >with a black screen (indicating either the driver being at fault, or >LAV Filters needing some updates for Intel GPUs). LAV filters do not decode HEVC by itself. It uses ffmpeg for decoding (either in software an hardware decode). In LAV filter source code in file dxva2dec.cpp in function CDecDXVA2::InitDecoder you could find some checks of profile and decoder compatibility. In case of Main10 hardware decoding is falling back to software decoding. In other words - LAV filters currently do not support hardware Main10 decoding (it is disabled in code, as far as I can understand - because of ffmpeg do not supports this yet, but I could be wrong)
I was very much hoping for an article like this. Thank you, Ganesh!
We are at an annoying twilight between the end of h.264/1080P and the beginning of h.265/4k. It looks like there are two choices for a future-resistant HTPC: Patience or a relatively large budget.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
93 Comments
Back to Article
zmeul - Friday, April 10, 2015 - link
why are you calling 3840x2160 4K? it's UHD, not 4KSixonetwo - Friday, April 10, 2015 - link
The answer is in the first paragraph. Read the article if you're gonna be snarky:"Despite the differences, '4K' has become entrenched in the minds of the consumers as a reference to UHD. Hence, we will be using them interchangeably in the rest of this piece."
zmeul - Friday, April 10, 2015 - link
not being snarky at all, 3840x2160 has been defined by Digital Cinema Initiatives (DCI) as UHD - it's a standardwhy not following it? because calling two resolutions the same creates a lot more confusion instead of clarifying it
frodbonzi - Friday, April 10, 2015 - link
The article clearly states the different standards and since the VAST majority of people refer to them all as 4k, simply stated it would do so as well... If you got confused by this, I suggest you might need to pay attention during your Grade 10 math class...Aikouka - Friday, April 10, 2015 - link
The same reason why tossing a 'p' after some random number immediately defines some "standard" resolution. I don't think most people realize that 'p' just means progressive scan, and worst of all, it pushes a *really* bad assumption that all resolutions are presented in 16:9.phoenix_rizzen - Saturday, April 11, 2015 - link
This is why every resolution should be given with the aspect radio and the number of vertical pixels. You know, how it's been done for the past 20-odd years.4:3 480p
16:9 720p
16:10 800p
16:9 1080p
21:9 1080p
16:9 1440p
16:9 2160p
Etc
Gives you all the information you need, without any extraneous marketing fluff.
Switching to "number of horizontal pixels" without any other information is just dumb. 2k, 2.5k, 4k, 8k are all just marketing fluff that don't give you enough information to know what the display actually looks like/supports.
It's too bad a technical site like this can't go beyond the marketing to actually presenting propper technical specs, and correct "the common knowledge" dosing around out there.
JonnyDough - Tuesday, April 14, 2015 - link
It's too bad commenters can't properly call this what it is, a marketing site. It hasn't been a tech site since Tom sold it to Best of Media.JonnyDough - Tuesday, April 14, 2015 - link
Oops, I was thinking I was on Tom's Hardware. :D Not that it matters, Anandtech also just sold out.StevoLincolnite - Sunday, April 19, 2015 - link
The same owners of Toms Hardware now own Anandtech... Dailytech then split off.I went from visiting the site on a daily basis to probably once a month... The content is either poor or there is nothing new, normally that wasn't a problem I would just navigate over to Dailytech and read people's comments in a barry white voice.
nathanddrews - Friday, April 10, 2015 - link
That ship has sailed. There's no point squabbling over the insignificantly small difference (6%) between full "DCI 4K" vs "UHD". The people that NEED to know the difference already DO.Just as 1080p was a byproduct of the 2K era, UHD is a byproduct of the 4K era (and soon, 8K era). Both serve as compromises to the multiple resolutions and aspect ratios of their respective eras, to ensure that everyday consumers have the ability to display content from a wide variety of sources.
10042015 - Friday, April 10, 2015 - link
This is exactly right, to further quote Wikipedia:>> >>
UHD
UHD is a resolution of 3840 pixels × 2160 lines (8.3 megapixels, aspect ratio 16:9) and is one of the two resolutions of ultra high definition television targeted towards consumer television, the other being FUHD which is 7680 pixels × 4320 lines (33.2 megapixels). UHD has twice the horizontal and vertical resolution of the 1080p HDTV format, with four times as many pixels overall.[1][31]
Digital cinema
The Digital Cinema Initiatives consortium established a standard resolution of 4096 pixels × 2160 lines (8.8 megapixels, aspect ratio 256:135) for 4K film projection. This is the native resolution for DCI-compliant 4K digital projectors and monitors; pixels are cropped from the top or sides depending on the aspect ratio of the content being projected. The DCI 4K standard has twice the horizontal and vertical resolution of DCI 2K, with four times as many pixels overall. DCI 4K does not conform to the standard 1080p Full HD aspect ratio (16:9), so it is not a multiple of the 1080p display.
4K digital films may be produced, scanned, or stored in a number of other resolutions depending on what storage aspect ratio is used.[33][34] In the digital film production chain, a resolution of 4096 × 3112 is often used for acquiring "open gate" or anamorphic input material, a resolution based on the historical resolution of scanned Super 35mm film.[35]
Format Resolution Display aspect ratio Pixels
UHDTV 3840 × 2160 1.78:1 (16:9) 8,294,400
Ultra wide television 5120 × 2160 2.33:1 (21:9) 11,059,200
WHXGA 5120 × 3200 1.60:1 (16:10) 16,384,000
DCI 4K (native resolution) 4096 × 2160 1.90:1 (256:135) 8,847,360
DCI 4K (CinemaScope cropped) 4096 × 1716 2.39:1 7,028,736
DCI 4K (flat cropped) 3996 × 2160 1.85:1 8,631,360
<< <<
Source:
https://en.wikipedia.org/wiki/4K_resolution
nathanddrews - Friday, April 10, 2015 - link
The Consumer Electronics Association would disagree with you. They defined 4K and UHD as being the same thing. UHD/4K qualifies for any television capable of displaying a MINIMUM of 3840x2160."The group also defined the core characteristics of Ultra High-Definition TVs, monitors and projectors for the home. Minimum performance attributes include display resolution of at least eight million active pixels, with at least 3,840 horizontally and at least 2,160 vertically. Displays will have an aspect ratio with width to height of at least 16 X 9. To use the Ultra HD label, display products will require at least one digital input capable of carrying and presenting native 4K format video from this input at full 3,840 X 2,160 resolution without relying solely on up-converting."
https://www.ce.org/News/News-Releases/Press-Releas...
geekfool - Friday, April 10, 2015 - link
anandtech, generalizing helps no one if its base assumption is wrong as per the NHK R&D/BBC R&D multi year collaboration....you mean officially called
"UHD-1" at 3840x2160 and
"UHD-2" at 7680x4320 pixels and is ITU BT.2020 real colour 10bit/12bit per pixel 120 fps etc.... compliant
+ lets also not forget 22/2 sound capable as per the NHK R&D/BBC R&D multi year collaboration to define all this UHD-1/UHD-2 standard.
http://www.bbc.co.uk/rd/blog/2013/06/defining-the-...
and official EBU test sequences are available up to 12_EBU_ZurichAthletics2014_UHD_300p_HDR 3840x2160 360° "300 fps" 3026 52636-55635 and beyond.
https://tech.ebu.ch/docs/testmaterial/ebu_test_seq...
Samus - Friday, April 10, 2015 - link
dude, people call things out against their "standard" names all the time. whats next are you going to harp on people calling a USB flash drive a "thumb" drive?SirMaster - Friday, April 10, 2015 - link
4K is a generic term. You are referring to 4K DCI. UHD is simply another resolution in the 4K family.4K resolution, also called 4K, refers to a display device or content having horizontal resolution on the order of 4,000 pixels. Several 4K resolutions exist in the fields of digital television and digital cinematography.
Scabies - Friday, April 10, 2015 - link
Remember how "HD" came to refer to everything higher than 480p? Same idea. If it's 4K-ish, they're gonna call it 4K since that has the greatest market buzzword value right now.geekfool - Friday, April 10, 2015 - link
in the UK at least , "HD ready" was deemed acceptable for "720P" by the sale teams,not by the consumers, and "full HD" for "1080P"....as of today, i see another PR fake "UHD-1" aka 4k/UHD happening , as in they will label anything that is NOT ITU-R Rec. BT.2020 compliant as "UHD [4K] ready" and "UHD [8K] ready" while the real UHD-1/UHD-2 become Full UHD [4k/8k]...
Joint Task Force on Networked Media (JT-NM)
Minimum Viable System Requirements
Of a Sample System Architecture for Live Multi-Camera Studio Production
if these UHD (12bit!) screen is/are not ITU-R “Rec. BT.2020″ 10bit/12bit real colour compliant then why should we htpc users care....
https://tech.ebu.ch/docs/groups/jtnm/JT-NM%20MVS%2...
Video Format Resolution – The system shall be capable of carrying video payload of any
resolution up to the size of UHDTV2 (7680 x 4320).
Video Image Rate – The system shall be capable of carrying video payload of any frame or
field rate up to 300 Hz, and shall be capable of carrying NTSC style fractional frame rates at
multiples of (1000/1001).
Video Sample Depth – The system shall be capable of carrying video payload sample depths of
10 or 12 bits.
Video Chroma Sampling – The system shall be capable of carrying a video payload of 4:2:2 or
4:4:4 chroma sampling.
Alpha Channel – The system shall be capable of carrying a video payload that contains an
Alpha Channel (a component that represents transparency).
Color Spaces – The system shall be capable of carrying video payload in the color space of
ITU-R Rec. BT.601, ITU-R Rec. BT.709, and ITU-R Rec. BT.2020.
jay401 - Friday, April 10, 2015 - link
Btw, the first image in the article has a typo "efficieny" making it seem fake.aaronwt - Saturday, April 18, 2015 - link
Why are they calling 1080P "full HD"? There is no such thing. It's just a marketing term. 1080P is HD.SydneyBlue120d - Friday, April 10, 2015 - link
What about AMD and HEVC? I do not see any AMD part listed in the HEVC decoding table :)frodbonzi - Friday, April 10, 2015 - link
They did mention the Radeon earlier - looks like they don't support it yet? ATI cards used to be vastly superior for HTPCs once upon a time... wonder what will come out when they release their new graphics lineup.Samus - Friday, April 10, 2015 - link
AMD's current lineup all consist of a pretty old architecture. GCN has been revised 3 times, but it's still the same architecture. Big Maxwell kind of beat them to market. AMD claims they've been waiting to market a new chip until AFTER all these standards are set so they can have native support across all of them (DX12, HDMI 1.4, etc.)As long as AMD has a stockpile of next-gen chips available for XMAS season they'll be ok.
Aikouka - Friday, April 10, 2015 - link
Last I knew, UVD didn't support h.265 at all, and unless my Google Fu is failing me, that's still correct. However, I did find this nugget:http://us.hardware.info/reviews/5156/6/amd-a10-785...
"Hardware-based support for the brand new H.265 / HEVC codec that will be used for 4K content is not yet a part of UVD, but there's good news in this department. Together with Telestream, AMD has developed HEVC codec that uses HSA that's able to play 4K HEVC content on Kaveri with a very low load on the CPU. It's unclear how and when that codec will become available to consumers, but the fact that the chip is specifically suitable for 4K HEVC is great news if you want to build an HTPC. AMD also wants HSA to be used for Open Source projects, so it wouldn't surprise us if they release an HSA-compatible OpenCL open source H.265 codec."
Ryan Smith - Friday, April 10, 2015 - link
They don't have any HEVC decode support in current products.http://images.anandtech.com/doci/8460/DXVA_285.png
Carrizo will be the first product to support it.
DanNeely - Friday, April 10, 2015 - link
Because AMD has just been rebadging its existing GPUs while TSMC has been failing to get new processes out the door, while Intel and nVidia have updated their product lines.NikosD - Friday, April 10, 2015 - link
AMD supports HEVC decoding through OpenCL.Cyberlink's HEVC decoder and free Strongene Lentoid HEVC decoder can accelerate HEVC decoding using GPU shaders for both AMD and Intel GPUs.
Take a look here:
http://forum.doom9.org/showthread.php?p=1705352#po...
dryloch - Friday, April 10, 2015 - link
When they tried to disable 1080P over Component too early consumers complained and they gave in. We need to do the same thing with this HDMI 2.0 nonsense. If they ratified HDMI 2.0 for receivers and TVs they should have required the new 2.2 copy protecion...otherwise what is the point. We need to push them to allow 4K Blueray over hdmi 2.0 period.barleyguy - Friday, April 10, 2015 - link
Though you are correct that consumers complained, that's not why they gave in. They gave in because the FCC said "you will not disable component outputs on existing devices". They (being Comcast in this specific case) even applied for an exemption for Pay Per View, and were turned down.So the only way they could disable component ports in the US was to not put them on devices at all. That's why many BluRay players nowadays don't include component outputs.
Note though that internet distribution was immune to the ruling, which is how Vudu gets away with downrezzing component outputs to 480i.
HDCP 2.2 is mostly irrelevant to me, because I don't think there's a significant difference between 1080p and 4K at living room viewing distances. Even though my living room screen is 100". (LCOS projector)
luffy889 - Friday, April 10, 2015 - link
DXVA Checker reports that my GTX 970 supports HEVC_VLD_Main for resolutions up to QFHD. The GT 650M on my rMPB in bootcamp reports HEVC_VLD_Main too, but for FHD only.zepi - Friday, April 10, 2015 - link
10 and 12bit colors are becoming a necessity sooner or later. IPS-screens are approaching contrast ratios of 1:2000 and with only 8-bit processing it is not possible to display more than 256-luminance steps.What good does high contrast bring if your digital signal path reduces actual presentable dynamic range down to 8 bits?
Digital cameras can capture dynamic ranges up to 13-14 stops (bits) in good conditions and best PVA screens already achieve over 4000:1 contrasts ratios which would benefit from linear contrast steps all the way up to 12-13bit signaling.
And then we have OLED-screens should in theory offer unlimited contrast ratios, though in practice problems with lowest driver currents limit the bottom end usable brightness to something higher that is non-zero. (See Oculus DK2 black smearing issues and "hack" to limit low-end brightness to RGB1,1,1 instead of true zero).
Higher bit-depths are needed if we want to get real and usable increases to dynamic ranges of the monitors.
Currently we are calibrating monitors often to 200nits brightness for 255,255,255 signal with 8bit color channels and that is it, there is nothing above that. It is like saying that 48dB music loudness ought to be enough for eveyone, anyone who wants to listen louder ought to go home and stop destroying peoples hearing...
Oxford Guy - Friday, April 10, 2015 - link
PVA?A-MVA panels from Sharp, as used in two Eizos, exceed 5500:1. There is also a 1440 res panel from TP Vision with similar numbers.
ScepticMatt - Friday, April 10, 2015 - link
You have a NUC5i7RYH for review, or is that a typo? (If you are allowed to answer that)ganeshts - Friday, April 10, 2015 - link
Yes, I have the review ready, but there are some strange aspects in the storage subsystem testing - waiting for Intel to shed more light on our findings. May opt to put out the review next week even if they don't respond.CaedenV - Friday, April 10, 2015 - link
1) Windows 10 already has HEVC and MKV support built-in. This has rather limited implications for desktop users, but it means that we could see HEVC support on Xbox One and Windows Phone in the near future which would be interesting. It may not mean anything for 4K content directly, but the ability to have 1080p HEVC content on such devices is going to be a big deal, and we will see 4K on such devices sooner rather than later.2) Will anyone care about 8K from a content perspective? I mean look at the crap quality of DVD and how well it has managed to be up scaled to 1080p displays. Sure, it does not look like FHD content, but it looks a heck of a lot better than 1990's DVD, and with how little data there is to work with it is truly nothing short of a miracle. Likewise, moving from 1080p content to a 4K display looks pretty nice. Looking at still-frames there is certainly a difference in resolution quality... but in video motion the difference is really not that noticeable. The real difference that will stand out is the expanded color range and contrast ratio of 4K video compared to what is available for 1080p Bluray content rather than the resolution gains.
Moving from 4K to 8K will be even less of a difference. 4K content will have so much resolution information, combined with color and contrast information, that it will be essentially indistinguishable from any kind of 8K content available when up scaled. That is not to say that 8K will not be better... just not practically better. Or better put; 8K will only be noticeably better in so few situations that it will not be practical to invest in anything better than 4K content which could be easily and un-noticeably up scaled to newer 8K displays. 4K just hits so close to so many physical limitations that it becomes a heck of a lot of work for little to no benefit to move beyond it. I think we will see 4K hang around for a very long time; at least as a content standard. At least until we switch to natively 3D mediums which will require 4K+ resolution per eye, or we start seeing bionic implants that improve our vision.
3) Is copy protection really going to be a big deal going forward? Last year I got so frustrated with my streaming experiences for disc-less devices, and annoyed at trying to find discs in my media library that I finally bit the bullet and ripped my whole library to a home server. With 4K media it is going to be the exact same workflow where I purchase a disc, rip it to the server, and play it back on whatever device I want. Copy protection is simply never going to be good enough to stop pirates, so why not adopt the format of the pirates and have reasonable pricing on content like the music industry did? Would it really be that difficult to make MKV HEVC the MP3 of the video world and just sell them directly on Amazon where you could store them locally if you want or re-download per viewing if you really don't have the storage space? DRM just seems like such a silly show of back-and-forth that it is less than useless.
dullard - Friday, April 10, 2015 - link
CaedenV, I think you are underestimating the number of people who look at still pictures. While what you are saying is mostly true for movies, movies are just one of many uses of a display. We've had 10 megapixel cameras for about 10 years now. And still today, virtually no display can actually show 10 megapixels. 4K displays certainly can't without cutting out lines or otherwise compromizing / compacting the picture.valnar - Friday, April 10, 2015 - link
I hate to be the guy who says "640K ought to be enough for anybody", but at the moment I can barely tell the difference between 720p and 1080p on a 46" screen. Now, I understand the push these days is bigger is better, but at some point even your average household isn't going to want a TV bigger than 65" or so. Who is 4K and 8K for? I can see 8K for the theaters, but short of that...? 1080p will not only hold me for a long time, it might hold me indefinitely. People already can't tell the difference between DTS and DTS-HD in blind tests.edzieba - Friday, April 10, 2015 - link
"but at the moment I can barely tell the difference between 720p and 1080p on a 46" screen"Insufficient parameters. Even assuming 20:20 vision, the angle subtended by the display (or the viewing distance from the dsplay combined by the display size) are very important in determining optimum resolution and refresh rate. The distance recommended for SDTV is far too far away for viewing HDTV optimally, so if you swapped an SDTV for an HDTV without moving your chair/sofa, you're getting a sub-optimal picture.
CaedenV - Friday, April 10, 2015 - link
personally I can usually tell a pretty big difference between 720p and 1080p (though sometimes I have been fooled), especially if there is any finely detailed iconography on screen (like text and titles)... but I have pretty good eyes and can't tell the difference between 1080p and 4K in most situations. I plan to move to 4K on my PC because I need (who am I kidding, I WANT) a 35-45" display and 1080p simply does not hold up at those sizes at that distance. For a living room situation 4K does not start making since most of the time until you start getting larger than 55", and even then you need to be sitting relatively close. It would make the most sense with projector systems that take a whole wall... but those are not going to drop in price any time soon.CaedenV - Friday, April 10, 2015 - link
Oh sure, for a computer monitor in a production environment there is certainly a case to be made for 8K, and even 16K which is in the works. Probably not a case to be made for what I do, but I can certainly see the utility for Photoshop power users and content creators.But in an HTPC situation (which is what the article is about) there is not a huge experiential difference between 1080p and 4K (at least from a resolution standpoint, I understand 4K brings other things to the table as well). I mean it is better... but not mind-blowingly better like the move from SD to FHD was. The move from UHD to 8K+ will be even less noticeable.
Also, the MP count on your typical sensor is extremely misleading as the effective image size is considerably lower. Most 8MP cameras (especially on cell phones and point-and-shoot devices) can barely make a decent 1080p image. A lot of that sensor data is combined or averaged to get your result, so even then 4K is going to be more than enough for still images (unless shooting RAW).
foxtrot1_1 - Friday, April 10, 2015 - link
Why are tech companies so bad at agreeing on standards for things? The PC world is a nightmare right now with competing next-gen standards, and now an HDMI 2.0 cable and an HDMI 2.0 source might not work together properly. I feel like universal standards are more important than cutting costs to the bone and inconveniencing consumers, but then I'm not the CEO of Toshiba.Oxford Guy - Friday, April 10, 2015 - link
"Why are tech companies so bad at agreeing on standards for things?" Collusion for speeding up planned obsolescence. What else?Oxford Guy - Friday, April 10, 2015 - link
Also, divide and conquer is an old tactic, indeed.bznotins - Friday, April 10, 2015 - link
Will Skylake come with better HEVC support, and thus is it worth waiting for if one wants to build a new HTPC/server this year?CaedenV - Friday, April 10, 2015 - link
yes and no. Yes, Skylake should (note should) have hardware HEVC support. However, even older desktop i3/5/7 CPUs are powerful enough to do software decoding for 4K HEVC, so it is sort of a moot point. Hardware support is really only important in the mobile space where you are running on a battery, or low power devices (like Atom chips) which cannot decode fast enough on the CPU cores.CaedenV - Friday, April 10, 2015 - link
in addition to that, what really matters on the desktop chips is HEVC encode (or VP9 for that matter). Encoding time for HEVC takes forever, and getting hardware encoding would dramatically speed things up, which could really help with things like streaming services.DesktopMan - Saturday, April 11, 2015 - link
I doubt any 35w/45w parts (commenly used in HTPCs) can do 4K60 in software only. Not all lower power systems are running on batteries.ToTTenTranz - Friday, April 10, 2015 - link
So Core M doesn't have even an ounce of HEVC decoding hardware?How about Cherry Trail?
FwFred - Friday, April 10, 2015 - link
Most likely the hardware is disabled to keep TDP in checkCaedenV - Friday, April 10, 2015 - link
probably a next gen feature. It is still pretty new tech.FwFred - Friday, April 10, 2015 - link
I've learned a long time ago not to worry about 'future proofing' my AV equipment. I buy everything at once to ensure it's all compatible and actually takes advantage of the new features being hyped. It also ends up being cheaper since by the time the full AV chain is ready, prices have dropped, and quality/compatibility has improved.Laststop311 - Friday, April 10, 2015 - link
so if you are building an HTPC and care about 4k the only real option is using a discrete gtx 960. I dont like having discrete cards in my htpc tho. Will AMD's carrizo APU have hardware HEVC decode? Will skylake have hardware HEVC decode? Has that info been announced at all? I know carrizo is supposed to use Hawaii GCN architecture and that doesn't have hevc hardware decode so that's not rly looking good. Have no clue what intel is doing for skylakes igpu. I hope 2015 solves this issue.dorion - Friday, April 10, 2015 - link
"Unfortunately, the version of HDCP used to protect HD content was compromised quite some time back.""Consumers need to remember that not all HDMI 2.0 implementations are equal."
Or we just crack the shitty DRM encryption again. It's not even useful. HDCP was cracked but it didn't flood the market with piracy, no that happened 3 years earlier when the encryption used on bluray disks was cracked. Physical access to a device is the ultimate security breach.
Laststop311 - Friday, April 10, 2015 - link
cracking the hdcp wasn't rly that useful, who wants to capture an uncompressed video signal to the monitor thats like HUGEEEE file sizes and it's even more impractical for 4k. What is the bitrate on an uncompressed 4k video signal astronomical i know that. Would spinning platter HDD's even be fast enough to capture it im pretty sure SSD's would be required and the whole idea is incredibly impractical. All the money put into new HDCP was basically a complete waste as no 1 is going to bother going through that hassle. The disk encryption will just be cracked again and 4k mkvs encoded in HEVC will b popping up all over once 4k discs become standardized.Oxford Guy - Friday, April 10, 2015 - link
Why do you think the industry is so intent on pushing unnecessary bloat in the form of 4K? There are so many positives for industry players and really none for consumers.1440 is enough resolution for HDTV.
CaedenV - Friday, April 10, 2015 - link
totally agree. HDCP is just an annoyance that breaks things at best, and slows down piracy by a few months. They really need to do what the music industry did with MP3 format. Embrace the format that everyone wants and loves, and offer affordable content. If they remove the barriers to ownership, then more people will purchase.willis936 - Friday, April 10, 2015 - link
Who on earth would want to intercept a real time uncompressed video stream when you have the disc? It's beyond insane. Just snag the stream and avoid transcoding. The only practical use case is video game consoles, computers, and cable boxes. Even then it should only ever be used on cable boxes.mattlach - Friday, April 10, 2015 - link
Don't really care about 4k on my HTPC.At normal viewing distances you can't tell the difference anyway.
Now, on my desktop on the other hand... But I will be holding off until there is a decent 16:10 4k monitor. I just can't bring myself to go 16:9
Laststop311 - Friday, April 10, 2015 - link
4k starts to matter at 65 inches imo and definitely matters at 70"+. 4k also matters if you like to sit close and feel immersed. 60 inch and below is pointless for 4k tho unless you sit 2 feet away i agree with that and majority of people only have 60 inches or below. I currently use a 65 inch panasonic st60 tv but the LG 4k oleds are making their appearance albeit at an absurd 10k for 65 inch but just like the 55 inch 1080p started at 11k and is now 3k 2 years later surely this will eventually creep into the 4k range then I can buy it.cptcolo - Friday, April 10, 2015 - link
In about a year or so from now I'll prob buy a ~70in 4K TV. But it will have to meet the minimum requirements described above (HDMI 2.0a & HDCP 2.2).Oxford Guy - Friday, April 10, 2015 - link
"4k starts to matter at 65 inches imo and definitely matters at 70"+"You're forgetting something. 1080 was given to us precisely because it wasn't enough resolution. Instead of just doubling 720 to 1440, which would have been plenty of resolution, even for large sets, the industry instead has duped people into thinking they need 4K.
cptcolo - Friday, April 10, 2015 - link
It looks like we are about a year away from truly useful 4K / UHD.I'll keep an eye out for hardware meeting the minimum requirements: HDMI 2.0 (or better yet HDMI 2.0a), HDCP 2.2 and ARC. I have my fingers crossed on Skylake Integrated Graphics 3840x2160p60 support.
Willardjuice - Friday, April 10, 2015 - link
Was the million dollar question ever answered about Maxwell 2+ (does it support both 4:4:4 and hdcp 2.2)? I know of no TV/receiver that does atm but never could find the answer about Maxwell 2.oshogg - Friday, April 10, 2015 - link
The real "Key Takeaway" is that most popular 4K broadcast content (Netflix, Amazon Instant Video etc.) is not watchable on PC right now - and it isn't clear if it will be possible to do so even in near future.Oxford Guy - Friday, April 10, 2015 - link
Congratulations, consumer sheep. You have been scammed again. Instead of just creating a logical HDTV standard, one that maximizes human visual acuity with minimal data size overkill, you have been led around with a carrot on a stick in order to convince you to keep replacing your equipment. First there was 720. Then 1080 (which definitely should have never existed). Then, we go to 4K with manufacturers already plotting 8K sets sooner than you think.HDTV should have gone right to 1440 and stopped there. 1080 should have never existed and 4K is ridiculous overkill for TV viewing unless you sit one or two feet away from the set. But, manufacturers will continue to convince you that you need highly compressed (artifacted), color-drained, excessively bulky and slow to encode 4K video because moar pixelz = moar entertainment!
The same silliness affects the monitor market, where consumers have been duped into thinking pixels they can't see (because they're so so small!) are more important than things like wide gamut GB-LED backlighting — even though sRGB is an ancient standard that doesn't even cover the even more ancient ISO Coated printing space let alone modern inkjet printers' spaces, let alone coming even slightly close to the limits of human vision.
What should have happened after 720 is 1440 with at least AdobeRGB gamut (if not more). Smaller bandwidth requirements would have lowered the compression demands. That would have, in turn, increased the quality of the playback — most likely above highly-compressed 4K. Now, I notice that wide gamut is now apparently part of the hodgepodge of "standards" being tossed about for 4K, but I'll believe it when I see it. Regardless, the compression is going to have to be heavy to deal with the unnecessarily large files.
Oxford Guy - Friday, April 10, 2015 - link
If you don't believe me: http://www.tftcentral.co.uk/articles/visual_acuity...Oxford Guy - Friday, April 10, 2015 - link
I should modify the sentence "the same silliness" to note that, at least 4K makes some sense for monitors — not for TVs, due to viewing distance. See TFTCentral article.serendip - Monday, April 13, 2015 - link
Just like how more CPU cores in phones meant more performance or how a QHD screen somehow got squeezed into a phablet - because more is better. It has nothing to do with good engineering, it has everything to do with stupid marketing departments.We still have 1080p content on MPEG2 through satellite, in all its blocky compressed "HD" glory. 4k content would probably suffer the same fate.
zodiacfml - Saturday, April 11, 2015 - link
Terrible and there are other physical standards.geok1ng - Saturday, April 11, 2015 - link
I do not understood why the GTX 960 does meet the HTPC requirements for the 4K era.ganeshts - Saturday, April 11, 2015 - link
The number of people who would consider a dual-slot GPU to be a HTPC GPU is very less. It is preferable that HTPC GPUs be single slot and/or passively cooled. Integrate GPUs are the best bet unless one is interested in high-quality madVR or other such taxing renderer configurations.Also, software support is not that great right now even if the drivers claim 4K HEVC decode capabilities. We should also wait for that aspect to mature.
sonicmerlin - Saturday, April 11, 2015 - link
Wait a sec, does this mean every single 4k TV sold thus far won't be able to play 4k blu-Rays because they don't have HDCP 2.2 support? How is that legally allowed?ganeshts - Saturday, April 11, 2015 - link
You are right in your inference. However, note that some of the currently sold 4K TVs do have HDCP 2.2 ports (as you can see in our panel photo linked above).Unfortunately, this is one of the disadvantages of being an early adopter... and that is exactly the reason why it is difficult to make any purchases in the entertainment PC setup that involves 4K - be it decoding or display.
geekfool - Sunday, April 12, 2015 - link
you should add some official UHD-1 and UHD-1 tags for future referencegeekfool - Sunday, April 12, 2015 - link
derp UHD-2 OC and perhaps cover that in more detail by emailing the BBC R&D team and asking for info and their take on the real UHDTV1/2 time to markets etc...wintermute000 - Sunday, April 12, 2015 - link
so basically not worth it unless you are loaded (i.e. willing to upgrade your HTPC + TV) and want to watch the 1 movie in 10 that comes out on 4k (as well as going to the pain of having to deal with discs again).OrionTSEP - Monday, April 13, 2015 - link
>10b encodes, despite being supported in the drivers, played back>with a black screen (indicating either the driver being at fault, or
>LAV Filters needing some updates for Intel GPUs).
LAV filters do not decode HEVC by itself. It uses ffmpeg for decoding (either in software
an hardware decode). In LAV filter source code in file dxva2dec.cpp in function CDecDXVA2::InitDecoder you could find some checks of profile and decoder compatibility. In case of Main10 hardware decoding is falling back to software decoding. In other words - LAV filters currently do not support hardware Main10 decoding (it is disabled in code, as far as I can understand - because of ffmpeg do not supports this yet, but I could be wrong)
Sivar - Monday, April 13, 2015 - link
I was very much hoping for an article like this. Thank you, Ganesh!We are at an annoying twilight between the end of h.264/1080P and the beginning of h.265/4k.
It looks like there are two choices for a future-resistant HTPC: Patience or a relatively large budget.
burjoes - Friday, May 15, 2015 - link
I wonder why the chart doesn't include the new HDMI 2.0a, since it was discussed in the article.