First, the beelink gtr6 with the exact sand cpu supports hdmi 2.1 Second it was released just last year. Third, yes I do need all that bandwidth. Fourth... Just because.
The more 'features' a product has, the more it is likely to cost. Just because a chipset supports a features, doesn't mean it will have it in the product. It may require more electronic components to make the 'feature' functional, which translates to higher product cost overall (or the company may blame that). Also, with an integrated chipset, what is it that you plan to run at the bandwidth of HDMI 2.1? This unit will likely only play video, not games at that resolution/refresh rate.
That is the fattest and longest ribbon cable I have ever seen in a miniPC or laptop. And it seemingly doesn't have to be that long, if the daughterboard wasn't attached to the base that opens up like a clam shell.
None of these older integrated GPUs support AV1 encoding in hardware.
AV1 encoding is supported now in the AMD Phoenix Ryzen 7040 series and it will be supported in the Intel Meteor Lake Core Ultra series, which will be launched by the end of the year.
There already are many models of small computers with Ryzen 7 7840U, Ryzen 7 7840HS or Ryzen 9 7940HS, which support AV1 encoding in real time of several video streams in parallel.
If you tell me that the scalers are using shaders (which i'm not really sure whether are they doing so or not) then the non GPU-HQ preset scalers are shaders too and therefore GPU shaders are activated regardless you use GPU-HQ or not. Is your benchmark invalid since its based on a false premise?
There is no 'false premise' to talk of here - in fact, the attempt is to provide analysis based on the obtained results.
The software used, and the settings are detailed in the section. Anybody can feel free to reproduce the results.
As for the statement 'activation of GPU shaders for GPU-HQ case' - the default case playback uses bilinear algorithm for chroma scaling (as an example), and that should be handled by fixed function hardware in most GPUs [ https://github.com/mpv-player/mpv/issues/10306#iss... ]. Of course, I have not looked in detail into the mpv code base, but based on the energy numbers for GPU-HQ vs. default (for the GEEKOM AS6 - 9.69 Wh vs. 8.12 Wh), and GPU D3D Usage for the two cases for different files, it looks like some additional work is being done on the GPU and my educated guess is that the GPU shader work is far more for GPU-HQ compared to the default case.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
14 Comments
Back to Article
MTEK - Monday, July 31, 2023 - link
HDMI 2.0 and not 2.1. It's 2023. Why is this still happening?nandnandnand - Monday, July 31, 2023 - link
It's a previous-gen part, for one.Does it even need the features or bandwidth of HDMI 2.1?
shabby - Monday, July 31, 2023 - link
First, the beelink gtr6 with the exact sand cpu supports hdmi 2.1Second it was released just last year.
Third, yes I do need all that bandwidth.
Fourth... Just because.
heffeque - Monday, July 31, 2023 - link
They reviewed a fairly "old" mini-PC.All 7040HS mini-PCs have HDMI 2.1
Drkrieger01 - Monday, July 31, 2023 - link
The more 'features' a product has, the more it is likely to cost. Just because a chipset supports a features, doesn't mean it will have it in the product. It may require more electronic components to make the 'feature' functional, which translates to higher product cost overall (or the company may blame that).Also, with an integrated chipset, what is it that you plan to run at the bandwidth of HDMI 2.1? This unit will likely only play video, not games at that resolution/refresh rate.
Drkrieger01 - Monday, July 31, 2023 - link
Also, DisplayPort 1.4a is capable of 4K@96Hz, 8K@30Hz. HDMI 2.0 can only do 4K@60Hz. DisplayPort 1.4a is far superior to HDMI 2.0.meacupla - Monday, July 31, 2023 - link
That is an oddly complex daughterboard.That is the fattest and longest ribbon cable I have ever seen in a miniPC or laptop.
And it seemingly doesn't have to be that long, if the daughterboard wasn't attached to the base that opens up like a clam shell.
rUmX - Monday, July 31, 2023 - link
No AV1 video encoding benchmarks? I think SVT-AV1 should be included. H264/H265 are now ancient codecs.AdrianBc - Tuesday, August 1, 2023 - link
None of these older integrated GPUs support AV1 encoding in hardware.AV1 encoding is supported now in the AMD Phoenix Ryzen 7040 series and it will be supported in the Intel Meteor Lake Core Ultra series, which will be launched by the end of the year.
There already are many models of small computers with Ryzen 7 7840U, Ryzen 7 7840HS or Ryzen 9 7940HS, which support AV1 encoding in real time of several video streams in parallel.
meacupla - Tuesday, August 1, 2023 - link
Mobile Ryzen 6000 only has AV1 decode.haplo602 - Wednesday, August 2, 2023 - link
So you review an Asus rebrand but ignore all the upstart minipc manufacturers that were on the market months before this one ? Interesting choice ...nicolaim - Wednesday, August 2, 2023 - link
Me again. 2023, yet only two USB-C ports. WTF!?eriri-el - Monday, August 7, 2023 - link
"The activation of the GPU shaders for rendering in the GPU-HQ case results in relatively higher energy numbers for all the systems."Any citation for this? AFAIK GPU-HQ is just a profile to activate higher quality scaling.
Here is my citation:
https://github.com/mpv-player/mpv/blob/35a6b26b780...
If you tell me that the scalers are using shaders (which i'm not really sure whether are they doing so or not) then the non GPU-HQ preset scalers are shaders too and therefore GPU shaders are activated regardless you use GPU-HQ or not. Is your benchmark invalid since its based on a false premise?
ganeshts - Sunday, August 13, 2023 - link
There is no 'false premise' to talk of here - in fact, the attempt is to provide analysis based on the obtained results.The software used, and the settings are detailed in the section. Anybody can feel free to reproduce the results.
As for the statement 'activation of GPU shaders for GPU-HQ case' - the default case playback uses bilinear algorithm for chroma scaling (as an example), and that should be handled by fixed function hardware in most GPUs [ https://github.com/mpv-player/mpv/issues/10306#iss... ]. Of course, I have not looked in detail into the mpv code base, but based on the energy numbers for GPU-HQ vs. default (for the GEEKOM AS6 - 9.69 Wh vs. 8.12 Wh), and GPU D3D Usage for the two cases for different files, it looks like some additional work is being done on the GPU and my educated guess is that the GPU shader work is far more for GPU-HQ compared to the default case.