I don't think I really care much about 4k GAMING monitors until they have video cards that can actually take advantage of all those pixels. Maybe in 3-4 years price/ performance for video cards will be there, but it sure isn't right now. just my 2cents
Modern video cards already can play select games at 4k 60hz. If you had a brain between your ears you'd understand there's a tradeoff for every single graphical option you enable, there's a framerate penalty (min, avg, max) and some of these options are computationally a lot more intensive than the changes it makes to the game's visuals. Often times (and I mean like ~80% of the time) these options don't make any sense to enable. Stuff like volumetric lighting, lens flares, adaptive exposure, etc.
I literally play Warframe on a 24" 4k 60hz IPS monitor (Acer K242HQKbmjdp), with an avg framerate of ~82 fps, and the only options that matter are model quality, texture quality, anisotropic filtering. Don't even need any kind of AA at 24" 4k. My specs are i5-4690k @ 4.5Ghz, GTX 970 @ 1.55 GHz, and 16gb of DDR3-2400 memory, so it's nothing extravagant by any means, i'm running hardware that's basically 3 ~ 4 years old right now.
I get really tired of people who don't know trash about optimizing game settings according to what their PC is reliably capable of and assume that if a game can't be played at 100% MAXIMUM settings then it's the hardware manufacturer's fault. Nope. It's not AMD or Nvidia's fault here, they can't fix stupid users who don't understand how to fine-tune graphical settings to get to an ideal playable framerate and video quality.
Yeap it is pretty simple. Set to Ultra and then using afterburner watch your fps in real-time and lower every setting by two levels, one at a time, to medium and see which setting has the greatest effect. Boom you have 4k up to 90hz in most games with a 1080 ti.
To be clear, I'm only half kidding. :) I mean when the hardware is this expensive and is touted to run this and that game at 4K I fully expect it to do it under decent conditions. Otherwise I'm pretty sure you can do 24FPS at ultra low settings on a lot of hardware but that's not quite the point.
The thing about Ultra settings is a lot of them do almost nothing for image quality compared to High or lower in some cases - but they take significantly more processing power than High.
It is a preference thing. If they want volumetric lighting, lens flare, DoF, adaptive exposure, etc. etc. If developers have it, then its your prerogative to enable or disable it. And if they demo'd it in E3, then it is reasonable for the buying public to expect it in the final product.
I'm not sure why you quickly attacked the previous poster. That is an immature way to discuss a simple topic.
I agree he could have worded himself more politely. But he's not wrong. If you use the wrong settings, there will never be a card good enough for you. You can enable all the things you mentioned. Still there is a big difference between DoF medium and ultra in performance hit.
Crysis at the absolute max settings is hard today with video cards that are 10 times as fast as what was available then. Max settings really doesn't mean much, depends on what those settings are.
Even so, if someone’s acceptable graphics settings are more taxing than yours calling them an idiot is just 1) a sign of really low self esteem, and 2) itself a bigly idiotic thing to say.
Let me guess: It's more mature to persistently meme that 4K 60hz games are inherently an unattainable goal that can't happen for now or any time in the near future, but it's not mature to dispel the myth/meme that 4k 60hz is possible and HAS been possible for a while.
And even if you want to be moronic and enable computationally intensive effects that don't provide a large benefit to the overall picture quality, even I can play games like TF2 completely maxed out at ~66 FPS avg on my system specs on a 4K 60hz monitor, despite how wasteful enabling 8x AA on a 24" 4k monitor is to the framerate.
"Current graphics cards can’t play games well at 4K."
"Idiot, they do when you turn the settings way low!"
Uhm. That’s like him saying a car isn’t fast enough to drive on a highway at reasonable speeds and you going "Idiot, you just have to drive it at 30mph!"
> I get really tired of people who don't know trash about optimizing game settings according to what their PC is reliably capable of and assume that if a game can't be played at 100% MAXIMUM settings then it's the hardware manufacturer's fault. Nope. It's not AMD or Nvidia's fault here, they can't fix stupid users who don't understand how to fine-tune graphical settings to get to an ideal playable framerate and video quality.
Seriously? You don’t see how you’re shifting goal posts here just to call someone "stupid"? If a game can’t be played at "100% MAXIMUM™ settings" it’s usually the GPU that’s not fast enough. Just because you turn off all effects to be able to play that doesn’t mean that other people find that an acceptable compromise.
You call others stupid, but you talk like a 14 year old gamer who got hit on his head by his mom a couple of times more than he should have been.
>That’s like him saying a car Making a really bad car to computer analogy. Check.
>You don’t see how you’re shifting goal posts here just to call someone "stupid" Making a stupid blanket statement comment stating that games can't be played at 4k 60hz (read: this is blatantly false), then projecting their stupidity to others. Check.
>You call others stupid, but you talk like a 14 year old gamer Wants to stand on their high horse for not belittling others in online conversations, then immediately does that in their comment anyways.
See, I'm not above or below comments. You're stupid and a hypocrite. PC software have always pushed the envelope for existing PC hardware. This isn't new. This has been the situation for decades. If Nvidia and AMD made a GPU 4x as powerful today, 1 year from now, we'd be seeing game publishers making software that's 4x or 5x as computationally intensive as what we have already. You can't ever blame the hardware manufacturers, they're literally on the bleeding edge of existing process nodes and the amount of memory that's feasibly and affordably integrated into GPUs.
>He can’t play the games he wants at the settings he wants on 4K
Here's the thing, idiot, there's three people in this situation. 1) The graphics card manufacturer. 2) The software developer. 3) You, the one playing games.
For the graphics card manufacturers, they're working closely with fab companies like TSMC for products on future process nodes before it's even out in the public. They're inherently limited by what TSMC and current technology can manufacture. They're inherently limited by the pricing and availability of memory they can interface on the GPU. They're inherently limited by thermodynamics and how much power their chip can consume at the process node they're stuck at and pushing the envelope too much can burn them. AMD Fury GPUs weren't particularly well received since their forward-thinking HBM got too expensive over time and the power consumption was high compared to competing products, outside the silicon's ideal operating efficiency. You can “push the silicon” harder, but it doesn’t always make a better product.
Yes, let’s blame the damn GPU manufacturers for not getting their shit together. Don’t they know GPUs operate on magic? They’re the reason why “4k 60hz gaming™” isn’t possible.
For software developers, they’re under time and budget constraints to make a product and ship it out the door to meet investor and consumer demands. Whether it’s the 27th Call of Duty or a completely original game, a publishing company sunk money to develop it and they’re looking to make it back with game sales or support down the line from expansions/DLC/microtransactions, or whatever cost model(s) they employ. Ultimately early design decisions with the game inherently limit what future patches can rectify down the line (in terms of optimization). It’s not always sensible to go back and optimize it so well that the game performs at 4k 60hz smoothly if all that work won’t result in sales that justify that extra development time; the game was already produced and sold and the bulk of its lifetime sales have already passed in most cases. 4k monitor users are in the utter minority anyways.
Ultimately, some blame falls on the software developer, and they should’ve employed better design decisions to make more optimal use of computer resources to make a game that’s playable at 4k 60hz from the beginning, but development costs doesn’t make this feasible, so the best they can really do is push those lessons learned into future titles.
And then there’s you, the “entitled gamer™” that likes to point the finger and blame other people, when it’s a matter of PEBCAK. You aren’t a hardware manufacturer, no amount of whining makes process node shrinks happen faster. You aren’t a software developer, no amount of whining makes games look prettier and operate in a more optimized manner. You only have control of two things here:
1) Buy better/faster hardware. This costs $$$. 2) Tune your game settings according to your system’s specs to get the best balance between visuals and performance.
You don’t get to whine on Anandtech article comments without being called out for being stupid for not realizing these facts. If you’re too poor to afford a better GPU, hit up the spam bot in the comments section with the totally legitimate pyramid scheme they have going on to afford a 1080Ti. If you already have a 1080Ti + 8700k + exotic cooling and overclocks, then you’ll have to make compromises on settings, because the only thing you have in YOUR CONTROL are the amount of money you spend on entertainment and the performance you can eke out of your system when enjoying your games.
And again, I don’t give a shit if I come across as rude. Facts are facts. Deal with it.
Your "attacking the poor GPU manufacturers" argument is a straw man. So you’re still a retard, because you’re completely missing the point. Go eat some boogers or play with your poop a bit to calm down.
@JoeyJoJo123, JoeyBoy, your comments don't make sense. People don't care how YOU like to play games and what you're willing to sacrifice. They want to play them like THEY like it.
Pointing out that they can drop some settings is helpful. Insulting them for not wanting what you want is... not smart
"Uhm. That’s like him saying a car isn’t fast enough to drive on a highway at reasonable speeds and you going "Idiot, you just have to drive it at 30mph!""
To use the car analogy, it would be like loading every car up to it's maximum rated operating load, then complaining it doesn't hit the maximum possible acceleration.
If you want to run at 4k, maybe just easy back a bit from the visually-indistinguishable-but-I-need-everything-at-11 "Utra" settings to more reasonable 'high' (or even 'medium' if it doesn't hurt your 1337 reputation too badly) and performance will be perfectly fine.
Game developers will always scale target performance to what GPUs can provide. As cards get faster, newly released games will get more graphically complex in lockstep.
>Game developers will always scale target performance to what GPUs can provide. As cards get faster, newly released games will get more graphically complex in lockstep.
Is that really true? Modern high-end graphics cards can easily push basically any game at 1080P ultra at 150+ FPS, which 95%+ of gamers are at that resolution or below. I don't think developers really optimize for the highest end hardware, outside of certain edge cases.
Obviously new games get more and more graphically complex, but I'd bet that has more to do with the average gamer's hardware getting faster. I.e. what are laptops from the last couple years capable of? Or for many games, what are consoles capable of? As TVs and consoles move to 4k, I think we'll see more and more games optimized to run at 4k, with not too much thought put in to what an 1180TI can run.
"If you want to run at 4k, maybe just easy back a bit from the visually-indistinguishable-but-I-need-everything-at-11"
It’s not "run at 4K", it’s "run at 4K with the settings I like". Both perfectly fine as a personal preference, the former being achievable today, the latter less so.
"I’d like ham and eggs." — "We have no ham, you should get the eggs only." — "Well I won’t have the eggs then." — "You complete imbecile, it’s both with eggs, is it not?!"
You're absolutely right. But equally guilty are game developers who throw so many graphics options around, that one needs a degree in CS to understand. Performance can be affected by so many things, like it's the CPU doing the physics or the GPU ? Throwing the physics or AI at the GPU may overload it. Again, game developers keep this info to themselves. So, how is the average user supposed to "optimize" their game ?
Honestly, that's the developer's job. A good development team will usually spend time on figuring out the ideal settings, but many don't and punt to the end-user.
Don't really care if I come across as rude. If you make stupid comments I'll call you out on stupid comments. What do you think is going to happen when you make a blanket statement as patently false as "you can't play games at 4k 60hz because it's the fault of video card manufacturers"?
Nice projection! Enjoy being wrong and not addressing any of the points I've made before. You like to cop out of the argument by making bad comparisons to cars, attacking the tone (rather than the argument presented), all the while not providing any facts about how 4K 60hz is impossible (according to you).
Aparently it’s impossible for the OP at the settings he wants. You’ve written a shit ton of arguments to prove how smart you are, but are completely missing the point. Poop-head.
"The remote is going to be quite handy because the 43-incher can be used like a TV when connected to various media streamers or players (or even a TV tuner!). Now, a disadvantage of this size is that the 436M6VBPAB does not come with a stand that can regulate its height or tilt. To partly solve this, it does have VESA mounts."
That’s from the previous AnandTech article.
So it’s 43", has a non-adjustable stand (like TVs do), a 4K HDR panel, and a remote control. I’m not exactly sure what makes this a "gaming monitor" instead of a "gaming TV", to be honest.
Plenty games can run at high refresh. Considering how long monitors last its pretty shortsighted to not consider high refresh when the next round of cards is probably going to drive 4k ultra over 60hz for most games.
Ditto. I am waiting for the day I can finally retire my 144Hz TN Philips LCD for something that refreshes as fast while actually being usable outside of gaming. The TN panel is just awful for video playback.
Has anyone tried to charge a notebook via USB-C with this monitor? I plan to connect a Dell XPS13 and would expect via one cable to charge (slow charging is fine), transmit DisplayPort (4K), audio and USB connectivity for some USB devices connected to the monitor.
Technically this should all be possible via 1 USB-C cable.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
54 Comments
Back to Article
faiakes - Monday, July 2, 2018 - link
Display Port 1.2?Can it handle 4K at 80Hz?
Why not Display Port 1.4?
DigitalFreak - Monday, July 2, 2018 - link
Maximum refresh rate for 3840 × 2160 @ 8bit is 69Hz with DP 1.2.nathanddrews - Tuesday, July 3, 2018 - link
Probably 80Hz at 4:2:2 or 4:2:0, which is needed for HDR.kaesden - Wednesday, July 11, 2018 - link
becaues this way they can sell you a new monitor next year with DP 1.4.Papaspud - Monday, July 2, 2018 - link
I don't think I really care much about 4k GAMING monitors until they have video cards that can actually take advantage of all those pixels. Maybe in 3-4 years price/ performance for video cards will be there, but it sure isn't right now. just my 2centsJoeyJoJo123 - Monday, July 2, 2018 - link
Modern video cards already can play select games at 4k 60hz. If you had a brain between your ears you'd understand there's a tradeoff for every single graphical option you enable, there's a framerate penalty (min, avg, max) and some of these options are computationally a lot more intensive than the changes it makes to the game's visuals. Often times (and I mean like ~80% of the time) these options don't make any sense to enable. Stuff like volumetric lighting, lens flares, adaptive exposure, etc.I literally play Warframe on a 24" 4k 60hz IPS monitor (Acer K242HQKbmjdp), with an avg framerate of ~82 fps, and the only options that matter are model quality, texture quality, anisotropic filtering. Don't even need any kind of AA at 24" 4k. My specs are i5-4690k @ 4.5Ghz, GTX 970 @ 1.55 GHz, and 16gb of DDR3-2400 memory, so it's nothing extravagant by any means, i'm running hardware that's basically 3 ~ 4 years old right now.
I get really tired of people who don't know trash about optimizing game settings according to what their PC is reliably capable of and assume that if a game can't be played at 100% MAXIMUM settings then it's the hardware manufacturer's fault. Nope. It's not AMD or Nvidia's fault here, they can't fix stupid users who don't understand how to fine-tune graphical settings to get to an ideal playable framerate and video quality.
Alistair - Monday, July 2, 2018 - link
Yeap it is pretty simple. Set to Ultra and then using afterburner watch your fps in real-time and lower every setting by two levels, one at a time, to medium and see which setting has the greatest effect. Boom you have 4k up to 90hz in most games with a 1080 ti.LordSojar - Monday, July 2, 2018 - link
Ultra settings or bust. Accept nothing less. nVidia, release the new cards! BOOHISSSSclose - Tuesday, July 3, 2018 - link
So I buy a 1000E GPU, a 1000E monitor, and then I have to.... LOWER SETTINGS??! Unacceptable :).close - Tuesday, July 3, 2018 - link
To be clear, I'm only half kidding. :) I mean when the hardware is this expensive and is touted to run this and that game at 4K I fully expect it to do it under decent conditions. Otherwise I'm pretty sure you can do 24FPS at ultra low settings on a lot of hardware but that's not quite the point.Diji1 - Friday, July 6, 2018 - link
The thing about Ultra settings is a lot of them do almost nothing for image quality compared to High or lower in some cases - but they take significantly more processing power than High.crimsonson - Monday, July 2, 2018 - link
Why so angry?It is a preference thing. If they want volumetric lighting, lens flare, DoF, adaptive exposure, etc. etc.
If developers have it, then its your prerogative to enable or disable it. And if they demo'd it in E3, then it is reasonable for the buying public to expect it in the final product.
I'm not sure why you quickly attacked the previous poster. That is an immature way to discuss a simple topic.
Alistair - Monday, July 2, 2018 - link
I agree he could have worded himself more politely. But he's not wrong. If you use the wrong settings, there will never be a card good enough for you. You can enable all the things you mentioned. Still there is a big difference between DoF medium and ultra in performance hit.Crysis at the absolute max settings is hard today with video cards that are 10 times as fast as what was available then. Max settings really doesn't mean much, depends on what those settings are.
xype - Tuesday, July 3, 2018 - link
Even so, if someone’s acceptable graphics settings are more taxing than yours calling them an idiot is just 1) a sign of really low self esteem, and 2) itself a bigly idiotic thing to say.Alistair - Friday, July 6, 2018 - link
Using SSAA at 4x is idiotic :)xype - Tuesday, July 10, 2018 - link
You obviously never experienced _proper_ SSAA at 4x thingamajig on a GFX-enabled GPU. :PRanger1065 - Tuesday, July 3, 2018 - link
Child rage temper tantrums. Back to Wccftech with you.JoeyJoJo123 - Tuesday, July 3, 2018 - link
Let me guess: It's more mature to persistently meme that 4K 60hz games are inherently an unattainable goal that can't happen for now or any time in the near future, but it's not mature to dispel the myth/meme that 4k 60hz is possible and HAS been possible for a while.And even if you want to be moronic and enable computationally intensive effects that don't provide a large benefit to the overall picture quality, even I can play games like TF2 completely maxed out at ~66 FPS avg on my system specs on a 4K 60hz monitor, despite how wasteful enabling 8x AA on a 24" 4k monitor is to the framerate.
xype - Tuesday, July 3, 2018 - link
> If you had a brain between your ears"Current graphics cards can’t play games well at 4K."
"Idiot, they do when you turn the settings way low!"
Uhm. That’s like him saying a car isn’t fast enough to drive on a highway at reasonable speeds and you going "Idiot, you just have to drive it at 30mph!"
> I get really tired of people who don't know trash about optimizing game settings according to what their PC is reliably capable of and assume that if a game can't be played at 100% MAXIMUM settings then it's the hardware manufacturer's fault. Nope. It's not AMD or Nvidia's fault here, they can't fix stupid users who don't understand how to fine-tune graphical settings to get to an ideal playable framerate and video quality.
Seriously? You don’t see how you’re shifting goal posts here just to call someone "stupid"? If a game can’t be played at "100% MAXIMUM™ settings" it’s usually the GPU that’s not fast enough. Just because you turn off all effects to be able to play that doesn’t mean that other people find that an acceptable compromise.
You call others stupid, but you talk like a 14 year old gamer who got hit on his head by his mom a couple of times more than he should have been.
JoeyJoJo123 - Tuesday, July 3, 2018 - link
Like clockwork:>That’s like him saying a car
Making a really bad car to computer analogy. Check.
>You don’t see how you’re shifting goal posts here just to call someone "stupid"
Making a stupid blanket statement comment stating that games can't be played at 4k 60hz (read: this is blatantly false), then projecting their stupidity to others. Check.
>You call others stupid, but you talk like a 14 year old gamer
Wants to stand on their high horse for not belittling others in online conversations, then immediately does that in their comment anyways.
See, I'm not above or below comments. You're stupid and a hypocrite. PC software have always pushed the envelope for existing PC hardware. This isn't new. This has been the situation for decades. If Nvidia and AMD made a GPU 4x as powerful today, 1 year from now, we'd be seeing game publishers making software that's 4x or 5x as computationally intensive as what we have already. You can't ever blame the hardware manufacturers, they're literally on the bleeding edge of existing process nodes and the amount of memory that's feasibly and affordably integrated into GPUs.
xype - Tuesday, July 3, 2018 - link
He can’t play the games he wants at the settings he wants on 4K, and you’re telling him he’s "wrong" and stupid?You’re a retard.
JoeyJoJo123 - Wednesday, July 4, 2018 - link
>He can’t play the games he wants at the settings he wants on 4KHere's the thing, idiot, there's three people in this situation.
1) The graphics card manufacturer.
2) The software developer.
3) You, the one playing games.
For the graphics card manufacturers, they're working closely with fab companies like TSMC for products on future process nodes before it's even out in the public. They're inherently limited by what TSMC and current technology can manufacture. They're inherently limited by the pricing and availability of memory they can interface on the GPU. They're inherently limited by thermodynamics and how much power their chip can consume at the process node they're stuck at and pushing the envelope too much can burn them. AMD Fury GPUs weren't particularly well received since their forward-thinking HBM got too expensive over time and the power consumption was high compared to competing products, outside the silicon's ideal operating efficiency. You can “push the silicon” harder, but it doesn’t always make a better product.
Yes, let’s blame the damn GPU manufacturers for not getting their shit together. Don’t they know GPUs operate on magic? They’re the reason why “4k 60hz gaming™” isn’t possible.
For software developers, they’re under time and budget constraints to make a product and ship it out the door to meet investor and consumer demands. Whether it’s the 27th Call of Duty or a completely original game, a publishing company sunk money to develop it and they’re looking to make it back with game sales or support down the line from expansions/DLC/microtransactions, or whatever cost model(s) they employ. Ultimately early design decisions with the game inherently limit what future patches can rectify down the line (in terms of optimization). It’s not always sensible to go back and optimize it so well that the game performs at 4k 60hz smoothly if all that work won’t result in sales that justify that extra development time; the game was already produced and sold and the bulk of its lifetime sales have already passed in most cases. 4k monitor users are in the utter minority anyways.
Ultimately, some blame falls on the software developer, and they should’ve employed better design decisions to make more optimal use of computer resources to make a game that’s playable at 4k 60hz from the beginning, but development costs doesn’t make this feasible, so the best they can really do is push those lessons learned into future titles.
And then there’s you, the “entitled gamer™” that likes to point the finger and blame other people, when it’s a matter of PEBCAK. You aren’t a hardware manufacturer, no amount of whining makes process node shrinks happen faster. You aren’t a software developer, no amount of whining makes games look prettier and operate in a more optimized manner. You only have control of two things here:
1) Buy better/faster hardware. This costs $$$.
2) Tune your game settings according to your system’s specs to get the best balance between visuals and performance.
You don’t get to whine on Anandtech article comments without being called out for being stupid for not realizing these facts. If you’re too poor to afford a better GPU, hit up the spam bot in the comments section with the totally legitimate pyramid scheme they have going on to afford a 1080Ti. If you already have a 1080Ti + 8700k + exotic cooling and overclocks, then you’ll have to make compromises on settings, because the only thing you have in YOUR CONTROL are the amount of money you spend on entertainment and the performance you can eke out of your system when enjoying your games.
And again, I don’t give a shit if I come across as rude. Facts are facts. Deal with it.
xype - Tuesday, July 10, 2018 - link
Your "attacking the poor GPU manufacturers" argument is a straw man. So you’re still a retard, because you’re completely missing the point. Go eat some boogers or play with your poop a bit to calm down.close - Wednesday, July 4, 2018 - link
@JoeyJoJo123, JoeyBoy, your comments don't make sense. People don't care how YOU like to play games and what you're willing to sacrifice. They want to play them like THEY like it.Pointing out that they can drop some settings is helpful. Insulting them for not wanting what you want is... not smart
edzieba - Wednesday, July 4, 2018 - link
"Uhm. That’s like him saying a car isn’t fast enough to drive on a highway at reasonable speeds and you going "Idiot, you just have to drive it at 30mph!""To use the car analogy, it would be like loading every car up to it's maximum rated operating load, then complaining it doesn't hit the maximum possible acceleration.
If you want to run at 4k, maybe just easy back a bit from the visually-indistinguishable-but-I-need-everything-at-11 "Utra" settings to more reasonable 'high' (or even 'medium' if it doesn't hurt your 1337 reputation too badly) and performance will be perfectly fine.
Game developers will always scale target performance to what GPUs can provide. As cards get faster, newly released games will get more graphically complex in lockstep.
Bizwacky - Monday, July 9, 2018 - link
>Game developers will always scale target performance to what GPUs can provide. As cards get faster, newly released games will get more graphically complex in lockstep.Is that really true? Modern high-end graphics cards can easily push basically any game at 1080P ultra at 150+ FPS, which 95%+ of gamers are at that resolution or below. I don't think developers really optimize for the highest end hardware, outside of certain edge cases.
Obviously new games get more and more graphically complex, but I'd bet that has more to do with the average gamer's hardware getting faster. I.e. what are laptops from the last couple years capable of? Or for many games, what are consoles capable of? As TVs and consoles move to 4k, I think we'll see more and more games optimized to run at 4k, with not too much thought put in to what an 1180TI can run.
xype - Tuesday, July 10, 2018 - link
"If you want to run at 4k, maybe just easy back a bit from the visually-indistinguishable-but-I-need-everything-at-11"It’s not "run at 4K", it’s "run at 4K with the settings I like". Both perfectly fine as a personal preference, the former being achievable today, the latter less so.
"I’d like ham and eggs." — "We have no ham, you should get the eggs only." — "Well I won’t have the eggs then." — "You complete imbecile, it’s both with eggs, is it not?!"
cocochanel - Tuesday, July 3, 2018 - link
You're absolutely right. But equally guilty are game developers who throw so many graphics options around, that one needs a degree in CS to understand.Performance can be affected by so many things, like it's the CPU doing the physics or the GPU ?
Throwing the physics or AI at the GPU may overload it.
Again, game developers keep this info to themselves.
So, how is the average user supposed to "optimize" their game ?
GreenReaper - Tuesday, July 3, 2018 - link
Honestly, that's the developer's job. A good development team will usually spend time on figuring out the ideal settings, but many don't and punt to the end-user.Holliday75 - Tuesday, July 3, 2018 - link
I click on the option that says "Low", "Medium" or "high". Usually works.Tams80 - Tuesday, July 3, 2018 - link
Most of those are very subjective. Some people really do like volumetric lighting etc.The only real exception is AA (which does tax a system a lot) for small 4k displays. It's not really needed.
Check your attitude mind. It's rather rude.
JoeyJoJo123 - Tuesday, July 3, 2018 - link
Don't really care if I come across as rude. If you make stupid comments I'll call you out on stupid comments. What do you think is going to happen when you make a blanket statement as patently false as "you can't play games at 4k 60hz because it's the fault of video card manufacturers"?xype - Tuesday, July 3, 2018 - link
So you’re a rude retard. Congrats!JoeyJoJo123 - Thursday, July 5, 2018 - link
Nice projection! Enjoy being wrong and not addressing any of the points I've made before. You like to cop out of the argument by making bad comparisons to cars, attacking the tone (rather than the argument presented), all the while not providing any facts about how 4K 60hz is impossible (according to you).xype - Tuesday, July 10, 2018 - link
Aparently it’s impossible for the OP at the settings he wants. You’ve written a shit ton of arguments to prove how smart you are, but are completely missing the point. Poop-head.Dug - Monday, July 2, 2018 - link
Would it kill ya to make a 32"? Which is much more reasonable size for desktops.Alistair - Monday, July 2, 2018 - link
I really want this monitor also, but the 40" kills it. I already feel my 32" 4k is gigantic too.xype - Tuesday, July 3, 2018 - link
"The remote is going to be quite handy because the 43-incher can be used like a TV when connected to various media streamers or players (or even a TV tuner!). Now, a disadvantage of this size is that the 436M6VBPAB does not come with a stand that can regulate its height or tilt. To partly solve this, it does have VESA mounts."That’s from the previous AnandTech article.
So it’s 43", has a non-adjustable stand (like TVs do), a 4K HDR panel, and a remote control. I’m not exactly sure what makes this a "gaming monitor" instead of a "gaming TV", to be honest.
¯\_(ツ)_/¯
milkod2001 - Tuesday, July 3, 2018 - link
Thisnorazi - Monday, July 2, 2018 - link
In 2018, I would expect 100Hz at the minimum for any monitor with the word "gaming" in itAlistair - Monday, July 2, 2018 - link
I just really wanted 90hz, for youtube and tv watching compatibility. Anything less than 90hz means youtube will stutter if you don't set it to 60hz.Stuka87 - Monday, July 2, 2018 - link
Not much point in a 100Hz 4k display at this time. Even the best Titan won't run many games at a sustained 100fps without knocking settings down.Midwayman - Tuesday, July 3, 2018 - link
Plenty games can run at high refresh. Considering how long monitors last its pretty shortsighted to not consider high refresh when the next round of cards is probably going to drive 4k ultra over 60hz for most games.Samus - Monday, July 2, 2018 - link
Ditto. I am waiting for the day I can finally retire my 144Hz TN Philips LCD for something that refreshes as fast while actually being usable outside of gaming. The TN panel is just awful for video playback.Lolimaster - Monday, July 2, 2018 - link
Why can't they release this thing is 27-28" 1440p 120Hz FULL GLOSSY, qdot, good color accuracyJUST DO IT.
milkod2001 - Tuesday, July 3, 2018 - link
2012 called and asked for your specs. 32''4k 100Hz native should not be a problem in 2018 but apparently is.Xajel - Tuesday, July 3, 2018 - link
This can be great for the XBox One X. or for an Home Office TV (connected to the PC also).edzieba - Wednesday, July 4, 2018 - link
No FALD, no sale. Edge-lit 'zones' are barely any better than the old 'dynamic contrast' scam.zodiacfml - Thursday, July 5, 2018 - link
High end specs except the refresh rate. I don't consider this much better than my $400 LG TV.imaheadcase - Thursday, July 5, 2018 - link
That stand looks shady as hell. I can't tell if that is plastic or what, but for the size looks like it would easily tip over.Joffer - Thursday, July 5, 2018 - link
Only 8bit panel? Not 10? DisplayHDR 1000 requires 10-bit image processing..https://displayhdr.org/
tommo1982 - Monday, July 9, 2018 - link
One of the websites that sells it has - 23Hz as the minimum refresh rate.Heavenly71 - Monday, July 9, 2018 - link
Has anyone tried to charge a notebook via USB-C with this monitor? I plan to connect a Dell XPS13 and would expect via one cable to charge (slow charging is fine), transmit DisplayPort (4K), audio and USB connectivity for some USB devices connected to the monitor.Technically this should all be possible via 1 USB-C cable.
Elleequip - Tuesday, November 16, 2021 - link
Das ist toll <a href="https://yanrefitness.de/quality/">ich danke Ihnen für das Teilen</a>