Comments Locked

31 Comments

Back to Article

  • markifi - Wednesday, January 7, 2015 - link

    This has to reach market along with Google finally leaving triple frame buffering behind. 60hz is nice, but 120 with no buffering would be nice as an option.
  • tuxfool - Wednesday, January 7, 2015 - link

    It also appears to be a nice way to burn through power....
  • Samus - Thursday, January 8, 2015 - link

    If they'd stop trying to cram 400PPI+ into these things and prioritize actually important features like framerate and display quality, the whole battery power thing would be a wash. I'm sure a 200-250PPI 1080P display at 120hz saps the same amount of power as a 400PPI+ 1080P display at 60hz.
  • rkhighlight - Thursday, January 8, 2015 - link

    I don't know if you know what you're talking about. Two 1080p panels with different pixel density just means that the one with less PPI is bigger. A GPU doesn't care about PPI. 1080p means 1080p and it's always the same computational complexity.

    250 PPI + 1080p translate into a 8.8 inch panel.
    400 PPI + 1080p translate into a 5.5 inch panel.

    There's no other way (16:9 ratio presumed). Comparing the power draw if such different screen sizes doesn't make sense and the 8.8" display certainly uses more power with the same technology. No matter what refresh rate.
  • nikaldro - Thursday, January 8, 2015 - link

    He's not totally wrong.
    A higher PPI count means that a stronger backlight is needed to achieve the same brightness
  • blanarahul - Friday, January 16, 2015 - link

    I think I would like variable refresh rate much better than higher refresh rate, especially considering OLED panels have no lower limit on refresh rate. It would eliminate the need of Android trying to maintain 60 fps always and give the system some breathing room which, in theory, should improve both smoothness and overall power consumption.
  • thetuna - Thursday, January 8, 2015 - link

    I think what Samus was getting at is that as PPI increases, the percentage of space on the display taken up by light-blocking transistors grows; necessitating a more powerful backlight to get the same brightness.
    Although, I agree with you that the larger screen would most likely draw more power.
  • soccerballtux - Thursday, January 8, 2015 - link

    if google would commit to requiring a solid 60hz for your app to be approved we wouldn't need this.
  • Brandon Chester - Thursday, January 8, 2015 - link

    They need to commit to having all of their own apps run at 60fps before they can expect developers to.
  • Alexvrb - Thursday, January 8, 2015 - link

    Even if that does happen it will only be on shiny new hardware. Then, when manufacturers feel that users need to be pushed towards newer hardware again, Google will update things and all of the new apps won't run smooth on your "old" hardware.
  • Ortanon - Saturday, January 10, 2015 - link

    I can't tell you how often I've tried to tell people this, about not only software but televisions too.
  • jjj - Wednesday, January 7, 2015 - link

    This is a feature their soon to arrive MT6795 supports - that chip being 8xA53 at 2.2GHz, dual chan memory and PowerVR G6200 @ 700 MHz , although official specs are hard to find. Anyway on the CPU side given the clocks and how their 1.7GHz A53 does it would perform pretty well,even more so with dual channel. The GPU, not so sure it can push 120FPS in actual games .... nobody tests actual games.
  • blanarahul - Friday, January 16, 2015 - link

    I think you meant MT6792.
  • ddriver - Wednesday, January 7, 2015 - link

    It will be a blast for people living in another temporal plane and experiencing the world in slow motion.
  • ishould - Wednesday, January 7, 2015 - link

    I'd love to see this come to future smartphones. This and 1ms response touch screens so we can really draw on the screen as if it were a physical object. Microsoft had a demo of this for their surface table
  • ishould - Wednesday, January 7, 2015 - link

    Here's a youtube video for an example
    https://www.youtube.com/watch?v=vOvQCPLkPt4
  • ddriver - Wednesday, January 7, 2015 - link

    1 ms is a complete overkill, 10 msec is enough to not impede a human in any way. Problem is touch interaction is a very low in the priority queue for device manufacturers, sure if you set yourself to it, you can make a device dedicated to that and have it nearly latency free, but in an actual user product, the device has to do a ton of other things, so you can't afford to waste precious CPU cycles on user interaction.

    Another problem is the way the vendors implement their device drivers, putting stuff like touch and sound last, or on the very top, and very far from the actual hardware in order to make room for other features which they deem more worthy of CPU power, and it is not a configuration you can switch, say you need fast touch for this app, fast audio for that app, it is all rather fixed, and takes years of time and development cycles to move something. But hey, that's what you get from people who put money making first and versatility and productivity last.
  • saratoga4 - Wednesday, January 7, 2015 - link

    With most devices having many mostly idle cores, I doubt CPU load matters very much to latency. If it did, latency would continuously drop as more cores and higher clock speeds became the normal. Instead, its been barely improving even as huge increases in CPU power rolled out.

    The bigger issue is probably read out of the screen itself, the combined display+GPU latency, neither of which is very easy to improve or helped very much by faster SOCs. These are actually pretty hard to improve as capacitive sensing is very noisy, and GPUs are by design high latency devices.
  • ddriver - Wednesday, January 7, 2015 - link

    As I said, it is the current implementations, both in software and the actual hardware, both are a "last priority" for hardware vendors. Typically, there is nothing preventing touch from having the same latency as a mouse, which is acceptably low, even though it includes the video output latency as well.

    Noise might be an issue only if filtering is handled in software, which would explain the terrible latency, and really it is something that needs to be handled in hardware, will not add more than a few cents to the BOM and can be plugged as a responsive generic user input device like a mouse or a keyboard. It doesn't happen because hardware vendors can still get away with their devices being as bad as they are when it comes to user interaction. It is entirely possible to fix this, but hardware vendors are lazy and greedy, and since it is something that will only benefit the consumer, nobody is in a rush to do it.
  • ABR - Thursday, January 8, 2015 - link

    Is it the hardware or the software? Every time I try out an Android device in a store to this day the lags drive me crazy. But iOS and Windows Phone (despite supposedly always running on "worse" hardware) don't have this problem. My intuition is that the problem is Android is built on Linux, which is developed "server first", whereas both WP and iOS (despite also building on unix) are built UI first. (And on the other side, OS X performs lousily as a server.)
  • ddriver - Thursday, January 8, 2015 - link

    I haven't tested windows devices, but iOS is a little better than the most recent android releases, but it is still far from ideal. The problem is not with Linux, but with the software stack. Linux can actually be more responsive than windows on desktop, especially if you build a real time kernel.
  • name99 - Thursday, January 8, 2015 - link

    The issue has nothing to do with how great the Linux (or WIndows, or Darwin) kernels are; it happens far higher up, in the frameworks.

    MS, in particular, has designed from scratch all the relevant Metro APIs to be concurrent --- they all take and return futures. This does a lot to prevent UI interactions and animations from being blocked by anything else that occurs on the primary app thread.

    Apple are not as slick. They engineered Core Animation to run on a separate thread from the start, and they've retrofitted a bunch of the most important frameworks to take blocks as parameters, submitted to a GCD queue. But they don't have the future's technology of MS, so they can't as easily chain these async requests together or aggregate them. This will probably come soon With the next release of Swift (and maybe be retrofitted to Objective C). The technology for futures is in LLVM; but even so, Apple still needs a new set of Frameworks designed from scratch with this sort of Actor model in mind.

    Google seem to be stuck in the 90s, with a 90s model of concurrency and a 90s model of animation. They (unlike Apple and MS) don't fully control their language. Java has had futures for a long time, but I don't know how powerful they are, how nice their syntax is, or how well they are integrated into the libraries. Certainly the Android frameworks (even today) as far as I know are not designed based on any sort of Actor model.

    Basically interactive UI remains the same as always --- the more crud happens on your UI thread, the less responsive your device will be, so the key is ensuring that as much as possible doesn't happen on the UI thread. Step one is to make it POSSIBLE to move stuff off the UI thread. But that's only the start. Step 2 (about where Apple is) is to make it EASY to move stuff off the UI thread. Step 3 (where MS mostly is) is to make it EASIER to do stuff off the UI thread than to do it synchronously.
  • mkozakewich - Thursday, January 8, 2015 - link

    I've never understood the whole 1ms thing. A 200Hz panel would take 5 ms to display anything, so you've got a pretty big window where you're a single frame behind. Maybe this would be less of a problem on a faster screen where you wouldn't be waiting up to 16.6 ms to see where you are.
  • watzupken - Thursday, January 8, 2015 - link

    Not sure if this is required on a cell phone to be honest. I rather have a decent speced phone where the battery can actually last, rather than the need to plug to the mains or battery pack almost daily. The race for higher resolution and bigger display is starting to become irrational from my point of view. So I wonder if this increased in refresh rate will actually bring about a useful improvement on a cell phone.
  • Xinn3r - Thursday, January 8, 2015 - link

    "although it’s possible that displays with poor refresh rates wouldn’t see nearly as much benefit."

    I don't understand this sentence, wasn't it comparing it to 120Hz? Isn't it obvious that poorer refresher rates (under 120Hz) would be worse?

    Or is the display-capable 120Hz and refresh rate a different thing? If that's it, than it's really confusing explaining it using a sentence like that.
  • mkozakewich - Thursday, January 8, 2015 - link

    I wondered, too. I came to the conclusion that they were saying the new chips enabled the use of 120Hz screens.
    Even then, it would be weird to wonder whether screens less than 120 Hz would perform worse. Yes, they would.
  • mkozakewich - Thursday, January 8, 2015 - link

    Oh, unless screens need a refresh rate of 8.3 or less to sustain 120 Hz (which would be a hardware thing), and lower-quality displays of 10 ms or more would show less and less frames per second?
  • ianmills - Monday, January 12, 2015 - link

    I believe most display panels are capable of 120hz these days. The issue is with the chip/board that drives the display that is usually not capable of 120hz

    For example, it is possible to buy a board for desktop LCDs that you can install inside the LCD to make it 120hz
  • stickmansam - Thursday, January 8, 2015 - link

    Personally, I would prefer displays at 1600x900 at about 5 inches which gives a very good DPI still while not being wasteful imo
  • zodiacfml - Saturday, January 10, 2015 - link

    They are now producing displays?
  • Vegator - Sunday, January 11, 2015 - link

    MediaTek doesn't produce displays, but does design chips driving the display such as smartphone SoCs and touchscreen controllers. It merged with MStar not too long ago which was a large player in the TV SoC and display controller space, which helps to explain MediaTek's expertise in display technology that it is now demonstrating.

Log in

Don't have an account? Sign up now