Comments Locked

29 Comments

Back to Article

  • GC2:CS - Monday, December 2, 2019 - link

    That is just enough for an Apple Watch...
  • michael2k - Monday, December 2, 2019 - link

    That's exactly what I thought! Or possibly a more expensive iPhone SE, if they can get it scaled up to a 4" screen
  • p1esk - Monday, December 2, 2019 - link

    But not enough PPI for any VR headset... Also, isn't 3000 nits insanely bright? Why is this thing so bright?
  • DanNeely - Monday, December 2, 2019 - link

    high end HDR support. IIRC the current specs define up to 10k (ex for the sun or other very small bright reflections) as a future tech target. Probably also as a demonstration of how high they can push the techs performance. This should have OLED blacks, but also be able to get brighter than the best consumer LCD displays.
  • tuxRoller - Monday, December 2, 2019 - link

    The sun is closer to 150k.
  • Valantar - Tuesday, December 3, 2019 - link

    Nobody is saying it should try to emulate the sun's actual brightness, just that extreme brightness levels like 3000 nits are useful for displaying very bright things _like_ the sun.
  • futrtrubl - Tuesday, December 3, 2019 - link

    Plus nobody would actually want it that bright. Think of the liability for retinal damage.
  • tuxRoller - Tuesday, December 3, 2019 - link

    I was directly replying to this:
    "IIRC the current specs define up to 10k (ex for the sun or other very small bright reflections)"

    While there may be some use for a spec that covers the entire human visual luminance range (scotopic + mesoscopic + photopic), that's not what I was advocating (https://hdguru.com/calibration-expert-is-10000-nit...
    TBC: 10k isn't equal to peak solar luminance, nor the maximum that a human eye can tolerate without damage (typically).
  • ZolaIII - Monday, December 2, 2019 - link

    It's a early prototype to show how far it can go. It won't go commercially neither that bright nor with that density nor it will remain to be Gallium nitrate solely based we will probably see switch to Indium Gallium Zinc oxide (IGZO) LED's along with further miniaturisation.
  • CharonPDX - Monday, December 2, 2019 - link

    Probably for use in a projection setting.
  • wilsonkf - Monday, December 2, 2019 - link

    Think about outdoor display.
  • Kamus - Monday, December 2, 2019 - link

    3000 nits is actually very low, for it's intended application.

    Let's say you had 10,000 nits. That would barely be enough for the full range of brightness HDR 10 can offer.

    But, it gets much worse... These will be used with waveguides for AR and VR. Which means the brightness drops off dramatically, and you would be lucky to even get proper SDR brightness after it's magnified to fill your FOV.

    I've seen prototypes of those at tradeshows that get much brighter (they claim millions of nits)

    TL;DR: nothing remarkable about 3,000 nits at these form factors.
  • SimpleDisplay - Monday, December 2, 2019 - link

    This display is still an order of magnitude too big for AR (2 orders in area). At 45 PPD, this would have an FoV of ~6-7 degrees. With square aspect ratio, the display would to be about 10 inches square to cover 50x50 degrees FoV, too big for head mounted displays
  • Kamus - Tuesday, December 3, 2019 - link

    You're right, I didn't even notice the size, and assumed this was a micro-display. It's not, this is big enough for smart watches.
  • p1esk - Tuesday, December 3, 2019 - link

    Wow, you're right! I just watched https://www.youtube.com/watch?v=52ogQS6QKxc - 1M nits and 10k PPR! This is crazy. For some reason I thought that MicroLED is hindered by the inability to cram many of the individual LEDs dense enough for the common size/resolution ratios. So why the hell do still we not have 8k MicroLED TVs and monitors???
  • p1esk - Tuesday, December 3, 2019 - link

    Also, why did this Japan Display chip with such inferior specs make the news?
  • edzieba - Tuesday, December 3, 2019 - link

    Because:

    1) Those really tiny microdisplays are fabbed on Silicon in the same way other ICs are, rather than on a large substrate like regular display panels. They hit the reticle size limit like every other IC.

    2) They cost an arm, a leg, and both kidneys. Fabbing them is far more complex than an IC of the same die area, and so they cost a heck of a lot more per unit area.
  • p1esk - Tuesday, December 3, 2019 - link

    Ok, 1. is a good point, but 2 - why would it cost more than a regular IC? These dies have very regular structures, probably even simpler than flash memory. I don't see why you couldn't fab a
  • p1esk - Tuesday, December 3, 2019 - link

    *I don't see why you couldn't fab a million of them at a fraction of a cent per die, and stitch them together to have a 4k display.
  • p1esk - Tuesday, December 3, 2019 - link

    Keep in mind that fabbing an IC is only expensive for advanced node technologies, like 14nm. For the TV or monitor PPI we are talking about fabbing huge node sizes, like microns, instead of nanometers. This gotta be much much cheaper to make, especially in mass quantities.
  • ksec - Tuesday, December 3, 2019 - link

    Well even the Apple Watch has 326PPI, so 265 isn't good enough ( yet )
  • Kamus - Monday, December 2, 2019 - link

    "Micro LED technology is a promising candidate for higher-end displays and television that will be available three to four years down the road."

    3 to 4 years? Think 10, if we're lucky. We might see it earlier at these sizes though.

    In 3-5 years, these might end up being on AR/VR headsets.

    But as far as televisions go. Just stop living in the future, and get an OLED. Its as good as it's going to get for a decade or so.
  • Retycint - Monday, December 2, 2019 - link

    I've started to see noticeable burn-in on my OLED Samsung S8+, after 2.5 years of use. So I'm not very convinced on the longevity of OLED TVs, especially since OLEDs have only achieved mass-market recognition in the past 2-3 years and there hasn't been enough time to say whether burn-in will be an issue or not
  • nathanddrews - Tuesday, December 3, 2019 - link

    Categorically, burn-in is inevitable on OLED just as the LED backlight of an LCD can burn out. The only question that matters is: will it happen before you throw it away and replace it? It can be mitigated and slowed, but not eliminated. Not trying to hate, just being realistic. I refuse to buy a phone WITHOUT an OLED screen, I love how it looks. To date, my Galaxy S5 is the only OLED device in my life that suffered burn-in before I was ready to upgrade to a newer phone (manifested after 18 months). My S3 went almost 4 years without any signs of permanent burn-in.

    When talking about MicroLED timing, it's important to remember that there's a big difference in manufacturing between MicroLED in small applications like this (iWatch 5/6) and MicroLED in large applications (Sony Crystal LED/Samsung Wall). We're going to see MicroLED in watches and stadiums before we see them in our living rooms.
  • Kamus - Tuesday, December 3, 2019 - link

    That's very interesting to me. I used a galaxy note 4 for 5 years, and I never had any burn in, what so ever.

    I just switched to a Xiaomi mi 9 after deciding my note 4 was just due for an update (I still loved the screen though) and I'm even less afraid of burn in now that everything is dark mode.

    Back when I used the note 4, dark modes were just a pipe dream for most content.
  • ianmills - Tuesday, December 3, 2019 - link

    The PPI is a bit small and I can see the 282ppi mi band 4 dots 30cm away
  • bansheexyz - Tuesday, December 3, 2019 - link

    If the costs of OLED come down dramatically to match LCD prices, then the uneven pixel wear issue would be tolerable, as you could simply buy replacements more frequently. It's only a problem due to current prices, so they need to start printing these panels like paper the way they say they can.
  • Vitor - Tuesday, December 3, 2019 - link

    The pixel density is actually suitable for any TV. A 4k 22" has a pixel density of 200. So yeah, this tech could easily be better than lcd everywhere without the burn in issue of oled.
  • Lolimaster - Tuesday, December 3, 2019 - link

    Seems microled i wayto inmature, remenber rhe promuse of mled qled tvs from samsung in 2021...

Log in

Don't have an account? Sign up now