In 10 years, our walls will be smart home and media centers displays. I welcome our robot overlords in 30-40 years; I'll be getting advanced in years by then, so if they kill me it won't be near as a big of a deal.
The AI will recognise your auroa of 'dissent', based on where your Google Glasses pointed, your broswing history (no madated by law as undeletable), Kindle books read, songs downloaded, and needless to say Youtube history, Amazon also reported to the cops you bought a signal-jammer and a money counter....and AI will naturally go after the intelligensia who might 'awaken' the young.
I pray for you pal. I suppose I'll be there right along side you.
This isn’t really that impresssive given that it is LED tile. Such aggregate resolutions have been available for years. The way LED tiles work is you essentially build them up like LEGO bricks and scaling up the backend would permit any arbitrary resolution.
What would be impressive is if that was seen by devices as one logical display. Currently due to the seamless modular nature of LED, this display is likely a 4x3 condfiguration of 4K logical displays.
Because of bandwidth limitations, a screen greater than 4k has problems with refresh rates > 60Hz. DisplayPort has a maximum uncompressed HDR resolution of 4k at 60Hz. If the screen is to have a reasonable refresh rate then multiple cables will be needed from the driving PC (and the PC will probably need multiple graphics cards). (I assume that it will be driven from a PC as I am not aware of any non-PC solution that could drive this large a screen.)
so you basically mean contrary to Nvidia as usual BS chest pounding, Multi-GPU still has a use consumer wise (gamers) got used to calling them SLI or Crossfire.
Almost like Nv knows full well top end folks very likely still have need of dual gpu (even if edge cases) where having a harsh set limit instead of just being happy your are making massive $$$$ as a result of "allowing" multi-gpu and/or for different brands to pair up with each other to properly drive these mega resolutions at reasonable levels and settings.
I am surprised at least some of the top end TV not have their own dedicated controller so driving that 4k+ "TV" becomes no harder than running a low 3d task instead of mass wasted "work" (look at Final Fantasy (newest one) for the built in benchmark running crud where it does not need to be just to ensure everything is "loaded" properly or some crud excuse, if a "dedicated" processor like G-sync or w/e could actually handle some of the "heavy lifting" that would be great.
Instead we have games etc wasting massive potential performance or being hamstrung by BS crud slinging companies (Intel, Nv, Apple to name some) saying you need something 4-5x more powerful than you actually need just so they can turn around not that long after trash its performance via "drivers" cut deals with game publisher etc to "rinse and repeat"
At least if the TV or w/e handled a chunk of the load it could mean that no in fact you do not need the branded" kool aid at $$$$$$$$ to enjoy the newest stuff, just update the much much less $ "booster" on the TV itself almost like using a switch on its docking station it "opens it way up" performance wise, or like PS4 and its VR "helmet" that combines a few extra chios to "boost" what it normally was not capable of.
I do not see what they do not do such things, ESPECIALLY for the workstation/photo/gaming crowd, I know they do have chips in TV they have for a long time, and router/modem as well, though it really has not been all that long that they started using a cruddy dedicated processor tor memory, clock speeds and the like, this would be taking this that extra step and making it more than a fancy calculator cpu but an actual functional "booster" if that is making sense ^.^
That is the trick. It appears to end users as one display. To a source in this Sony example, it is likely eight UHD signals. Oddly, I would expect that there would be 32 LED drivers as the common platforms are still based around a 1080p design (newer 4K units are out there but far more rare). A multiview processor or some video switchers can take those eight UHD source signals and splice them into the 32 lower resolution 1080p signals.
As for the refresh rates, LEDs are actually kind of insane. 480 Hz and 960 Hz are rather common for the individual tiles with some going far, far higher. And these can actually be reached but again, you are generally limited to the normally 1080p capable LED driver. You'd have to limit the resolution to 640 x 360 coming out of the LED driver (you'd get 540 Hz max) which is basically one cabinet of tiles and an insanely large multiview processor/switcher to handle the splicing.
As for a means of driving it from a single cable, I can actually think of one non-standard means: 100 Gbit Ethernet using a video over IP protocol like SDVoE, a switch with a 100 Gbit uplink, and a video over IP capable LED driver (which do exist). The SDVoE side of things would require the 100 Gbit Ethernet card in reality to be a FPGA as it would need to handle some of the SDVoE protocol. Source video cards would need to be capable of passing multiple frame buffers directly to the FPGA over the PCIe bus in the host system. From there the switch takes the multiple frame buffer streams and then sends it to the necessary number of IP based LED controllers. This is actually how the future of big LED walls is going to progress as it simplifies so much on the cabling end and a switch with a 100 Gbit uplink that can be used to link to multiple IP based LED drivers is actually less expensive than the multiview processors/video switchers. Beyond this, the next logical step is connect directly via IP to the cabinets for the tiles and remove the more traditional driver boxes entirely. Engineering is already happening to this end on the manufacturer side.
On the contrary - these MicroLED displays (Sony CLEDIS, Samsung Wall) are very impressive. Like any given display, it's up to the controller to set the parameters for display resolution/refresh. One controller can drive an entire group of panels as one display. Doesn't matter if it's made up of 20 panels or just one, the result is seamless if done properly. Last presentation I saw stated that production of smaller sub-panels (finer pitch) is on track for mass adoption by 2022, meaning we'll probably forget all about LCD and OLED by 2024. Higher resolutions, larger displays, way higher nits, perfect black, no burn-in (allegedly), lower power, better viewing angles. Exciting stuff!
I've actually built a 4320 x 2160 (two 4K side by side). For end users, it is just two 4K displays being presented via a switcher system. The backend switcher system took each 4K input and split it into four 1080p signals that then feed into a proprietary interface box. From there, each of the eight interface boxes drove four cabinets of panels each holding eight actual LED tiles. It really is like building with Lego. Pixel pitch was 1.6 mm on these panels so this Sony demo actually wasn't that much bigger than the unit I installed even though it is of lower resolution.
The wall I built was based off of older NovaStar equipment which is why the LED interface boxes were limited to 1080p. Newer NovaStar and alternative LED drivers now accept 4K inputs. Barco and Christie also make LED driver interfaces that accept a 4K video signal directly from an IP connection (SDVoE).
The Sony demo referenced here actually isn't that impressive either which uses either a 1.2 mm or 1.25 mm pixel pitch. I've personally seen a 0.9 mm units and 0.7 mm based LED panels are in production now. In about same area, that'd be more like a 22,400 x 6,300 resolution using the finer pitched products available today. It just comes down to cost and how much effort is put into the backend to drive that many pixels.
very nice LED Display screens!!!!we are produce and exporting LED display screen at Shenzhen China.There are indoor and outdoor LED video for your choosing.Welcome to consult me.My Whatsapp: +86 13316857758,Andy Zhu
Samsung's MicroLED cinema screen has already been installed in Shanghai. It's rumored that a typical movie ticket costs about 4 times the usual price in major Chinese cities.
Calling it "16k" when it's not 16:9 or a DCI aspect ratio (as all other xk resolutions are) is a bit misleading. With this aspect ratio, it's definitely not as many pixels as a 16:9, Flat or Scope 16k display would be.
Doing the math, it is a 32:9 aspect ratio, or two 16:9 side by side based on the dimensions.
The precise Resolution wasn't disclosed but using the physical dimensions and the 16,000 horizontal pixel estimate, the pixel pitch would be 1.2 to 1.25 mm. Vertical resolution should be around 4320.
Considering the ridiculous resolution, this could at best be used for still image slideshows, or tiled 4K/8K video streams from multiple sources.
I would be interested if they could bring those bezel-less panels to the consumer market as 21" 1080p@120hz modules that you can connect to a central hub that has DP/HDMI inputs to hook up with your GPU. That way, you could upgrade your screen by just buying a couple of modules and adding them to your setup.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
19 Comments
Back to Article
LordSojar - Thursday, April 11, 2019 - link
In 10 years, our walls will be smart home and media centers displays. I welcome our robot overlords in 30-40 years; I'll be getting advanced in years by then, so if they kill me it won't be near as a big of a deal.Notmyusualid - Sunday, April 14, 2019 - link
Actually Sojar - you'll be first dead!The AI will recognise your auroa of 'dissent', based on where your Google Glasses pointed, your broswing history (no madated by law as undeletable), Kindle books read, songs downloaded, and needless to say Youtube history, Amazon also reported to the cops you bought a signal-jammer and a money counter....and AI will naturally go after the intelligensia who might 'awaken' the young.
I pray for you pal. I suppose I'll be there right along side you.
Not many of us left.
:)
shabby - Thursday, April 11, 2019 - link
20 ppi... i'm blown away!quiksilvr - Thursday, April 11, 2019 - link
Considering that most movie theaters are about 5-10 ppi this is actually pretty good.Kevin G - Thursday, April 11, 2019 - link
This isn’t really that impresssive given that it is LED tile. Such aggregate resolutions have been available for years. The way LED tiles work is you essentially build them up like LEGO bricks and scaling up the backend would permit any arbitrary resolution.What would be impressive is if that was seen by devices as one logical display. Currently due to the seamless modular nature of LED, this display is likely a 4x3 condfiguration of 4K logical displays.
Duncan Macdonald - Thursday, April 11, 2019 - link
Because of bandwidth limitations, a screen greater than 4k has problems with refresh rates > 60Hz. DisplayPort has a maximum uncompressed HDR resolution of 4k at 60Hz. If the screen is to have a reasonable refresh rate then multiple cables will be needed from the driving PC (and the PC will probably need multiple graphics cards). (I assume that it will be driven from a PC as I am not aware of any non-PC solution that could drive this large a screen.)Dragonstongue - Thursday, April 11, 2019 - link
so you basically mean contrary to Nvidia as usual BS chest pounding, Multi-GPU still has a use consumer wise (gamers) got used to calling them SLI or Crossfire.Almost like Nv knows full well top end folks very likely still have need of dual gpu (even if edge cases) where having a harsh set limit instead of just being happy your are making massive $$$$ as a result of "allowing" multi-gpu and/or for different brands to pair up with each other to properly drive these mega resolutions at reasonable levels and settings.
I am surprised at least some of the top end TV not have their own dedicated controller so driving that 4k+ "TV" becomes no harder than running a low 3d task instead of mass wasted "work" (look at Final Fantasy (newest one) for the built in benchmark running crud where it does not need to be just to ensure everything is "loaded" properly or some crud excuse, if a "dedicated" processor like G-sync or w/e could actually handle some of the "heavy lifting" that would be great.
Instead we have games etc wasting massive potential performance or being hamstrung by BS crud slinging companies (Intel, Nv, Apple to name some) saying you need something 4-5x more powerful than you actually need just so they can turn around not that long after trash its performance via "drivers" cut deals with game publisher etc to "rinse and repeat"
At least if the TV or w/e handled a chunk of the load it could mean that no in fact you do not need the branded" kool aid at $$$$$$$$ to enjoy the newest stuff, just update the much much less $ "booster" on the TV itself almost like using a switch on its docking station it "opens it way up" performance wise, or like PS4 and its VR "helmet" that combines a few extra chios to "boost" what it normally was not capable of.
I do not see what they do not do such things, ESPECIALLY for the workstation/photo/gaming crowd, I know they do have chips in TV they have for a long time, and router/modem as well, though it really has not been all that long that they started using a cruddy dedicated processor tor memory, clock speeds and the like, this would be taking this that extra step and making it more than a fancy calculator cpu but an actual functional "booster" if that is making sense ^.^
take care o7
Kevin G - Thursday, April 11, 2019 - link
That is the trick. It appears to end users as one display. To a source in this Sony example, it is likely eight UHD signals. Oddly, I would expect that there would be 32 LED drivers as the common platforms are still based around a 1080p design (newer 4K units are out there but far more rare). A multiview processor or some video switchers can take those eight UHD source signals and splice them into the 32 lower resolution 1080p signals.As for the refresh rates, LEDs are actually kind of insane. 480 Hz and 960 Hz are rather common for the individual tiles with some going far, far higher. And these can actually be reached but again, you are generally limited to the normally 1080p capable LED driver. You'd have to limit the resolution to 640 x 360 coming out of the LED driver (you'd get 540 Hz max) which is basically one cabinet of tiles and an insanely large multiview processor/switcher to handle the splicing.
As for a means of driving it from a single cable, I can actually think of one non-standard means: 100 Gbit Ethernet using a video over IP protocol like SDVoE, a switch with a 100 Gbit uplink, and a video over IP capable LED driver (which do exist). The SDVoE side of things would require the 100 Gbit Ethernet card in reality to be a FPGA as it would need to handle some of the SDVoE protocol. Source video cards would need to be capable of passing multiple frame buffers directly to the FPGA over the PCIe bus in the host system. From there the switch takes the multiple frame buffer streams and then sends it to the necessary number of IP based LED controllers. This is actually how the future of big LED walls is going to progress as it simplifies so much on the cabling end and a switch with a 100 Gbit uplink that can be used to link to multiple IP based LED drivers is actually less expensive than the multiview processors/video switchers. Beyond this, the next logical step is connect directly via IP to the cabinets for the tiles and remove the more traditional driver boxes entirely. Engineering is already happening to this end on the manufacturer side.
nathanddrews - Thursday, April 11, 2019 - link
On the contrary - these MicroLED displays (Sony CLEDIS, Samsung Wall) are very impressive.Like any given display, it's up to the controller to set the parameters for display resolution/refresh. One controller can drive an entire group of panels as one display. Doesn't matter if it's made up of 20 panels or just one, the result is seamless if done properly. Last presentation I saw stated that production of smaller sub-panels (finer pitch) is on track for mass adoption by 2022, meaning we'll probably forget all about LCD and OLED by 2024. Higher resolutions, larger displays, way higher nits, perfect black, no burn-in (allegedly), lower power, better viewing angles. Exciting stuff!
Kevin G - Thursday, April 11, 2019 - link
I've actually built a 4320 x 2160 (two 4K side by side). For end users, it is just two 4K displays being presented via a switcher system. The backend switcher system took each 4K input and split it into four 1080p signals that then feed into a proprietary interface box. From there, each of the eight interface boxes drove four cabinets of panels each holding eight actual LED tiles. It really is like building with Lego. Pixel pitch was 1.6 mm on these panels so this Sony demo actually wasn't that much bigger than the unit I installed even though it is of lower resolution.The wall I built was based off of older NovaStar equipment which is why the LED interface boxes were limited to 1080p. Newer NovaStar and alternative LED drivers now accept 4K inputs. Barco and Christie also make LED driver interfaces that accept a 4K video signal directly from an IP connection (SDVoE).
The Sony demo referenced here actually isn't that impressive either which uses either a 1.2 mm or 1.25 mm pixel pitch. I've personally seen a 0.9 mm units and 0.7 mm based LED panels are in production now. In about same area, that'd be more like a 22,400 x 6,300 resolution using the finer pitched products available today. It just comes down to cost and how much effort is put into the backend to drive that many pixels.
Andyju2008 - Wednesday, July 17, 2019 - link
very nice LED Display screens!!!!we are produce and exporting LED display screen at Shenzhen China.There are indoor and outdoor LED video for your choosing.Welcome to consult me.My Whatsapp: +86 13316857758,Andy Zhus.yu - Friday, April 12, 2019 - link
Samsung's MicroLED cinema screen has already been installed in Shanghai. It's rumored that a typical movie ticket costs about 4 times the usual price in major Chinese cities.Valantar - Friday, April 12, 2019 - link
Calling it "16k" when it's not 16:9 or a DCI aspect ratio (as all other xk resolutions are) is a bit misleading. With this aspect ratio, it's definitely not as many pixels as a 16:9, Flat or Scope 16k display would be.Kevin G - Friday, April 12, 2019 - link
Doing the math, it is a 32:9 aspect ratio, or two 16:9 side by side based on the dimensions.The precise Resolution wasn't disclosed but using the physical dimensions and the 16,000 horizontal pixel estimate, the pixel pitch would be 1.2 to 1.25 mm. Vertical resolution should be around 4320.
s.yu - Sunday, April 14, 2019 - link
Are we talking RGB panels or something like Pentile?Kevin G - Monday, April 15, 2019 - link
Generally RGB stripe or triple dot depending on the pixel pitch and manufacturer.D. Lister - Saturday, April 13, 2019 - link
Considering the ridiculous resolution, this could at best be used for still image slideshows, or tiled 4K/8K video streams from multiple sources.I would be interested if they could bring those bezel-less panels to the consumer market as 21" 1080p@120hz modules that you can connect to a central hub that has DP/HDMI inputs to hook up with your GPU. That way, you could upgrade your screen by just buying a couple of modules and adding them to your setup.
AndrewHaris - Tuesday, May 21, 2019 - link
Very helpful article, If you are looking for <a href="http://umh.ae/led-screen/">LED Screen Manufacturer and supplier</a> in Dubai then contact us.Red_Ninja4752 - Sunday, December 27, 2020 - link
783 inch display? Dang.