Some interestingly notable hardware companies missing there - Samsung, Qualcomm/Snapdragon, MediaTek, and HiSilicon/Huawei. That's a LOT of missing mobile vendors. Hopefully, between Google, Youtube, Twitch and Netflix, AV1 can be rammed down hard enough everyone else just falls in line with big daddy ARM.
I'm not completely sure but I think Samsung and Qualcomm have ties to HEVC. So I think they'd prefer the money made from that instead of backing AV1. That doesn't mean they'll not support it on their chips eventually but I assume they'll try to cling to the standard that makes them money and that they've invested in.
Know Samsung\Qualcomm, they will probably NERF the implementation of AV1 in their devices to make HEVC appear superior. Lots of way they can do this, from artificially jacking streaming efficiency to only implementing seq_profile_0. Who knows, wouldn't be the first time big corp played money with codec competition.
It is a non-issue, the codec will be available on every contemporary platform the moment someone rolls out a hardware accelerated implementation. It is not something that has to be laid down in the silicon.
The codec won't get going until there is full hardware support including full fixed function decode in most systems out there. CPU-based decoding is just too inefficient for modern video codecs, and GPU-assisted decoding leaves the CPU with too much work, if HEVC is anything to go by. HEVC takes a significant amount of a modern desktop CPU to software-decode - let alone smartphones and TV hardware. AVC1 is more complex by most accounts.
There is no fixed function decode on most platforms. It is just hardware accelerated. Video codecs lend themselves very well to general purpose compute, and it would be a complete waste of transistors to make dedicated decoder hardware. Don't let the practice of burying that stuff in the firmware confuse you.
Wait, what? AFAIK most modern hardware has a dedicated decoding block. Some things (like 4K Netflix/Amazon) won't even play without hardware HEVC decoding, even if you have more than enough CPU grunt.
The real modern SoC's use DSP's for it & modern DSP's are also re programmable (at least those good designed one's from CEVA and Tensilica) which mens new commands & algorithms can be entered. All tho the DSP's aren't as efficient as the fixed single use/purpose hard fixed ASICS they really aren't that far behind & when you take in consideration their actual multi purpose use & (still limited & not so easy to do) programmable nature they are much better fitted for the job. AMD Radeon UVD contains also a Tensilica DSP but a small one & for audio only. On the smartphones all current generation SoC's use DSP for that purpose (QC QDSP, HiSilicon, Samsung & now MTK use Tensilica P6), on the desktop Intel is jumping in the same bandwagon with new announced generations.
I hope now you feel a bit more educated on the topic.
*ahem* "GM206’s decoder was a major upgrade from GM204’s. Importantly, it added full HEVC fixed function decoding. And not just for standard 8bit Main Profile HEVC, but 10bit Main10 Profile support as well"
Lol how about 10 bit HEVC real time recording at S845 QDSP @30 FPS or Exunos 9810 Tensilica P6 MP @60 FPS & double those on 8 bit main profile. All this wile staying in 1W limit for DSP & 2~2.5W for whole SoC. Still those do & great variety of other V/A formats (including Vp9 8~10 bit) and staying open to support any future implementations no matter is it a codec improvement or a new codec. In the mean time they also do; audio processing, VC, CNN... So just try to stick that against bad implementation of an ASIC fixed function which is still partial as it employs & CUDA core's implementation in Nv encoder and idles doing absolutely nothing on more than 30W. And a simple explanation how & why adding ASICS fixed functions isn't exactly the best solution other than it's stiff. https://www.semiwiki.com/forum/content/6797-cpu-gp... It's for CNN but it also reflects pretty much everything else. ASICS will stay best solution for fast fixed IO switches and nothing else.
"GM206’s decoder was a major upgrade from GM204’s. Importantly, it added full HEVC fixed function decoding. And not just for standard 8bit Main Profile HEVC, but 10bit Main10 Profile support as well"
I wonder if "this time will be different". People always complains about MPEG royalties during standards transitions, but once the next standard is embraced, the complaining subsides.
The "problem" is that hardware manufactures don't want to implement multiple encoders/decoders, so MPEG tends to win due to the investment costs.
The reference implementation is most likely very basic and slow. Even if it is an open standard and free to license, good hardware accelerated implementations will not be necessarily free or open source.
The one thing that nobody on this planet should doubt is the ability of the industry to ruin even the best of things.
If they had their way, they'd be happy to sell us breathing air. And pretty soon might do just that.
This time is different BECAUSE they developers of the codec are working hand-in-hand with the hardware companies. There's been a standing meeting (iirc, every quarter) where key software codec folks go to Taiwan for full syncs with all stake holders. As a result they've even had to drop some of the experiments because the hardware folks said they were too costly to implement. Hopefully we'll see these used in av2 since they are quite innovative (a good bit is from daala, which had a radically
If you look at the list of AV1 companies, the list *missing* hardware players list is very telling. AV1 clearly doesn't have the support of the professional or semi-professional video hardware community. AV1 might still succeed in spite of this. After all, the video recording device business is dwarfed by smartphones these days. I suspect in the end we'll have two standards on all of devices instead of two. Hurray for "progress".
Other than what others have mentioned, I'm not sure what you're saying. It's your main concern that the org isn't interested in itu status, and thus, uncertain ott?
May be I will have to go through twitter to have this done and corrected. But Av1.0 has not been released yet. Even if you click on the Spec PDF link it still say it is draft document. Apparently the PR and marketing team when on without even asking the engineering team. They wanted to report this about 10days before NAB Show.
We are a time where future video cameras (from smartphones-drones to pro-cameras) should have a certain minimal quality certification for day/dark + HDR/10-12bits.
Consumer win having actual quality video delivered to their new OLED tv's.
What I really want is NVidia support to compress and decompress to whatever quality the video card can handle for user generated content
Bluray DRM can be recognized for blocking commercial content and yet allow me to record whatever is on my screen
Recording 4-8K with ZERO compression should be allowed if the hardware can handle it
Max compression for 4K content should be set to whatever the user wants
For example 100% usable compression in software might actually be set to 64% actual compression in hardware and be displayed as 100/64 compression
Setting the max usable compression would prevent me from going too far in a video where its not initially noticeable and then ruining the remainder of the video where it is noticeable
I'd like to generate my own content without the DRM Nazi's ruining my day
I'd like to EASILY record whatever I want at any usable resolution and compression level the hardware can handle
4-8K camcorders should output directly to a Videocard input (yes, I said INPUT) without any compression whatsoever and let the videocard do all of the heavy lifting
whether recording desktop output or camera input, WE NEED MORE OPTIONS!
"Recording 4-8K with ZERO compression should be allowed if the hardware can handle it" Among your many wild dreams/demands that got my eye the most. Although much or most consumer (and *all* prosumer) video recording hardware is capable of zero compression that will never happen due to a thing called "market differentiation". In plain words that means that the camera manufacturers love the very fat profit margins they get from professional video cameras so they would never risk them by moving zero compression down the food chain.
That looks like it is an unwritten rule that is somehow obeyed even by non professional camera companies. Video/movie professionals require uncompressed video because they can manipulate it, color grade it, edit it, insert special effects in etc much more gracefully. In recent years uncompressed video started being introduced to mid-range prosumer DSLRs (which record video but are not video focused) but that is the current lowest limit I am aware of.
That calculator's math is not correct for videocamera raw footage.
1. That calculator assumes 3 subpixels per pixel. In almost every camera on the market, be it a basic potato you'd have to pay people to take all the way to the $70k RED Monstro you have a bayer pattern sensor with 1 subpixel per pixel, unlike monitors. This significantly reduces the amount data you actually need to record when running with raw.
2. It uses math that produces the bandwidth needed to drive a monitor at 4K using worse than CVT timings, and that adds a ton (around 20-24%) of overhead that you don't have in a camera raw In a camera raw, you have the raw frame, some metadata about exposure, lenses, shutter speed, location etc and timecodes. Besides, anything past 2560x1600 is using CVT-R2 timings, which have much lower overhead (less than 5% for 4K and up)
Here's the actual numbers for raw 4K/8K capture, frames only, so add some overhead for metadata (think <10mbit/s, since you don't need blanking intervals here):
This all fits just fine using a few SDI cables (remember, those only go up to 12Gbit/s and that's what they use in cinema, TV and live broadcast) to get raw footage off the camera and into some external capture device.
You can also do that over HDMI or DP just fine (after de-bayering). In fact, this is done right now: the higher-end DSLRs and video cameras all have HDMI output, so you can just feed the live signal into a regular ordinary HDMI capture card and cap it raw that way, but you'll only see that on higher-end kit, because it's expensive to have fast IO in a space as small and oddly-shaped as a camera.
The excessive use of "Speaking of..." phrase to start paragraphs nearly made me think that this was written by Linus G Sebastian. However, I was wondering if there was anything new with audio compression schemes, and I think AV1 is just about video compression?
Opus happened not that long ago for audio, and it will be a long while before we see anything better (if ever). The reality is that despite ISP's best efforts, overall internet speed is going up every year, making audio such a small item to optimize for. I mean, what's ~128kbit/s stereo (that's where I've seen audio typically max out at) compared to a 4+mbit/s video stream (4mbit/s is around where 1080p streams sit)?
The major things about AV1 is that it seems to be really focused on the needs of OTT 4k video. Google and others have stated that it is less efficient at lower than HD resolutions, probably not much better at HD, and somewhere around 30% better at higher resolutions. The real problems left to solve are much more on the encode than the decode end. Netflix is asking for somewhere between 5x-10x more computationally intensive before they will commit to adopting it. Right now they are stating around 200x the computational complexity on current encoders. The next year where they optimize the encoders will determine if AV1 succeeds.
I also think it is an open question whether consumer devices will ever get powerful enough to have even HW accelerated encoding for consumer produced video. The AV1 codec is focused on large distributors like Google, Netflix, and recently Facebook and their needs. The massive computational needs of AV1 are perfect for these big distribution platforms as cpu cycles are cheap and they are focused on reducing their bandwidth costs.
What I don't understand is how they can be so confident some well-resourced patent troll won't come out of the woodwork and submarine the standard, just when it's really gaining momentum.
It seems to me what we really need is some legal framework around standards that says if you don't state your claim before a standard is ratified by some official body like ISO, then you forfeit your rights to said IP within that context.
There's no guarantee for any of these pools (as mpegla discovered) but aom has a massive number of patents under its belt, it was very carefully designed to avoid known patents, and they've setup a legal defense fund.
Guys, i know, you love vulgar girls What about online communication with them without limits? Here http://lonaism.ga you can find horny real girls from different countries.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
71 Comments
Back to Article
ZeDestructor - Friday, March 30, 2018 - link
Some interestingly notable hardware companies missing there - Samsung, Qualcomm/Snapdragon, MediaTek, and HiSilicon/Huawei. That's a LOT of missing mobile vendors. Hopefully, between Google, Youtube, Twitch and Netflix, AV1 can be rammed down hard enough everyone else just falls in line with big daddy ARM.Trixanity - Friday, March 30, 2018 - link
I'm not completely sure but I think Samsung and Qualcomm have ties to HEVC. So I think they'd prefer the money made from that instead of backing AV1. That doesn't mean they'll not support it on their chips eventually but I assume they'll try to cling to the standard that makes them money and that they've invested in.Samus - Sunday, April 1, 2018 - link
Know Samsung\Qualcomm, they will probably NERF the implementation of AV1 in their devices to make HEVC appear superior. Lots of way they can do this, from artificially jacking streaming efficiency to only implementing seq_profile_0. Who knows, wouldn't be the first time big corp played money with codec competition.SirPerro - Thursday, April 5, 2018 - link
That'll only damage them in the long termleo_sk - Sunday, April 8, 2018 - link
Qualcomm owns some patents but hasnt yet declared its royalty rates (as part of velos media group)iter - Friday, March 30, 2018 - link
It is a non-issue, the codec will be available on every contemporary platform the moment someone rolls out a hardware accelerated implementation. It is not something that has to be laid down in the silicon.CSMR - Friday, March 30, 2018 - link
The codec won't get going until there is full hardware support including full fixed function decode in most systems out there. CPU-based decoding is just too inefficient for modern video codecs, and GPU-assisted decoding leaves the CPU with too much work, if HEVC is anything to go by. HEVC takes a significant amount of a modern desktop CPU to software-decode - let alone smartphones and TV hardware. AVC1 is more complex by most accounts.iter - Friday, March 30, 2018 - link
There is no fixed function decode on most platforms. It is just hardware accelerated. Video codecs lend themselves very well to general purpose compute, and it would be a complete waste of transistors to make dedicated decoder hardware. Don't let the practice of burying that stuff in the firmware confuse you.brucethemoose - Friday, March 30, 2018 - link
Wait, what? AFAIK most modern hardware has a dedicated decoding block. Some things (like 4K Netflix/Amazon) won't even play without hardware HEVC decoding, even if you have more than enough CPU grunt.StevoLincolnite - Saturday, March 31, 2018 - link
Yeah, nah. Most GPU's feature fixed-function decode ASICs built into the hardware level.AMD Unified Video Decoder exists for a reason and gets updated constantly with new hardware decode options.
https://en.wikipedia.org/wiki/Unified_Video_Decode...
Now you can feel a little more educated on the topic.
ZolaIII - Saturday, March 31, 2018 - link
The real modern SoC's use DSP's for it & modern DSP's are also re programmable (at least those good designed one's from CEVA and Tensilica) which mens new commands & algorithms can be entered. All tho the DSP's aren't as efficient as the fixed single use/purpose hard fixed ASICS they really aren't that far behind & when you take in consideration their actual multi purpose use & (still limited & not so easy to do) programmable nature they are much better fitted for the job. AMD Radeon UVD contains also a Tensilica DSP but a small one & for audio only. On the smartphones all current generation SoC's use DSP for that purpose (QC QDSP, HiSilicon, Samsung & now MTK use Tensilica P6), on the desktop Intel is jumping in the same bandwagon with new announced generations.I hope now you feel a bit more educated on the topic.
sonicmerlin - Sunday, April 1, 2018 - link
You should probably define DSP at least once before using the abbreviation multiple times throughout your paragraph.ZolaIII - Sunday, April 1, 2018 - link
To define Digital Signal Processor? I think you got lost coming hire.peevee - Tuesday, April 3, 2018 - link
Your google is broken?entity279 - Monday, April 2, 2018 - link
Though we fell far less educated on the syntax of pluralsZeDestructor - Monday, April 2, 2018 - link
*ahem* "GM206’s decoder was a major upgrade from GM204’s. Importantly, it added full HEVC fixed function decoding. And not just for standard 8bit Main Profile HEVC, but 10bit Main10 Profile support as well"pls2educate yourself
Source: https://www.anandtech.com/show/10325/the-nvidia-ge... around middle of page
ZeDestructor - Monday, April 2, 2018 - link
wait.. wrong parentZolaIII - Tuesday, April 3, 2018 - link
To late. Btx process terminated.ZolaIII - Tuesday, April 3, 2018 - link
Lol how about 10 bit HEVC real time recording at S845 QDSP @30 FPS or Exunos 9810 Tensilica P6 MP @60 FPS & double those on 8 bit main profile. All this wile staying in 1W limit for DSP & 2~2.5W for whole SoC. Still those do & great variety of other V/A formats (including Vp9 8~10 bit) and staying open to support any future implementations no matter is it a codec improvement or a new codec. In the mean time they also do; audio processing, VC, CNN... So just try to stick that against bad implementation of an ASIC fixed function which is still partial as it employs & CUDA core's implementation in Nv encoder and idles doing absolutely nothing on more than 30W.And a simple explanation how & why adding ASICS fixed functions isn't exactly the best solution other than it's stiff.
https://www.semiwiki.com/forum/content/6797-cpu-gp...
It's for CNN but it also reflects pretty much everything else. ASICS will stay best solution for fast fixed IO switches and nothing else.
eddman - Monday, April 2, 2018 - link
You really need to read more about these stuff before posting.ZeDestructor - Monday, April 2, 2018 - link
*ahem*"GM206’s decoder was a major upgrade from GM204’s. Importantly, it added full HEVC fixed function decoding. And not just for standard 8bit Main Profile HEVC, but 10bit Main10 Profile support as well"
Source: https://www.anandtech.com/show/10325/the-nvidia-ge... around middle of page
ZolaIII - Tuesday, April 3, 2018 - link
Ah found a parent ask.ZeDestructor - Tuesday, April 3, 2018 - link
Yeah, I found the tobacco store :PZolaIII - Tuesday, April 3, 2018 - link
Coffee shop with green bites would had been even better. :')SydneyBlue120d - Saturday, March 31, 2018 - link
You're technically correct, however I'd like to remember You there is at least one mobile maker supporting the new codec, is name is Apple.levizx - Tuesday, February 11, 2020 - link
2 years later. Both Apple and Samsung joined AOMedia, MTK actually launched the very first AV1-enabled mobile SOC.Elstar - Friday, March 30, 2018 - link
I wonder if "this time will be different". People always complains about MPEG royalties during standards transitions, but once the next standard is embraced, the complaining subsides.The "problem" is that hardware manufactures don't want to implement multiple encoders/decoders, so MPEG tends to win due to the investment costs.
iter - Friday, March 30, 2018 - link
The reference implementation is most likely very basic and slow. Even if it is an open standard and free to license, good hardware accelerated implementations will not be necessarily free or open source.The one thing that nobody on this planet should doubt is the ability of the industry to ruin even the best of things.
If they had their way, they'd be happy to sell us breathing air. And pretty soon might do just that.
tuxRoller - Friday, March 30, 2018 - link
This time is different BECAUSE they developers of the codec are working hand-in-hand with the hardware companies. There's been a standing meeting (iirc, every quarter) where key software codec folks go to Taiwan for full syncs with all stake holders. As a result they've even had to drop some of the experiments because the hardware folks said they were too costly to implement. Hopefully we'll see these used in av2 since they are quite innovative (a good bit is from daala, which had a radicallyElstar - Friday, March 30, 2018 - link
If you look at the list of AV1 companies, the list *missing* hardware players list is very telling. AV1 clearly doesn't have the support of the professional or semi-professional video hardware community. AV1 might still succeed in spite of this. After all, the video recording device business is dwarfed by smartphones these days. I suspect in the end we'll have two standards on all of devices instead of two. Hurray for "progress".tuxRoller - Saturday, March 31, 2018 - link
Other than what others have mentioned, I'm not sure what you're saying.It's your main concern that the org isn't interested in itu status, and thus, uncertain ott?
GreenReaper - Sunday, April 1, 2018 - link
They managed to get VP9 into PC hardware - I'm sure they can manage AV1 more generally.iwod - Friday, March 30, 2018 - link
May be I will have to go through twitter to have this done and corrected. But Av1.0 has not been released yet. Even if you click on the Spec PDF link it still say it is draft document. Apparently the PR and marketing team when on without even asking the engineering team. They wanted to report this about 10days before NAB Show.tuxRoller - Saturday, March 31, 2018 - link
Hey! I recognize you from d9:)Lolimaster - Saturday, March 31, 2018 - link
We are a time where future video cameras (from smartphones-drones to pro-cameras) should have a certain minimal quality certification for day/dark + HDR/10-12bits.Consumer win having actual quality video delivered to their new OLED tv's.
ಬುಲ್ವಿಂಕಲ್ ಜೆ ಮೂಸ್ - Saturday, March 31, 2018 - link
What I really want is NVidia support to compress and decompress to whatever quality the video card can handle for user generated contentBluray DRM can be recognized for blocking commercial content and yet allow me to record whatever is on my screen
Recording 4-8K with ZERO compression should be allowed if the hardware can handle it
Max compression for 4K content should be set to whatever the user wants
For example 100% usable compression in software might actually be set to 64% actual compression in hardware and be displayed as 100/64 compression
Setting the max usable compression would prevent me from going too far in a video where its not initially noticeable and then ruining the remainder of the video where it is noticeable
I'd like to generate my own content without the DRM Nazi's ruining my day
I'd like to EASILY record whatever I want at any usable resolution and compression level the hardware can handle
4-8K camcorders should output directly to a Videocard input (yes, I said INPUT) without any compression whatsoever and let the videocard do all of the heavy lifting
whether recording desktop output or camera input, WE NEED MORE OPTIONS!
Santoval - Sunday, April 1, 2018 - link
"Recording 4-8K with ZERO compression should be allowed if the hardware can handle it"Among your many wild dreams/demands that got my eye the most. Although much or most consumer (and *all* prosumer) video recording hardware is capable of zero compression that will never happen due to a thing called "market differentiation". In plain words that means that the camera manufacturers love the very fat profit margins they get from professional video cameras so they would never risk them by moving zero compression down the food chain.
That looks like it is an unwritten rule that is somehow obeyed even by non professional camera companies. Video/movie professionals require uncompressed video because they can manipulate it, color grade it, edit it, insert special effects in etc much more gracefully. In recent years uncompressed video started being introduced to mid-range prosumer DSLRs (which record video but are not video focused) but that is the current lowest limit I am aware of.
ಬುಲ್ವಿಂಕಲ್ ಜೆ ಮೂಸ್ - Sunday, April 1, 2018 - link
Using the calc @ >
https://www.extron.com/product/videotools.aspx
4K @ 60fps X 16bit 4:4:4 = 4.45GB/sec
8K X 60fps X 16 bit 4:4:4 = 17.82 GB/sec
can you show me any consumer video camera outputting Raw video at more than 4 Gigabytes per second?
ZeDestructor - Tuesday, April 3, 2018 - link
That calculator's math is not correct for videocamera raw footage.1. That calculator assumes 3 subpixels per pixel. In almost every camera on the market, be it a basic potato you'd have to pay people to take all the way to the $70k RED Monstro you have a bayer pattern sensor with 1 subpixel per pixel, unlike monitors. This significantly reduces the amount data you actually need to record when running with raw.
2. It uses math that produces the bandwidth needed to drive a monitor at 4K using worse than CVT timings, and that adds a ton (around 20-24%) of overhead that you don't have in a camera raw In a camera raw, you have the raw frame, some metadata about exposure, lenses, shutter speed, location etc and timecodes. Besides, anything past 2560x1600 is using CVT-R2 timings, which have much lower overhead (less than 5% for 4K and up)
Here's the actual numbers for raw 4K/8K capture, frames only, so add some overhead for metadata (think <10mbit/s, since you don't need blanking intervals here):
3840 * 2160 * 60fps * 16bit * 1 subpixel @ RGB/4:4:4 chroma: 7.96Gbit/s = ~1GB/s
7680 * 4320 * 60fps * 16bit * 1 subpixel @ RGB/4:4:4 chroma: 31.85Gbit/s = ~4GB/s
This all fits just fine using a few SDI cables (remember, those only go up to 12Gbit/s and that's what they use in cinema, TV and live broadcast) to get raw footage off the camera and into some external capture device.
You can also do that over HDMI or DP just fine (after de-bayering). In fact, this is done right now: the higher-end DSLRs and video cameras all have HDMI output, so you can just feed the live signal into a regular ordinary HDMI capture card and cap it raw that way, but you'll only see that on higher-end kit, because it's expensive to have fast IO in a space as small and oddly-shaped as a camera.
bill44 - Saturday, March 31, 2018 - link
Dynamic Metadata support? HFR supported?SeleniumGlow - Sunday, April 1, 2018 - link
The excessive use of "Speaking of..." phrase to start paragraphs nearly made me think that this was written by Linus G Sebastian. However, I was wondering if there was anything new with audio compression schemes, and I think AV1 is just about video compression?ZeDestructor - Monday, April 2, 2018 - link
Opus happened not that long ago for audio, and it will be a long while before we see anything better (if ever). The reality is that despite ISP's best efforts, overall internet speed is going up every year, making audio such a small item to optimize for. I mean, what's ~128kbit/s stereo (that's where I've seen audio typically max out at) compared to a 4+mbit/s video stream (4mbit/s is around where 1080p streams sit)?errorr - Monday, April 2, 2018 - link
The major things about AV1 is that it seems to be really focused on the needs of OTT 4k video. Google and others have stated that it is less efficient at lower than HD resolutions, probably not much better at HD, and somewhere around 30% better at higher resolutions. The real problems left to solve are much more on the encode than the decode end. Netflix is asking for somewhere between 5x-10x more computationally intensive before they will commit to adopting it. Right now they are stating around 200x the computational complexity on current encoders. The next year where they optimize the encoders will determine if AV1 succeeds.errorr - Monday, April 2, 2018 - link
I also think it is an open question whether consumer devices will ever get powerful enough to have even HW accelerated encoding for consumer produced video. The AV1 codec is focused on large distributors like Google, Netflix, and recently Facebook and their needs. The massive computational needs of AV1 are perfect for these big distribution platforms as cpu cycles are cheap and they are focused on reducing their bandwidth costs.mode_13h - Tuesday, April 3, 2018 - link
What I don't understand is how they can be so confident some well-resourced patent troll won't come out of the woodwork and submarine the standard, just when it's really gaining momentum.It seems to me what we really need is some legal framework around standards that says if you don't state your claim before a standard is ratified by some official body like ISO, then you forfeit your rights to said IP within that context.
tuxRoller - Wednesday, April 4, 2018 - link
There's no guarantee for any of these pools (as mpegla discovered) but aom has a massive number of patents under its belt, it was very carefully designed to avoid known patents, and they've setup a legal defense fund.http://www.streamingmedia.com/Articles/News/Online...
Tterraneya - Wednesday, April 4, 2018 - link
Guys, i know, you love vulgar girlsWhat about online communication with them without limits? Here http://lonaism.ga you can find horny real girls from different countries.