The XBox is proof that Microsoft doesn't take PC gaming seriously. On the other hand, playing a video game is something you should be doing for amusement. Taking your fun seriously is a good way to raise the old blood pressure for all the wrong reasons. The people that ough to take it seriously at all are the ones that have a financial stake in it (EA, Valve, etc) and the rest of us should just learn to relax and play something. :)
I'd love to relax and play something, but first I need to log into my Hotmail/Bing/Live/Microsoft account to recover my password so that I can verify my credentials on my computer so that I can log into my Origin account to get access to my DLC then download and restart my computer before logging into my Nvidia account to download game ready drivers to enable ShadowPlay even though it doesn't work well with this UWP fullscreen mode.
I don't just play "something", I play what I find enjoyable. It's difficult to enjoy most games if they look like crap.
You may say "but current games don't look like crap". Look at old games from 15 years ago. Do you think they look "good"? No, of course not. Yet at the time they looked great. That's because the bra keeps getting raised, that's the only reason we get better.
Dearie, I wish the bra kept getting raised as time passed, but that's the opposite of reality. :P
For me, graphics are one of the least important aspects of a game. I prefer a good story and the medium through which its told isn't critical to me. I also like freedom to do what I want and a flexible game world. There's one major publisher that lives up to those things (sorta) and that'd be Bethesda with their open world RPGs. However, I've stopped trying to keep up with them after Skyrim. I'm not saying Fallout 4 is a bad game at all, but I haven't played it because I'm tired of upgrading constantly to keep up. The PC, for me, started out as more of a tool for writing and a way to communicate with the rest of the world than as a gaming platform. Gaming is fun, but in recent years, I've gotten tired of the frustration that comes along with it so I'm slowly creeping away from using a PC as a gaming system. I found I'm just as entertained on the DS I hijacked from my kid as I am on a computer, but I don't have to bother with the underlying hardware. It's a refreshing change.
Anyway, off on a tangent...Microsoft cares about gaming in a roundabout way, but PC gaming doesn't drive sales in a direct manner for them. The XBox does. Its sales are much more easily measured because the numbers can be counted directly. As opposed to a multi-function device like the PC, the XBO and prior generations can be counted and accounted for in harder terms.
To be honest, a Sandy Bridge i5 has no problems whatsoever on playing games today. The "boost" people used to get from upgrading processors to play games on PC has decreased tremendously.
All you'd need is, yes, upgrade your video card every few years (not even need to upgrade every generation, can skip one or maybe even two depending on what you got at first).
One would say that'd be more expensive, but that's debatable. Multiplatform games are, most of the time, cheaper on PC. As long as you don't buy "pre-order", wait a few months, you can save up. I have a list of games I'd like to play still, and I get them over time when they go on sale (got several during the end of year sales on GOG, Steam, and HumbleBundle often has good deals too).
Don't give up! The DS/3DS is a very nice piece of entertainment kit, and has many cool games, not denying that. But you won't be able to play Fallout 4 there, heh.
Agreed that most gaming systems are perfectly adequate using a Sandy Bridge i5 and a modern GPU. Upgrades are simple enough and relatively inexpensive, but there's other annoyances as well that make PC gaming a pretty painful experience. There's driver updates, there's day one DLC, and there's data mining from Steam, the OS, via the browser, and through pretty much any service I'd use on the web that all discourage me from wanting to depend much on a computer running Windows. Linux, running on one of my laptops as its only OS and on the other alongside a dual boot with Win 7, is okay for everything I want to do on a computer with the exception of having a lot of flexibility with games...it's not terrible, but there are a ton of native Windows games that I lack the compute power to run inside of WINE or just won't work well without an annoying amount of tinkering.
The larger problem in my case is that I really dislike using a desktop computer where it's easier and cheaper to keep up with hardware updates. My most recent desktop suffered a SATA controller failure just about a month ago and that was the last straw for me. Having to RMA a motherboard was all the excuse I needed to drag that big ugly dinosaur out to my car and drive it over to the electronics recycler. I fell back tomy two older laptops that have been working fine for years but are by no means gaming rigs. Moving gaming to handheld consoles (the DSi in particular) is really just one step in a long process of shifting entertainment to less annoying platforms. I'm not entirely quitting with PC gaming as both of my laptops are capable of running Fallout 3 well enough, but there's only a small handful of modern titles I'm interested in (Fallout 4, GTA 5, From the Depths) and a tiny number of games just isn't worth the trouble of owning and maintaining a gaming PC. Instead I can grab a mountain of used DS carts for very little cash and carry my fun pretty much anywhere I go without worrying about all those gotchas that I'd have to deal with on a PC. And, when I get around to it, I'll snag a 2nd hand 2DS or 3DS that will still run the old DS junk and get me relatively caught up. It just seems like its a no-brainer to go this route and relegate my computers to word processing and surfing the Internet.
to me it is not really about being serious or not but more profit. they know there is money to be made on PC platform (just look at last year PC gaming revenue vs console). and the main OS for gaming PC is windows. they already lock that down so now how to take advantage of that and have all the game revenue mostly for themselves only. so the answer is UWP. it is not about how to make PC gamer happy. it is how they can lock PC gaming into their Xbox like ecosystem and directly benefit from them.
Most current Sony 4K TVs will accept 4K60 4:4:4 and 1080p120 4:4:4 native input over HDMI. Several Vizio TVs will do 1080p120 as well. Then a handful of HiSense and other Chinese brands sometimes will, but it can be a bit of a crapshoot with firmware revisions and so forth.
You won't until there is a new higher-bandwidth version of HDMI, because all new high-end TVs are 4K and HDMI 2.0 only has the bandwidth for 4K at 60hz.
He didn't state 4K resolution as a requirement. A 4K TV can accept 1080P input. I'm pretty sure there are some TVs that can accept 1080p @ 120hz via HDMI 2.0. In fact I find it annoying that it's not industry-standard.
It would make more sense to hammer MS, fix Windows to get proper WCG, HDR, HFR and HiDPI scaling sorted. On the long run, this is the only universal solution. It's not just games that need it, UHD BD playback on PC, HDR Netflix and so on. Also why make it vendor specific?
I vaguely remember hearing rumors that the Creators update would address some of the color space, hdr issues. I think it was after they showed off their surface desktop which had inbuilt software switching from sRGB to DCI. when some journo's asked about it MS mentioned better support may be incoming.
This might be how AMD is going to accomplish that, similar to the way that they created Mantle to prompt Microsoft into building a proper low-overhead graphics library for Windows (DX12).
I had this same thought. Just waiting around won't get anything accomplished. If you really want to badger them into it, do it yourself and start pushing it to developers. If they're successful maybe it ends up being supported natively in a major Win10 update down the road.
I recall that in the "old days" of the early 2000's, that ATI cards were considered superior in image quality, while NVIDIA cards would give faster FPS. Looks like RTG/AMD is trying to push back into that situation.
Radeons have always offered better IQ (there have been articles on this subject in the last year even) They also get better with age (unlike cards from Nvidia which get worse, nerfed to make newer cards look better) and just recently the drivers have improved to the point where they are much better than Nvidias (drivers have been great for a long time but now they're rock solid, Nvidias have gotten worse and worse)
I've only done coding on some small-scale projects, and nothing that comes anywhere near the complexity of an OS, but I'm going to make the reasonable assumption that quality HDR support cannot be done "quickly". I can only hope Microsoft is already working on it.
There are already many 10 bit color monitors. While they may not precisely match HDR requirements they are very close and are what professionals use to create HDR content.
Has MS said anything about if they're going to include color space management enhancements in the upcoming Creators Update? It'd be a logical thing to include; and a lot more generally useful than gimics like the Surface Dial contraption, OTOH it'd probably a lot more work by a large margin...
The Surface Studio reveal had a toggle for switching between sRGB, AdobeRGB and DCI-P3, seemed pretty much instant. I can only assume that this will not be restricted to the Studio.
Everything I have heard has suggested HDR support is coming to Windows 10 in the Creators Update. I did a quick google search and a bunch of results from mid-December said the same.
For all there marketing talk, it looks like AMD is completely ignoring the VR-world. Which might make sense because it is unlikely that Occulus nor Valve will make an AMD-specific headset, but it seems a bit of a problem.
What I was hoping for involved was to allow the GPU to instead of submitting entire frames at once would be to submit just horizontal stripes (presumably make the spec line by line, but only so you could divide things up to match the headset's zones). This would make "chasing the beam" a real possibility, and likely cut the latency by at least a half (and most of that would come from the compute-to-eyeball latency).
AMD needs to come up with a better solution to VR than the current "throw pixels at the headset" if they want to compete with Nvidia, and a "VRsynch" might have worked well.
VR doesn't represent a large enough market for a company lean on profits to invest in. It's still a new category (discounting failures to take off in the 1990s) and the risks may not be worth it. It'll be sunk costs if AMD or any other company invests heavily in current gen VR only for it to suffer another failure. That's not something AMD can afford so seeing the company's leadership exercise caution is actually somewhat reassuring. Let NV burn money on it first and then tag along later if it catches on.
Branding aside, this all seems to boil down to "the status quo, but with an automatic gamut switch".
Currently, if you feed a wide-gamut desktop monitor a signal, it will be 'mapped to the native gamut' automatically (i.e. there is no hardware in the monitor to scale the gamut, just the real-time LUT for calibration) because there is nothing else for you to do: all the gamut mapping MUST be done on the PC end by default. With Freesync 2, the gamut mapping... remains on the PC. The only change appears to be that you have to use AMD's API for the privilege, and (one would hope) not need to check a separate database of monitor gamuts.Indeed, that database of monitor gamuts in the real object of value. Once you have that, a game engine can ignore AMD's API entirely and just perform the mapping themselves in a GPU agnostic way (and likely do it in the exact same tone mapping pass almost every engine is already using anyway).
It's about more than just wide gamut. Currently if you feed an HDR monitor a signal, it's almost always HDR10. Which means it has to be tone mapped from the range of HDR10 to the actual native range of the monitor. This is what AMD wants to get around.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
51 Comments
Back to Article
nathanddrews - Tuesday, January 3, 2017 - link
Nice to see that AMD cares about PC gaming even if Microsoft doesn't.xype - Tuesday, January 3, 2017 - link
Wait, what? Microsoft doesn’t, either? And I thought only Apple didn’t…MajGenRelativity - Tuesday, January 3, 2017 - link
I personally use Windows pretty much all the time. But if they don't properly support wide color gamuts, that's a pretty big issue.ddriver - Tuesday, January 3, 2017 - link
Well, at least windoze 10 adoption, ops, I mean enforcement is doing well!Murloc - Tuesday, January 3, 2017 - link
AFAIK apple has proper color support.This is not a gaming-only issue.
nathanddrews - Tuesday, January 3, 2017 - link
Haven't you heard of UWP? It is proof positive that Microsoft doesn't take PC gaming seriously.BrokenCrayons - Tuesday, January 3, 2017 - link
The XBox is proof that Microsoft doesn't take PC gaming seriously. On the other hand, playing a video game is something you should be doing for amusement. Taking your fun seriously is a good way to raise the old blood pressure for all the wrong reasons. The people that ough to take it seriously at all are the ones that have a financial stake in it (EA, Valve, etc) and the rest of us should just learn to relax and play something. :)nathanddrews - Tuesday, January 3, 2017 - link
I'd love to relax and play something, but first I need to log into my Hotmail/Bing/Live/Microsoft account to recover my password so that I can verify my credentials on my computer so that I can log into my Origin account to get access to my DLC then download and restart my computer before logging into my Nvidia account to download game ready drivers to enable ShadowPlay even though it doesn't work well with this UWP fullscreen mode.BrokenCrayons - Tuesday, January 3, 2017 - link
Over the holidays, I finally got a clue. If I want amusement I just press poke the power button on my Nintendo DSi XL and play a game.nathanddrews - Tuesday, January 3, 2017 - link
No doubt, my Wii U got a lot of playtime.Anotheraccount - Tuesday, January 3, 2017 - link
I laughed so freaking hard I couldn't breath. I had to do you a solid and create an account here to tell you: Well said sir.Alexvrb - Tuesday, January 3, 2017 - link
Would you like a tutor on computer usage?bcronce - Tuesday, January 3, 2017 - link
I don't just play "something", I play what I find enjoyable. It's difficult to enjoy most games if they look like crap.You may say "but current games don't look like crap". Look at old games from 15 years ago. Do you think they look "good"? No, of course not. Yet at the time they looked great. That's because the bra keeps getting raised, that's the only reason we get better.
BrokenCrayons - Tuesday, January 3, 2017 - link
Dearie, I wish the bra kept getting raised as time passed, but that's the opposite of reality. :PFor me, graphics are one of the least important aspects of a game. I prefer a good story and the medium through which its told isn't critical to me. I also like freedom to do what I want and a flexible game world. There's one major publisher that lives up to those things (sorta) and that'd be Bethesda with their open world RPGs. However, I've stopped trying to keep up with them after Skyrim. I'm not saying Fallout 4 is a bad game at all, but I haven't played it because I'm tired of upgrading constantly to keep up. The PC, for me, started out as more of a tool for writing and a way to communicate with the rest of the world than as a gaming platform. Gaming is fun, but in recent years, I've gotten tired of the frustration that comes along with it so I'm slowly creeping away from using a PC as a gaming system. I found I'm just as entertained on the DS I hijacked from my kid as I am on a computer, but I don't have to bother with the underlying hardware. It's a refreshing change.
Anyway, off on a tangent...Microsoft cares about gaming in a roundabout way, but PC gaming doesn't drive sales in a direct manner for them. The XBox does. Its sales are much more easily measured because the numbers can be counted directly. As opposed to a multi-function device like the PC, the XBO and prior generations can be counted and accounted for in harder terms.
LordanSS - Tuesday, January 3, 2017 - link
To be honest, a Sandy Bridge i5 has no problems whatsoever on playing games today. The "boost" people used to get from upgrading processors to play games on PC has decreased tremendously.All you'd need is, yes, upgrade your video card every few years (not even need to upgrade every generation, can skip one or maybe even two depending on what you got at first).
One would say that'd be more expensive, but that's debatable. Multiplatform games are, most of the time, cheaper on PC. As long as you don't buy "pre-order", wait a few months, you can save up. I have a list of games I'd like to play still, and I get them over time when they go on sale (got several during the end of year sales on GOG, Steam, and HumbleBundle often has good deals too).
Don't give up! The DS/3DS is a very nice piece of entertainment kit, and has many cool games, not denying that. But you won't be able to play Fallout 4 there, heh.
BrokenCrayons - Wednesday, January 4, 2017 - link
Agreed that most gaming systems are perfectly adequate using a Sandy Bridge i5 and a modern GPU. Upgrades are simple enough and relatively inexpensive, but there's other annoyances as well that make PC gaming a pretty painful experience. There's driver updates, there's day one DLC, and there's data mining from Steam, the OS, via the browser, and through pretty much any service I'd use on the web that all discourage me from wanting to depend much on a computer running Windows. Linux, running on one of my laptops as its only OS and on the other alongside a dual boot with Win 7, is okay for everything I want to do on a computer with the exception of having a lot of flexibility with games...it's not terrible, but there are a ton of native Windows games that I lack the compute power to run inside of WINE or just won't work well without an annoying amount of tinkering.The larger problem in my case is that I really dislike using a desktop computer where it's easier and cheaper to keep up with hardware updates. My most recent desktop suffered a SATA controller failure just about a month ago and that was the last straw for me. Having to RMA a motherboard was all the excuse I needed to drag that big ugly dinosaur out to my car and drive it over to the electronics recycler. I fell back tomy two older laptops that have been working fine for years but are by no means gaming rigs. Moving gaming to handheld consoles (the DSi in particular) is really just one step in a long process of shifting entertainment to less annoying platforms. I'm not entirely quitting with PC gaming as both of my laptops are capable of running Fallout 3 well enough, but there's only a small handful of modern titles I'm interested in (Fallout 4, GTA 5, From the Depths) and a tiny number of games just isn't worth the trouble of owning and maintaining a gaming PC. Instead I can grab a mountain of used DS carts for very little cash and carry my fun pretty much anywhere I go without worrying about all those gotchas that I'd have to deal with on a PC. And, when I get around to it, I'll snag a 2nd hand 2DS or 3DS that will still run the old DS junk and get me relatively caught up. It just seems like its a no-brainer to go this route and relegate my computers to word processing and surfing the Internet.
lobz - Wednesday, January 4, 2017 - link
the bra?boy I sure hope so it won't keep getting raised
renz496 - Wednesday, January 4, 2017 - link
to me it is not really about being serious or not but more profit. they know there is money to be made on PC platform (just look at last year PC gaming revenue vs console). and the main OS for gaming PC is windows. they already lock that down so now how to take advantage of that and have all the game revenue mostly for themselves only. so the answer is UWP. it is not about how to make PC gamer happy. it is how they can lock PC gaming into their Xbox like ecosystem and directly benefit from them.SydneyBlue120d - Tuesday, January 3, 2017 - link
I'd like to know if there will ever be any TV with FreeSync 1 or 2 support :)Shadowmaster625 - Tuesday, January 3, 2017 - link
I still cant find a tv that offers true 120Hz input from a PC.nathanddrews - Tuesday, January 3, 2017 - link
Most current Sony 4K TVs will accept 4K60 4:4:4 and 1080p120 4:4:4 native input over HDMI. Several Vizio TVs will do 1080p120 as well. Then a handful of HiSense and other Chinese brands sometimes will, but it can be a bit of a crapshoot with firmware revisions and so forth.Flunk - Tuesday, January 3, 2017 - link
You won't until there is a new higher-bandwidth version of HDMI, because all new high-end TVs are 4K and HDMI 2.0 only has the bandwidth for 4K at 60hz.Alexvrb - Tuesday, January 3, 2017 - link
He didn't state 4K resolution as a requirement. A 4K TV can accept 1080P input. I'm pretty sure there are some TVs that can accept 1080p @ 120hz via HDMI 2.0. In fact I find it annoying that it's not industry-standard.bill44 - Tuesday, January 3, 2017 - link
It would make more sense to hammer MS, fix Windows to get proper WCG, HDR, HFR and HiDPI scaling sorted. On the long run, this is the only universal solution.It's not just games that need it, UHD BD playback on PC, HDR Netflix and so on. Also why make it vendor specific?
doggface - Tuesday, January 3, 2017 - link
I vaguely remember hearing rumors that the Creators update would address some of the color space, hdr issues. I think it was after they showed off their surface desktop which had inbuilt software switching from sRGB to DCI. when some journo's asked about it MS mentioned better support may be incoming.voodoobunny - Tuesday, January 3, 2017 - link
This might be how AMD is going to accomplish that, similar to the way that they created Mantle to prompt Microsoft into building a proper low-overhead graphics library for Windows (DX12).Alexvrb - Tuesday, January 3, 2017 - link
I had this same thought. Just waiting around won't get anything accomplished. If you really want to badger them into it, do it yourself and start pushing it to developers. If they're successful maybe it ends up being supported natively in a major Win10 update down the road.jardows2 - Tuesday, January 3, 2017 - link
I recall that in the "old days" of the early 2000's, that ATI cards were considered superior in image quality, while NVIDIA cards would give faster FPS. Looks like RTG/AMD is trying to push back into that situation.SirDerpsalot - Monday, February 6, 2017 - link
Radeons have always offered better IQ (there have been articles on this subject in the last year even)They also get better with age (unlike cards from Nvidia which get worse, nerfed to make newer cards look better) and just recently the drivers have improved to the point where they are much better than Nvidias (drivers have been great for a long time but now they're rock solid, Nvidias have gotten worse and worse)
Nozuka - Tuesday, January 3, 2017 - link
A monitor that only reaches its full potential with either a Nvidia or AMD and not both (or others) is still a terrible situation for consumers.Spunjji - Tuesday, January 3, 2017 - link
Agreed. Unfortunately Nvidia are unlikely to alter their stance here.lobz - Wednesday, January 4, 2017 - link
still certainly not the mistake of AMDTristanSDX - Tuesday, January 3, 2017 - link
Probably useless idea. MS can quickly add support for HDR in some update. There are none HDR monitors, and there won't be for another half year.MajGenRelativity - Tuesday, January 3, 2017 - link
I've only done coding on some small-scale projects, and nothing that comes anywhere near the complexity of an OS, but I'm going to make the reasonable assumption that quality HDR support cannot be done "quickly". I can only hope Microsoft is already working on it.Zan Lynx - Tuesday, January 3, 2017 - link
There are already many 10 bit color monitors. While they may not precisely match HDR requirements they are very close and are what professionals use to create HDR content.lobz - Wednesday, January 4, 2017 - link
the useless idea was to crap your comment here without giving it at least some thought...bill44 - Wednesday, January 4, 2017 - link
http://www.tftcentral.co.uk/news.htmDanNeely - Tuesday, January 3, 2017 - link
Has MS said anything about if they're going to include color space management enhancements in the upcoming Creators Update? It'd be a logical thing to include; and a lot more generally useful than gimics like the Surface Dial contraption, OTOH it'd probably a lot more work by a large margin...Friendly0Fire - Tuesday, January 3, 2017 - link
The Surface Studio reveal had a toggle for switching between sRGB, AdobeRGB and DCI-P3, seemed pretty much instant. I can only assume that this will not be restricted to the Studio.jnolen - Tuesday, January 3, 2017 - link
Everything I have heard has suggested HDR support is coming to Windows 10 in the Creators Update. I did a quick google search and a bunch of results from mid-December said the same.wumpus - Tuesday, January 3, 2017 - link
For all there marketing talk, it looks like AMD is completely ignoring the VR-world. Which might make sense because it is unlikely that Occulus nor Valve will make an AMD-specific headset, but it seems a bit of a problem.What I was hoping for involved was to allow the GPU to instead of submitting entire frames at once would be to submit just horizontal stripes (presumably make the spec line by line, but only so you could divide things up to match the headset's zones). This would make "chasing the beam" a real possibility, and likely cut the latency by at least a half (and most of that would come from the compute-to-eyeball latency).
AMD needs to come up with a better solution to VR than the current "throw pixels at the headset" if they want to compete with Nvidia, and a "VRsynch" might have worked well.
BrokenCrayons - Tuesday, January 3, 2017 - link
VR doesn't represent a large enough market for a company lean on profits to invest in. It's still a new category (discounting failures to take off in the 1990s) and the risks may not be worth it. It'll be sunk costs if AMD or any other company invests heavily in current gen VR only for it to suffer another failure. That's not something AMD can afford so seeing the company's leadership exercise caution is actually somewhat reassuring. Let NV burn money on it first and then tag along later if it catches on.schizoide - Tuesday, January 3, 2017 - link
The technology itself sounds really cool, but it's incredible confusing to brand it FreeSync for two reasons.First, HDR stuff has nothing to do with adaptive refresh. And second, it sounds like it won't actually be free!
FMinus - Tuesday, January 3, 2017 - link
it's probably going to be branded as such FreeSynce 2, which includes this HDR stuff and FreeSync (1).schizoide - Tuesday, January 3, 2017 - link
Yes, which is confusing for the two reasons I stated. Adding a 2 after the name doesn't help.edzieba - Wednesday, January 4, 2017 - link
Branding aside, this all seems to boil down to "the status quo, but with an automatic gamut switch".Currently, if you feed a wide-gamut desktop monitor a signal, it will be 'mapped to the native gamut' automatically (i.e. there is no hardware in the monitor to scale the gamut, just the real-time LUT for calibration) because there is nothing else for you to do: all the gamut mapping MUST be done on the PC end by default. With Freesync 2, the gamut mapping... remains on the PC. The only change appears to be that you have to use AMD's API for the privilege, and (one would hope) not need to check a separate database of monitor gamuts.Indeed, that database of monitor gamuts in the real object of value. Once you have that, a game engine can ignore AMD's API entirely and just perform the mapping themselves in a GPU agnostic way (and likely do it in the exact same tone mapping pass almost every engine is already using anyway).
Ryan Smith - Wednesday, January 4, 2017 - link
It's about more than just wide gamut. Currently if you feed an HDR monitor a signal, it's almost always HDR10. Which means it has to be tone mapped from the range of HDR10 to the actual native range of the monitor. This is what AMD wants to get around.xTRICKYxx - Wednesday, January 4, 2017 - link
Minor typo:"Admittedly I don’t see too many Radeon HD 7790 or R9 290X owners..."
Assuming you mean 7970, but the 7790 is just as real :P
Ryan Smith - Wednesday, January 4, 2017 - link
7790 is GCN 1.1. The first GCN 1.1 card, in fact.(7970, by comparison, was GCN 1.0)
iwod - Wednesday, January 4, 2017 - link
Another moment to show how poor AMD is with their marketing and naming.FreeSync 2 essentially has not many things related to what the original FreeSync idea was.
Vodietsche - Wednesday, January 11, 2017 - link
Woot! looks sweeet.