Comments Locked

46 Comments

Back to Article

  • Wreckage - Monday, August 6, 2012 - link

    Hopefully OpenGL can return to prominence again. There needs to be competition against the closed proprietary DirectX, which has stagnated lately.

    Valve's move to Linux and the rise of Android as a gaming platform should provide additional support for more games to use alternatives to DirectX.
  • Flunk - Monday, August 6, 2012 - link

    You forgot Macs, those things are pretty hot right now too. Blizzard has made a load of cash off the Mac versions of their popular games.
  • KoolAidMan1 - Tuesday, August 7, 2012 - link

    And iOS, arguably the fastest growing segment in the gaming market right now.

    There is TONS using OpenGL right now, hopefully it keeps improving thanks to its resurgence.
  • bobvodka - Tuesday, August 7, 2012 - link

    OpenGL|ES is not OpenGL, they are two seperate specs - iOS and Android can grow all they like, they still aren't using OpenGL.
  • djgandy - Monday, August 6, 2012 - link

    Please explain how DX has stagnated?
  • nevertell - Monday, August 6, 2012 - link

    Well, tessellation was supported by OpenGL years before DX11 was released.
  • B3an - Monday, August 6, 2012 - link

    Even John Carmack admitted recently at the Quakecon 2012 event that DirectX is now overall better than OpenGL, and Carmack has always used OpenGL. So i dont think Microsoft have anything to worry about.

    With DX11.1 Microsoft have also rewritten and optimised much of it for better performance, efficiency and mobile GPU support. So MS now basically have a single fully featured API for both desktop and mobile GPU's that can do everything. DX11.1 also introduced Target Independent Rasterization which i don't think OpenGL has.
  • PrinceGaz - Monday, August 6, 2012 - link

    The problem with DirectX is it is not royalty-free and supported across all platforms. The days of game development being led by Windows PCs is over; it is now all about convergence and supporting as many target markets as possible from mobile-phones (Android, iOS mainly at the moment) as well as game consoles, plus PCs and Macs as well with one common language.

    It will mean compromises at the high-end, but graphics are so good these days it hardly matters. We want better games, not better graphics, so the less developers need to worry about the graphics and the more time they can spend on making a good game which will run on a wide range of devices, the better in my opinion.
  • inighthawki - Monday, August 6, 2012 - link

    Since when has DirectX not been royalty free?
  • Ryan Smith - Monday, August 6, 2012 - link

    I'd have to double-check, but since at least DX6, when Microsoft included S3TC as part of the standard. You can't be DX compliant without S3TC, and you have to pay to license S3TC.
  • BenchPress - Monday, August 6, 2012 - link

    S3TC support is not demanded. You can query which formats are supported, and trying to create unsupported ones will fail.

    Of course texture compression is important so the major hardware manufacturers implemented it and pay for it. But this is equally true for OpenGL. They all implement the extension.

    Core Direct3D functionality is royalty free, just like OpenGL. After all, they are just APIs.
  • mczak - Monday, August 6, 2012 - link

    That isn't quite true for DX10 and up. There's very few optional formats, almost everything is mandatory (including the s3tc formats).
    That said, IIRC hw manufacturers did NOT have to pay s3tc license fees due to some licensing deal of MS (if they implemented s3tc for d3d, they still had to pay for OGL).
  • StevoLincolnite - Monday, August 6, 2012 - link

    Direct X is also used on the Xbox 360 which is a games console, the PS3 even has Direct X 9 hardware, albeit without the API. (It uses OpenGL)
    Android, iOS and other mobile OS's can't play AAA games like a PC or Console can, in-fact most games are incredibly simple with decade old level of graphics.

    Steam has what, 30-40 million users? Origin is picking up steam (Pun intended), Ubisoft uPlay probably has a few users.

    So lets go with 40 million PC gamers (Which is probably lower than the true number.) which in turn use Direct X.
    Then add on the Xbox console numbers which makes it 110 million gamers that use Direct X in one form of another.

    Suddenly Direct X doesn't look insignificant.

    If you go back almost a couple decades during the Era of 3dfx, Allot of games actually used multiple API's that you could choose depending on your hardware.
    I remember Unreal Tournament for instance coming with 3dfx Glide, OpenGL, Software Rendering and Direct 3D, so it's not like you can't support multiple API's in a game engine using some wrappers, which may be the path developers may take in the future again.
  • BiggieShady - Monday, August 6, 2012 - link

    Saying things like "PS3 has DX9 hardware but it uses OpenGL" is wrong on so many levels. DirectX and OpenGL are API-s designed to work with different types of hardware with one thing in common - they expose the same hardware functionality. So PS3 and Xbox have different GPU-s that both expose same functionality either for OpenGL or DirectX.
  • wicketr - Monday, August 6, 2012 - link

    There are over 500 Million Mac/Linux/Smartphone/Tablet users that DirectX isn't on. Meaning that a developer using DirectX would miss out on those users.

    ....Or they could use OpenGL and cover 100% of the user base. Currently, it'd be asinine to develop a mobile only game in DirectX, because they'd only reach about 5% of the market.

    The question on the desktop is if DirectX's feature set is THAT much better than OpenGL to warrant it's use. As OpenGL catches up in parity and as more users abandon Windows, the reasoning for using DirectX gets smaller and smaller, especially as developers get more and more comfortable in OpenGL (due to mobile programming).

    The only way DirectX can really survive over the long term (10+ years out), is if Microsoft develops engines compatible for iPhone and Android. Those ecosystems are just WAY to large for Microsoft to use their proprietary engine to push users (and developers) to MS Phones.
  • bobvodka - Monday, August 6, 2012 - link

    DX will survive as long as there is a Windows or XBox market for it to survive in.

    Beyond that developers, like myself, will do what we've always done and use the best API for the target platform.

    Amusingly the only people who really seem to care about which API 'wins' are OS Zealots who want their OS to dominate the world... which, you know, works so well...

    The rest of us doing the real work day in-day out will just use the best tools for the job and won't care who they come from.

    Unfortunately, right now, the ARB doesn't have a track record of 'the best tools' having dropped the ball with the Longs Peak/OGL3.0 debarcle which was, at the time, OpenGL's best chance of 'making a come back' on Windows (this was around the time of Vista/DX10 and the wailing of XP gamers who wanted DX10 features in their games) - when it comes to OpenGL I wouldn't trust the ARB to find their own arses if handed a map.

    Now, OpenGL|ES is another matter, they have handle that well and its a good thing BUT OpenGL|ES is NOT OpenGL (yay marketing!); you do not program these APIs the same way and doing so is going to hurt you which brings us back around to 'the right tools for the job'.

    OpenGL is not, for the majority of the home computer market, the right tool yet.
  • ananduser - Tuesday, August 7, 2012 - link

    The user to which you replied presented some vague numbers representing dedicated gamers and not the install base. You counter that with the totality of non-Win devices ?

    You seem bent on people "abandoning" Windows like it's the plague and it's the right thing to do. You also seem to go to sleep at night wishing DirectX was dead so that EA would develop BF4 for your mac(I presume).

    DX will still be the dominant gaming industry API even 10 years out. Even more so than today.
  • SleepyFE - Monday, August 6, 2012 - link

    DirectX is being used because people think like you (and so far it was better). Most people use Windows so why not use DirectX. NOW you have 110 mill gamers using DirectX because developers used it. If they decided to use OpenGL which also works on Windows and Xbox you would have 110 mill gamers using OpenGL and a few million using DirectX because Microsoft was the developer's partner and they forced them to use it.

    So far using OpenGL was a problem since it had less features and was harder to learn (deep end of the pool). Now is the time to make the switch.

    Also, i do not use Windows because i like it, i like the games that are made for it. If i want to play a game, i am FORCED to have Windows because of DirectX only running on Windows. Using OpenGL does not yet mean it will run everywhere, but it does mean a lot less work to get it working elsewhere. Right now so many people are on Windows that developing the game in both DirectX and OpenGL just doesn't pay. Using only one is the way to go. An OpenGL title right now has a slightly wider audience (a very very very very small increase). But if they keep using DirectX we have to keep using Windows (even if we don't like it).

    Games are the reason most of us use Windows, but with good OpanGL that might change. Ouya is a gaming console with Android and as phones become more and more powerful we might only need a phone some day. A phone with a super fast USB 5.0 that is plugged in to a monitor with more USB ports for a keyboard, mouse, controller... And that phone will probably have Android on it. And to play games you will need OpanGL. At that point it would be good to have a huge selection of games (even if they are old and outdated, nostalgia never goes out of fashion).
  • bobvodka - Monday, August 6, 2012 - link

    The PS3 does not use OpenGL.

    There is a PSGL implimentation but no one who wants any speed out of the thing uses it; GCM is the PS3 API of choice.

    Linux and OSX are the only OSes which use OpenGL exclusively.

    Mobile devices use OpenGL|ES which is a different API completely and, more importantly, can not be used in the same manner as OpenGL. They are moving closer in feature parity however even then I would call anyone who treated a mobile device like a desktop insane.

    Developers also regularly support multi rendering APIs when they produce 360, PS3 and PC titles; right now we (where I work) are producing a game which has support for X360, PS3, D3D11 and early WiiU support.

    In the future, for a short while at least until we can drop it for the older platforms anyway, I suspect we'll have X360, PS3, D3D11, XBox720, PS4, WiiU and iOS support in the renderer (maybe Android too, although I know of no plans personally) - as always we will pick the best API for the platform.
  • wicketr - Monday, August 6, 2012 - link

    The question is what API are you using for Mac users? They are 10% of desktop users, and significantly more than that if you discount the business segment of computers. Surely you're not ignoring a fourth (and growing) of the consumer desktop/laptop market by not even writing your game for Macs.

    If it's OpenGL, then wouldn't it be a greater benefit to "write once, deploy twice" on both OSes? Or are the graphical benefits of DirectX really that much better to write it twice?
  • bobvodka - Monday, August 6, 2012 - link

    Firstly using the Steam Hardware survey, which is the correct metric as we are a AAA games studio I'll grant you, at most, 5% of the market, the majority of which have Intel GPUs, for which the OpenGL implementation has generally been.. sub-par to put it mildly.

    Secondly all console development tools are on the PC and based around Visual Studio as such we work in Windows anyway.

    Thirdly the Windows version generally comes about because we need artists/developer tools . Right now it is also useful for learning about and testing 'next gen' ideas with an API which will be close to the XBox API

    Forthly; we have a windows version working which uses D3D11 and OpenGL offers no compelling reason to scrap all the work. Remember D3D had working compute shaders with a sane integration for some years now - OpenGL has only just got these and before doing the work with OpenCL was like opening a horrible can of worms due to the lack of standardised and required interop extensions which existed (I looked into this at the back end of last year for my own work at home and quickly dispaired at the state of OpenGL and its interop).

    Finally, OSX lags OpenGL development. Currently OSX10.7.3 (as per https://developer.apple.com/graphicsimaging/opengl... ) supports GL3.2 and I see no mention of the version being supported in 10.8. Given that OpenGL3.2 was released in 2009 and OSX10.7 was released last year I wouldn't pin my hopes on seeing 4.2 any time 'soon'.

    Now, supporting 'down market' hardware is of course a good thing to do however in D3D11 this is easy (feature levels) in OpenGL different hardware + drivers = different features which again increases engineering work load and the requirements for fallbacks.
    You could mandate 'required features' but at that point you start cutting out market share and that 5% looks smaller and smaller.

    Now, we ARE putting engineering effort into OpenGL|ES as mobile devices are an important corner stone from a business stand point thus the cost can be justified.

    In short; there is no compelling business nor technical reason at this junction to drop D3D11 in favor of OpenGL to capture a fragment of the 5% AAA 'home computer' market when there are no side benefits and only cost.
  • powerarmour - Monday, August 6, 2012 - link

    Yes because Carmack is always 100% right about everything, and the id Tech 5 engine is the greatest and most advanced around.
  • SleepyFE - Monday, August 6, 2012 - link

    id Tech 5 is awesome!! I don't like shooters (except for Prey) but i played Rage just to see how much "worse" OpenGL is. The game looks GREAT. I can't tell it from any other AAA game from the graphics alone. And that means OpenGL is good enough and should be used more. Screw what someone says, try it yourself then tell me OpenGL can't compete.
  • bobvodka - Monday, August 6, 2012 - link

    False logic - games are as good as their art work.

    OpenGL has shaders, so yes with good art work it can do the same as D3D - however the API itself, the thing the programmers have to work with - isn't as good AND up until now it was lacking feature parity with D3D11.

    Feature wise OpenGL is there.
    API/usability wise - it isn't.

    FYI; I used OpenGL for around 8 years from around 1.3 until 3.0 came out and, like a few, was so fed up of the ARB at this point that I gave up on GL and moved to a modern API, speaking from an interface design point of view.
  • Penti - Friday, August 10, 2012 - link

    Game engines are perfectly fine supporting different graphics API's. Obviously non Windows platforms won't run D3D. Microsoft does not license it. So while they do license stuff like ActiveSync/Exchange, exFAT (which should have been included in the SDXC spec under FRAND-terms but isn't), NTFS, remote desktop protocols, OpenXML, binary document formats, sharepoint protocols, some of the .NET environment etc most of the vital tech is against payed licensing. They don't even specifies the Direct3D API's for implementation for none hardware vendors. It's simply not referenced at all. OpenGL is thoroughly referenced in comparison.

    Even though PS3 isn't OGL (PSGL is OGLES based) you could still do Cg shaders, or convert HLSL or GLSL shaders or vise versa so it's not like skills are lost. Tools should be written against the game engines and middleware any way.

    Plus the desktop OGL is compatible with OGLES when it comes to the newer releases such as 4.1 and 4.3. Albeit with some tricks/configuration/compatibility modes. Then implementations sucks, but that will also be true for some graphics chips support for DX.
  • inighthawki - Monday, August 6, 2012 - link

    The tessellation feature you're referring to is a brand-specific hardware extension, and not the same class that DirectX's tessellation is. The tessellation hardware introduced for DX11 is a completely programmable pipeline that offers more flexibility. DirectX does not add support for hardware specific features for good reason.
  • djgandy - Tuesday, August 7, 2012 - link

    Tessellation was only added to the GL pipeline in 4.0. It was another one of those 'innovations' where GL copied DX, just like pretty much every other feature GL adds.

    What GL needs to do is copy DX when they remove stuff from the API. Scratch this stupid core/compatibility model, which just adds even more run-time configurations, remove all the old rubbish and do not allow mixing of new features with the old fixed function pipeline.
  • bobvodka - Tuesday, August 7, 2012 - link

    There was, 4 years ago, a plan to do just what you described in your second paragraph - Longs Peak was the code name and it was a complete change and clean up of the API with a modern design and it was a change universally praised by those of us following the ARB's news letters and design plans.

    In July 2007 they were 'close' to a release; in October they had 'some issues' to work out - they then went into radio silence and 6 months later, without bothering to tell anyone what was going on, they rolled out 'OpenGL3.0' aka 2.2 where all the grand API changes, worked on for 2 years, were thrown out the window, extensions bolted on again and no functionality removed.

    At this point myself, and quite a few others, made a loud noise and departed OpenGL development in favour of D3D10 and then D3D11.

    Four years on the ARB are continuing down the same path and I wouldn't bet my future on them seeing sense any time soon.
  • djgandy - Tuesday, August 7, 2012 - link

    The ARB think they are implementing features that developers want, and maybe they are, but AFAIK they have very few big selling developers anyway.

    It seems the ARB is unable to see the reason behind this, maybe because they are so concerned about the politics of backwards compatibility or least certain members of it are. For me this is the hardest part to understand, since it is not even real breaking of compatibility, it is simply ring fencing new features from old features thus saving a ton of driver writing hell (i.e what DX did). Instead you can still use begin end with your glsl arb and geometry shaders with a bit of fixed function fog over the top. How useful.

    I find it hard to even consider the GL API as an abstraction layer with the existing extension hell and the multiple profiles a driver can opt to support. The end result of this "compatibility" is anyone actually wanting to sell software using OpenGL has to pick the lowest common denominator...whatever that actually is, because you don't even know what you are getting till run time with the newer API, so then you just pick the ancient version of the API because at least you have a 99% chance that a GL 3.0 driver will be installed with all the old fixed function crud that you don't actually need, but glVertex3f is nice right?

    IMO GL's only hope is for a company like Apple to put it into a high volume product and actually deliver a good contract to developers (core profile only, limited extensions, and say GL 4.0).
  • bobvodka - Tuesday, August 7, 2012 - link

    Unfortunately Apple isn't very on the ball when it comes to OpenGL support.

    OSX10.7, released last year, only supports OpenGL 3.2, a spec released in 2009 and had Windows support within 2 months.

    Apple are focusing on mobile it would seem, where OpenGL|ES is saner and rules the roost.
  • djgandy - Tuesday, August 7, 2012 - link

    I agree, I did say 'like apple' though, but yes, no one is really interested in desktop GL anymore.

    At least apple went 3.2 core though and explicitly state in their documentation that deprecated features are removed

    https://developer.apple.com/library/mac/#documenta...

    Tessellation (Burning energy) aside I don't think there are many headline features that desperately need adding from 4.0 onwards. Apple did only support 2.1 until they released 3.2 support!
  • Jumangi - Monday, August 6, 2012 - link

    Yea DX isn't the stagnated one. OpenGL is a spec has been behind DirectX for years now. The stagnation is from allot of developers who don't use the latest stuff since they stay with the lowest denominator we still have with the PS3 and 360. The features are there if developers want to use them.
  • sheh - Monday, August 6, 2012 - link

    And WinXP.
  • ltcommanderdata - Monday, August 6, 2012 - link

    It's mentioned in the article that nVidia desktop GPUs don't support ETC in hardware. Do we know whether ETC support in hardware is universal across OpenGL ES 2.0 GPUs from various vendors like PowerVR, ARM, Qualcomm, and nVidia? In the transition, games will no doubt include both OpenGL ES 2.0 and 3.0 render paths to support more devices, and it'd be preferable if both paths could share texture assets.

    Since ASTC is optional right now which GPU makers have announced they support it in hardware?

    It'll be interesting to see what Apple does with ETC on iOS. The existing PVRTC is more efficient on PowerVR GPUs than ETC. I can't see Apple happy to support a less efficient texture format whose benefit to developers but not Apple is to ease portability of games to Android. I suppose Apple will enable ETC support for compatibility, but won't be investing effort in optimizing it's performance. IMG just released a new PVRTC2 texture format for Series5XT and Series6 GPU that improves upon PVRTC, so I expect Apple will be pushing that format soon on iOS.
  • Ryan Smith - Monday, August 6, 2012 - link

    Most current OpenGL ES 2.0 GPUs do not support ETC. I do not have a good list, but I know that PowerVR, Tegra, and Adreno do not support it. So until OpenGL ES 3.0 is the baseline, developers will still have to deal with disjoint texture compression formats. But at least there's finally a way out.

    As for ASTC, none of the GPU vendors have announced that they'll be supporting it in hardware. This is part of the reason why ASTC is currently being floated as an extension, in order to gather feedback. Now that the standard is out, GPU vendors can look at integrating hardware support. Since this is a joint ARM/NV proposal, I'd expect both of them to integrate support a the earliest opportunity.
  • pebrb - Monday, August 6, 2012 - link

    You are confusing ETC and ETC2 in the entire article.
    All current OpenGL ES 2.0 GPUs _do_ support ETC (not ETC2) as it is the standard in ES 2.0
    The only vendor that doesn't support it is Apple (even though the PowerVS SGX has support for it)
    The reason for developers support different texture formats is because ETC is very low quality and lack of alpha channel support.
    In OpenGL ES 3.0 they made ETC2 the standard, we'll see how it'll work out.
    But I'm waiting for ASTC it has very good features. And from what I heard it will be supported by Microsoft as well, which means that everyone needs to implement it anyway.
  • name99 - Monday, August 6, 2012 - link

    "I can't see Apple happy to support a less efficient texture format whose benefit to developers but not Apple is to ease portability of games to Android."

    Is any of this conspiracy theory in the slightest grounded in reality?

    Apple has generally been happy to support standards, even when that's not directly relevant to its goals. The web world (HTML5 and h264) are obvious examples, but an example more relevant to what we're discussing here is Apple's aggressive support for C++11 (both language and library) in the latest version of XCode. And this is not just passive checkbox support, it is genuine hard work to make C++11 interoperate well with Objective C, and to perform well.
  • ltcommanderdata - Monday, August 6, 2012 - link

    My doubt isn't about Apple supporting standards in general, it's about them supporting a weaker technology as a standard. Apple promotes H.264 over WebM, because it has superior image quality (yes there is some debate) and superior hardware support. With the existing PVRTC texture format on iOS offering better performance and image quality than ETC on the GPUs Apple uses I can't see them singing ETC's praises as a standard. They'll no doubt add iOS support for ETC as it's required for OpenGL ES 3.0, but I don't doubt they'll be strongly encouraging iOS developers to keep using PVRTC and soon PVRTC2.
  • Alexvrb - Monday, August 6, 2012 - link

    PVRTC2 will likely wipe the floor with ETC2. Heck it'll probably be better than ASTC too.
  • iwodo - Tuesday, August 7, 2012 - link

    Would love to see PVRTC2 Vs ASTC
  • whooleo - Monday, August 6, 2012 - link

    Concerning the desktop, OpenGL is hardly being used in games due to DirectX being better and because many of the Linux and Mac OS X users aren't gamers. I mean there are games on Mac OS X but not many gamers on Macs. Plus it shows, Apple's graphics drivers aren't where they should be compared to Linux and Windows along with the graphics cards they include with their PCs. Enough about Mac OS X, Linux also has too many issues with graphics cards and drivers to be friendly enough to the average gamer. This is just a summary of the reasons why desktop OpenGL adoption is low. Now don't get me wrong, OpenGL can be a great open source alternative to DirectX but some things need to be addressed first.
  • bobvodka - Tuesday, August 7, 2012 - link

    Just a minor correction; OpenGL isn't 'open source' it is an 'open standard' - you get no source code for it :)
  • whooleo - Tuesday, August 7, 2012 - link

    Whoops! Thanks for the correction!
  • beginner99 - Tuesday, August 7, 2012 - link

    maybe I got it all wrong but aren't the normal gaming gpus rather lacking in OpenGL performance? I always thought that that was one of the factors the workstations cards differ in (due to driver). Wouldn't that impact game performance as well?
  • Cogman - Tuesday, August 7, 2012 - link

    > As OpenCL was designed to be a relatively low level language (ANSI C)

    OpenCL was BASED on C99. It is not, however, C99. They are two different languages (in other words, you can't take OpenCL code and throw it into a C compiler and vice versa).

    Sorry for the nit pick, however, it is important to note that OpenCL is its own language (Just like Cuda is its own language).
  • UrQuan3 - Thursday, August 16, 2012 - link

    Maybe someone can set me straight on this. Years ago, I had a PowerVR card for my desktop (Kyro II). While this was not a high end card by any means, I seem to remember a checkmark box "Force S3TC Compression". The card would then load the uncompressed textures from the game and compress them using S3TC before putting them in video RAM. The FPS performance increase was very noticeable although load times went up a little.

    Am I confused about that? If I'm not, why isn't that more common? Seems like that would solve the problem of supporting multiple compression schemes. Of course, if a compression scheme isn't general purpose, that could cause problems.

Log in

Don't have an account? Sign up now