Targeting OpenGL is not so easy, don’t get confused by the documentation

I want to create this post to clarify once and for all how the OpenGL extensions mechanism works and the correct proceedings to target OpenGL versions. I named this article in this way because OpenGL are generally bad documented (or difficult to understand) and OpenGL.org wiki makes the things worse. For example, several people got confused by this page:

https://www.opengl.org/wiki/OpenGL_Extension#Core_Extensions

Targeting OpenGL 2.1

These are useful extensions when targeting GL 2.1 hardware. Note that many of the above extensions are also available, if the hardware is still being supported. These represent non-hardware extensions introduced after 2.1, or hardware features not exposed by 2.1's API. Most 2.1 hardware that is still being supported by its maker will provide these, given recent drivers.

And this document:

https://www.opengl.org/registry/specs/ARB/map_buffer_range.txt

"New Procedures and Functions

void *MapBufferRange( enum target, intptr offset, sizeiptr length,
bitfield access );

void FlushMappedBufferRange( enum target, intptr offset, sizeiptr length );

Issues

(1) Why don't the new tokens and entry points in this extension have
"ARB" suffixes like other ARB extensions?

RESOLVED: Unlike a normal ARB extension, this is a strict subset of functionality already approved in OpenGL 3.0. This extension exists only to support that functionality on older hardware that cannot implement a full OpenGL 3.0 driver. Since there are no possible behavior changes between the ARB extension and core features, source code compatibility is improved by not using suffixes on the extension."

so the question is:

- GL_ARB_map_buffer_range is a core extension or not?

Yes, it is. If you check gl.xml you can find:

 <extension name="GL_ARB_map_buffer_range" supported="gl|glcore">
 <require>
 <enum name="GL_MAP_READ_BIT"/>
 <enum name="GL_MAP_WRITE_BIT"/>
 <enum name="GL_MAP_INVALIDATE_RANGE_BIT"/>
 <enum name="GL_MAP_INVALIDATE_BUFFER_BIT"/>
 <enum name="GL_MAP_FLUSH_EXPLICIT_BIT"/>
 <enum name="GL_MAP_UNSYNCHRONIZED_BIT"/>
 <command name="glMapBufferRange"/>
 <command name="glFlushMappedBufferRange"/>
 </require>
 </extension>

The attribute 'supported' tells "gl|glcore", so GL_ARB_map_buffer_range is a core extension. The conclusion here is that you can target OpenGL 2.1 expecting OpenGL 3.0 core extensions, which could be tricky if you are not an expert. The functions have non-ARB names, so you may just check the list, init a loading library like glew, just use the functions without making a check of the extensions, and surprisingly you may don't work on some OpenGL 2.1 drivers. You may think "but the official wiki page told me that I can target OpenGL 2.1 with that, the wiki is wrong". No, the wiki is correct, because they said: "Most 2.1 hardware that is still being supported by its maker will provide these, given recent drivers.". 

OpenGL 2.1 is the minimum requirement to make the extension work on his environment but this particular extension, along other similar ones, have been introduced from OpenGL 3.0 onward and not OpenGL 2.1, so it may be present or not on recent OpenGL 2.1 versions if they are implemented by their vendors, but it's not 100% guaranteed because these extensions are not core features. So, when you target OpenGL 2.1, you have to be very careful about these tricks, because the authors themselves bad documented them. Infact, I disagree with the suggestions of the wiki page, you don't have to target OpenGL 2.1 expecting 3.0 core extensions, but check the extensions to "eventually" support them to improve the graphics, that is totally different.

At this point, you may ask:

- What kind of display driver provides only OpenGL 2.1, and not OpenGL 3.0 or the latest versions?

Well, it's more common than you think. The display drivers installed on virtual machines, for example, or display drivers of older graphics hardware that provides only a subset of the features that can implement a subset of the OpenGL 3.0 core extensions but not all the existing features (so it cannot be OpenGL 3.0 compliant, from the Khronos tests). Remember that you always have to check an extension, when you want to use it. Even the most recent OpenGL 4.5 core extension can be present on the oldest OpenGL version, if the minimum requirement says it. I can make an OpenGL extension that is a core extension on version 4.5 but that could be implemented also on OpenGL 1.2, because it is the minimum requirement to make the extension work! But this doesn't mean that if you target OpenGL 1.2 (which is pretty old) you can count on a 4.5 feature, if not to improve the graphics, that sincerely is deplorable. My best suggestion here is to scale the graphics to use extensions along their correct timeline, checking if they are supported and use modern extensions to improve the graphics but not to decide if the application may run or not.

For instance, you can target OpenGL 2.1 and use GL_ARB_map_buffer_range but if the application doesn't start because this extension is not supported by the driver, you are making a bad support of OpenGL 2.1 on all those machines that provide it. I hope that this article could help to clarify this important process.