What GLX_MESA_query_renderer seeks to address is covered in full-length detail within Intel To Improve "Out Of The Box" Linux Gaming. The basics come down to on Linux for a software program / game to determine the graphics card in use, details of the OpenGL driver, and other system information is a real mess and lacks standardization. Games have had to resort to using the PCI library and other measures for trying to determine the graphics card in use, which isn't always in right for cases of multi-GPU systems and laptops sporting NVIDIA Optimus and various other scenarios. There's also been other aspects of determining Linux system information that's a mess.
Games and other software need this information for making smarter choices about default visual quality settings for game engines and other cases for handling driver/hardware black-lists and other aspects of delivering the best experience. The Linux situation is currently a mess, so Romanick talked with many stakeholders to come up with a new solution, that's modelled largely after Apple's handling in OS X.
The GLX_MESA_query_renderer extension provides a method of determining the graphics card(s) in use (including the vendor / device IDs), video memory information, preferred OpenGL profile, supported versions of the OpenGL Core and Compatibility problems, and OpenGL ES information. Ian Romanick has drafted the specification for this GLX work and he's also done a prototype implementation within the query-renderer branch of Mesa.
Romanick published this initial work to the Mesa-dev list. Once the specification is settled, he's also planning a similar version of MESA_query_renderer for EGL.
GLX_MESA_query_renderer works to address a real-world problem with Linux and will hopefully be adopted by games and other software making use of OpenGL in advanced modes. However, for it to be really useful and widespread, the proprietary NVIDIA and AMD graphics drivers will also need to provide support for this extension.