I have a graphics application here that requires the speed and reliable storage of a server machine to run on but needs a decent graphics card for display. So the developers have chosen to use a display box with a number of NVIDIA cards in it, with the application running on a separate server. The application then just connects via X to the display box.
This has been working fine, however with newer Qt version the app has started to use Vertex Arrays. This isn't supported by the OS shipped OpenGL libraries over GLX. The NVIDIA OpenGL libraries shipped with the drivers do support this however. And tarring up the relevant libs from a machine with the NVIDIA driver/hardware and untarring them on the server works fine (this was suggested by an NVIDIA employee on another forum). However this doesn't seem very maintainable or clean.
Does anyone know if the NVIDIA installer can be made to install just the application libs (32 and 64 bit) on a system with no NVIDIA hardware (not the X drivers or kernel components)? I can't see a way to stop it bombing out when it sees no NVIDIA card.
Thanks
This has been working fine, however with newer Qt version the app has started to use Vertex Arrays. This isn't supported by the OS shipped OpenGL libraries over GLX. The NVIDIA OpenGL libraries shipped with the drivers do support this however. And tarring up the relevant libs from a machine with the NVIDIA driver/hardware and untarring them on the server works fine (this was suggested by an NVIDIA employee on another forum). However this doesn't seem very maintainable or clean.
Does anyone know if the NVIDIA installer can be made to install just the application libs (32 and 64 bit) on a system with no NVIDIA hardware (not the X drivers or kernel components)? I can't see a way to stop it bombing out when it sees no NVIDIA card.
Thanks
Comment