Announcement

Collapse
No announcement yet.

VMware's Virtual GPU Driver Is Running Fast

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Run games with XP under Ubuntu

    I hope this is not off topic.

    Is it advisable to run Windows XP as a VMware guest under Ubuntu 12.04 as a host, once it is available? Will most 3D games work in a satisfactory manner?

    I have a license for XP and I use it to play some games via dual boot which don't work under Wine. e.g. due to GameGuard or other crap.

    But XP will soon get out of maintenance and I don't want to buy another Windows license.

    System runs with a contemporary Nvidia card and the Nvidia binary currently on Ubuntu 11.10.

    Comment


    • #12
      For anyone with VMware Workstation 8, note that the graphics acceleration does not work for shared VMs. I had to unshare my ubuntu 12.04 vm in order to see the graphics acceleration in action.

      Comment


      • #13
        I have openSUSE 12.1 as host and openSUSE 12.1 as guest on a Dell XPS 14z laptop - Intel i5-2430M, Intel HD 3000 and Nvidia GT520M (via Nvidia Optimus). And i'm trying to get this working.

        Just the like the tutorial said I have VMware workstation 8.0.3 build-703057, and upgraded my host system to kernel 3.3.4-22-desktop, have Mesa 8.0.3 and the other dependencies (although I'm not sure which packages I need to install and not to install in openSUSE 12.1 - the amout of packages with intel in the name is confusing!).

        My problem is two fold:

        1) Although my host system says I have 3D acceleration glxgears makes the CPU usage jump to 80% - its not supposed to do that? I'm not convinced my system is using 3D acceleration as playing video causes tearing, and uses more CPU than it should.

        2) When I install the xf-86-video-intel driver in my openSUSE 12.1 guest, it just stops with a blank screen.

        If you could at least get problem 1 fixed, then that would go along way, trying different combinations of drivers and config is fine in a VM - but I can't mess about with my host system.

        Part of me just wants to put Ubuntu on it and be done with it! But then I wouldn't learn anything!

        Help from you experts here would be most welcome!

        Comment


        • #14
          Originally posted by rash.m2k View Post
          Although my host system says I have 3D acceleration glxgears makes the CPU usage jump to 80% - its not supposed to do that? I'm not convinced my system is using 3D acceleration as playing video causes tearing, and uses more CPU than it should.
          Running glxgears will pin the CPU on most systems -- the graphics workload is too simple to keep the GPU busy so most of the time is spent in the graphics stack. Try running a shader intensive game instead, something that will make the GPU work hard.
          Test signature

          Comment


          • #15
            Originally posted by bridgman View Post
            Running glxgears will pin the CPU on most systems -- the graphics workload is too simple to keep the GPU busy so most of the time is spent in the graphics stack. Try running a shader intensive game instead, something that will make the GPU work hard.
            How about glxspheres?

            Code:
            rm@internal:~> glxspheres
            Polygons in scene: 62464
            Visual ID of window: 0x93
            Context is Direct
            OpenGL Renderer: Mesa DRI Intel(R) Sandybridge Mobile 
            3.214284 frames/sec - 2.789998 Mpixels/sec
            2.309206 frames/sec - 1.641661 Mpixels/sec
            3.551191 frames/sec - 1.687881 Mpixels/sec
            3.532096 frames/sec - 1.678805 Mpixels/sec
            3.539815 frames/sec - 1.682474 Mpixels/sec
            3.542859 frames/sec - 1.683921 Mpixels/sec

            Code:
            rm@internal:~> optirun glxspheres
            Polygons in scene: 62464
            Visual ID of window: 0x21
            Context is Direct
            OpenGL Renderer: GeForce GT 520M/PCIe/SSE2
            68.489363 frames/sec - 59.448767 Mpixels/sec
            70.040586 frames/sec - 60.795229 Mpixels/sec
            65.860829 frames/sec - 57.167200 Mpixels/sec
            70.228166 frames/sec - 60.958048 Mpixels/sec
            71.775807 frames/sec - 62.301400 Mpixels/sec
            72.789950 frames/sec - 63.181677 Mpixels/sec
            70.925934 frames/sec - 61.563711 Mpixels/sec
            70.724441 frames/sec - 61.388815 Mpixels/sec
            ^C[WARN]Received Interrupt signal.
            Here's my glxinfo (i've removed the extension sections as they were just too much extra info - can post if you want):

            Code:
            rm@internal:~> glxinfo
            name of display: :0
            display: :0  screen: 0
            direct rendering: Yes
            server glx vendor string: SGI
            server glx version string: 1.4
            server glx extensions:
            
            ...
            
            client glx vendor string: Mesa Project and SGI
            client glx version string: 1.4
            client glx extensions:
            
            ...
            
            GLX extensions:                                                                                 
            
            ...
            
            OpenGL vendor string: Tungsten Graphics, Inc
            OpenGL renderer string: Mesa DRI Intel(R) Sandybridge Mobile 
            OpenGL version string: 3.0 Mesa 8.0.2
            
            ...
            
            glu version: 1.3
            glu extensions:
                GLU_EXT_nurbs_tessellator, GLU_EXT_object_space_tess
            ...
            I can clearly see a difference when running glxspeheres with/without optimus (optimus being MUCH faster and smoother), but CPU usage still rockets to 80% percent for both (i.e. on both Intel HD and Nvidia gfx). Also I noticed tha Xorg cpu usage also goes up to 50% along with glxspheres when running with optimus.

            Shouldn't this be offloaded onto the graphics card only?

            Here is what I have installed on my system:

            Code:
            internal:/home/rm # rpm -qa | grep xf86
            xf86-input-joystick-1.6.1-3.1.x86_64
            xf86-video-nv-2.1.18-4.1.x86_64
            xf86-video-mga-1.5.0-3.1.x86_64
            xf86-video-vmware-12.0.2-3.1.x86_64
            xf86-video-glint-1.2.7-3.1.x86_64
            xf86-video-trident-1.3.5-3.1.x86_64
            xf86-video-cirrus-1.4.0-3.1.x86_64
            xf86-input-vmmouse-12.8.0-3.1.x86_64
            xf86-video-geode-2.11.13-3.1.i586
            xf86-video-sis-0.10.4-6.1.x86_64
            xf86-video-mach64-6.9.1-3.1.x86_64
            xf86-video-ast-0.95.0-3.1.x86_64
            xf86-video-vesa-2.3.1-3.1.x86_64
            xf86-video-r128-6.8.2-3.1.x86_64
            xf86-video-fbdev-0.4.2-4.1.x86_64
            xf86-input-synaptics-1.6.1-8.1.x86_64
            libxcb-xf86dri0-1.8.1-6.1.x86_64
            xf86-input-wacom-0.15.0-5.1.x86_64
            xf86-video-tga-1.2.1-4.1.x86_64
            xf86-video-neomagic-1.2.6-3.1.x86_64
            xf86-video-chips-1.2.4-4.1.x86_64
            xf86-input-mouse-1.7.2-3.1.x86_64
            xf86-video-voodoo-1.2.4-4.1.x86_64
            xf86-video-siliconmotion-1.7.6-3.1.x86_64
            xf86-video-i128-1.3.5-3.1.x86_64
            xf86-video-ark-0.7.4-3.1.x86_64
            xf86-video-v4l-0.2.0-4.1.x86_64
            xf86-video-dummy-0.3.5-3.1.x86_64
            xf86-input-void-1.4.0-4.1.x86_64
            xf86-video-tdfx-1.4.4-3.1.x86_64
            xf86-video-ati-6.14.4-3.1.x86_64
            xf86-video-intel-2.19.0-7.1.x86_64
            xf86-input-keyboard-1.6.1-4.1.x86_64
            xf86-video-savage-2.3.4-3.1.x86_64
            xf86-input-evdev-2.7.0-3.1.x86_64
            xf86-video-newport-0.2.3-4.1.x86_64
            Code:
            internal:/home/rm # rpm -qa | grep intel
            libdrm_intel1-2.4.33-70.1.x86_64
            vaapi-intel-driver-1.0.17-2.1.x86_64
            xf86-video-intel-2.19.0-7.1.x86_64
            intel-gpu-tools-1.2-1.1.x86_64
            Code:
            internal:/home/rm # rpm -qa | grep -i mesa
            Mesa-libGLESv1_CM-devel-8.0.2-132.1.x86_64
            Mesa-libGL1-8.0.2-132.1.x86_64
            Mesa-libtxc_dxtn1-32bit-1.0.1.git20110528.1938-5.1.x86_64
            Mesa-32bit-7.11-11.1.2.x86_64
            Mesa-libEGL1-8.0.2-132.1.x86_64
            Mesa-libGL-devel-8.0.2-132.1.x86_64
            DirectFB-Mesa-1.4.5-20.1.2.x86_64
            Mesa-libGLESv2-devel-8.0.2-132.1.x86_64
            libOSMesa8-8.0.2-132.1.x86_64
            Mesa-libGLESv1_CM1-8.0.2-132.1.x86_64
            Mesa-libIndirectGL1-8.0.2-132.1.x86_64
            Mesa-8.0.2-132.1.x86_64
            Mesa-libEGL-devel-8.0.2-132.1.x86_64
            Mesa-libtxc_dxtn1-1.0.1.git20110528.1938-5.1.x86_64
            Mesa-libGLU1-8.0.2-132.1.x86_64
            Mesa-libglapi0-8.0.2-132.1.x86_64
            Mesa-libGLU-devel-8.0.2-132.1.x86_64
            Mesa-libGLESv2-2-8.0.2-132.1.x86_64
            Mesa-devel-8.0.2-132.1.x86_64
            Note - I really appreciate any help you can give me, I've posted on quite a few forums but no-one seems to know the answer to this conundrum, the intel drivers website say you just need to install these packages and it should all just work.
            Last edited by rash.m2k; 27 May 2012, 12:57 PM.

            Comment


            • #16
              That's still a benchmark, and you're cpu-limited. It's expected for it to use up as much as possible.

              Actual apps run vsynced to 60fps, usually.

              Comment


              • #17
                What packages do I need to install to get intel HD 3000 working?

                Previously I had xorg-x11-driver-video-intel-legacy installed. That doesn't seem to be the correct driver (its legacy).

                Or rather what packages could possibly be interfering with the intel hd 3000 working properly, I think before I upgraded to Tumbleweed I had it working with the above intel legacy package - I also had kernel mode setting disabled before.
                Last edited by rash.m2k; 27 May 2012, 02:30 PM.

                Comment


                • #18
                  Are you running on a VMware guest or just on regular hardware ? If regular hardware I think the latest "regular" driver release should work :



                  If you're running on a VMware guest then you probably want something else.
                  Test signature

                  Comment


                  • #19
                    Originally posted by bridgman View Post
                    Are you running on a VMware guest or just on regular hardware ? If regular hardware I think the latest "regular" driver release should work :



                    If you're running on a VMware guest then you probably want something else.
                    I've already seen that page and have all the latest versions of the packages installed, there must be some config I'm missing, such as kernel most setting, or a driver I have added to the 50-blacklist.conf file.

                    Something is not right - my CPU usage should be zero, I've run glxgears, glxspheres, mkv video files on previous laptops and machines, my CPU usage generally stays close to zero.

                    Comment


                    • #20
                      Originally posted by bridgman View Post
                      Are you running on a VMware guest or just on regular hardware ? If regular hardware I think the latest "regular" driver release should work :



                      If you're running on a VMware guest then you probably want something else.
                      already installed all the latest version of all of these items.

                      Comment

                      Working...
                      X