Announcement

Collapse
No announcement yet.

Gallium3D / LLVMpipe With LLVM 2.8

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Can anyone explain what the difference between LLVMpipe and Gallivm is? Both seem to be projects to use LLVM, but why do they both exist? As I understand it, LLVMpipe is based on the same design as a normal softpipe CPU driver, does Gallivm take a different approach somehow?

    Comment


    • #32
      Originally posted by pingufunkybeat View Post
      Crysis needs at least Windows Vista, not sure about the others.

      Which version of Windows are you trying to run them with?

      you didnt understand my last sentence...

      even crysis works faster then 5 fps on 4850...

      Comment


      • #33
        I buy a new computer every six years. By the time UT3 was released and I still dual booted Windows, my then six year old 2800 XP+, 1GB RAM and 9800 pro could still run it at medium settings with a res of 1024*786 @ 24-30fps.

        Now I have a Phenom 9950 X4, 8GB RAM and a 5770 and I don't want to upgrade whenever I get an 'sorry unsuported error', simply because my hardware can't do some OpenGL version that would be supported in the next four years, while my rig would still be able to somewhat cope with semi-HW fallback.

        Remember that my old 9800 pro was OpenGL 1.5! :/

        Comment


        • #34
          PS: Almost 3yo already. Damn time flies...

          Comment


          • #35
            Originally posted by V!NCENT View Post
            I buy a new computer every six years. By the time UT3 was released and I still dual booted Windows, my then six year old 2800 XP+, 1GB RAM and 9800 pro could still run it at medium settings with a res of 1024*786 @ 24-30fps.

            Now I have a Phenom 9950 X4, 8GB RAM and a 5770 and I don't want to upgrade whenever I get an 'sorry unsuported error', simply because my hardware can't do some OpenGL version that would be supported in the next four years, while my rig would still be able to somewhat cope with semi-HW fallback.

            Remember that my old 9800 pro was OpenGL 1.5! :/
            Your rig should be able to cope with a full-HW fallback for a few years. Most games still offer D3D9/GL2.1 codepaths and you have D3D11/GL4.1 hardware! It's highly unlikely you will ever need to fall back to software rendering during the next 4-5 years (and probably more).

            Modern hardware has evolved faster than your typical software/games (a mid-range card can play everything at 1920x1200) so unless you are using specialized programs you'll probably won't face performance issues during your system's lifetime. Add a SSD next year or so and you are golden! (Not kidding, a fast SSD will make your run-of-the-mill netbook feel like a supercomputer).

            Comment


            • #36
              Originally posted by smitty3268 View Post
              Can anyone explain what the difference between LLVMpipe and Gallivm is? Both seem to be projects to use LLVM, but why do they both exist? As I understand it, LLVMpipe is based on the same design as a normal softpipe CPU driver, does Gallivm take a different approach somehow?
              LLVMpipe is a driver that uses LLVM to turn TGSI into CPU code so shaders (and possibly other operations) don't have to be interpreted at runtime. I imagine there is a TGSI to LLVM IR converter followed by LLVM itself.

              Gallivm is part of LLVMpipe -- think about LLVM (C++) wrapped up for use in a C driver -- but is also likely to be used "above the line" for *generating* TGSI (as part of the OpenCL stack and possibly as part of a future GLSL compiler).

              - The driver-independent parts of the LLVM / Gallium code are found in src/gallium/auxiliary/gallivm/.
              http://cgit.freedesktop.org/mesa/mes...lvmpipe/README

              Gallivm is in the src/gallium/auxiliary folder, along with other "helper" routines like draw.

              Disclaimer - I found this on the internet so while it was probably true at one time it may no longer be true today

              Comment


              • #37
                I don't trust an SSD with my data, yet. Better does this tech mature a little more over time and even then I have to do some homework. Of course I could put /home on my HDD, but I currently don't feel that file operations are too slow for my liking

                And upgrading in between the lifetime of my rig is a little pointless; I paid money for a computer that I configured to have a lifetime of 6 years

                Comment


                • #38
                  Originally posted by V!NCENT View Post
                  I don't trust an SSD with my data, yet. Better does this tech mature a little more over time and even then I have to do some homework. Of course I could put /home on my HDD, but I currently don't feel that file operations are too slow for my liking

                  And upgrading in between the lifetime of my rig is a little pointless; I paid money for a computer that I configured to have a lifetime of 6 years
                  Fair enough. SSDs are rather more reliable than spinning ceramic platters, however.

                  Comment


                  • #39
                    'til they reach their write limit, or if they have flaky firmware, or a flaky controller and don't save data in the first place, or...

                    Comment


                    • #40
                      Originally posted by curaga View Post
                      'til they reach their write limit, or if they have flaky firmware, or a flaky controller and don't save data in the first place, or...
                      Intel SSDs are rather more reliable than spinning ceramic platters, then. :P

                      The nice thing is that SMART data from SSDs is actually accurate and you can predict failures well in advance. Much nicer than spinning disks which can fail catastrophically without any prior indication. Google's grand hard disk test concluded that "models based on SMART parameters alone are unlikely to be useful for predicting individual drive failures."

                      After a year of hard use (1.65TB writes, virtual machines, browser caches, etc) my Intel SSD (X25-M G2) has a wearout indicator of 98/100 and a single reallocated sector. The drive is considered broken when the wearout indicator reaches 10/100 and with the current rate it will last >10 years. Not bad!

                      (I'm gaining levels in thread derailment, btw.)

                      Comment

                      Working...
                      X