Announcement

Collapse
No announcement yet.

How Old ATI GPUs Can Be Faster On Open Drivers

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by cookiecaper View Post
    They dumped docs in desperation during hard times hoping to buy loyalty from open-source purists without dedicating many resources to the thing or really supporting the OSS ecosystem, and I don't think that's the right behavior to encourage or reward. We want real support or real help, not lip service.
    They may or may not have been fueled by desperation when they did this, but you are forgetting your history.

    AMD did _exactly_ what the open source community asked. They released docs because they were told that by doing so, open source devs would pick those docs up and run with it to write drivers.

    Well, as time has worn on, there has been no magical army of devs who know how to code display drivers rush to the rescue.

    The ones that exist (aren't some paid by AMD?) seem to be working hard to write solid drivers. They have put an emphasis on older hardware since the closed drivers stop supporting older drivers as new models are released.

    I seem to recall reading that as they finish the older drivers, the newer ones will be developed faster because much of the code necessary will have already been written.

    Comment


    • #32
      Originally posted by Nobu View Post
      Well, (sort-of off topic) nVidia may be stepping out of x86 (with their ARM venture, if you want to call it that), but we'll have to wait to see whether they support open driver development in that area. (and, of course, this will be a processor. But it may influence their graphics division, too)

      I'm excited to see what they produce (especially if they perform well and don't cost too much)...I just hope I can add an external video card, in case the on-board gpu is unsupported or slow.
      i suspect this nvidia "ARM move" will be more about the smartphone/tablet/netbook market than the general computer one.

      So far i don't think (haven't searched much TBH) any ARM product matched x86 performance wise. Of course someone might argue that the "average" user doesn't need a really powerful computer. But again there is a portion of the market that does more than browsing chating doc editing and stuff on their computers. And no other arch ever became direct competition to x86 in that market. Even in the server/hi performance market x86 seems dominant.

      Comment


      • #33
        Originally posted by Drago View Post
        LiquidAcid, dude. Is it described how to upload compressed textures on GPU in R600+ docs from AMD?
        The BCx formats (BC1 to BC3 were formerly called DXTn formats) are mentioned in the R700 ISA docs (page 342). Concerning texture upload, I don't know. Like I said, probably some alignment issue.

        Comment


        • #34
          Originally posted by Qaridarium
          german/europe do not have pure software patents but we have hardware binding software patents and S3TC is a hardware binding feature.

          means no US,German,Europa dev can help you. we need to support some devs on Kuba or Toga islands or China.
          So libtxc_dxtn is illegal ?

          Comment


          • #35
            Originally posted by Qaridarium
            german/europe do not have pure software patents but we have hardware binding software patents and S3TC is a hardware binding feature
            However AMD/ATI is already paying license fees for S3TC since it's implemented in hardware. So there isn't actually a problem when feeding the GPU with pre-compressed textures.

            The problem is when textures have to be compressed or decompressed by the CPU in local memory, and that's where libtxc_dxtn comes into play.

            As long as you just handle pre-compressed data you don't even come near the compression/decompression algorithm. You just move bytes from A to B.

            Comment


            • #36
              Originally posted by LiquidAcid View Post
              However AMD/ATI is already paying license fees for S3TC since it's implemented in hardware. So there isn't actually a problem when feeding the GPU with pre-compressed textures.

              The problem is when textures have to be compressed or decompressed by the CPU in local memory, and that's where libtxc_dxtn comes into play.

              As long as you just handle pre-compressed data you don't even come near the compression/decompression algorithm. You just move bytes from A to B.
              As i undrerstant it, one do not need libtxc_dxtn, except maybe software renders. If game uses comressed textures, feed them to vram and let the GPU to conserve space. If the game doesn't uses compressed textures, upload rgba and do not conserve space. What is the whole problem with that, and patents?

              Comment


              • #37
                LiquidAcid, I fail to find that document in http://www.x.org/docs/AMD/ . Any link appreciated.

                Comment


                • #38

                  Comment


                  • #39
                    Originally posted by Drago View Post
                    As i undrerstant it, one do not need libtxc_dxtn, except maybe software renders. If game uses comressed textures, feed them to vram and let the GPU to conserve space. If the game doesn't uses compressed textures, upload rgba and do not conserve space. What is the whole problem with that, and patents?
                    I believe that's essentially correct, and that many games will work without the functionality provided by libtxc_dxtn. However, I don't think the drivers can actually expose the extension without it, and some apps check that and fail if it isn't present. That's why some drivers allowed you to override and specify the extension as present even though it wasn't fully. There's a bug report to add that into r600g, but it hasn't been done yet.

                    Also, as far as i know, there should be no patent issues with just adding the support to use libtxc_dxtn into mesa. It's the actual library and the person using it which is illegal, not the support for it added to mesa.

                    Comment


                    • #40
                      Originally posted by smitty3268 View Post
                      However, I don't think the drivers can actually expose the extension without it, and some apps check that and fail if it isn't present.
                      Exactly. You can't expose texture compression support in OpenGL without libtxc_dtxn, since you wouldn't cover the whole specs this way:



                      Basically need the driver to be able to compress and decompress blocks. Keep in mind that this is already (fully?) implemented in both classic mesa and gallium:
                      src/mesa/main/texcompress_s3tc.c
                      src/gallium/auxiliary/util/u_format_s3tc.c

                      Comment

                      Working...
                      X