Announcement

Collapse
No announcement yet.

AMD Releases Open-Source UVD Video Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by droidhacker View Post
    It will decode bluray JUST FINE. It just won't decode bluray ***3D*** (which is a retarded fad to begin with).
    On Old UVD and UVD+ Cards and BD Disk didn't run smooth, so its more an hardware limitation.

    Comment


    • Originally posted by jrch2k8 View Post
      1.) good single thread performance is always welcome and the same is true for windows but im pretty sure in many games like Resident Evil 5/crysis 1 for example my old Phenom II X2[no OC] reached at least 40 pushing all the settings at 720p some im sure is pretty dependant of your setup
      2.) Wine has reached for me close to 95% parity in many games like lineage2 Tauti or last half life title or devil may cry 4 but i agree it can vary a lot depending your system but the same is true on windows.
      3.) nope, only r600g have an experimental LLVM backend and no commercial driver has it either[LLVM i mean] and either way is not that simple cuz Wine could today implement a direct TGSI pass and forget the conversion but then only Gallium drivers would work or use direct GPU ASM so any GPU would work but prolly at the cost of new 5 million LoC and beyond all that you still need to parse and run basic security/stability checks[this is done by windows too at driver level] and even so Windows drivers have a bazillion of hacks to improve performance in selected games due to ultra crappy routines/spec violations/by hand slow ASM that game studios never fixes[especially unreal]. So even if you literally grab the entire DX code and hook it directly to the GPU most games would still be slower than windows just because the driver don't have the optimization made by hand that are present on windows drivers over the years


      The rasterizer/synthesizer inside a GPU is unified for DirectX and OpenGL. What do you think is wrong with my proposal: To extract from a GPU windows driver, the D3D_compiler target library, and run DirectX via Wine without emulation. Just compile from D3D_bytecode to D3D_machinery_code and send that to the GPU. Then the hacks are up to the GPU driver, and i don't thing that are not there, they probably don't cut code form the rasterizer.

      As for Lineage2: [email protected] and Nvidia Fermi [email protected] at medium settings: Low quality renderer=60% --- high quality renderer=30% and not everything work.

      Comment


      • Originally posted by curaga View Post
        Sure it does, some acceleration techniques prevent subtitles from being used at all (old-style video overlay - you can't read it back, you can't render over it).
        You don't need to Copy back the Data for an Overlay. And it has nothing todo with the Hardware or how the hardware decode the video.

        Comment


        • Originally posted by artivision View Post
          The rasterizer/synthesizer inside a GPU is unified for DirectX and OpenGL. What do you think is wrong with my proposal: To extract from a GPU windows driver, the D3D_compiler target library, and run DirectX via Wine without emulation. Just compile from D3D_bytecode to D3D_machinery_code and send that to the GPU. Then the hacks are up to the GPU driver, and i don't thing that are not there, they probably don't cut code form the rasterizer.
          There's more to a graphics driver (DX, GL, whatever) than just the shader compiler. You would need to write the rest of the DX driver as well (or at least emulate it over OpenGL via Wine as I believe is done today).
          Test signature

          Comment


          • Originally posted by Deathsimple View Post
            Thx for the reminder, changing the page right now. Edit: Ok not changing this, cause I can't remember my password for it (and the register server seems to be down).

            RS780 and RS880 are both UVD2, but very early implementations (some features missing, allot of hardware bugs that needs workarounds, etc...), I have an RS880 based laptop and I'm still working on supporting those (I already have the firmware is booting, but the have a crepy problem with the memory controller I can't seem to fix).

            Christian.
            Thanks!
            I'd be willing to test RS780 support when you think it's closer.
            What kernel tree/version would one need? radeon_drm git?

            Comment


            • Originally posted by artivision View Post
              The rasterizer/synthesizer inside a GPU is unified for DirectX and OpenGL. What do you think is wrong with my proposal: To extract from a GPU windows driver, the D3D_compiler target library, and run DirectX via Wine without emulation. Just compile from D3D_bytecode to D3D_machinery_code and send that to the GPU. Then the hacks are up to the GPU driver, and i don't thing that are not there, they probably don't cut code form the rasterizer.

              As for Lineage2: [email protected] and Nvidia Fermi [email protected] at medium settings: Low quality renderer=60% --- high quality renderer=30% and not everything work.
              i use high in all settings + new shaders + hdr C class rendering + all effects including FOV and reflections + 2x MSAA AMD Fx 6100 stock radeon 4850x2 2GB[no crossfire with fglrx] [7770 OC coming friday] 16gb Ram 90% of win7 SP1 and in sieges 110% [for some reason i don't yet fully understand i have lots less graphical lag in wine than win7]. note with r600g is around 50-60%[mesa/drm/kernel/xorg/wine git in gentoo x86_64]. i have an OCZ vertex 4 SSD 128gb shared with both. Btw absolutely every option work perfectly for me and are all actives including system message and damage onscreen info.

              the same applies for me with StarCraft2 Wings of Liberty in campaign mode, map editor is actually slow, you have something wrong in your setup if Lineage2 don't work flawlessly and fast in your PC

              Comment


              • Originally posted by bridgman View Post
                There's more to a graphics driver (DX, GL, whatever) than just the shader compiler. You would need to write the rest of the DX driver as well (or at least emulate it over OpenGL via Wine as I believe is done today).

                Wine doesn't emulate any GPU driver functionality. It just translates HLSL bytecode to GLSL. I can also call you a liar, because some months before, you post to a comment of mine, that your driver is unified for D3D and OGL, and the same quality as your competitor. You probably don't know ether.

                Comment


                • Originally posted by przemoli View Post
                  Serious Sam works on r600g same as on Catalyst at least for my humble 5730M. (Bad in both cases :P)
                  Well at least on linux my hd5750 is not fast enough either..... catalyst 13.3
                  But that is an driver problem for the most part.

                  I am having fun, but had to turn down the resolution, and have to use cpu frequency utilities, to set the kernel to performance..

                  I do not like to admit it, but it runs better under DirectX on the same hardware. ( dual boot pc win7 and openSUSE 12.3 )

                  ( copied the save games from linux to windows for testing this )
                  We can only hope they will improve the two catalyst for linux drivers.

                  Or the opensource driver to become real good also for newer hardware.

                  Still I am happy amd did release UVD video support.
                  Its more then nvidia is doing.

                  But sadly for gaming on Linux, currently don't get an amd graphics card.

                  Comment


                  • Originally posted by artivision View Post
                    The rasterizer/synthesizer inside a GPU is unified for DirectX and OpenGL. What do you think is wrong with my proposal: To extract from a GPU windows driver, the D3D_compiler target library, and run DirectX via Wine without emulation. Just compile from D3D_bytecode to D3D_machinery_code and send that to the GPU. Then the hacks are up to the GPU driver, and i don't thing that are not there, they probably don't cut code form the rasterizer.
                    1.) btw you most likely can't do that without a kernel module since linux don't allow direct hardware access like windows
                    2.) i really doubt someone actually implement it since will require a massive amount of work for not much gain at the end and the security risk of doing so are big enough
                    3.) this will probably will require additional driver support an a protocol to do so which will prolly take long time assuming nVidia/AMD even care to do so [i bet you gallium just won't, at all]

                    Comment


                    • Originally posted by phoronix View Post
                      Phoronix: AMD Releases Open-Source UVD Video Support
                      FUCK YEAH!

                      Now just get that powersaving code that is already written and working past technical review and we're rocking!!!

                      w00t w00t!

                      Comment

                      Working...
                      X