Announcement

Collapse
No announcement yet.

Open ATI Driver More Popular Than Catalyst

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by Michael View Post
    Kano, every driver has its problems and bugs that go unresolved for long periods of time... Hell, I still have a NVIDIA driver bug that's been open now for about four years or so concerning CoolBits and Xinerama.
    I enable CoolBits by default, but basically rarely use it. Xinerama I do not use at all. I don't know what the exact problem is in your case, but the gpus behave definitely differently when more than 1 monitor is used, then the gpu/vram is clocked higher even in idle mode which also leads to (much) higher energy consumption.

    Of course it is nice when oc works, the gains are usually pretty small however for a gpu oc.

    RENDERING problems - especially regressions should much more important than oc in the special case.

    As i think you are very interested in automated regression tests, maybe take a deeper look at wine:



    When you use the same suite and different driver releases and you see differences then you can easyly identify problems too.

    Comment


    • #62
      I assume ATIs plan is to go 100% over to open source driver eventually?

      Comment


      • #63
        Originally posted by kersurk View Post
        I assume ATIs plan is to go 100% over to open source driver eventually?
        Doubtful. Probably the aim is to provide an open source alternative, but there are proprietary bits & pieces that ATI will continue to support that can't go open source (third party stuff that's included, such as s3tc).

        Comment


        • #64
          Originally posted by deanjo View Post
          No vsync? what are you smoking? seriously? Vsync works PERFECTLY fine on nvidia cards. Turn it on. Not the case on ATI cards. 'Flicker free' booting, put the appropriate mode line in grub, flickering is a problem for old CRT's. In the day and age of LCD's there is no real 'flicker' to be appreciated. I spend more time watching tear free accelerated video then I do rebooting and changing resolutions to non native res's (perhaps it's needed more often with FOSS setups with them giving shitty performance having to change resolutions down to a lower value) and with portables I care more about a working sleep state then a rebooting process (again advantage nvidia).
          You're not getting it.. this isn't about <60 Hz flickering, this is about switching resolutions 4 times from pressing the power button to your desktop.

          Comment


          • #65
            I bought HD3870 last year mainly because of ATIs open source effort. I tought that if i need 3d performance I can go back to fglrx. Then ATI stated that they don't support upstream xorg or kernel. Just Ubuntu, rhel etc... thats kinda problem for me because I use Archlinux.

            Now I use xf86-video-ati and I'am quite pleased. I don't use compiz or the other composite things. They are useless and cause all kinds of problems. I would like to play Neverwinter Nights, but thats not a huge problem.Even if I can get fglrx to work on my machine there is a bug somewhere that prevents me to play NWN(Huge memory leak). I bought PS3 for gaiming. Almost every game these days are designed for consoles.

            ps. English probably sucks, but bear with me!

            Comment


            • #66
              I guess when you used a 25 € card you would be as pleased as you are now or even onboard would be suited for you. A PS3 for games is certainly the best way to get rid of driver issues without the need to pay a MS fee

              Comment


              • #67
                Originally posted by Qaridarium
                idtech5 is an bullshit backward engine only for bullshit Xbox360/playstation3 hardware with only 256mb-ram/512mb-Ram.

                IDtech5 is the worst i ever read abaut....
                Lol you are an idiot. Yes pun intended.

                Having an engine that has awesome lightning, infinite texture detail and amount and unlimited geometry and then not sucking up major RAM kicks ass.

                What else do you want from game engine? It's an engine without limits for game designers. No matter how far you zoom in with your sniper rifle; every frame rendered has about the same detail.

                Nothin comes close to id tech5 by a long shot. This engine compared to other engines is what OpenGL Quake 1 was to other software rendering engines back in the day.

                Comment


                • #68
                  I use nVidia GTX260 with nvidia and Radeon 4870 with fglrx. Both run Debian Sid 64 bit.

                  My only issue now is that I cannot get compositing working on fglrx.

                  Otherwise, I happy with both. I occasionally dual boot on the Radeon machine to another OS.


                  .

                  Comment


                  • #69
                    Originally posted by Kano View Post
                    I guess when you used a 25 ? card you would be as pleased as you are now or even onboard would be suited for you. A PS3 for games is certainly the best way to get rid of driver issues without the need to pay a MS fee
                    Well I guess you are right. I should have bought motherboard with intel graphics, but I hope all those ATI open source devs can get that tought out of my mind.

                    Comment


                    • #70
                      Originally posted by smitty3268 View Post
                      Is VLIW support something Itanium needs? Just wondering if there's any chance of someone else helping out with it or if it will need to come from the radeon developers to make it happen.
                      AFAIK the Itanium stack outputs assembler source code for a single-issue version of the CPU, then custom code in the assembler takes care of packing multiple operations into a single instruction group.

                      Originally posted by smitty3268 View Post
                      It probably doesn't make a whole lot of sense to drop the current compiler while you are still able to share it with the non-gallium driver, but I do hope this gets worked on eventually when gallium gets more mature. I really don't think the custom code in the drivers is going to be able to compete with the optimizations created by a dedicated compiler team.
                      Remember that shader compilation happens in two stages -- the upper levels of Mesa parse the app's shader program, then optimize and convert to a standard IL (TGSI for Gallium3D). The compiler in the driver handles the second stage - conversion from IL/TGSI to native hardware instructions.

                      One school of thought is that the bulk of the generic optimizations can be done in the first stage (before IL is generated), and that the optimizations done in the second stage will be highly specific to the actual hardware anyways. Nothing is ever 100%, of course, but the current thinking is that LLVM can be more useful in the first stage than the second, hardware-specific stage. I believe the plan is still to use LLVM in the first stage of the OpenCL stack, for example.

                      I missed you so much. Yes, you. No, not you. You. I couldn't blog for a while and I ask you (no, not you) what's the point of living if one ...
                      Test signature

                      Comment

                      Working...
                      X