Announcement

Collapse
No announcement yet.

Whoops, ATI's Evergreen Will Bring A New Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by smitty3268 View Post
    You guys would never have even heard of this if bridgman hadn't announced it, it just would have been another branch of code that came and went silently.
    Bingo... and I wouldn't have said anything about it if I hadn't just finished telling the mesa dev list folks that we were going to do something different an hour before we had the latest meeting.

    I kinda felt it was important to tell them what was going on, since the implication was that the 7xx-to-Evergreen transition might end up being a good opportunity for jumping to Gallium3D despite having said the exact opposite an hour earlier
    Test signature

    Comment


    • #72
      Originally posted by bridgman View Post
      Bingo... and I wouldn't have said anything about it if I hadn't just finished telling the mesa dev list folks that we were going to do something different an hour before we had the latest meeting.

      I kinda felt it was important to tell them what was going on, since the implication was that the 7xx-to-Evergreen transition might end up being a good opportunity for jumping to Gallium3D despite having said the exact opposite an hour earlier
      What's with the Evergreen codename anyway? Is it the first one that AMD named instead of ATI? Or do you guys just suddenly hate numbers now? It seems like it would be a lot simpler to just keep having increasing numbers, now we have to try and remember which names come before others - yuck.

      Comment


      • #73
        No, we just like trees... and islands...
        Test signature

        Comment


        • #74
          Originally posted by salva84 View Post
          You are right, I have an ATI 5850, the fglx drivers are really slow even scrolling web pages in firefox.... and the open source drivers are always "coming".... For me, the solution I've found is to execute Linux in a Virtualbox machine inside Windows 7. No more problem with graphics, suspend / hibernate, I can make a backup of the whole system, and I dont have to reboot to play some games.
          See, I know exactly what you mean. I have been a loyal ATI user since the Radeon 9700 days. I started converting myself to a full time Linux user in the Radeon X1900 era (3~4 years ago?) and the experience has always been like what you said Fglrx has always been a crap, endless crashing, slowing down, corrption, watermark?! And OSS driver by that time does not even support 2D acceleration in my X1900, so I waited and waited and waited still I sold my X1900 and it still doesn't work in Linux. My next purchase, the most regretful one was HD 2900 PRO. I could have gone to nVidia and be happy once and for all, but 2900 drag me into the same loop hole. Fglrx after two years was still just like before, slowing down in scroll, card fan fires up at full speed, lock ups and shit, and they got X1000 series working in OSS driver! but I upgraded to HD2000 family, by which time the OSS driver still did not support yet. So another year of wait. And this keeps on going for my HD 4850. You see, you can never get out of this waiting loop hole unless you jump ship to use something else. You can either use new NVidia cards, or you can use ATI cards which are 5 years old. But new ATI card on Linux? come on that's never gonna ever happen.

          Comment


          • #75
            Yeah but if you jump ship you get a new card with a two year + old architecture. At least with ati drivers get better with age like a fine wine, nvidia well as Barack says, "You can put lipstick on a pig".

            Comment


            • #76
              Originally posted by jamey0824 View Post
              as Barack says, "You can put lipstick on a pig".
              Have you ever tried that? It's not easy.

              Comment


              • #77
                Originally posted by curaga View Post
                V!ncent, if you want to prove you have accel, post Xorg.0.log instead.
                I never said I had accel... I just said it was working with 2D.

                I was hoping someone would tell me by posting that info if it was accelerating 2D.

                Judging by the massive spikes in CPU usage it is probably just framebuffer support

                Comment


                • #78
                  Yep, if you're using the radeon driver then you have modesetting (either UMS or KMS) and shadowfb acceleration.

                  Shadowfb is surprisingly fast though, since the software rendering happens in CPU memory, which from a CPU's POV is maybe 100x as fast as video memory, then the shadow frame buffer (in CPU memory) is periodically copied to the real frame buffer.
                  Test signature

                  Comment


                  • #79
                    Originally posted by bridgman View Post
                    Bingo... and I wouldn't have said anything about it if I hadn't just finished telling the mesa dev list folks that we were going to do something different an hour before we had the latest meeting.

                    I kinda felt it was important to tell them what was going on, since the implication was that the 7xx-to-Evergreen transition might end up being a good opportunity for jumping to Gallium3D despite having said the exact opposite an hour earlier
                    I am with Bridgman here. Just try to jump to Gallium3D, since that will help evolve the whole stack, not just one single driver ( Ever ).

                    Comment


                    • #80
                      Either way you end copying to the graphics card... Does this mean that a dedicated framebuffer is not nessecarily faster than a CPU, but is only good for offloading calculation power? And the CPU is only faster if you do nothing else but blit a bunch of pixmaps? This in the sense that if you do more than that it would be faster to let the graphocs card handle the blitting because then you don't have to move stuff back and forth between the CPU and the GPU to avoid massive latency?

                      Comment

                      Working...
                      X