Announcement

Collapse
No announcement yet.

R600 Gallium3D Shader Compiler Milestone Hit

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    I didn't spend 300 bucks on a graphics card to play Mass Effect with 40FPS. That would be a total waste of money. I play in Windows, VSynced to a constant 60FPS. Also, Wine *always* has some glitches. It's never perfect. When people buy a high performance gaming card, they didn't do it so it can go to waste, they did it to have the best possible gaming performance for their money.

    It's just not a gaming platform, period. And the majority of people know it.

    And this is about the open source drivers anyway. Play Mass Effect and Dragon Age on those.

    Comment


    • #32
      Originally posted by bugmenot View Post
      I play Mass Effect under wine.Other than some issues with the mouse (which can be worked around with hacks to wine) the game works fine and I get 40+ fps.
      I also play Oblivion, Dragon Age : Origins, The Last Remnant, Velvet Assassin, Fallout 3.All of which are not using "last-decade graphics".I'm probably somewhat in the minority but Linux users don't just play old games.
      Wow I didn't know that all these games would work with r600 driver. I just tried Oblivion and it indeed runs . I'll have to wait for evergreen support though because the game is a little too much for my HD 3200 IGP.

      Comment


      • #33
        Originally posted by monraaf View Post
        Wow I didn't know that all these games would work with r600 driver. I just tried Oblivion and it indeed runs . I'll have to wait for evergreen support though because the game is a little too much for my HD 3200 IGP.
        I don't own an ATI card, and so haven't tested the open source drivers, I'm using a nvidia card with the nvidia binary drivers.My point was that some people do care about 3D graphics performance.Whether or not it matters with wine at this point, I'm not sure.If wine is already bottlenecking the card, would the graphics driver performance hit make things worse? I guess it would depend on which parts of the driver are causing slowdowns.

        I agree that completeness and stability are probably the most important concerns, but I do hope the drivers can eventually get to 80-90% of windows performance.The fact is, I will stick with nvidia cards until this is true.

        As for windows, I like to do jump in and out of other tasks while I'm playing games.I would rather be in my preferred OS most of the time, rather than having to deal with windows.I do keep a windows installation around, but I only find myself using it if it's a short, but high quality game that can't be run in wine.RPGs are out, otherwise I'd be stuck in windows all the time.

        Comment


        • #34
          Has anyone tried running the piglit tests?

          World of Warcraft almost runs on my HD4890 with a 2.6.33 kernel on F12, but crashes shortly after a call to glPixelStorei() (apparently). I'd like to gather more information about this bug on my PC's particular graphics stack, and I'm hoping that the "piglit" tests might have enough coverage to reveal something interesting. However, these tests seem to have been written by graphics developers for other graphics developers! Does anyone have any experience running these things, please? Is there anything I need to be aware of that might totally invalidate any results I might get?

          I have a nasty suspicion that the bugs I am seeing are likely to remain unfixed for a long, long time unless I can construct a meaningful report for them...

          Comment


          • #35
            Originally posted by RealNC View Post
            The only thing that saves Linux in this regard is that people don't actually use it to play games. And they don't expect it to play games either. That's why poor graphics performance is OK with most users.

            (Note: with "games" I mean real games, like Assassin's Creed, Mass Effect and stuff that runs best on Windows and Consoles, not some amateur or old games with last-decade graphics.)
            What a bull. There are also proprietary graphic drivers on Linux which give comparable performance to Windows.

            Comment


            • #36
              Originally posted by RealNC View Post
              Note: with "games" I mean real games, like Assassin's Creed, Mass Effect and stuff that runs best on Windows and Consoles, not some amateur or old games with last-decade graphics.)
              Rofl... you call these real games? Have a look at this: http://www.youtube.com/watch?v=n7VAhzPcZ-s

              Comment


              • #37
                Well I for one think that this is perfectly OK.

                The main thing is that there is some *decent* out of the box "stuff just works" going on. If some particular user requires balls-to-the-wall performance, they still have access to the proprietary blob, which apparently isn't going away.

                Remember that effort tends to follow demand... so if most users don't demand perfect/optimal performance, the final performance optimizations might get neglected. If *everybody NEEDS* the best performance, then there will be more focus on getting it optimized.

                I figure that having a *decent* 3d performance, along with other things used by *typical* users, i.e. decent 2d performance, and some h264 decoding, and the VAST MAJORITY of users will be so pleased that they will ignore the blob.

                I would really like to know what proportion of users buy the top-end graphics card for a few hundred $ vs something more modest in the $50-$100 range.... not just for linux users, but for windoze users as well. I suspect that VERY FEW people actually buy the latest and greatest GPU. I know that I don't... I focus on features like having no fans and having dual DVI.

                Comment


                • #38
                  Besides, how many people are willing to pay $20 extra to make sure that their hardware is supported under Linux? Quite a few.

                  I have no problem with paying $20 more for a slightly stronger video card, if that means that I have fully open drivers which are integrated into my distribution from the first day, which follow kernel and X and Mesa updates naturally, and won't leave me stranded with an incompatible blob, like the binary drivers do. Yes, nVidia drivers too.

                  And people who would rather run Windows or a binary blob, just to save $20, aren't the target audience for open source drivers in the first place.

                  With that in mind -- get powersaving working and into the linux kernel, then get the missing features in place (OGL3, video decoding), then optimise away. There are many users who appreciate the opensource efforts and will support AMD because of it. Keep up the good work!

                  Comment


                  • #39
                    If ATI was to come out with a fully Gallium3D supported card at 65% performance of the blob and the following would be finished:
                    -OpenGL 3.2 (don't care if mesa or not)
                    -Suspend to RAM support
                    -Power saving
                    -Crossfire equivalent (or OpenGL for one card and the rest of the State Tracker goodness for the other card would be even better!)

                    I would pay for a 200-300 euro/dollar ATI card in a heartbeat!

                    A massive plus if:
                    -Accelerated video playback
                    -Accelerated audio playback
                    -SVG

                    Hear me AMD: I'd buy it right now!

                    Comment


                    • #40
                      PS: And GLSL is a requirement too, but that's OpenGL to me...

                      Comment

                      Working...
                      X