Announcement

Collapse
No announcement yet.

Radeon 3D Performance: Gallium3D vs. Classic Mesa

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Now that test looks far better than the other one (nvidia blob vs. free ati). And brings some interesting results like the low but stable framrate.
    Stop TCPA, stupid software patents and corrupt politicians!

    Comment


    • #42
      Are there any plans to switch to r300g as the default driver in let's say Mesa 7.9?

      Comment


      • #43
        How far is the galium3d development for r6xx<= cards?

        These tests look very promising, specially the upper limit of the fps for the galium3d driver. Is that due to vsync?

        Comment


        • #44
          Originally posted by tball View Post
          How far is the galium3d development for r6xx<= cards?
          I think glisse's blog shows the current state - the driver and shader compiler framework are in place and at least one triangle has been drawn successfully :

          jglisse - the new blog in LiveJournal. There should be new interesting records soon.
          Test signature

          Comment


          • #45
            Originally posted by bridgman View Post
            I think glisse's blog shows the current state - the driver and shader compiler framework are in place and at least one triangle has been drawn successfully :

            http://jglisse.livejournal.com/
            Looks great. Thx for the update. :-)

            I don't know if Glisse wants to comment if he has gotten any further?

            Comment


            • #46
              R700 classic Mesa vs. Catalyst 10.4 testing finished up this morning... I think this article is in the queue for Monday.
              Michael Larabel
              https://www.michaellarabel.com/

              Comment


              • #47
                You can watch the commits yourself, just keep hitting "refresh"



                Looks like a bunch of compiler changes since the last blog post.
                Test signature

                Comment


                • #48
                  Originally posted by garytr24 View Post
                  Easy. Input lag. 1 second/60 frames = 16ms. 1/120 = 8ms.
                  I am fairly sure that your body doesn't move that quickly. The time it takes to slightly reposition and click a mouse on a target is greater than 16ms by a decent margin. Not to mention the input system latency.

                  When you add up all the other latencies in the entire equation -- from your ability to perceive events, to signaling them to the system, to hardware to software, to the software processing, to the response signals, to the hardware doing its job, to you being able to perceive them again, the latencies are hitting anyway from 66ms to 300ms on a modern system. A good PC with a high quality CRT will be the best and a console on a plasma TV will be the worst, but even most PC systems are sloshing through over 100ms of response lag you apparently weren't even aware of.

                  The difference is very noticeable, ask any decent musician
                  That's a _totally_ different topic. If you were talking about video playback latencies or professional audio mixing, it would be relevant, but you're not. That all has to do with being able to notice when things are out of sync. Your eye and ear are far more tolerant of things that are in sync but are just "late."

                  or pro player.
                  Hah.

                  Why should I wait 16ms for my shot to fire when I can wait 8ms...
                  Because it doesn't matter. The shot fires in timing with the simulation updates anyway. Firing it "sooner" doesn't even really mean anything. Especially in multiplayer.

                  I'll hear it
                  Not likely. The latency between the mouse click and when the game will get the sound to the speakers is a lot longer than 8ms, usually on the order of dozens of milliseconds. It's even worse on most TVs. A realtime system with professional grade hardware and a software stack dedicated solely to sound playback and mixing gets things to around the 1-4ms mark. Those setups also cost at least 4x as much as the best gaming rig you're likely to ever own (and 10x as much as any gaming rig anybody with any common sense and/or someone to sleep with would ever blow money on; more than $1200 is just ****ing stupid).

                  and it'll get sent to the server sooner, even if I don't see it that fast.
                  The server is already dealing with the 20-150ms latencies the Internet is imposing anyway. And that very lag compensation algorithm is actually evening out the field in other ways across all the players. As soon as you start playing any properly implemented networked real-time game you're already in the realm of approximated physics and gameplay.

                  I have never heard anyone actually measure a problem with a mere 16ms latency. I've heard plenty of people on forums repeat such claims with no actual fact or recorded measurements, but nothing "real." I strongly feel it's just another of the many excuses that some (but far too many) gamers try to use instead of just admitting, "the other guy just did better than me and that's why he won and I lost."

                  There are advantages to moving up to monitors with higher refresh rates (I'm still waiting for 120hz to become the norm), but the effects of moving to those will be very subtle and mostly have nothing to do with gaming.

                  Comment


                  • #49
                    I am fairly sure that your body doesn't move that quickly. The time it takes to slightly reposition and click a mouse on a target is greater than 16ms by a decent margin. Not to mention the input system latency.
                    No matter how big your latency is, if you react 16ms EARLIER, your result will also come 16 ms earlier.

                    Most people will need more considerably longer than 100 ms to react with a mouse movement and a click. By the same logic, a 100ms lag doesn't matter either. Anybody who has ever played an online game will know that this is not the case. The difference between a 100ms lag and a 50ms lag is enough to totally throw off your game if you're not used to it. Although you cannot react in 50ms.

                    Modern games are not just see and react, your brain does all sorts of interpolation and syncing. Especially if you use very high-sensitivity mouse flicks, 16ms will mean being off by hundreds of pixels.

                    Seriously, only people who don't really play games competitively argue this point. If you play a fast shooter for a few years, you will know that 20ms make a difference. Not a huge one, but it's there, and it's very evident.

                    Having said that, I'm not too fussed about fps because I'm not a pro gamer or whatnot. 60 fps is enough for me and my casual play, as long as it is stable. So I don't even care about the topic much.

                    Comment


                    • #50
                      Edit bug.

                      Because it doesn't matter. The shot fires in timing with the simulation updates anyway. Firing it "sooner" doesn't even really mean anything. Especially in multiplayer.
                      Think about it again.

                      Doing simulation updates at fixed intervals only means that the events are binned together.

                      So if you have a 60Hz simulation, this means that an update window is 16 ms. Which makes things worse. Now a 1ms latency will become 16ms latency 1/16 of the time, and a 2ms latency will become 16ms latency 1/8 of the time.

                      So it's even more important that your click gets there earlier. If your opponent is 2ms slower, his shot will be 16ms later 1/8 of the time.

                      Comment

                      Working...
                      X