Testing Raw Hardware Fill Rate, NOT Driver Efficiency
I love the work Phoronix is doing. And it's great to see them using more modern graphic engines. My only suggestion is that they include some lower resolution benchmarks. If the goal is to show differences in drivers, they need to run at resolutions that aren't fill rate limed. When older cards are only getting 7FPS and $500 top of the line, can barely pull 100FPS, you're testing raw hardware fill rate, not driver efficiency.
Announcement
Collapse
No announcement yet.
AMD Gallium3D Marks Huge Win: Beating Catalyst In Steam On Linux Game
Collapse
X
-
Originally posted by DaVince View PostIt's important that these benchmarks keep moving along with the times. When people were asking for Source game benchmarks, this was important as it had the potential to reveal performance issues in these engines. Now that Source games run well across the board, it's important to benchmark newer, bigger, more complex challenges.
Basically, this stuff is always changing, always developing. You shouldn't expect game benchmarks to stay the same all the time, because new games will come out with new feature sets that may visibly strain the drivers. That's also why Unigine comes with a new benchmarking program almost every year.
But what we as Phoronix forum members can do is more than just complain: point Michael in the right direction, like what has been done earlier in this thread.
Also, CS:GO wasn't a bad choice. Sure, it's Source engine, but it's also recently released and relevant.
I am not against adding in more modern titles when appropriate - what bothers me is the idea that the older tests should be disposed of simply because they are crimping everyone's style somehow, which really does seem to be the root of most people's arguments here.
Leave a comment:
-
Originally posted by DaVince View PostIt's important that these benchmarks keep moving along with the times. When people were asking for Source game benchmarks, this was important as it had the potential to reveal performance issues in these engines. Now that Source games run well across the board, it's important to benchmark newer, bigger, more complex challenges.
Basically, this stuff is always changing, always developing. You shouldn't expect game benchmarks to stay the same all the time, because new games will come out with new feature sets that may visibly strain the drivers. That's also why Unigine comes with a new benchmarking program almost every year.
But what we as Phoronix forum members can do is more than just complain: point Michael in the right direction, like what has been done earlier in this thread.
Also, CS:GO wasn't a bad choice. Sure, it's Source engine, but it's also recently released and relevant.
We can write those scripts for more advanced games, that was spoken about earlier in the thread.
We can create test profiles that do what we want.
The code is open.
If we write it Michael will run the tests.
This is the positive difference between this site and the other benchmarking sites, but we're not taking advantage of it, for the most part.
Leave a comment:
-
Originally posted by log0 View PostLOL. Following you argumentation, why exactly should anyone care about doing benchmarks of this game then?
Benchmarking of the game, if it were possible via phoromatic; would show a lot of helpful detail such as:
1) Which cards are better with the game; or which hardware and/or software combinations work well with the game.
2) What can a prospective buyer of the game expect from their current or planned hardware/software setup.
3) What can one expect from a given hardware/software setup at a given resolution.
4) Since the tests would be independent, it could find issues those too close to the development could easily oversee.
5) Which future steam machine configuration to purchase for games of similar or greater calibre/requirements using The Witcher 2 as a control standard.
6) etc. etc. And many more varied reasons the average Joe out there looks at benchmarks of games. Most of those people are average non-IT non-technical people who just love games and computers.
Obviously it isn't possible to benchmark at the moment without some development work to create an autonomous benchmarking script for the Witcher 2.
Leave a comment:
-
Originally posted by Hamish Wilson View PostSeconded.
All of these people complained back when it was all Quake engine games being used, so Micheal eventually managed to get the Valve games working in decent enough shape for benchmarking in order to silence these critics, and now they complain that he is not testing Civ5 or Borderlands 2...
You people will simply never be happy. It is just like how when Valve actually came to Linux you all acted like they were suddenly no longer AAA anymore in order that you could keep complaining that there were still no mainstream game titles on Linux...
Basically, this stuff is always changing, always developing. You shouldn't expect game benchmarks to stay the same all the time, because new games will come out with new feature sets that may visibly strain the drivers. That's also why Unigine comes with a new benchmarking program almost every year.
But what we as Phoronix forum members can do is more than just complain: point Michael in the right direction, like what has been done earlier in this thread.
Also, CS:GO wasn't a bad choice. Sure, it's Source engine, but it's also recently released and relevant.
Leave a comment:
-
Originally posted by DanglingPointer View PostI will make the assumption you understand SDLC...
First, I would request you take a step back and imagine the use case for this piece of software. An actor wants to play the game. Whether it is via native OpenGL calls or not, the use case is what matters which is to play the game.
If it works easily enough after functional testing for a non-technical user then it passes User Acceptance Testing (UAT).
Wine will never pass non-technical user acceptance testing.
However downloading The Witcher 2 from Steam passes that testing if it just works from the get go with Nvidia drivers or Catalyst drivers. Whether FPS and time-framing are on par with Windows or not is irrelevant to the Use Case as long as FPS, frame-times, graphics fidelity, and bugs are within the minimum allowable/acceptable tolerances for release to production and marketing.
In the case of the Witcher 2 it passes all tests. People buy it, it makes money, business case/model proven! Then perhaps for the next and subsequent projects gpu API's friendlier to *nix would be considered since the market had already been proven and money can be made (pilot program by stealth).
As for the Witcher 2 (2011), API's friendlier to *nix were ruled out back then due to zero business case.
You're too into the IT tech of everything, remember most of the world just turns on the light switch not knowing how a light bulb works.
Leave a comment:
-
Originally posted by Mikeyy00 View PostLook, don't mean to be rude, but I don't give a crap about these old ass games. They've been running decently on Gallium for a while (as long as you're not rocking an ancient adaptor). What about newer games like Civ 5? Borderlands? (for which the Catalyst runs extremely poorly?). How about Wasteland 2?
I have AMD 6950. And I use catalyst.
I cant say I had any problems with Catalyst or Open source driver.
I'm glad I have AMD because with Nvidia I always had problems with their driver(few years back).
Leave a comment:
-
Old games or not (for me they are not) this is impressive either way. Even on the most recent generation of HW it looks already good and reaches about 3/4 of Catalyst's performance. That is good.
Now I just wish to get rid of some small Kabini rendering glitches in KDE and I'm happy.
Leave a comment:
Leave a comment: