Originally posted by dungeon
View Post
I would be at the edge of considering you a Troll now - if I did not know you better from all your reasonable posts of the past...
So one last try - then I will stop
I did not call it a "clear" regression. Just a regression. But you - yes you yourself - called it a regression ("Hawaii regressed") in your last post.
How severe or reasonable the regression is in light of an improvement of one other card I do not want to judge.
Looking at this one regression ALONE: a 80 to 60 fps regression is sad for 290 owners who payed more money for their card to get more performance for 4k or 120fps gaming.
and then this BS:
Originally posted by dungeon
View Post
Now, please answer my question from my previous post - I really want to know:
Why should that machine act wrong only with crimson while 15.9 is fine? And why would that happen in 6 runs of 2 independent games using the same engine. It is rather unlikely that "something" happend. Even if some mysterious process would have occupied 1 or 2 cpu cores exactly during those 6 runs then the performance penalty would not be so high cause the machine had 2 more cores to use and the benchmark is not even cpu limited to begin with.
Comment