Announcement

Collapse
No announcement yet.

Unigine Engine Looks To Wasteland 2

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Kano
    replied
    First of all i have got nothing against Unigine, but some issues should be adressed. It is a bit similar to Rage issues with AMD drivers. If you want you can blame AMD for everything If you want to see crossplattform games using OpenGL 4/DX 11 features this should not result in such a performance penalty that even their own game does not use em. It is hard to believe that they can not reproduce the same results at their own office. Maybe Unigine could get some hints from AMD how to optimize the code in a way those cards work faster. Most likely AMD does that on the fly for DX11, maybe just rename the binary and try to bench again. This could be a bit tricky however because the "real" binary is executed by a launcher with the selected settings. I really want to see Linux games that use the latest OpenGL 4 features but i doubt that a bit even when a game will use this engine. A first step could be working together with AMD to get an OpenGL profile for CF - Nvidia has got SLI for that as well - sometimes it would be cheaper to add the same card again instead of buying a new one, but thats pretty useless currently for OpenGL - especially on Linux. All you can do is to get a fast card directly even when your board would support more cards.

    Leave a comment:


  • Stebs
    replied
    @Kano
    I don't really understand what you are "moaning" about. Currently the OpenGL drivers from Nvidia and especially AMD are not as optimized for Games as the DX11 drivers are, nothing new there.
    And not really astonishing when you consider that most Games today use D3D (and OpenGL often used for "professional" applications) , but still it is a shame.
    So where do you see the fault of the Unigine Engine in all this?
    Until someone from Unigine tells us more, their DX11 path being more tweaked than their OpenGL path is pure speculation. It could as well be entirely a driver issue as mentioned before.

    About tesselation: sure it is quite performance taxing, especially when used excessively like the Heaven _Benchmark_ does. So using it (and to what extend) will always be a question of cost/benefit. But this is true for every Engine.

    Leave a comment:


  • Kano
    replied
    @ZedDB

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite


    One option that is not mention in the old article is:

    aticonfig --cfl on

    to enable a crossfire logo, but i never saw this. Usually i don't like logos, but to verify i would like to see that. I could only "verify" with 12-3 driver that with enabled cf it showed wrong colors, but no speed diff.

    But did YOU really read your mentioned website correctly? There are only SLI results for OpenGL (which are pointless for Linux as SLI is not supported on Fermi there). There is NO OpenGL CF result just because there is no CF OpenGL profile. Everything is optimized for DX11 on AMD - absolutely nothing for OpenGL. So lets compare the speed differences of the single gpu cards @ 1080p (i would prefer fps, but hopefully the points do scale in a similar way, somebody from u should talk about that) - what you still miss is the Win -> Linux fps drop which are also a few fps for amd, for nvidia this is only a slight difference.
    Code:
    Card 	 DX11 	 OpenGL 	 (OpenGL/DX11)
    HD 7970 	1467	1236	84,25%
    HD 7950 	1300	1057	81,31%
    GTX 580 	1105	1030	93,21%
    HD 7870 	1059	906	85,55%
    HD 6970 	973	688	70,71%
    GTX 570 	943	881	93,43%
    HD 7850 	940	790	84,04%
    GTX 480 	918	886	96,51%
    HD 6950 	877	610	69,56%
    GTX 560 Ti 	811	742	91,49%
    HD 6870 	785	564	71,85%
    HD 7770 	774	627	81,01%
    HD 5870 	773	577	74,64%
    GTX 560 	764	645	84,42%
    HD 6850 	669	477	71,30%
    HD 7750 	508	429	84,45%
    GTX 550 Ti 	489	421	86,09%
    If you would have looked correctly then you should have seen that you lose about 14 to 30% depending on the AMD card used only by switching from DX11 to OpenGL (my HD5670 is more the 30% type). You lose a few % extra on ATI when you do you benchmarks on Linux. Lets look at Nvidia: you lose about 4 to 16% but you don't lose much when you switch from Win to Linux. That means switching to Linux costs you about 1/3 of the speed for AMD but only 1/6 with a SINGLE Nvidia card. As you can see that even if CF is officially supported on Linux by AMD you lose not only 1/3 of the speed, you loose MUCH more. If you are lucky your speed drops only by 2/3 from a DX11 CF solution to OpenGL Linux...

    @binstream

    The one and only way why Oilrush is seen as "fast" by some ppl is that absolutely no tessellation is used. Also you should release performance counts like number of objects/scene or whatever was reduced to improve speed. If the engine would be used the same way it would be slow as hell for most users without highend cards.
    Last edited by Kano; 16 April 2012, 08:12 AM.

    Leave a comment:


  • NomadDemon
    replied
    i tester unigine benchmarks on my ubuntu 11.04 64bit with radeon 4850 + blob

    my max resolution is 1280x1024 and my fps was SAME on windows and on linux, about 50-60 [settings like high-ultra]

    graphic was awesome, dont need teselation stuff

    Leave a comment:


  • ZedDB
    replied
    Originally posted by Kano View Post
    And your tests show no extreme different performance with heaven between dx11 and opengl and on your systems you get better performance using crossfire with opengl?
    Here you have a test with DX11 scores and opengl scores: http://www.sweclockers.com/recension...850/7#pagehead
    I think they are quite close in performance. However I think Nvidia and AMD spends more time on optimizing direct3d than they do with opengl because that is what most games currently use on the PC platform.
    So I don't think we can say without a doubt that it's Unigine that is to blame here. Sure it would be great if the opengl render were better than the direct3d counterpart. But if they have to spend countless more hours on optimization than they did with the d3d render to make it happend I don't really blame them for not doing so.

    Perhaps binstream can enlighten us about this matter. IE does the opengl require more optimization than the d3d one to be on pair?

    BTW is crossfire even supported on linux Kano? IIRC it's not so I don't think it's relevant for us linux users ATM.
    Last edited by ZedDB; 16 April 2012, 05:52 AM.

    Leave a comment:


  • kraftman
    replied
    Originally posted by yogi_berra View Post
    New Vegas was definitely better than F3.
    Yes, in every aspect.

    From the latest update:

    That sounds like a 2D game to me. http://www.kickstarter.com/projects/...2/posts/208363
    I think just characters portraits will be 2D like in Neverwinter Nights. It's a good thing imho, because 2D portraits look usually better.

    Leave a comment:


  • benmoran
    replied
    Binstream,

    I just wanted to say thank you for supporting the open source drivers as well. With Oil Rush the performance is really impressive, all things considered. I finally had time to play past the second chapter, and it really is a lot of fun.

    I'm a Wasteland 2 backer, and really hope they go with Unigine. I think it would be a nice fit for the type of game they're making. Wasteland 2 would also give Unigine a bit of publicity, and hopefully get more people interested in using it.

    Leave a comment:


  • Kano
    replied
    And your tests show no extreme different performance with heaven between dx11 and opengl and on your systems you get better performance using crossfire with opengl?

    Leave a comment:


  • binstream
    replied
    Originally posted by Kano View Post
    I certainly know that there was a hidden value in the first release of Heaven to enable opengl tesselation on series 4 hardware - it has been removed later. I do not say they do not test those cards at all, but they seem to be ok with dx11 only optimisations. That does not help Linux users at all.
    About 30% of our test farm are different Linux systems, we spend a lot of time on Linux compatibility and performance.

    Leave a comment:


  • Kano
    replied
    I certainly know that there was a hidden value in the first release of Heaven to enable opengl tesselation on series 4 hardware - it has been removed later. I do not say they do not test those cards at all, but they seem to be ok with dx11 only optimisations. That does not help Linux users at all.

    Leave a comment:

Working...
X