Page 3 of 4 FirstFirst 1234 LastLast
Results 21 to 30 of 35

Thread: Port fglrx openGL stack to Gallium3D

  1. #21
    Join Date
    Apr 2010
    Posts
    1,946

    Default

    Quote Originally Posted by BlackStar View Post
    Repeat after me: glxgears is not a benchmark. Don't try to use it as one, because its results are FUCKING INVALID.

    There, better now?

    In fact, fglrx performs identically to the Windows driver in OpenGL (sometimes slightly faster, too). The rest of your points are being addressed as we speak (better 2d acceleration, video acceleration).

    Bah.
    Yes, I have read this on unofficial ATI site, when I had the card im my hands.

    fglrx: OpenArena 40-60fps(fullhd,maxed out).

    nvidia: With 9800gt - 300fps(fullhd,maxed out).
    (for comparsion, 6800gt with gddr3 (asus v9999) on athlon xp 3200+ -> 120fps fullhd, maxed out)

    Technically 4650 is weaker than 9800, additionally 4650 - 128bit DDR2; 9800 - 256bit GDDR3. So I think this is because gpu staves for the lack of ram bandwidth.

    But then nvidia makes crap and closes drivers whilst ATI does A LOT for opensource drivers, so I bought hd4770(faster chip than 9800gt, 128bit GDDR5 on 4770==256bit GDDR3 on 9800; lower idle power consumption). I have it in my hware, Im trying to recompile lastest kernel to use it(typing this in vesa). Not much trouble, gentoo anyway.



    The trouble is if this issues will be if all that options you talk (vid accel, 2d - is already here I know, improved opengl) will be handled in fglrx. Then it will be as in nvidia, but later and worser. I have switched to amd 4770 only because of their opensource effort, yes I appreciate it that much.

  2. #22
    Join Date
    Jun 2009
    Posts
    1,103

    Default

    Quote Originally Posted by BlackStar View Post
    Repeat after me: glxgears is not a benchmark. Don't try to use it as one, because its results are FUCKING INVALID.

    There, better now?

    In fact, fglrx performs identically to the Windows driver in OpenGL (sometimes slightly faster, too). The rest of your points are being addressed as we speak (better 2d acceleration, video acceleration).

    Bah.
    mmm i can be wrong but as far i have tested, the relation of fglrx and catalyst is more like this

    DX9/10/10.1/11 >>>>>> WGL4 which >>>>>>>>>>>>>>> fglrx

    download the unigine demos(i think those are complex enough to see the diff) for both oses and running with both codepaths, at least in my hardware the difference is very noticeable

    windows7x64
    kubuntu 10.04 rc
    4850x2 x2 aka quadfire
    quadcore cpu

    now on the nvidia side, opengl in linux is like 10% slower than directx in windows, wich is completly acceptable for me

  3. #23
    Join Date
    Jun 2009
    Posts
    1,103

    Default

    Quote Originally Posted by crazycheese View Post
    yes I appreciate it that much.[/U]
    amen brother, the only reason i didnt instantly rma my quadfire is cuz i heard in time about AMD FOSS work, so yes I appreciate it that much too

  4. #24
    Join Date
    Apr 2010
    Posts
    1,946

    Default

    Quote Originally Posted by jrch2k8 View Post
    amen brother, the only reason i didnt instantly rma my quadfire is cuz i heard in time about AMD FOSS work, so yes I appreciate it that much too
    intel E5300-->amd athlon II x4 630(instead of intel i3-540)
    s775 asrock p43me--> amd-based am3 gigabyte GA-MA785GMT-UD2H, instead on intel s1150 board
    nvidia 9800gt-->powercolor hd4770

    All because amd supports foss and nvidia regresses totally.

    Intel does not provide performance 3d hardware and overall it is pricy for same performance, so sorry, intel.

    And I would really appreciate AMD create linux counter, where you can register your hardware as running linux. As main desktop and home os, not as workstation one.

  5. #25
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,386

    Default

    jrch2k8, what kind of performance difference (linux vs windows, amd vs nvidia) do you see running with a single GPU rather than a 4-GPU crossfire rig ?

  6. #26
    Join Date
    Jun 2009
    Posts
    1,103

    Default

    Quote Originally Posted by bridgman View Post
    jrch2k8, what kind of performance difference (linux vs windows, amd vs nvidia) do you see running with a single GPU rather than a 4-GPU crossfire rig ?
    mmm, is true quadfire tends to be really crappy even on windows (maybe game engine fault mostly), but i didnt test it with only 1 gpu that time and well rigth now im using xorg 1.8 and kernel 2.6.34 since i choosed to only use the OSS driver. i could test it again when my new disk come in a couple of months with a clean install(i really dont wanna spend some quality time downgrading my distro to get fglrx working again )

  7. #27
    Join Date
    Oct 2009
    Posts
    2,064

    Default

    @bridgman:

    I think the key interesting idea from this thread is to make some kind of open source thing that sits in between fglrx and kernel/xorg/whatever in order to improve the ability to follow the bleeding edge with fglrx.

    Right now, fglrx breaks whenever the kernel or xorg is changed.
    The kernel part is generally manageable since that interface is open source. The big problem is that fglrx breaks whenever the xserver changes.

    But right now we have an open source X driver that *works*, and follows the xserver. So what these guys want is to be able to take the existing SOLID open source parts and mix in certain chunks (the acceleration chunks) of fglrx in order to come up with an overall driver package that doesn't break every time someone sneezes.

    Which is not a bad idea. Even if it would probably be extremely expensive to implement.

    (note: I'm not interested in this -- I personally am happy with the progress and function of the open source drivers.)

  8. #28
    Join Date
    Aug 2009
    Posts
    83

    Default

    Quote Originally Posted by bridgman View Post
    jrch2k8, what kind of performance difference (linux vs windows, amd vs nvidia) do you see running with a single GPU rather than a 4-GPU crossfire rig ?
    Why bother bridgman ? I know you are trying to be nice here, but the guy has no clue what he's talking about .... he's comparing DX11 to OpenGL4 on cards that are capable of neither (hint ... HD4850x2).

    Also I do not think his Quadfire setup has any benefits over a single HD4850x2 since the CPU will be most likely not able to feed the cards properly. Yet he's expeting the holy grail and more ...

  9. #29
    Join Date
    Jun 2006
    Posts
    3,046

    Default

    Quote Originally Posted by BlackStar View Post
    Repeat after me: glxgears is not a benchmark. Don't try to use it as one, because its results are FUCKING INVALID.
    Heh... You forgot to mention they should repeat this over and over, using a blunt object applied to the back of their head whilst saying it until the desire to use glxgears as a benchmark leaves their minds, hopefully for good...

    There, better now?
    I certainly feel better now...

    In fact, fglrx performs identically to the Windows driver in OpenGL (sometimes slightly faster, too). The rest of your points are being addressed as we speak (better 2d acceleration, video acceleration).

    Bah.
    I would hesitate to say this is going to be happening in a timely manner. They've been saying things along these lines for many years now, unfortunately- while only putting a few people when compared to the Windows side of things for years now. It's not being ugly or accusing when I state this- only making a statement of fact. They can't make a business case (yet...though, if what I've been told in confidence actually HAPPENS, that may deeply change for the better by the end of this or middle of next year...) to justify doing massive cleanups on the codebase where they've made incorrect assumptions on the WinSys layer of things to use a Gallium3D term for things so everyone can relate here (Which is actually where many of your bugs are coming from...). Because of this, there's been many, many years of promises without fufilling many of them except many years later.

    It's a good part of where the bitching about fglrx stems from, actually. If you didn't know what was going on and why- you'd be peeved that they couldn't get "simple" things right like fglrx does screw up on- and we won't get into Crossfire, etc. which is their baby and should've been there already in stable form in the minds of the community at large.

  10. #30
    Join Date
    Jun 2006
    Posts
    3,046

    Default

    Quote Originally Posted by droidhacker View Post
    Right now, fglrx breaks whenever the kernel or xorg is changed.
    The kernel part is generally manageable since that interface is open source. The big problem is that fglrx breaks whenever the xserver changes.
    The big problem for them would be that it's more of a moving target than the way they're doing things right now. The main reason that the FOSS driver works as well as it does is that it's in lock-step with the Gallium3D API edge because it's part and parcel of that project. For them, it's a fairly extensive re-write for the parts that are breaking like you state- only to get to an edge that does the same thing on them with the same level of regularity right at the moment.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •