Announcement

Collapse
No announcement yet.

Open ATI Driver More Popular Than Catalyst

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Zetbo
    replied
    I bought HD3870 last year mainly because of ATIs open source effort. I tought that if i need 3d performance I can go back to fglrx. Then ATI stated that they don't support upstream xorg or kernel. Just Ubuntu, rhel etc... thats kinda problem for me because I use Archlinux.

    Now I use xf86-video-ati and I'am quite pleased. I don't use compiz or the other composite things. They are useless and cause all kinds of problems. I would like to play Neverwinter Nights, but thats not a huge problem.Even if I can get fglrx to work on my machine there is a bug somewhere that prevents me to play NWN(Huge memory leak). I bought PS3 for gaiming. Almost every game these days are designed for consoles.

    ps. English probably sucks, but bear with me!

    Leave a comment:


  • MartjeB
    replied
    Originally posted by deanjo View Post
    No vsync? what are you smoking? seriously? Vsync works PERFECTLY fine on nvidia cards. Turn it on. Not the case on ATI cards. 'Flicker free' booting, put the appropriate mode line in grub, flickering is a problem for old CRT's. In the day and age of LCD's there is no real 'flicker' to be appreciated. I spend more time watching tear free accelerated video then I do rebooting and changing resolutions to non native res's (perhaps it's needed more often with FOSS setups with them giving shitty performance having to change resolutions down to a lower value) and with portables I care more about a working sleep state then a rebooting process (again advantage nvidia).
    You're not getting it.. this isn't about <60 Hz flickering, this is about switching resolutions 4 times from pressing the power button to your desktop.

    Leave a comment:


  • mirv
    replied
    Originally posted by kersurk View Post
    I assume ATIs plan is to go 100% over to open source driver eventually?
    Doubtful. Probably the aim is to provide an open source alternative, but there are proprietary bits & pieces that ATI will continue to support that can't go open source (third party stuff that's included, such as s3tc).

    Leave a comment:


  • kersurk
    replied
    I assume ATIs plan is to go 100% over to open source driver eventually?

    Leave a comment:


  • Kano
    replied
    Originally posted by Michael View Post
    Kano, every driver has its problems and bugs that go unresolved for long periods of time... Hell, I still have a NVIDIA driver bug that's been open now for about four years or so concerning CoolBits and Xinerama.
    I enable CoolBits by default, but basically rarely use it. Xinerama I do not use at all. I don't know what the exact problem is in your case, but the gpus behave definitely differently when more than 1 monitor is used, then the gpu/vram is clocked higher even in idle mode which also leads to (much) higher energy consumption.

    Of course it is nice when oc works, the gains are usually pretty small however for a gpu oc.

    RENDERING problems - especially regressions should much more important than oc in the special case.

    As i think you are very interested in automated regression tests, maybe take a deeper look at wine:



    When you use the same suite and different driver releases and you see differences then you can easyly identify problems too.

    Leave a comment:


  • dgrafenhofer
    replied
    Originally posted by Qaridarium
    ...bullshit...bullshit...

    IDtech5 is the worst i ever read abaut....

    ... and no and no.. nothing????

    ... is bullshit!
    (rant)
    I am sorry to say this, but you are getting more and more on everybody's nerves with your inflammatory and trollish comments. I know that you are not a native speaker - me neither, but somebody has to tell you that your usage of swear words is completely out of place.
    (/rant)

    Leave a comment:


  • mirv
    replied
    Originally posted by Qaridarium
    idtech5 is an bullshit backward engine only for bullshit Xbox360/playstation3 hardware with only 256mb-ram/512mb-Ram.

    IDtech5 is the worst i ever read abaut....


    ArmA2 vor an exampel is an Killer Engine for Big workstations 12Cores 32gb ram 2GB-Vram Grafic-cards

    and you need 2 pices of 5870 to play it!


    for idtech5 you need a backward graficcard with no power and you need no ram and no and no.. nothing????

    idtech 5 is bullshit!

    idtech4 has more features than idtech5!

    day and night! idtech5 can only handle DAY-Time!
    I'm not sure why you think so lowly of id tech 5 - I think it's quite impressive. Certainly it won't run on the open source drivers, but from everything I've read and seen about it, it's quite an advanced engine so there shouldn't be any surprise there.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by MostAwesomeDude View Post
    r300/r500 is *not* a good venue for LLVM. r600 is a bit more amenable.
    Can you elaborate why? Do they have too much fixed-function hardware and not enough shader power, or what?

    Originally posted by MostAwesomeDude View Post
    And for the record, I really don't think the custom code in LLVM is going to be able to compete with the optimizations created by fglrx's team.
    Sorry, I meant the OSS drivers. I don't think LLVM will be able to compete with fglrx either, I just think it will be closer.

    Leave a comment:


  • MostAwesomeDude
    replied
    Originally posted by smitty3268 View Post
    It probably doesn't make a whole lot of sense to drop the current compiler while you are still able to share it with the non-gallium driver, but I do hope this gets worked on eventually when gallium gets more mature. I really don't think the custom code in the drivers is going to be able to compete with the optimizations created by a dedicated compiler team.
    Other way around. r300g borrows the compiler from classic r300.

    r300/r500 is *not* a good venue for LLVM. r600 is a bit more amenable. And for the record, I really don't think the custom code in LLVM is going to be able to compete with the optimizations created by fglrx's team.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by frantaylor View Post
    This is NOT my experience with regressions. I test software for a living and I find odd weird regressions all the time. Invariably they end up having much more impact than you might think.

    Here is a common scenario: A regression is found in a "rarely used app" and a bug report is filed. The developers say "this is a rarely used app" and they mark the bug as "we are not going to fix it". Others come across the same bug, find the bug report, see that the developers are indifferent, and they either code around it or they just drop the buggy piece of code and switch to something else. The developers have no idea that this has happened. Since it is free software, there are no sales to affect and no salesmen to beat up the developers to fix it.

    Really the only sane approach is to take a serious effort to fix any regression, no matter how minor it seems.
    I'm certainly not saying regressions are unimportant, quite the opposite. But any sane development team is going to prioritize the bug reports they get, and if you think that regressions should always automatically go on top no matter what the situation, then I would have to disagree.

    Leave a comment:

Working...
X