Announcement

Collapse
No announcement yet.

Testing Out AMD's DRI2 Driver Stack

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • cliff
    replied
    I would have to disagree with the author.
    I have had a lot of problems with my ATI card on Linux.
    In fact it has caused me many system re-installs.
    I have the 4870 X 2 and no 3d support on Jaunty Jackalope. I tried installing the Catalyst 9.4 and when I reboot my machine it locks up and the colors are all messed up. I wish I would have bought a Nvidia card. I used to use windows and the Vista drivers were not much better.

    Leave a comment:


  • sc3252
    replied
    Originally posted by Melcar View Post
    Even the binaries were "usable". I remember running my old 9800 and 9600 cards on the old drivers; performance was lacking, but they got the job done. I honestly could never understand what all the fuss was about with ATI drivers even back then.
    Well, I too used a 9800 np and it was to say the least unstable. I would get driver crashes by just by hitting Ctrl+AlT+F1 or any other f key. Also there were quite a few times when video applications would give me a black screen. Not including all that the community usually had to patch the drivers themselves because a lot of the time they would not compile. They also had issues with VIA motherboards, so the drivers wouldn't even load without some secret patch(seriously I had to look for 3-5 hours before I could find a patch to my motherboard kt333).

    Leave a comment:


  • bridgman
    replied
    Most of the real discussion was a few years ago, but the key point is that there were some real serious problems with the current architecture (possibly worse than the ones you were dealing with) which had to be addressed.

    The main problems were :

    - multiple graphics drivers, some in user space, and some in the kernel, over-writing each others settings during common use cases

    - inability to share memory between 2D and 3D drivers, which made any kind of desktop composition inefficient and slow

    Both the 2D and 3D drivers need context switches in order to access the hardware anyways, since the direct rendering architecture uses drm to arbitrate between the ddx and mesa drivers, and to manage the shared DMA buffers (aka ring buffer) used to feed commands and data into the graphics processors. I don't think these changes really introduce more context switches as much as change the dividing line between user and kernel responsibilities.

    I'm a bit rusty on the history (my hands-on X experience was before X11), but my understanding is that user modesetting is a relatively new addition to X, and that the KMS initiative is arguably going back to the way modesetting was handled in earlier versions of X which were presumably built on existing kernel drivers. I haven't had much luck finding online references to support this, but I have been told this by a number of people who have worked on X for a very long time.

    If you're saying that the DRI architecture is fundamentally flawed and that all graphics should go through a single userspace stack (presumably in the X server) then that's a different discussion of course.

    Jesse Barnes wrote a good summary of the rationale for moving modesetting into the kernel, and there is a good discussion (plus the odd rant ) in the subsequent comments : http://kerneltrap.org/node/8242

    Thomas's original TTM proposal is a pretty good summary of the goals related to memory management : http://www.tungstengraphics.com/mm.pdf
    Last edited by bridgman; 13 May 2009, 09:02 PM.

    Leave a comment:


  • highlandsun
    replied
    Can anyone point me to some articles/discussions for background on TTM/GEM/KMS? This whole development direction seems to me to be contrary to good design principles - there are more moving parts, and putting more into the kernel means more context switches required to get any useful work done. Back when I worked on X (I wrote the X11R1 port for Apollo DOMAIN/OS) we would have had our heads chewed off for trying to push any of this work into the kernel...

    Leave a comment:


  • Yfrwlf
    replied
    Originally posted by drag View Post
    What this gets you is that the APIs are going to be much more unified between drivers and application compatibility should improve quite a bit, if not performance.
    Which means you will get better performance, because as you unify things, testing is easier and improvements effect more hardware, etc. Unifying/standardizing more means more focus on goodies like performance and such, and less time wasted, which will yield a better Linux experience. Can't wait.

    Leave a comment:


  • paul_one
    replied
    I think the "ATI/AMD" remark at the start of the article was directed at the post-buyout company - not the pure "ATi" drivers.

    I used to have a 9800 Pro, was VERY disappointed that the Linux drivers sucked so badly (and the installer back then was hell).

    With Nvidia I've had fine binary drivers - although I wish they'd grow up and help out the Noauvouvouioviuoi project with hardware docs at least (ermm, what part of the hardware command structure is DRM'd then Nvidia?).

    But, anyhoo, VERY nice article.
    Correct performance stats (nice for a change), wasn't condeming of the performance, and explained the various aspects of KMS/DRI/gallium/etc very well.

    Phoronix, give this guy more stuff to write about!!

    Leave a comment:


  • Pitabred
    replied
    Meandering a little way off-topic...

    Originally posted by drag View Post
    <snip>
    (a decent Pentium 4 can play a DVD-resolution-sized video with doing raw unaccelerated x11 and software scaling.. but even my dual cores choke on 1080p HD stuff.)
    <snip>
    I think I've stated it elsewhere, but on the mplayerhq.hu homepage they give you instructions on how to compile the mplayer-mt. It lets me play 1080p on an Athlon X2 4600+ CPU with Xv. While it's not GPU acceleration, it's a heck of a lot better than nothing. My MythTV system with an HD3200 is awesome with the open source drivers. The only thing that could make it better is if I could start doing some 3D gaming on it as well.

    Leave a comment:


  • MostAwesomeDude
    replied
    Originally posted by drag View Post
    But I would not expect any massive improvements in benchmarks from Gallium. Any performance improvements will be a slow small build up of performance over time.
    Actually, it tends to go in spurts and bursts, especially with my coding style. :3

    Leave a comment:


  • mendieta
    replied
    Folks, a bit off-topic, but I still can't have 3D accel in my AMD/ATI Radeon HD 3200 IG in Kubuntu 9.04 ... anyone knows what happened with the Catalyst 9.5 release? It was rumored to be out today (just google for Catalyst 9.5)

    Thanks!

    Leave a comment:


  • Melcar
    replied
    I meant finalized in terms of "usable by end-users".
    As for Gallium, I won't bitch and moan for performance. Stability and feature support are my main concerns. 30-60% of the 3D performance of fglrx is fine with me. Getting applications to run in a stable manner is more important to me than squeezing extra frames. Optimization can come latter.

    Leave a comment:

Working...
X