Originally posted by libv
View Post
Announcement
Collapse
No announcement yet.
Radeon Driver To Support ATI R500/600
Collapse
X
-
When you say 2d acceleration, do you mean XAA? Because it seems to me to be rather counter-productive to do work on XAA these days, considering that it essentially provides no performance benefit, only takes a rather slight load off the CPU. I think moving directly to work on EXA is more productive.
However it is also more difficult, because EXA uses parts of the 3d engine, needs DRM suppport, etc. However it would be much more effective acceleration in the long run.
As for overlays, I think the issue is rather similar. I would expect at some point a TexturedVideo-style implementation, using the 3d engine to do overlays.
Of course the big gotcha here is that we still need specs for the 3d engine, but hopefully we'll have those in a month or two.
In that vein, I'd say yes, the X11 driver work needed at this point is not huge, but a lot of work will need to be done on DRM and DRI. I'm hoping the 3d driver ends up being a Gallium rather than a Mesa driver, because that seems to be where the future of GFX drivers for linux is.Last edited by TechMage89; 23 November 2007, 06:59 PM.
Comment
-
Originally posted by TechMage89 View PostWhen you say 2d acceleration, do you mean XAA? Because it seems to me to be rather counter-productive to do work on XAA these days, considering that it essentially provides no performance benefit, only takes a rather slight load off the CPU. I think moving directly to work on EXA is more productive. However it is also more difficult, because EXA uses parts of the 3d engine, needs DRM suppport, etc. However it would be much more effective acceleration in the long run.
My non-technical impression is that "everyone agrees EXA is far superior even though all the available EXA implementations seem to be dog-slow"
I think we're all on the same page here. We need to get the bottom level infrastructure (DRM etc..) in place anyways, not spend much time on XAA, and do everything we can to help EXA evolve and improve to the point where it works for everyone.
Originally posted by TechMage89 View PostAs for overlays, I think the issue is rather similar. I would expect at some point a TexturedVideo-style implementation, using the 3d engine to do overlays. Of course the big gotcha here is that we still need specs for the 3d engine, but hopefully we'll have those in a month or two.
Either way, we're going to get the info out there, see what happens, and help where we can.Test signature
Comment
-
It also occurred to me that with the r600+ approach as everything-is-3d, this would be an ideal place to start work on Glucose because, once 3d is working, that fits the paradigm of the card, and should offer superior performance.
Regarding EXA, it seems the problem is that the DRM for many drivers was never designed to handle it, and the implementations have been fairly poor. Lots of changes need to happen in a lot of drivers. The benefit here is that we have the opportunity to make a fairly fresh start.
This is quite exciting, because it means, not only stable, fast ati drivers in the not-so-distant future, but also the opportunity to try new technologies.
Comment
-
Yep. One of the things I have heard a lot since hanging out in the Linux world is "you can't rely on having 3d". The 3d folks say "you can't rely on having DRM", which then seems to lead to "you can't rely on having the kernel module installed and configured properly" and so on.
I'm hearing a lot more people starting to say "you have to be able to rely on 3d", which is going to enable a lot of good things further up the software stack.Test signature
Comment
-
Originally posted by bridgman View PostYep. One of the things I have heard a lot since hanging out in the Linux world is "you can't rely on having 3d". The 3d folks say "you can't rely on having DRM", which then seems to lead to "you can't rely on having the kernel module installed and configured properly" and so on.
I'm hearing a lot more people starting to say "you have to be able to rely on 3d", which is going to enable a lot of good things further up the software stack.
then: what direction is AMD moving in wrt to its fglrx driver and the open source effort: is it actively trying to help the Gallium3D and TTM efforts or is it on a different path? this seems quite relevant since G3D and TTM seem to be gaining a lot traction lately and TTM seems to be bound for kernel inclusion even.
also: I do not understand why AMD insists on keeping its own fglrx driver development alive. Ok, maybe for the parts that require certain 'IP' thingies, but then again: why not come up with a 'plugin' architecture of the open source drivers (in conjunction with their developers). this I think is a much more efficient use of effort!
any ETA on the 3D documentation?
Comment
-
We will be actively supporting open source developers working on Gallium and TTM, and would like to see open source drivers for AMD graphics adopting new technologies as quickly as possible.
The fglrx driver was originally aimed at workstation users, where performance and ISV certification are critical but the apps and distro versions are tightly constrained. I believe proprietary proprietary drivers will remain the solution of choice for workstation, as well as for gaming and possibly for high-end video. Open source is ideal when you need to deal with a broad range of distros and apps, particularly things which are either still in development or which just poppsd out a week or three ago, as long as the open source driver is kept fairly simple so that it *can* be fixed quickly.
The only problem I see with a plug-in (typically people talk about proprietary 3d/video on open source 2d/kernel) is that if we ever want to play protected video we're going to need a proprietary driver right down to the hardware in order to meet copy protection aka Digital Rights Management requirements. I know nobody cares about DRM today but I also hear everyone wanting Linux desktop usage to grow dramatically, and I don't think you can have broad desktop acceptance without the ability to play DVD and HD legally. You won't see OEMs embrace Linux on the desktop without legal DVD/BR/HD-DVD and you need OEM SKUs if you ever want to see serious market adoption. Even display drivers have to be proprietary if you are serious about DRM, since the content has to be protected all the way to the frame buffer.
I think open source drivers will be the norm for most out-of-box desktop users (except for workstation), while anyone serious about getting the most gaming performance or video capabilities will upgrade to the proprietary driver. OEMs selling desktop consumer systems with Linux pre-installed will generally go for the proprietary drivers in order to get all the bells and whistles, but there will be cases where the OEM works closely with a major distro for support and in that case open source drivers can work out fine.
Anyone wanting to do development or testing with the latest unreleased (or just released last week) kernels or distros will also want to stay with an open source driver, since there's a good chance something will need to be tweaked to line up with the latest tweaks in the OS.
We are trying to achieve two things with open source drivers :
- empower the distributions to provide a complete and high quality "out of box" end user experience
- ensure that drivers for our graphics products can always keep pace with changes in the overall OS/desktop/app environment (ie "no user left behind" )
"Official" schedule for initial 3d is 1Q08, but I want to see developers starting to work on 3d before the end of 2007 if at all possible. The 690 integrated part should be able to use existing code largely unchanged, but more work will be needed for 5xx and 6xx discrete parts.Last edited by bridgman; 24 November 2007, 04:38 PM.Test signature
Comment
-
thanks for the very quick answer!
Originally posted by bridgman View PostWe will be actively supporting open source developers working on Gallium and TTM, and would like to see open source drivers for AMD graphics adopting new technologies as quickly as possible.
Originally posted by bridgman View PostThe fglrx driver was originally aimed at workstation users, where performance and ISV certification are critical but the apps and distro versions are tightly constrained. I believe proprietary proprietary drivers will remain the solution of choice for workstation, as well as for gaming and possibly for high-end video. Open source is ideal when you need to deal with a broad range of distros and apps, particularly things which are either still in development or which just poppsd out a week or three ago, as long as the open source driver is kept fairly simple so that it *can* be fixed quickly.
I also do not see what is so different from a normal driver and a gaming driver or a high end video driver. Also, performance: I bet you a truckload of money that once 3D works that the community will start tuning and tweaking until it has every last bit of performance from the hardware and driver that it can obtain...
The driver will almost automatically be much simpler when G3D is used.
Originally posted by bridgman View PostThe only problem I see with a plug-in (typically people talk about proprietary 3d/video on open source 2d/kernel) is that if we ever want to play protected video we're going to need a proprietary driver right down to the hardware in order to meet copy protection aka Digital Rights Management requirements. I know nobody cares about DRM today but I also hear everyone wanting Linux desktop usage to grow dramatically, and I don't think you can have broad desktop acceptance without the ability to play DVD and HD legally. You won't see OEMs embrace Linux on the desktop without legal DVD/BR/HD-DVD and you need OEM SKUs if you ever want to see serious market adoption. Even display drivers have to be proprietary if you are serious about DRM, since the content has to be protected all the way to the frame buffer.
Once the community knows how the hardware works and when the protected content goes through the framebuffer it would be fairly simple to just grab the buffer for every frame, thus defeating the protection.
We all know the DRM is pointless (BD+ cracked before even being released, etc etc.) so in the long run it will be extinct. From an engineering standpoint any encryption schema that needs to be fast on decompression is weak and can be easily attacked. any good engineer knows that. Besides, it's not 'Rights' but 'Restrictions' ;-)
Originally posted by bridgman View PostI think open source drivers will be the norm for most out-of-box desktop users (except for workstation), while anyone serious about getting the most gaming performance or video capabilities will upgrade to the proprietary driver. OEMs selling desktop consumer systems with Linux pre-installed will generally go for the proprietary drivers in order to get all the bells and whistles, but there will be cases where the OEM works closely with a major distro for support and in that case open source drivers can work out fine.
Machines must make our lives easier and not make everybody a sysadmin. This attitude I see around me more and more....
Originally posted by bridgman View PostAnyone wanting to do development or testing with the latest unreleased (or just released last week) kernels or distros will also want to stay with an open source driver, since there's a good chance something will need to be tweaked to line up with the latest tweaks in the OS.
Originally posted by bridgman View PostWe are trying to achieve two things with open source drivers :
- empower the distributions to provide a complete and high quality "out of box" end user experience
- ensure that drivers for our graphics products can always keep pace with changes in the overall OS/desktop/app environment (ie "no user left behind" )
Originally posted by bridgman View Post"Official" schedule for initial 3d is 1Q08, but I want to see developers starting to work on 3d before the end of 2007 if at all possible. The 690 integrated part should be able to use existing code largely unchanged, but more work will be needed for 5xx and 6xx discrete parts.
Comment
-
So does this mean that some 3d engine specs might be available before the end of the year? I hope so.
There seems to be quite a bit of enthusiasm behind this, and the RadeonHD devs seem to be doing a good job of making sure everything works right. If there could be a stable driver with EXA and Aiglx by the end of 2008 that would be an enormous achievement. I am willing to help in any way I can, but my understanding of the functioning of graphics hardware is extremely limited (I keep running into new acronyms I need to look up like TTM, LVDS, GART, etc). I can code though, so if there is any way to learn about this and contribute to driver development, I'd like to do it.
Comment
-
Originally posted by fhuberts View PostI also do not see what is so different from a normal driver and a gaming driver or a high end video driver. Also, performance: I bet you a truckload of money that once 3D works that the community will start tuning and tweaking until it has every last bit of performance from the hardware and driver that it can obtain...
What I think you will see from open source 3d is a clean, elegant implementation that runs pretty well with everything and has maybe 50-70% of the proprietary driver performance on the hottest apps. There's a big difference between what the community *can* do and what they *will* do.
Now... will the open source driver be good enough for most people ? Absolutely. I expect it will also need less tweaking and bug fixing as new apps come out, but if you want to match the performance of the Windows driver I don't see any alternative to a proprietary driver. Maybe I'll be wrong... nobody knows for sure.
Originally posted by fhuberts View PostOnce the community knows how the hardware works and when the protected content goes through the framebuffer it would be fairly simple to just grab the buffer for every frame, thus defeating the protection.
It sucks, I knowLast edited by bridgman; 24 November 2007, 05:52 PM.Test signature
Comment
Comment