Announcement

Collapse
No announcement yet.

Where is the fglrx documentation?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by oblivious_maximus View Post
    voltaic, I don't think you need to worry about coming off as antagonizing or bitter, if anyone in this thread has to worry about that, it's me. Your post was quite polite I thought.
    I rarely post on forums since it usually gets long and winded... I wear a lot of asbestos, but for the parts of a discussion that will just result in a flamewar, I tend to trim that out with an elipsis (...). No offence intended, but it is a defensive measure to ensure that flamewars don't ensue. (I reserve the right as anyone to not answer questions that are lost battles, competitively sensitive, or just plain wrong . I tend to ignore overuse of strong adjectives below .

    So xorg.conf is for configuring X, but not for configuring ATI's driver for X?? How does that make any sense? Maybe this would be acceptable if the alternative tools ATI is providing weren't as woefully inadequate as fglrx itself. At one point, starting CCCLE with my svideo plugged in resulted in my system locking up. Attempting to use what I could glean from "aticonfig --help" was totally useless.
    First, our use of xorg.conf is for configuring the X-Window system. The configuration of our driver we have decided to move outside of X.

    There are a number of fundamental issues with using xorg.conf for driver configuration.
    1. X is read-only, aticonfig and ccc-le and other internal subcomponents need to make configuration changes at runtime. This is active now.
    2. Runtime handover between CCC-LE and OGL is not possible with xorg.conf. You cannot adjust AA with xorg.conf and get OGL to pick it up. This is active now.
    3. The xorg.conf is per PCI-Bus ID only. This does not allow for per-device-ID configuration. This is coing in the future.
    4. xorg.conf does not handle multiple ASIC configurations (like crossfire) cleanly. As per Michael's article, this is coming in the future.
    5. The number of options that the driver is capable of responding to makes it using xorg.conf non-sensical.
    6. There is a propensity for users to pick up random configuration options and add them to a config file, even if they do absolutely nothing to the driver. This is primarily since xorg.conf is traditionally hand-hacked.
    7. The xorg community is increasingly moving to a "config-free" xorg.conf [which validates our long-standing it should just work model].


    There are many other reasons moving away from xorg.conf for driver options, but some of those I am not willing to discuss publicly.

    Users are free to play with amdpcsdb through the PCS options of aticonfig as much as they are free to play with registry settings under Windows. Both of these "configuration" techniques are unsupported and may put the driver into a bad state. The state that the driver can be put into via the UIs is the only supported[1] mode of operation.

    [1] supported in this context is that misbehavious configurable through the UI are the only issues that can be considered bugs.


    For example:
    Code:
    sudo aticonfig --initial=dual-head --screen-layout=right --tv-standard-type=VIDEO --tv-format-type=NTSC-M
    Should this have setup my TV as half my desktop? If it should have, it didn't. And because there's basically zero documentation on anything, I have no idea why, or what I need to add to that to make it actually work. If I run another aticonfig command with other options, will it add them to the ones I already set, or start over? I JUST DON'T KNOW because there's no docs.

    Code:
    sudo aticonfig --enable-monitor=tv
    That actually did something, and would probably be a nice option to use, only it did't output a useful display on my TV, only garbage maladjusted-vertical-hold-type distortion. And when I enabled my CRT again(through the distortion using bash history), it was running at 640x480 and not 1600x1200. Again, no documentation, so no idea why it didn't quite work or what I need to do about it. I also tried to use CCCLE (it only locked up my computer the one time thankfully) to set this up, even though it doesn't even support the way I want to use my TV. I tried to setup a clone mode with the TV at 640x480 and my monitor at 1600x1200. Guess what? With CCCLE, they both have to use the same resolution for some bonkers reason. Guess what else? xorg.conf does this just fine without issue.
    I understand and appreciate your position. The cost of configuration tools is that it abstracts and simplifies the native capability that the driver may have. Feel free to hack the amdpcsdb as much as you want in an unsupported manner.

    Only since XOrg picked up RANDR1.2 does this capability exists. Prior to that you had differing implementations of mergedfb/bigdesktop-pairmodes/twinview. Even now RANDR1.2 does not solve all the problems (persistance, multiple GPU, etc).

    ...
    "If people want to start documenting amdpcsdb options" ?? Are you serious?? If "people" want to? Why doesn't ATI document the options??
    The configuration options are prone to change. AMD does not warrant that one configuration option will have the same semantic effect from one release to the next, we only warrant that the user interfaces (aticonfig/amdcccle) will remain consistent. That is the model that we have chose and enact.

    Of course when we go through major internal changes, we regress in some areas. That is one reason that we release on a regular tempo - it provides users with a large selection of drivers that they can choose to hold on one where their particular feature set of interest is most stable.

    Utter disappointment is an understatment at this point. ATI has sucked the will to tinker right out of me. I want to keep going but ATI has also just about sucked the will to seek assistance and explanation out of me also. I must appologize if I have been overly harsh or antagonizing or insulting, I try really hard to remain detatched when discussing this stuff, but I've got all sorts of frustration bombarding me from the back of my mind whenever I even think about the ATI cards that I desperately want to be using, but which are sitting on a shelf gathering dust. So yeah, if I'm harsh, it's only out of love ... and a lot of frustration
    Acknowledged, but not responded to .

    OT:
    And PowerPlay doesn't work if you have more than one display connected?? WTF??? Seriously, what's up with that? Where's the doc explaining how that works and in what (ridiculous)circumstances it does not work? So if someone wants to actually use all the outputs on the back of their video card, they can expect the much lauded power saving features to simply not function? How is that acceptable?
    Again, there are reasons that I don't always post...

    For this case here, the selection of MCLK results in a finite amount of memory bandwidth. All clients (3D, 2D, CRTC) all fight for that bandwidth. In general the amount of bandwidth available when a system is clocked down is *just* sufficient for 3D, 2D and 1 CRTC to operate without corruption.

    Most users have minimal understanding of what I just wrote above, so consequently rather than trying to explain the caveats on each and every configuration option or behaviour, the user is told that what they have asked for is not possible.

    The options are
    1. Have a feature with limitations hard coded and the user must trust that the driver has done the best that is possible,
    2. not enable the feature, or
    3. enable the feature without limitations and let the users run into all sorts of problems.


    We opted for 1).

    Regards,

    Matthew

    Comment


    • #22
      Thanks for being so patient/think_skinned Matthew.

      Regarding powerplay, I basically understand what you're saying about the power management, but I'm sure I'm among "most users" in that it is a minimal understanding. Probably this will be an illustration of that, but the way I've been using tvout since I started using Linux, I only ever use 1 display at a time, there's no output on the TV when I'm not using it, and when I am using it, my monitor shuts off, until I finish using my TV and it's empty Xserver is killed. The TV goes on VT8, my desktop session stays on the ususal vt7, and neither is ever active at the same time. So I guess what I'm saying is, why should the mere presence of a connected svideo cable (with no TV on(the video card doesn't know if the TV is on or off though does it?), and no active X displays being transmitted through the cable) preclude the use of the power management features? I have a really low opinion of Nvidia, but they can manage this just fine (though they can't seem to manage putting my monitor into standby mode at all). *shrugs* Going to want an 'after-market' cooling solution eventually anyway I suppose, but so much for the power savings of a 45W CPU. *shrugs again*

      Only since XOrg picked up RANDR1.2 does this capability exists.
      Is that in reference to multiple monitors/screens with different resolutions? Or multiple monitors with separate Xservers running on them? If it's either I'm afraid that just isn't true (maybe you just meant with ATI hardware /fglrx?), I was doing both of those things long before RANDR1.2 just using X.org's builtin functionality (that Nvidia, as lacklustre as their driver is, supports just fine). I'm not sure that is what you meant however.

      I don't have any problem with fglrx having it's own set of config tools/files, and I certainly agree that being able to manage many/all X/fglrx settings on the fly that otherwise would require restarting X is a good thing (a great thing). I just think it's folly to entirely abandon basically all support for xorg.conf and also to simply say that basic features/functionality(eg: multiple independant displays/Xservers) of X.org aren't supported by fglrx, a driver for X.org. I think at the least xorg.conf should be parsed at startup and whatever configuration is present (if any) should be used. fglrx/cccle/aticonfig can/should take the xorg.conf and use it as the basis for their runtime config changes. And if X.org supports it, so should fglrx. I know X.org can now do pretty much entirely without any content in xorg.conf and function just fine, but if you decide to use it, X.org still reads and uses the configuration you set up. Can one even run "X :1" to launch a new Xserver on VT8, using the same single screen, with fglrx? Can you do this? Unless the answer to either is yes, those are rhetorical questions that hopefully illustrate my point about supporting X features.

      As for documentation, take a look at Nvidia's for an example of what ATI is sorely lacking. I realize a lot of their readme details xorg.conf options, and that's not what you guys/gals want to use, but if you had a document that describes for us simple minded users what can be done and how, even with your anti-X.org methods, I would not be complaining about this at all, I would have just studied the document. "aticonfig --help" may be understandable to you (and it is somewhat to me), but it's really lacking overall, as far as telling the user what options are needed to accomplish what use-case scenario. And if you look here, you'll notice how they have a README for every driver release. ATI needs to do something similar, regardless of its intentions to make it all just work and to have the user never need to edit a config file. I can setup multiple separate and/or independant displays in xorg.conf pretty easily, but I couldn't figure out how to properly do the same with aticonfig or CCCLE. The difference being available, almost-entirely-understandable-by-mere-mortals documentation.

      Thanks again for your patience and thick skin Matthew(I haven't been trying to get an ATI card working today so you probably don't need either to read this message ), not to mention just being here and working to help us try to understand ATI's peculiar(imnsho) choices.

      Comment


      • #23
        Originally posted by oblivious_maximus View Post
        Thanks for being so patient/think_skinned Matthew.

        Regarding powerplay, I basically understand what you're saying about the power management, but I'm sure I'm among "most users" in that it is a minimal understanding. Probably this will be an illustration of that, but the way I've been using tvout since I started using Linux, I only ever use 1 display at a time, there's no output on the TV when I'm not using it, and when I am using it, my monitor shuts off, until I finish using my TV and it's empty Xserver is killed. The TV goes on VT8, my desktop session stays on the ususal vt7, and neither is ever active at the same time. So I guess what I'm saying is, why should the mere presence of a connected svideo cable (with no TV on(the video card doesn't know if the TV is on or off though does it?), and no active X displays being transmitted through the cable) preclude the use of the power management features? I have a really low opinion of Nvidia, but they can manage this just fine (though they can't seem to manage putting my monitor into standby mode at all). *shrugs* Going to want an 'after-market' cooling solution eventually anyway I suppose, but so much for the power savings of a 45W CPU. *shrugs again*
        If the CRTC is active (independent of the actual display), it will use the memory bandwidth I was talking about before.

        The most simple solution for you is to use aticonfig to switch between monitors.

        aticonfig --enable-monitor=tv

        When you want to watch on the TV and when you want to use the normal display, you can switch back with

        aticonfig --enable-monitor=crt

        You can see this and other related options in the aticonfig help page.

        Is that in reference to multiple monitors/screens with different resolutions? Or multiple monitors with separate Xservers running on them? If it's either I'm afraid that just isn't true (maybe you just meant with ATI hardware /fglrx?), I was doing both of those things long before RANDR1.2 just using X.org's builtin functionality (that Nvidia, as lacklustre as their driver is, supports just fine). I'm not sure that is what you meant however.
        One X Server, one X Screen, two monitors. The options are the customer solutions (mergedfb/twinview/big-desktop), or XRANDR1.2 (which the proprietary driver doesn't currently support).

        Two X screens should be working fine for , but they are unaware of each other (and there are other tradeoffs). You can turn Xinerama on to have a unified desktop, that currently only NVidia supports with 3D.

        Two X servers is moving into a very niche configuration area. We do occasionally get this working, but don't heavily test and it usually breaks when we are doing invasive changes (like adding Crossfire support - as per Michael's 4850 article).


        I don't have any problem with fglrx having it's own set of config tools/files, and I certainly agree that being able to manage many/all X/fglrx settings on the fly that otherwise would require restarting X is a good thing (a great thing). I just think it's folly to entirely abandon basically all support for xorg.conf and also to simply say that basic features/functionality(eg: multiple independant displays/Xservers) of X.org aren't supported by fglrx, a driver for X.org.

        I think at the least xorg.conf should be parsed at startup and whatever configuration is present (if any) should be used.


        We don't ignore the X configuration file, we just ignore (or will in the future) the Driver section for fglrx. All other parts of the xorg.conf we usually honor if it is within the capabilities of the driver.


        fglrx/cccle/aticonfig can/should take the xorg.conf and use it as the basis for their runtime config changes. And if X.org supports it, so should fglrx.

        I know X.org can now do pretty much entirely without any content in xorg.conf and function just fine, but if you decide to use it, X.org still reads and uses the configuration you set up.
        As stated before, our model is that xorg.conf is for configuring X, and not for configuring driver.


        Can one even run "X :1" to launch a new Xserver on VT8, using the same single screen, with fglrx? Can you do this? Unless the answer to either is yes, those are rhetorical questions that hopefully illustrate my point about supporting X features.
        Does this work reliably with all drivers - or just NV? Does the Open Source radeon driver do this as well?

        Either way, the proprietary driver does not currently support this featureset.

        ...
        Regards,

        Matthew

        Comment


        • #24
          Of course a xorg.conf with less options is more easy to handle, but what to do when you really NEED extra options like Modelines for res 1152x864@100Hz or 1280x960@85Hz because a CRT never reports those special but very usefull res. Every other driver handles this, but when you try this with fglrx and Xorg 7.1.1 -> Xorg crash. Also your amdcccle is crap, it allows setting of 1152x864 at 100 hz but it is really 75 hz when you check it. For basic settings like res and refreshrate no 3rd party tool should be used - be standard conform and accept the options every other driver knows. For your special cases you can use whatever config tool, but don't forget that there are still users without TFT monitors and CRT users like to use non standard res.

          Comment


          • #25
            Originally posted by voltaic View Post
            ...
            Wow, thanks for stating what I've been trying to say for ages now. Those are the exact problems I have with a laptop and external monitor.

            Resolutions and refresh rates are never detected correctly. Only yesterday was I able to actually make my external display to refresh at 75 Hz, by editing the xorg.conf manually, of course. The aticonfig options never work! AMD/ATi should be happy I won't go blind now

            I had to add Modeline "1440x900_75.00" 136.49 1440 1536 1688 1936 900 901 904 940 -HSync +Vsync at the bottom of my Monitor section while you have the internal display off and force the CRT1 to be always on.
            This way the resoltion and refresh rate for the external display will be always correct, because, as you noticed, fglrx won't detect settings correctly even if you simply turn off the internal and force the external, without manual tweakings.
            This will make the Option "Mode2" "1440x900" in the device section unneeded, I believe.
            Thank god, if you disconnect the external and only start the internal display, fglrx will kick in its normal resolution.

            But for instance, this sort of xorg.conf brings me other problems which are pointles to mention here, since I'll be told that my xorg.conf is not what fglrx wants it to be

            And before somebody asks me, this is not an EDID problem from the side of my monitor.

            Comment


            • #26
              Originally posted by mtippett View Post
              If the CRTC is active (independent of the actual display), it will use the memory bandwidth I was talking about before.

              The most simple solution for you is to use aticonfig to switch between monitors.

              aticonfig --enable-monitor=tv
              ...
              I guess if I ever decide it's worth my sanity using fglrx again before radeonhd is viable for my purposes, I won't have much choice and will have to try to get that to actually work, and not just present me with vertical hold problems. Which brings me back to the original topic - as I can't seem to find any docs on what's required apart from --enable-monitor=tv to get it to work properly, I fear I will not decide it is worth attempting with fglrx. Something tells me that once radeonhd is viable for my purposes, I'll be able to continue doing things the same way I have been for yonks now.(edit: or at least use RANDR to do something similar)

              Originally posted by mtippett View Post
              Two X servers is moving into a very niche configuration area. We do occasionally get this working, but don't heavily test and it usually breaks when we are doing invasive changes (like adding Crossfire support - as per Michael's 4850 article).
              Niche or not (I say NOT), fact is, as I keep saying, it's a standard feature of the Xserver. Take a look at WINE's appdb for some games, it won't be long before you see someone recommending people run their game in a separate Xserver from their desktop session. In KDE, right on the KMenu, if you go to "Switch User" --> "Start New Session", that is no different from running "X :1" (apart from KDE being loaded too that is), and I'd hardly call that a niche. More like a standard Xserver feature that KDE (and GNOME?) depends on for (in windows-speak)'fast-user-switching".
              Originally posted by Xserver manpage
              :displaynumber
              The X server runs as the given displaynumber, which by default is 0. If multiple X servers are to run simultaneously on a host, each must have a unique display number. See the DISPLAY NAMES section of the X(7) manual page to learn how to specify which display number clients should try to use.
              Originally posted by mtippett View Post
              Does this work reliably with all drivers - or just NV? Does the Open Source radeon driver do this as well?
              Next time I plug in the ATI card I will try it and see (with radeon or radeonhd), but I suspect that every functional X driver apart from fglrx supports that behaviour - because it's a standard feature of the Xserver and I expect proper drivers for X support it just fine. I think you'll find that that six-headed system setup uses nvidia cards because they actually support X's features. I have no doubt that if Intel was selling discrete graphics parts that howto would apply to them as well.

              Originally posted by mtippett View Post
              Either way, the proprietary driver does not currently support this featureset.
              And that's part of why I'm still using the card made by your arch rival, which I've wanted to get rid of pretty much since the day I bought it.

              I think I'm just going to have to agree to disagree here. If you keep responding I suspect there's always going to be something about your(read: ATI's) response I'm going to take issue with, certainly as long as ATI is failing to support X's feature set with fglrx. That said, AMD rocks for their openness, and while I am totally annoyed with fglrx (not mention the PowerPlay defect... oops I just did), I don't yet regret having bought 2 ATI cards in the last few months even though I haven't been able to actually use them yet.

              Comment


              • #27
                Realistically, I think you're going to find that all drivers support multiple x servers just fine until dri (and the associated drm aka kernel module) comes into the picture, then it becomes a lot more iffy.

                As a trivial example, I believe the radeon (-ati) driver can fully accelerate both screens if you are running a single X server, but can only accelerate the primary screen when running two servers on the same card. Running multiple X servers on multiple cards is usually less of a problem than running multiple servers on the same card.
                Last edited by bridgman; 21 June 2008, 06:53 PM.
                Test signature

                Comment


                • #28
                  i am certain that fglrx atleast USED to support having more than 1 X server started.. on my laptop with radeon 9700pro, beefore i changed to r300/radeon, i have done it to start an extra X session for xdmcp..

                  edit:
                  this is with the monitor being the SAME on both X servers, with only 1 X server "active" at a time..

                  Comment


                  • #29
                    Probably. I read Matthew's email as saying "it works a lot of the time but we don't treat multiple servers on one card as something we support and test".

                    My understanding was that in general the need for multiple x servers on the same card was a vestige of the "pre-3d days" and was going away, but if that is not the case then we'll need to deal with it somehow.
                    Test signature

                    Comment


                    • #30
                      well.. its used.. for instance, having 2 graphical logins on 1 computer uses it.. People connecting to XDMCP(like me) uses it.

                      Comment

                      Working...
                      X