Announcement

Collapse
No announcement yet.

Where is the fglrx documentation?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by SheeEttin View Post
    I was just wondering the same thing, so I probed fglrx_drv.so (the driver binary) (Catalyst 8.3) for options. After a little filtering, I got this list. Unfortunately, that's only options, no explanation.

    (For the curious, here's the command I used:
    Code:
    strings /usr/lib/xorg/modules/drivers/fglrx_drv.so | grep -nE "[a-zA-Z]{3,}"
    The options, for me, started on line 11534.)
    ...line 76 "EnableMultiCard"...Could this be crossfire? So would I just add this under the device section in my xorg.conf as something like

    Option "EnableMultiCard"

    and then run the command to have it reread into the database? What was that command?

    Comment


    • #17
      I have to agree with oblivious_maximus here. Even though the drivers are getting better with each release, fglrx has such basic features lacking/broken that I can't help but feel aggravated.

      For instance, according to aticonfig, PowerPlay features will not work with a dual screen configuration. Great. It says it's a hardware limitation. So I can deal with that. When I go to turn off the laptop display (LVDS) and only use CRT1 fglrx cannot figure out what the proper resolution of the external display is. And of course, like a reasonable person, I want to do this dynamically. I don't want to change my xorg.conf every time I want to switch between two displays. Well, last night I spent two hours trying to figure our how to get fglrx to just, simply, output to CRT1 at the correct resolution, while keeping LVDS off. And for the life of me I could not get such a basic feature to work.

      The worst part is, after I started messing with aticonfig's dynamic options, the driver would no longer output anything above 1024x768. When I noticed this, I said, well OK I will revert to having both monitors enabled. Logical, right? This only made things worse, as now both monitors would run at incorrect resolutions! In fact, I noticed that at this point the driver would insist that the 22" external monitor was only capable of outputting 1024x768 (EDID from Xorg log). Restarting X or switching to a backed up xorg.conf would do absolutely nothing to change that. This is astounding to me. When I change a "dynamic" option, I expect the driver to make an on-the-fly change, not a permanent one. It turns out aticonfig --enable-monitor=CRT1, somehow, writes to a file that is not xorg.conf. I know this because I don't have write permission to xorg.conf as a user. This took me a long time to figure out. I had to go and remove everything in /etc/ati and reinstall the driver just to get my external monitor to display at the correct resolution. By the way, nowhere have I found this "workaround" documented.

      On another note, I disagree with the point that a proprietary driver does not need as much documentation as an open one. If the driver is incapable of figuring out the correct resolution of the display(s) it's driving then it better be documented well enough that I can configure it properly without pulling my hair out for two hours. Case in point, even searching for hours on Phoronix forums would not tell me that in order to get video playback on my external display, I would need the following three options:

      Code:
      Option      "OverlayOnCRTC2" "1"
      Option      "VideoOverlay" "on"
      Option      "Mode2" "1680x1050"
      I don't know what OverlayOnCRTC2 is supposed to mean. Do they mean CRT1? Because that's how the external display is referenced as in aticonfig (0 is LVDS, 1 is CRT1). Why do I need to specify Mode2? Doesn't xorg.conf have a section, aptly labeled "Screen", for settings such as resolution? I can't answer those questions, but I know from experimentation that no matter how many modelines or Modes I specify under the "Screen" section of xorg.conf fglrx will not respect my configuration. And why do we need aticonfig to do the modesetting and dynamic changes when there is already a perfectly good alternative such as XRandr? I would be understanding if AMD did not want to interface their proprietary drivers with open source extensions in fear of divulging the inner workings of their drivers. But they should at least provide something that is equivalent to or better in basic funtionality and ease of use (see XRandr).

      I'm sorry if my rant sounds antagonizing/bitter. I'm not trying to blame anyone in particular, and I appreciate AMD's efforts in improving fglrx, but I do also want to make it clear that I'm frustrated and disappointed with the driver support. With each release I read about all the great work that's going into fglrx, but on some level I really think the effort is misdirected. Maybe a lot of people do, but I really don't care if fglrx supports CrossFire, or can run Crysis at 100fps, while it simply cannot do the basics such as dynamic frequency scaling or automatic resolution detection and display switching. Compared to how easily and seamlessly the Intel drivers work on my other laptop, I kind of regret purchasing my ThinkPad T43 which came with a Mobility X300.

      I sincerely hope that someone from AMD gets to read this, because I think this is how a lot of GNU/Linux users are feeling about AMD/ATI. Just my two cents.

      EDIT: I just saw Matthew's comment, which sort of clarifies some of the questions I was asking. I just wish this sort of a design statement was accessible outside of the forums. If the driver team is going for a "just works" configuration, then as a user I should have a way of knowing that. I now realize that I've been wasting all my time configuring xorg.conf. In fact, I just configured everything properly using amdcccle; frequency scaling seems to work fine. Perhaps my frustration was not due to the deficiencies of the drivers, but rather a result of a lack of communication and official documentation. This I guess brings us back on topic.
      Last edited by voltaic; 06-20-2008, 01:19 AM.

      Comment


      • #18
        Originally posted by voltaic View Post
        EDIT: I just saw Matthew's comment, which sort of clarifies some of the questions I was asking. I just wish this sort of a design statement was accessible outside of the forums. If the driver team is going for a "just works" configuration, then as a user I should have a way of knowing that. I now realize that I've been wasting all my time configuring xorg.conf. In fact, I just configured everything properly using amdcccle; frequency scaling seems to work fine. Perhaps my frustration was not due to the deficiencies of the drivers, but rather a result of a lack of communication and official documentation. This I guess brings us back on topic.
        After reading it myself, I took a look at /etc/ati/amdpcsdb, and it's even more cryptic than a bad xorg.conf.

        Any chance we could get some documentation for this? It might help those of us with partially- or non-working setups (and supporting more customers is better for everyone, right?).

        Comment


        • #19
          voltaic, I don't think you need to worry about coming off as antagonizing or bitter, if anyone in this thread has to worry about that, it's me. Your post was quite polite I thought.

          Matthew: thank you for your informative reply. I'm afraid it just increased my utter disappointment however.

          So xorg.conf is for configuring X, but not for configuring ATI's driver for X?? How does that make any sense? Maybe this would be acceptable if the alternative tools ATI is providing weren't as woefully inadequate as fglrx itself. At one point, starting CCCLE with my svideo plugged in resulted in my system locking up. Attempting to use what I could glean from "aticonfig --help" was totally useless. For example:
          Code:
          sudo aticonfig --initial=dual-head --screen-layout=right --tv-standard-type=VIDEO --tv-format-type=NTSC-M
          Should this have setup my TV as half my desktop? If it should have, it didn't. And because there's basically zero documentation on anything, I have no idea why, or what I need to add to that to make it actually work. If I run another aticonfig command with other options, will it add them to the ones I already set, or start over? I JUST DON'T KNOW because there's no docs.
          Code:
          sudo aticonfig --enable-monitor=tv
          That actually did something, and would probably be a nice option to use, only it did't output a useful display on my TV, only garbage maladjusted-vertical-hold-type distortion. And when I enabled my CRT again(through the distortion using bash history), it was running at 640x480 and not 1600x1200. Again, no documentation, so no idea why it didn't quite work or what I need to do about it. I also tried to use CCCLE (it only locked up my computer the one time thankfully) to set this up, even though it doesn't even support the way I want to use my TV. I tried to setup a clone mode with the TV at 640x480 and my monitor at 1600x1200. Guess what? With CCCLE, they both have to use the same resolution for some bonkers reason. Guess what else? xorg.conf does this just fine without issue.

          I'm all for things just working, but the fact of the matter is that it fglrx (and CCCLE for that matter) don't "just work", and for ATI to take it upon themselves to totall eschew the established standard in favour of some other, totally undocumented (apparently...) ATI-only solution that can't even offer what X.org does (like for example, a second Xserver on a second display that isn't connected to the main display in any way), well, all I can say is I'm really really glad AMD is releasing specifications and fostering the free driver (it's the only reason I didn't just buy an older Nvidia card that hopefully would not be suffering from the same feature regressions and defects that mine is)

          "If people want to start documenting amdpcsdb options" ?? Are you serious?? If "people" want to? Why doesn't ATI document the options??

          Utter disappointment is an understatment at this point. ATI has sucked the will to tinker right out of me. I want to keep going but ATI has also just about sucked the will to seek assistance and explanation out of me also. I must appologize if I have been overly harsh or antagonizing or insulting, I try really hard to remain detatched when discussing this stuff, but I've got all sorts of frustration bombarding me from the back of my mind whenever I even think about the ATI cards that I desperately want to be using, but which are sitting on a shelf gathering dust. So yeah, if I'm harsh, it's only out of love ... and a lot of frustration


          OT:
          And PowerPlay doesn't work if you have more than one display connected?? WTF??? Seriously, what's up with that? Where's the doc explaining how that works and in what (ridiculous)circumstances it does not work? So if someone wants to actually use all the outputs on the back of their video card, they can expect the much lauded power saving features to simply not function? How is that acceptable?

          edit: hit post instead of preview with edits left to do
          Last edited by oblivious_maximus; 06-20-2008, 11:16 AM.

          Comment


          • #20
            Originally posted by voltaic View Post
            ....
            On another note, I disagree with the point that a proprietary driver does not need as much documentation as an open one. If the driver is incapable of figuring out the correct resolution of the display(s) it's driving then it better be documented well enough that I can configure it properly without pulling my hair out for two hours.
            The configuration options will not be documented, the configuration tools will be documented - see below.

            What I am reading from your statements is that you are looking at some sort of use-case driven documentation. My understanding is that no driver has this sort of documentation model.

            You seem to be saying,
            1. Connect Laptop to TV
            2. Dynamically switch on the TV to a resolution
            3. Move the video to that screen

            Using scenarios means piecing different options together. The number of fully described individual scenarios is ridiculously large. The Wiki or HOWTOs are a great place for people to collate their scenarios and the resultant options.

            Case in point, even searching for hours on Phoronix forums would not tell me that in order to get video playback on my external display, I would need the following three options:
            Code:
            Option      "OverlayOnCRTC2" "1"
            Option      "VideoOverlay" "on"
            Option      "Mode2" "1680x1050"
            From the aticonfig help

            Code:
              --ovon, --overlay-on={0|1}
                    Choose which head the hardware overlay should be visible on.  The
                    hardware overlay can be used for either OpenGL, video, pseudo-color
                    or stereo.
              --ovt, --overlay-type=STRING
                    Change the overlay for the X server.  STRING can be one of:
                        opengl
                        Xv
                        disable
              --mode2=W1xH1,W2XH2,W3xH3,...
                    Change the modes for the second display.  You may specify several
                    resolutions separated by commas.  Only valid for clone and big desktop
                    settings.
            Pair mode options: 
              Following options are used for query add and remove pair modes. 
              These options will be effective immediately. Other options on   
              the same command line will be ignored.
              --list-pairmode 
                    list all the current existing pair modes the driver can use.
              --add-pairmode=width0xheight0+width1xheight1
                    Add one pair mode to the list. width0 and height0 are the 
                    size of primary display and width1 and height1 for the 
                    secondary  display.
              --remove-pairmode=index 
                    Remove one pair mode from the list. User can get index by 
                    list-pairmode.
            Dynamic Display Management Options:
              Following options will not change the config file. They are
              used for querying driver, controller and adaptor information.
              These options will be effective immediately. Other options on 
              the same command line will be ignored.
              --enable-monitor=STRING,STRING
                    Setting current monitor to be enabled. Only 2 displays
                    can be enabled at the same time. Any displays
                    that are not on the list will be disabled.
                    STRING can be one of the following set, separated 
                    by commas:
                        none
                        crt1
                        crt2
                        lvds
                        tv
                        tmds1
                        tmds2
                        auto   -- use default policy to enable the displays.
              --query-monitor
                    This will return connected and enabled monitor information
              --swap-monitor
                    This only works for big desktop setup. This will swap the
                    contents on the two monitors.
              --swap-screens={on|off}
                    Enable/disable swap heads in dual-head mode.
                    This option works only in dual-head mode.
            I don't know what OverlayOnCRTC2 is supposed to mean. Do they mean CRT1? Because that's how the external display is referenced as in aticonfig (0 is LVDS, 1 is CRT1). Why do I need to specify Mode2?
            As described above, don't look to understand the config options, look to understand the configuration tools. All the options are there.

            Doesn't xorg.conf have a section, aptly labeled "Screen", for settings such as resolution? I can't answer those questions, but I know from experimentation that no matter how many modelines or Modes I specify under the "Screen" section of xorg.conf fglrx will not respect my configuration.
            And why do we need aticonfig to do the modesetting and dynamic changes when there is already a perfectly good alternative such as XRandr? I would be understanding if AMD did not want to interface their proprietary drivers with open source extensions in fear of divulging the inner workings of their drivers. But they should at least provide something that is equivalent to or better in basic funtionality and ease of use (see XRandr).
            Up until RANDR1.2 (released 9 months ago, and only now (last 3 months) appearing in distributions)), there was no consistent way of supporting multiple screens beyond dual-head configurations. The options above provide general capability as required.

            The pair-mode support is compatible with Randr (using a pseudo-mode that provies pseudo-xinerama).

            Regards,

            Matthew

            Comment


            • #21
              Originally posted by oblivious_maximus View Post
              voltaic, I don't think you need to worry about coming off as antagonizing or bitter, if anyone in this thread has to worry about that, it's me. Your post was quite polite I thought.
              I rarely post on forums since it usually gets long and winded... I wear a lot of asbestos, but for the parts of a discussion that will just result in a flamewar, I tend to trim that out with an elipsis (...). No offence intended, but it is a defensive measure to ensure that flamewars don't ensue. (I reserve the right as anyone to not answer questions that are lost battles, competitively sensitive, or just plain wrong . I tend to ignore overuse of strong adjectives below .

              So xorg.conf is for configuring X, but not for configuring ATI's driver for X?? How does that make any sense? Maybe this would be acceptable if the alternative tools ATI is providing weren't as woefully inadequate as fglrx itself. At one point, starting CCCLE with my svideo plugged in resulted in my system locking up. Attempting to use what I could glean from "aticonfig --help" was totally useless.
              First, our use of xorg.conf is for configuring the X-Window system. The configuration of our driver we have decided to move outside of X.

              There are a number of fundamental issues with using xorg.conf for driver configuration.
              1. X is read-only, aticonfig and ccc-le and other internal subcomponents need to make configuration changes at runtime. This is active now.
              2. Runtime handover between CCC-LE and OGL is not possible with xorg.conf. You cannot adjust AA with xorg.conf and get OGL to pick it up. This is active now.
              3. The xorg.conf is per PCI-Bus ID only. This does not allow for per-device-ID configuration. This is coing in the future.
              4. xorg.conf does not handle multiple ASIC configurations (like crossfire) cleanly. As per Michael's article, this is coming in the future.
              5. The number of options that the driver is capable of responding to makes it using xorg.conf non-sensical.
              6. There is a propensity for users to pick up random configuration options and add them to a config file, even if they do absolutely nothing to the driver. This is primarily since xorg.conf is traditionally hand-hacked.
              7. The xorg community is increasingly moving to a "config-free" xorg.conf [which validates our long-standing it should just work model].

              There are many other reasons moving away from xorg.conf for driver options, but some of those I am not willing to discuss publicly.

              Users are free to play with amdpcsdb through the PCS options of aticonfig as much as they are free to play with registry settings under Windows. Both of these "configuration" techniques are unsupported and may put the driver into a bad state. The state that the driver can be put into via the UIs is the only supported[1] mode of operation.

              [1] supported in this context is that misbehavious configurable through the UI are the only issues that can be considered bugs.


              For example:
              Code:
              sudo aticonfig --initial=dual-head --screen-layout=right --tv-standard-type=VIDEO --tv-format-type=NTSC-M
              Should this have setup my TV as half my desktop? If it should have, it didn't. And because there's basically zero documentation on anything, I have no idea why, or what I need to add to that to make it actually work. If I run another aticonfig command with other options, will it add them to the ones I already set, or start over? I JUST DON'T KNOW because there's no docs.

              Code:
              sudo aticonfig --enable-monitor=tv
              That actually did something, and would probably be a nice option to use, only it did't output a useful display on my TV, only garbage maladjusted-vertical-hold-type distortion. And when I enabled my CRT again(through the distortion using bash history), it was running at 640x480 and not 1600x1200. Again, no documentation, so no idea why it didn't quite work or what I need to do about it. I also tried to use CCCLE (it only locked up my computer the one time thankfully) to set this up, even though it doesn't even support the way I want to use my TV. I tried to setup a clone mode with the TV at 640x480 and my monitor at 1600x1200. Guess what? With CCCLE, they both have to use the same resolution for some bonkers reason. Guess what else? xorg.conf does this just fine without issue.
              I understand and appreciate your position. The cost of configuration tools is that it abstracts and simplifies the native capability that the driver may have. Feel free to hack the amdpcsdb as much as you want in an unsupported manner.

              Only since XOrg picked up RANDR1.2 does this capability exists. Prior to that you had differing implementations of mergedfb/bigdesktop-pairmodes/twinview. Even now RANDR1.2 does not solve all the problems (persistance, multiple GPU, etc).

              ...
              "If people want to start documenting amdpcsdb options" ?? Are you serious?? If "people" want to? Why doesn't ATI document the options??
              The configuration options are prone to change. AMD does not warrant that one configuration option will have the same semantic effect from one release to the next, we only warrant that the user interfaces (aticonfig/amdcccle) will remain consistent. That is the model that we have chose and enact.

              Of course when we go through major internal changes, we regress in some areas. That is one reason that we release on a regular tempo - it provides users with a large selection of drivers that they can choose to hold on one where their particular feature set of interest is most stable.

              Utter disappointment is an understatment at this point. ATI has sucked the will to tinker right out of me. I want to keep going but ATI has also just about sucked the will to seek assistance and explanation out of me also. I must appologize if I have been overly harsh or antagonizing or insulting, I try really hard to remain detatched when discussing this stuff, but I've got all sorts of frustration bombarding me from the back of my mind whenever I even think about the ATI cards that I desperately want to be using, but which are sitting on a shelf gathering dust. So yeah, if I'm harsh, it's only out of love ... and a lot of frustration
              Acknowledged, but not responded to .

              OT:
              And PowerPlay doesn't work if you have more than one display connected?? WTF??? Seriously, what's up with that? Where's the doc explaining how that works and in what (ridiculous)circumstances it does not work? So if someone wants to actually use all the outputs on the back of their video card, they can expect the much lauded power saving features to simply not function? How is that acceptable?
              Again, there are reasons that I don't always post...

              For this case here, the selection of MCLK results in a finite amount of memory bandwidth. All clients (3D, 2D, CRTC) all fight for that bandwidth. In general the amount of bandwidth available when a system is clocked down is *just* sufficient for 3D, 2D and 1 CRTC to operate without corruption.

              Most users have minimal understanding of what I just wrote above, so consequently rather than trying to explain the caveats on each and every configuration option or behaviour, the user is told that what they have asked for is not possible.

              The options are
              1. Have a feature with limitations hard coded and the user must trust that the driver has done the best that is possible,
              2. not enable the feature, or
              3. enable the feature without limitations and let the users run into all sorts of problems.

              We opted for 1).

              Regards,

              Matthew

              Comment


              • #22
                Thanks for being so patient/think_skinned Matthew.

                Regarding powerplay, I basically understand what you're saying about the power management, but I'm sure I'm among "most users" in that it is a minimal understanding. Probably this will be an illustration of that, but the way I've been using tvout since I started using Linux, I only ever use 1 display at a time, there's no output on the TV when I'm not using it, and when I am using it, my monitor shuts off, until I finish using my TV and it's empty Xserver is killed. The TV goes on VT8, my desktop session stays on the ususal vt7, and neither is ever active at the same time. So I guess what I'm saying is, why should the mere presence of a connected svideo cable (with no TV on(the video card doesn't know if the TV is on or off though does it?), and no active X displays being transmitted through the cable) preclude the use of the power management features? I have a really low opinion of Nvidia, but they can manage this just fine (though they can't seem to manage putting my monitor into standby mode at all). *shrugs* Going to want an 'after-market' cooling solution eventually anyway I suppose, but so much for the power savings of a 45W CPU. *shrugs again*

                Only since XOrg picked up RANDR1.2 does this capability exists.
                Is that in reference to multiple monitors/screens with different resolutions? Or multiple monitors with separate Xservers running on them? If it's either I'm afraid that just isn't true (maybe you just meant with ATI hardware /fglrx?), I was doing both of those things long before RANDR1.2 just using X.org's builtin functionality (that Nvidia, as lacklustre as their driver is, supports just fine). I'm not sure that is what you meant however.

                I don't have any problem with fglrx having it's own set of config tools/files, and I certainly agree that being able to manage many/all X/fglrx settings on the fly that otherwise would require restarting X is a good thing (a great thing). I just think it's folly to entirely abandon basically all support for xorg.conf and also to simply say that basic features/functionality(eg: multiple independant displays/Xservers) of X.org aren't supported by fglrx, a driver for X.org. I think at the least xorg.conf should be parsed at startup and whatever configuration is present (if any) should be used. fglrx/cccle/aticonfig can/should take the xorg.conf and use it as the basis for their runtime config changes. And if X.org supports it, so should fglrx. I know X.org can now do pretty much entirely without any content in xorg.conf and function just fine, but if you decide to use it, X.org still reads and uses the configuration you set up. Can one even run "X :1" to launch a new Xserver on VT8, using the same single screen, with fglrx? Can you do this? Unless the answer to either is yes, those are rhetorical questions that hopefully illustrate my point about supporting X features.

                As for documentation, take a look at Nvidia's for an example of what ATI is sorely lacking. I realize a lot of their readme details xorg.conf options, and that's not what you guys/gals want to use, but if you had a document that describes for us simple minded users what can be done and how, even with your anti-X.org methods, I would not be complaining about this at all, I would have just studied the document. "aticonfig --help" may be understandable to you (and it is somewhat to me), but it's really lacking overall, as far as telling the user what options are needed to accomplish what use-case scenario. And if you look here, you'll notice how they have a README for every driver release. ATI needs to do something similar, regardless of its intentions to make it all just work and to have the user never need to edit a config file. I can setup multiple separate and/or independant displays in xorg.conf pretty easily, but I couldn't figure out how to properly do the same with aticonfig or CCCLE. The difference being available, almost-entirely-understandable-by-mere-mortals documentation.

                Thanks again for your patience and thick skin Matthew(I haven't been trying to get an ATI card working today so you probably don't need either to read this message ), not to mention just being here and working to help us try to understand ATI's peculiar(imnsho) choices.

                Comment


                • #23
                  Originally posted by oblivious_maximus View Post
                  Thanks for being so patient/think_skinned Matthew.

                  Regarding powerplay, I basically understand what you're saying about the power management, but I'm sure I'm among "most users" in that it is a minimal understanding. Probably this will be an illustration of that, but the way I've been using tvout since I started using Linux, I only ever use 1 display at a time, there's no output on the TV when I'm not using it, and when I am using it, my monitor shuts off, until I finish using my TV and it's empty Xserver is killed. The TV goes on VT8, my desktop session stays on the ususal vt7, and neither is ever active at the same time. So I guess what I'm saying is, why should the mere presence of a connected svideo cable (with no TV on(the video card doesn't know if the TV is on or off though does it?), and no active X displays being transmitted through the cable) preclude the use of the power management features? I have a really low opinion of Nvidia, but they can manage this just fine (though they can't seem to manage putting my monitor into standby mode at all). *shrugs* Going to want an 'after-market' cooling solution eventually anyway I suppose, but so much for the power savings of a 45W CPU. *shrugs again*
                  If the CRTC is active (independent of the actual display), it will use the memory bandwidth I was talking about before.

                  The most simple solution for you is to use aticonfig to switch between monitors.

                  aticonfig --enable-monitor=tv

                  When you want to watch on the TV and when you want to use the normal display, you can switch back with

                  aticonfig --enable-monitor=crt

                  You can see this and other related options in the aticonfig help page.

                  Is that in reference to multiple monitors/screens with different resolutions? Or multiple monitors with separate Xservers running on them? If it's either I'm afraid that just isn't true (maybe you just meant with ATI hardware /fglrx?), I was doing both of those things long before RANDR1.2 just using X.org's builtin functionality (that Nvidia, as lacklustre as their driver is, supports just fine). I'm not sure that is what you meant however.
                  One X Server, one X Screen, two monitors. The options are the customer solutions (mergedfb/twinview/big-desktop), or XRANDR1.2 (which the proprietary driver doesn't currently support).

                  Two X screens should be working fine for , but they are unaware of each other (and there are other tradeoffs). You can turn Xinerama on to have a unified desktop, that currently only NVidia supports with 3D.

                  Two X servers is moving into a very niche configuration area. We do occasionally get this working, but don't heavily test and it usually breaks when we are doing invasive changes (like adding Crossfire support - as per Michael's 4850 article).


                  I don't have any problem with fglrx having it's own set of config tools/files, and I certainly agree that being able to manage many/all X/fglrx settings on the fly that otherwise would require restarting X is a good thing (a great thing). I just think it's folly to entirely abandon basically all support for xorg.conf and also to simply say that basic features/functionality(eg: multiple independant displays/Xservers) of X.org aren't supported by fglrx, a driver for X.org.

                  I think at the least xorg.conf should be parsed at startup and whatever configuration is present (if any) should be used.
                  http://people.freedesktop.org/~ajax/

                  We don't ignore the X configuration file, we just ignore (or will in the future) the Driver section for fglrx. All other parts of the xorg.conf we usually honor if it is within the capabilities of the driver.


                  fglrx/cccle/aticonfig can/should take the xorg.conf and use it as the basis for their runtime config changes. And if X.org supports it, so should fglrx.

                  I know X.org can now do pretty much entirely without any content in xorg.conf and function just fine, but if you decide to use it, X.org still reads and uses the configuration you set up.
                  As stated before, our model is that xorg.conf is for configuring X, and not for configuring driver.


                  Can one even run "X :1" to launch a new Xserver on VT8, using the same single screen, with fglrx? Can you do this? Unless the answer to either is yes, those are rhetorical questions that hopefully illustrate my point about supporting X features.
                  Does this work reliably with all drivers - or just NV? Does the Open Source radeon driver do this as well?

                  Either way, the proprietary driver does not currently support this featureset.

                  ...
                  Regards,

                  Matthew

                  Comment


                  • #24
                    Of course a xorg.conf with less options is more easy to handle, but what to do when you really NEED extra options like Modelines for res 1152x864@100Hz or 1280x960@85Hz because a CRT never reports those special but very usefull res. Every other driver handles this, but when you try this with fglrx and Xorg 7.1.1 -> Xorg crash. Also your amdcccle is crap, it allows setting of 1152x864 at 100 hz but it is really 75 hz when you check it. For basic settings like res and refreshrate no 3rd party tool should be used - be standard conform and accept the options every other driver knows. For your special cases you can use whatever config tool, but don't forget that there are still users without TFT monitors and CRT users like to use non standard res.

                    Comment


                    • #25
                      Originally posted by voltaic View Post
                      ...
                      Wow, thanks for stating what I've been trying to say for ages now. Those are the exact problems I have with a laptop and external monitor.

                      Resolutions and refresh rates are never detected correctly. Only yesterday was I able to actually make my external display to refresh at 75 Hz, by editing the xorg.conf manually, of course. The aticonfig options never work! AMD/ATi should be happy I won't go blind now

                      I had to add Modeline "1440x900_75.00" 136.49 1440 1536 1688 1936 900 901 904 940 -HSync +Vsync at the bottom of my Monitor section while you have the internal display off and force the CRT1 to be always on.
                      This way the resoltion and refresh rate for the external display will be always correct, because, as you noticed, fglrx won't detect settings correctly even if you simply turn off the internal and force the external, without manual tweakings.
                      This will make the Option "Mode2" "1440x900" in the device section unneeded, I believe.
                      Thank god, if you disconnect the external and only start the internal display, fglrx will kick in its normal resolution.

                      But for instance, this sort of xorg.conf brings me other problems which are pointles to mention here, since I'll be told that my xorg.conf is not what fglrx wants it to be

                      And before somebody asks me, this is not an EDID problem from the side of my monitor.

                      Comment


                      • #26
                        Originally posted by mtippett View Post
                        If the CRTC is active (independent of the actual display), it will use the memory bandwidth I was talking about before.

                        The most simple solution for you is to use aticonfig to switch between monitors.

                        aticonfig --enable-monitor=tv
                        ...
                        I guess if I ever decide it's worth my sanity using fglrx again before radeonhd is viable for my purposes, I won't have much choice and will have to try to get that to actually work, and not just present me with vertical hold problems. Which brings me back to the original topic - as I can't seem to find any docs on what's required apart from --enable-monitor=tv to get it to work properly, I fear I will not decide it is worth attempting with fglrx. Something tells me that once radeonhd is viable for my purposes, I'll be able to continue doing things the same way I have been for yonks now.(edit: or at least use RANDR to do something similar)

                        Originally posted by mtippett View Post
                        Two X servers is moving into a very niche configuration area. We do occasionally get this working, but don't heavily test and it usually breaks when we are doing invasive changes (like adding Crossfire support - as per Michael's 4850 article).
                        Niche or not (I say NOT), fact is, as I keep saying, it's a standard feature of the Xserver. Take a look at WINE's appdb for some games, it won't be long before you see someone recommending people run their game in a separate Xserver from their desktop session. In KDE, right on the KMenu, if you go to "Switch User" --> "Start New Session", that is no different from running "X :1" (apart from KDE being loaded too that is), and I'd hardly call that a niche. More like a standard Xserver feature that KDE (and GNOME?) depends on for (in windows-speak)'fast-user-switching".
                        Originally posted by Xserver manpage
                        :displaynumber
                        The X server runs as the given displaynumber, which by default is 0. If multiple X servers are to run simultaneously on a host, each must have a unique display number. See the DISPLAY NAMES section of the X(7) manual page to learn how to specify which display number clients should try to use.
                        Originally posted by mtippett View Post
                        Does this work reliably with all drivers - or just NV? Does the Open Source radeon driver do this as well?
                        Next time I plug in the ATI card I will try it and see (with radeon or radeonhd), but I suspect that every functional X driver apart from fglrx supports that behaviour - because it's a standard feature of the Xserver and I expect proper drivers for X support it just fine. I think you'll find that that six-headed system setup uses nvidia cards because they actually support X's features. I have no doubt that if Intel was selling discrete graphics parts that howto would apply to them as well.

                        Originally posted by mtippett View Post
                        Either way, the proprietary driver does not currently support this featureset.
                        And that's part of why I'm still using the card made by your arch rival, which I've wanted to get rid of pretty much since the day I bought it.

                        I think I'm just going to have to agree to disagree here. If you keep responding I suspect there's always going to be something about your(read: ATI's) response I'm going to take issue with, certainly as long as ATI is failing to support X's feature set with fglrx. That said, AMD rocks for their openness, and while I am totally annoyed with fglrx (not mention the PowerPlay defect... oops I just did), I don't yet regret having bought 2 ATI cards in the last few months even though I haven't been able to actually use them yet.

                        Comment


                        • #27
                          Realistically, I think you're going to find that all drivers support multiple x servers just fine until dri (and the associated drm aka kernel module) comes into the picture, then it becomes a lot more iffy.

                          As a trivial example, I believe the radeon (-ati) driver can fully accelerate both screens if you are running a single X server, but can only accelerate the primary screen when running two servers on the same card. Running multiple X servers on multiple cards is usually less of a problem than running multiple servers on the same card.
                          Last edited by bridgman; 06-21-2008, 06:53 PM.

                          Comment


                          • #28
                            i am certain that fglrx atleast USED to support having more than 1 X server started.. on my laptop with radeon 9700pro, beefore i changed to r300/radeon, i have done it to start an extra X session for xdmcp..

                            edit:
                            this is with the monitor being the SAME on both X servers, with only 1 X server "active" at a time..

                            Comment


                            • #29
                              Probably. I read Matthew's email as saying "it works a lot of the time but we don't treat multiple servers on one card as something we support and test".

                              My understanding was that in general the need for multiple x servers on the same card was a vestige of the "pre-3d days" and was going away, but if that is not the case then we'll need to deal with it somehow.

                              Comment


                              • #30
                                well.. its used.. for instance, having 2 graphical logins on 1 computer uses it.. People connecting to XDMCP(like me) uses it.

                                Comment

                                Working...
                                X