Announcement

Collapse
No announcement yet.

AMD/ATI lost another Linux customer

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by bakou View Post
    Whats wrong with xrandr? In my limited experience with fglrx (Yes I tried it a while ago on Ubuntu 8.10 but it didn't last long on my system) multiview didn't work very well at all, especially with compiz which is what makes dual screens on Linux these days much more usable (windows actually maximize to the right size properly when using two screens of different sizes across a large virtual desktop etc, which I often do at work).
    The RandR screen model doesn't map particularly well onto all of the options and modes supported by the proprietary drivers, so until RandR makes some more advances mapping RandR operations onto fglrx will be a bit of a crapshoot. The open drivers have a much easier time because they generally don't implement anything which RandR doesn't support.

    Corbin (MostAwesomeDude) is working on a project over the summer called "shatter", which will allow 2D acceleration to operate across multiple GPUs. This is an essential pre-requisite to adding multi-GPU support to RandR. Once that happens, the RandR vs proprietary driver situation should improve quite a bit.

    Multiview was a FireGL-only feature until very recently, so unless you tried it in the last month or two it's likely you were just seeing the work-in-process code.

    Originally posted by bakou View Post
    xrandr under xorg 1.6.2, intel 2.7 and compiz 8.2 and the latest 2.6.30 kernel patch is very solid for me, Also according to most benchmarks I have seen crossfire performance is not great compared to Windows.
    My understanding was that Crossfire performance was pretty similar on Linux and Windows; what differed was the number of apps which had profiles included with the driver (since most of the apps are different). Will check.

    Originally posted by bakou View Post
    You don't plan to implement crossfire in the open source driver?
    Crossfire is not just "something you turn on", it involves some fairly big changes to all levels of the graphics stack. Most of the programming information required is available already, but so far I haven't run into any devs who think spending time on something like Crossfire in the open drivers is a particularly good idea.

    Originally posted by bakou View Post
    Sorry but I have a hard time believing fglrx is so perfect that people actually want to use the most powerful ATi cards on Linux over other options for enterprise rendering workstations, even though you surely know better than me.
    If you have time, it might be worth reading some of the reviews. Most of the hate for fglrx involves using it in environments for which it had not originally been designed or tested. We have started ramping up consumer support but that is relatively recent (<2 years).
    Last edited by bridgman; 01 August 2009, 02:07 PM.
    Test signature

    Comment


    • #42
      Originally posted by bridgman View Post
      Crossfire is not just "something you turn on", it involves some fairly big changes to all levels of the graphics stack. Most of the programming information required is available already, but so far I haven't run into any devs who think spending time on something like Crossfire in the open drivers is a particularly good idea.
      Wouldn't Gallium3d make things like multi-gpu (maybe even across vendors since it's all behind a uniform api) "relatively" straightforward? Probably not, since then somebody would have thought of it already, but you know... it'd be realy kewl.

      Comment


      • #43
        Oh dear bridgman are you still forced to use 56kbps modem? That's not right at all! You should call to amnesty and through them demand at least some form of 3g network there or something? How about satellite connection anyway?

        Mankind still forces someone to use 56kbps modems. It's just plain wrong that someone has to read these forums and answer to our (sometimes appropriate ) whining with a modem. I think part of me just died...

        Comment


        • #44
          I could live with 56K, but being >20Km from the central office I get 26.4 KB/s on a good day and 13-21Kb/s on a not-so good day. On a bad day I don't get a dial tone

          Satellite connections are easy to get around here, but not if you live in a pine forest.
          Last edited by bridgman; 01 August 2009, 09:53 PM.
          Test signature

          Comment


          • #45
            Originally posted by bridgman View Post
            Satellite connections are easy to get around here, but not if you live in a pine forest.
            "Pine forest", or "forest of unused satellite dish towers"?

            Comment


            • #46
              I checked. Red Pine (aka Norway Pine). 70' tall and about 9" wide, on a 6' x 8' grid. Lots and lots of trees. Roughly 170 of them between the satellite and the highest mounting point I can find (other than a 70' tower). Maybe 40-50 if I don't mind the treetops blowing in front of the dish when it's really windy.
              Last edited by bridgman; 02 August 2009, 12:01 AM.
              Test signature

              Comment


              • #47
                Originally posted by bakou View Post
                ...

                Whats wrong with xrandr? In my limited experience with fglrx (Yes I tried it a while ago on Ubuntu 8.10 but it didn't last long on my system) multiview didn't work very well at all, especially with compiz which is what makes dual screens on Linux these days much more usable (windows actually maximize to the right size properly when using two screens of different sizes across a large virtual desktop etc, which I often do at work). xrandr under xorg 1.6.2, intel 2.7 and compiz 8.2 and the latest 2.6.30 kernel patch is very solid for me, Also according to most benchmarks I have seen crossfire performance is not great compared to Windows. You don't plan to implement crossfire in the open source driver? Sorry but I have a hard time believing fglrx is so perfect that people actually want to use the most powerful ATi cards on Linux over other options for enterprise rendering workstations, even though you surely know better than me.

                ...

                Hi,

                You seem to have a few things mixed up in the above.

                Multiview was in very early development around 8.10, the complete multiview (Xinerama) work was completed this year. Feel free to revisit it this year. Note that Multiview is only really useful for multi-GPU configurations. If you have a single GPU, RANDR is the better solution.

                The drivers support RANDR, but realistically that isn't the feature you are looking for. RANDR is primarily related to the dynamic enablement of monitors and the arbitrary placement over a virtual desktop.

                The maximization behaviour you are looking at is actually the XINERAMA extension, not RANDR directly. XINERAMA provides the output location over the X server. The Window manager uses the XINERAMA to ensure that maximization is located correctly relative to the monitors.

                RANDR is supported with the proprietary driver on all OSes with X Server 1.3 and later.

                Crossfire with the proprietary driver is automatically enabled only for a subset of games. Crossfire *only* gives good scaling where the GPU is the limiting factor.

                Regards,

                Matthew

                Comment


                • #48
                  Originally posted by energyman View Post
                  oh? phenoms suck with linux?
                  No, phenoms suck as a whole. yes they do have good price/performance but at the end of the day it helps to not run a 2nd rate CPU from a company who pumps jobs to india just to pay the bills.
                  I posted on lkml because of a usb bug in sb700 - and amd devs were very quick in responding.
                  That doesnt have anything to do with making decent hardware, irrelevant.
                  Amd's amd64 architecture was even developed with linux devs together:
                  http://marc.info/?l=linux-kernel&m=107763851825114&w=2
                  yea, back in the day. I applaud that amd for that and i myself owned 2 athlon 64 systems. Im saying AMD has taken a turn for the worst in recent time.
                  and what have the cpus to do with the graphics? Nothing that is.
                  Generally speaking one would want both a CPU and a GPU in a home setup so that is also irrelevant.
                  But hey, buy intel and tell us how much 3d sucks - or how poulsbo graphics are working out for you.
                  AMD releases documentation, intel pumps alot of money into X, take your pick. And you wont see me buy intel because I am one who buys a system I can actually use. For the time being that is nvidia. My route is not for everyone but unless your goal is open source then its illogical to go another route. My argument was how people dont donate to what runs their system but will pump money into a video card simply out of faith and judging by your lack of response in that specific regard I will assume I am correct.

                  Comment


                  • #49
                    Originally posted by L33F3R View Post
                    No, phenoms suck as a whole. yes they do have good price/performance but at the end of the day it helps to not run a 2nd rate CPU from a company who pumps jobs to india just to pay the bills.
                    i disagree - for the strong majority of people and purposes anything beyond a phenom 955 is largely overkill. and as for the india crack - intel has tons of overseas jobs as well.

                    Comment


                    • #50
                      Don't Intel chips have an "assembled in Malaysia" or something printed on them?

                      Comment

                      Working...
                      X