Announcement

Collapse
No announcement yet.

The State Of Open-Source Radeon Driver Features

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by ChrisXY View Post
    Have you tried dumping the edid data from vga and from hdmi and comparing them, possibly with another machine where it works?

    Where is your bug report?
    AMD doesn't have that many linux graphics people and the open source team is even smaller. Sure, you can make fun of them, but you could also help them by telling them when something doesn't work.
    I thought they read this forum.

    Besides, I'm not sure this should even be a bug. It's standard hardware. Every friggin' modern video card has an HDMI port and most TVs and even LCD monitors are equipped with them nowadays. It's a standard connection.

    I'm not sure how to get the edid data. There's a read-edid package in Debian?

    I noticed xrandr output the native resolution (current) but the size interpreted as 1600mm x 900mm which is obviously incorrect.

    The audio is another problem. But, the fact remains, this shouldn't be rocket science for the linux graphics people. I wish I had a Nvidia card with an hdmi port. I'd like to compare. I bet it'd work!

    P.S. the onboard Nvidia chipset of the mobo detects the TV via VGA with what seems to be an acceptable resolution. I read that one can only get native 1080p with the HDMI connection. However, plug in the VGA cable to the ATI video card's VGA port and you can only get up to 1024 x 768! ATI's/AMD's open drivers are a POS! I guess I should just install the blobs but the point of going AMD was to go with open source!

    I posted that they don't support Linux and the die-hard AMD fanatics on here kept saying to try their cards. I'm proven right - they are sh*t! Also, I read here that the thermal power features don't work still. So, they don't even properly cool the card with FOSS drivers. If you search 'HDMI' in the box, you discover most posts pertain to AMD especially the FOSS drivers. That says a lot, imho!
    Last edited by Panix; 23 November 2012, 11:23 AM.

    Comment


    • It's not like VGA and HDMI don't work with the radeon driver with any combination of card/monitor. On my Radeon HD 3650 card I can use both in 1920x1080 without any problem, HDMI audio included. Tomorrow I should get the replacement Radeon HD 6670 card and I'll be able to tell you how it works with the newer generation of radeons as well.

      Comment


      • Originally posted by Panix View Post
        Besides, I'm not sure this should even be a bug. It's standard hardware. Every friggin' modern video card has an HDMI port and most TVs and even LCD monitors are equipped with them nowadays. It's a standard connection.
        You wouldn't say that if you kept up with the articles about what broken hardware TV makers put out there and what oems do to the graphics hardware and the graphics bios.

        Just an example: http://mjg59.dreamwidth.org/8705.html
        But... it's standard hardware!

        Comment


        • My favorite post from there is :

          For added fun, the manufacturer of a 720p HDTV I used to have had apparently decided that this whole EDID thing was too complicated, so had copied the EDID verbatim from a Thinkpad monitor (the X server log actually printed the display's identity as "Thinkpad"). The Thinkpad monitor in question was apparently a 5:4, 1280x1024 model. So I got the middle 90% of my desktop, stretched horizontally by 40%.
          Things really are that bad.

          The next one is a recurring source of pain. Most GPUs have a display controller where the width is programmed as an integer number of bytes. For some bizarre reason the 1366x768 panel (something like 170 and 3/4 bytes) managed to become a standard anyways.

          The TV I currently have is detected by my graphics driver as 1360x768, for some reason. I don't understand why the vertical resolution is mysteriously correct. Thankfully it can be told not to scale/overscan at all, so I just have a 6-pixel black bar down one edge of the screen, which I can cope with.
          Test signature

          Comment


          • Originally posted by bridgman View Post
            For some bizarre reason the 1366x768 panel (something like 170 and 3/4 bytes) managed to become a standard anyways.
            Its "HD" resolution (the big brother being "FullHD").

            Comment


            • Originally posted by crazycheese View Post
              Its "HD" resolution (the big brother being "FullHD").
              Yeah, although you actually want 1280x720 for that. I remember when the 1366x768 res showed up in HD projectors and everyone was scratching their head wondering where that resolution came from. The general answer from the mfgs was "I dunno, we used these LCDs because they were cheap and good".

              I suppose it was the smallest that would do both 720p for video and 1024x768 for PCs ?
              Test signature

              Comment


              • Actualy there are many games that just need that resolution, for example boomzap's games natively wants 1366x768.



                Yeah, that is games for windows, but here is wine and rV280. Of course i have that resoultion in xorg.conf, because edid (LG IPS236) does not have it and those games complains .

                And yes i am very satisfied with radeon r200 driver, only maybe if someone could fix long standing bug with positional lights with tcl, that would be cool.

                Comment


                • Have an radeon X16?0 myself so pretty happy with the state of the r300g driver today.

                  Don't see the different video decoding solutions as very important.
                  They all are very format specific compared to OpenCL, one more than the other but still.
                  For video decoding focusing on OpenCL drivers seems like a more productive way to do things.
                  OpenCL video decoders can be used to decode multiple formats using the graphic card.

                  A lot of people seem to have less than accurate views on where the decoding happens.
                  Leading to a lot of misconceptions about hardware acceleration with video decoding.
                  Big problem for the format specific hardware accelerations are that the video formats themselves are a moving target. After h.264 we are going to have h.265! (And so on.)


                  Most isn't very difficult to figure out what part to do first, next and last due to dependencies:
                  first basics then (first features then optimizations).

                  Would say priorities should be first power management, then 3d features, then 3d and power optimizations then OpenCL and last implementing specific video decoding and exotic and, or rarely used things.

                  Comment


                  • Originally posted by plonoma View Post
                    For video decoding focusing on OpenCL drivers seems like a more productive way to do things.
                    OpenCL video decoders can be used to decode multiple formats using the graphic card.
                    ...
                    Would say priorities should be first power management, then 3d features, then 3d and power optimizations then OpenCL and last implementing specific video decoding and exotic and, or rarely used things.

                    Comment


                    • Originally posted by Panix View Post
                      It interprets a 72" TV when it's 24"! LOL!
                      If the TV lies about it's size and claims to be a 24" screen, what do you expect to happen? Because bad EDID info from display devices is a far more common explanation for such problems than bad drivers...

                      Comment

                      Working...
                      X