Announcement

Collapse
No announcement yet.

RadeonHD Driver Gets HDMI Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • RadeonHD Driver Gets HDMI Support

    Phoronix: RadeonHD Driver Gets HDMI Support

    The xf86-video-radeonhd driver has today received support to handle HDMI (High Definition Media Interface) connectors. While if you've used a DVI to HDMI dongle with the RadeonHD driver it would have worked already (as we shared in our recent ATI HDMI Linux article) this support is for those with integrated connectors.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    But presumably no support for higher resolutions?

    Originally posted by phoronix View Post
    Phoronix: RadeonHD Driver Gets HDMI Support

    The xf86-video-radeonhd driver has today received support to handle HDMI (High Definition Media Interface) connectors.
    If RadeonHD doesn't allow you to use the card's full resolution over an HDMI connector then what has been gained?

    Comment


    • #3
      when i saw the commit i thought to myself "gee, phoronix is going to make a big deal out of it" :]

      Comment


      • #4
        Originally posted by chrisr View Post
        If RadeonHD doesn't allow you to use the card's full resolution over an HDMI connector then what has been gained?
        It's probably not a question of high resolution support, just EDID funnies and finding the right way to over-ride EDID. This is just a guess (I haven't spoken with Michael or the SuSE folks about it) but many consumer devices only expose 1280x720 resolution (aka 720P) even if their native resolution is higher. Not sure why but it seems to happen. In those cases you need to force a higher resolution, as Michael was trying, but during the addition of RandR support to RadeonHD the mechanisms for over-riding EDID changed a bit AFAIK.

        There is also a debate going on about the best way to over-ride in the first place -- one view is that garbage in the x conf file causes so many problems that reliable information from EDID should over-ride the conf entries; another is that EDID isn't always right and if the user mucks up conf it's their own fault. I had to unsubscribe from the lists to keep my sanity so I'm not sure what the current status is
        Test signature

        Comment


        • #5
          What about HDCP support?

          Originally posted by bridgman View Post
          It's probably not a question of high resolution support, just EDID funnies and finding the right way to over-ride EDID.
          I was under the impression that hardware with HDMI ports was designed not to provide its highest and best resolutions unless some kind of "handshake" was made via HDCP. Are you saying that HDCP is optional for ATI cards?

          Comment


          • #6
            As far as I know this is a policy implemented by player applications as part of their agreement with high-def content providers. The player app looks at the content (eg. an HD-DVD or Blu-Ray disk), the output resolution of the display, and decides whether or not output protection is required. If protection can not be enabled then the app may choose to downscale the resolution as a compromise.

            The drivers and hardware just provide a robust mechanism to inform the player app and to implement its decisions -- I don't think there are resolution dependent rules built into the hardware. I will ask around and try to confirm this.

            As far as I know HDCP is optional for all hardware, not just ATI/AMD products -- it's just that without HDCP a player app may choose not to play protected content at high resolution. I vaguely remember hearing about a couple of cases where HDCP was not optional -- either a specific display required it or a specific driver turned it on by default -- but that's it.
            Test signature

            Comment


            • #7
              But if HDCP is optional, why bother with it at all?

              Originally posted by bridgman View Post
              As far as I know HDCP is optional for all hardware, not just ATI/AMD products -- it's just that without HDCP a player app may choose not to play protected content at high resolution.
              I'm just trying to imagine circumstances where a player might choose to use HDCP... and I am drawing a blank. E.g.

              "Shall we f**k our users over?" (Y/N)

              Tough one...

              Comment


              • #8


                Seriously, the decision is made by the content providers not the player developers. When the player dev signs a license agreement to play HD content, one of the conditions is that the player app will honour the protection rules which accompany the content.

                My understanding is that today none of the HD/BR disks have included a rule about requiring HDCP above a certain resolution but AFAIK this can be turned on at any time on a disk by disk basis. I believe the rules can either prohibit playback without protection or require "constriction", ie scaling to a lower resolution then upscaling again if required to fit the window.

                There are some good articles about this on the web -- I'll see if I can dig up a couple and post some links.
                Test signature

                Comment

                Working...
                X