Announcement

Collapse
No announcement yet.

DisplayPort 2.0 Published For 3x Increase In Data Bandwidth Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    I'm still waiting for a good, affordable 8-bit 120Hz 4K ~24" panel (ideally with built-in nearest neighbour scaling from 1080p and 720p), which is achievable with DisplayPort 1.3/1.4), so I'm not holding my breath for DisplayPort 2.0 devices. It's nice to dream about a 10-bit 144Hz 4K screen, though.

    Comment


    • #32
      This means that DP 2.0 is the first standard to support 8K resolution (7680 x 4320) at 60 Hz refresh rate with full-color 4:4:4 resolution, including with 30 bits per pixel (bpp) for HDR-10 support.
      I think VESA is forgetting HDMI 2.1 is a thing, as 2.1 supports 8k/120Hz (hell, it supports 10k/120Hz). This update is too little too late.

      Comment


      • #33
        Originally posted by torsionbar28 View Post
        If that's true, then it isn't DVI. DVI is a published standard, and it doesn't include audio, period. What you're describing is some proprietary nVidia thing that just happens to utilize the DVI connector.

        To make an analogy, if I hack together a way to attach a printer to my PC, using USB protocol, over an HDMI cable, that doesn't mean "HDMI supports printers".
        Nope, later DVI spec updates added audio support. However, as DVI has typically never been used outside the PC world, it's almost never been supported by products. I know of only a few monitors that supported audio over DVI.

        Comment


        • #34
          Originally posted by SyXbiT View Post
          I'm not an expert, so I'm happy to be corrected. It used to be that DVI was for monitors, and HDMI (because it could carry sound) was for TVs. Then DVI got support for sound. Then DisplayPort replaced DVI.

          DisplayPort and HDMI both transfer sound/video and are both fairly small in size (both have mini ports as well), and both handle really high refresh rates and resolutions.
          Are there pros/cons that make one better suited to TV or Monitor, or do we have two similar standards just for historical reasons?
          HDMI was created by a consortium of consumer-electronics companies to be primarily a digital connector to display HD content on Televisions, which at that time were using primarily DVI-HDCP or DVI-HDTV. It was designed as a superset of these standards, with additional YCbCr features thrown on top.

          HDMIs electrical signals are identical to DVI due to being a superset of DVI-HDCP and DVI-HDTV. As a result, passive converters can convert between the two formats (including the rarely used audio-over-DVI) without issue.

          Displayport was designed primarily by VESA as a royalty free display standard to replace existing PC display standards (primarily VGA and DVI). As the electrical signals are different, Displayport can not passively convert DVI/HDMI signals* and require a powered (active) converter to do so.* As a result of being primarily a PC standard, Displayport is almost never used in consumer electronic devices, and HDMI has been the dominant connector in both fields as a result until very recently; Displayport only started to gain traction in recent years due to having superior bandwidth to HDMI [HDMI 2.0 can not handle 4k/60Hz HDR @ 4:4:4; Displayport can].

          *Dual-mode Displayport cables have the ability to convert single-layer DVI/HDMI passively. All other conversions (including dual-link DVI) require active adapters.

          DP is a free standard though, so for example DP's Adaptive Sync can be implemented by every vendor, meanwhile HDMI only allows these things as vendor-specific extensions (which means others can't do it unless the vendor specifically opens it up)
          Displayport Adaptive Sync utilizes a specific extension that is not technically mainlined in the DP spec yet, and as a result not all vendors support it.

          EDIT

          Per: https://www.extremetech.com/extreme/...-points-beyond

          Once-optional features like Forward Error Compression and Panel Replay are now mandatory (Panel Replay is an advanced form of Panel Self Refresh). Adaptive-Sync, however, will remain an optional feature. Manufacturers are not required to support it.
          Tt looks like Displayport Adaptive-Sync will remain an optional part of the standard with Displayport 2.0.

          /EDIT

          HDMI is mainlining a VRR implementation as part of HDMI 2.1's specification, and as a result every product that is HDMI 2.1 certified will support HDMI Variable Rate Refresh.
          Last edited by gamerk2; 27 June 2019, 11:41 AM.

          Comment


          • #35
            Originally posted by carewolf View Post
            Replaced SCART and S-Video, you mean?
            No, I mean HDMI replaced Component Y-Pb-Pr video.

            Remember the Sony PS2 and Nintendo Game Cube? Component YPbPr video was the ONLY way to get HD video out of them to your HDTV, because HDMI was not invented yet.

            The timeline for consumer video interfaces is:

            1957: Composite (max res 240p)
            1988: S-Video (max res 480i)
            1997: Component Y Pb Pr (max res 1080p)
            2003: HDMI
            Last edited by torsionbar28; 27 June 2019, 03:21 PM.

            Comment


            • #36
              Originally posted by gamerk2 View Post
              Nope, later DVI spec updates added audio support. However, as DVI has typically never been used outside the PC world, it's almost never been supported by products. I know of only a few monitors that supported audio over DVI.
              Nope, this is not true. Do you have a link to some documentation that identifies audio as an official addition to the published DVI standard?

              "Some DVI-D sources use non-standard extensions to output HDMI signals including audio (e.g. ATI 3000-series and NVIDIA GTX 200-series).[9] Some multimedia displays use a DVI to HDMI adapter to input the HDMI signal with audio. Exact capabilities vary by video card specifications."



              Originally posted by gamerk2 View Post
              I think VESA is forgetting HDMI 2.1 is a thing, as 2.1 supports 8k/120Hz (hell, it supports 10k/120Hz). This update is too little too late.
              Good thing HDMI and DP are not competing interfaces. Each has its own use cases.
              Last edited by torsionbar28; 27 June 2019, 03:20 PM.

              Comment


              • #37
                Originally posted by torsionbar28 View Post
                No, I mean HDMI replaced Component Y-Pb-Pr video.

                Remember the Sony PS2 and Nintendo Game Cube? Component YPbPr video was the ONLY way to get HD video out of them to your HDTV, because HDMI was not invented yet.

                The timeline for consumer video interfaces is:

                1957: Composite (max res 240p)
                1988: S-Video (max res 480i)
                1997: Component Y Pb Pr (max res 1080p)
                2003: HDMI
                I was partially joking.. The joke is that that you say is true where you grew up, and not everywhere. We had different plugs in Europe, in particular SCART that could do all the formats including component video and RGB, since 1976 (but which was a HUGE plug)

                Btw, your example numbers above are rather wrong. Most of those plugs are analog, they don't have max resolutions.... Composite for instance can easily carry NTSC or PAL so up to 480i and 560i. I used composite on the Amiga for 320p and 640i. SCART could do 1080p without modifications to the plug (though it needed better shielded cables for longer disances), even though it was first introduced 20 years before any content used that resolution.
                Last edited by carewolf; 28 June 2019, 03:53 AM.

                Comment


                • #38
                  And some cards can push out 2560x1600 over VGA, but the result may not be as sharp as you hope. I run mine at 1280x1024@75Hz and it's fine if you use the fine-tuning options on the monitor. Heck, that's all the framebuffer can handle on graphics console mode (and I had to pull it down to 8-bit colour to avoid lag on scrolling).

                  Comment


                  • #39
                    Originally posted by GreenReaper View Post
                    And some cards can push out 2560x1600 over VGA, but the result may not be as sharp as you hope. I run mine at 1280x1024@75Hz and it's fine if you use the fine-tuning options on the monitor. Heck, that's all the framebuffer can handle on graphics console mode (and I had to pull it down to 8-bit colour to avoid lag on scrolling).
                    Yeah, I forced by my server to provide 2560x1440 over the VGA port so that I didn't need to get a display port KVM. Xorg is not happy with it, and Windows refused me, but I made Xorg do it by giving it a custom scanline. You can tell it is borderline though, if the VGA cable touches any other cables or gets close to a speaker or powerbrick the image is distorted. But I knew it had to be possible, 16 years ago I had a very late gen CRT that could do 1920x1440 at 60Hz, and knew even higher resolutions existed and that analog signals masures resolution in lines, so assuming the cable is shielded enough, it should be entirely possible if the software only lets us.

                    Comment


                    • #40
                      You will also be limited by the hardware at each end. In particular, the video card's RAMDAC only goes up so high. Reducing the blanking period and refresh rate can help, again if supported at both ends.

                      Comment

                      Working...
                      X