Announcement

Collapse
No announcement yet.

DisplayPort 2.0 Published For 3x Increase In Data Bandwidth Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GreenReaper
    replied
    And some cards can push out 2560x1600 over VGA, but the result may not be as sharp as you hope. I run mine at 1280x1024@75Hz and it's fine if you use the fine-tuning options on the monitor. Heck, that's all the framebuffer can handle on graphics console mode (and I had to pull it down to 8-bit colour to avoid lag on scrolling).

    Leave a comment:


  • carewolf
    replied
    Originally posted by torsionbar28 View Post
    No, I mean HDMI replaced Component Y-Pb-Pr video.

    Remember the Sony PS2 and Nintendo Game Cube? Component YPbPr video was the ONLY way to get HD video out of them to your HDTV, because HDMI was not invented yet.

    The timeline for consumer video interfaces is:

    1957: Composite (max res 240p)
    1988: S-Video (max res 480i)
    1997: Component Y Pb Pr (max res 1080p)
    2003: HDMI
    I was partially joking.. The joke is that that you say is true where you grew up, and not everywhere. We had different plugs in Europe, in particular SCART that could do all the formats including component video and RGB, since 1976 (but which was a HUGE plug)

    Btw, your example numbers above are rather wrong. Most of those plugs are analog, they don't have max resolutions.... Composite for instance can easily carry NTSC or PAL so up to 480i and 560i. I used composite on the Amiga for 320p and 640i. SCART could do 1080p without modifications to the plug (though it needed better shielded cables for longer disances), even though it was first introduced 20 years before any content used that resolution.
    Last edited by carewolf; 28 June 2019, 03:53 AM.

    Leave a comment:


  • torsionbar28
    replied
    Originally posted by gamerk2 View Post
    Nope, later DVI spec updates added audio support. However, as DVI has typically never been used outside the PC world, it's almost never been supported by products. I know of only a few monitors that supported audio over DVI.
    Nope, this is not true. Do you have a link to some documentation that identifies audio as an official addition to the published DVI standard?

    "Some DVI-D sources use non-standard extensions to output HDMI signals including audio (e.g. ATI 3000-series and NVIDIA GTX 200-series).[9] Some multimedia displays use a DVI to HDMI adapter to input the HDMI signal with audio. Exact capabilities vary by video card specifications."



    Originally posted by gamerk2 View Post
    I think VESA is forgetting HDMI 2.1 is a thing, as 2.1 supports 8k/120Hz (hell, it supports 10k/120Hz). This update is too little too late.
    Good thing HDMI and DP are not competing interfaces. Each has its own use cases.
    Last edited by torsionbar28; 27 June 2019, 03:20 PM.

    Leave a comment:


  • torsionbar28
    replied
    Originally posted by carewolf View Post
    Replaced SCART and S-Video, you mean?
    No, I mean HDMI replaced Component Y-Pb-Pr video.

    Remember the Sony PS2 and Nintendo Game Cube? Component YPbPr video was the ONLY way to get HD video out of them to your HDTV, because HDMI was not invented yet.

    The timeline for consumer video interfaces is:

    1957: Composite (max res 240p)
    1988: S-Video (max res 480i)
    1997: Component Y Pb Pr (max res 1080p)
    2003: HDMI
    Last edited by torsionbar28; 27 June 2019, 03:21 PM.

    Leave a comment:


  • gamerk2
    replied
    Originally posted by SyXbiT View Post
    I'm not an expert, so I'm happy to be corrected. It used to be that DVI was for monitors, and HDMI (because it could carry sound) was for TVs. Then DVI got support for sound. Then DisplayPort replaced DVI.

    DisplayPort and HDMI both transfer sound/video and are both fairly small in size (both have mini ports as well), and both handle really high refresh rates and resolutions.
    Are there pros/cons that make one better suited to TV or Monitor, or do we have two similar standards just for historical reasons?
    HDMI was created by a consortium of consumer-electronics companies to be primarily a digital connector to display HD content on Televisions, which at that time were using primarily DVI-HDCP or DVI-HDTV. It was designed as a superset of these standards, with additional YCbCr features thrown on top.

    HDMIs electrical signals are identical to DVI due to being a superset of DVI-HDCP and DVI-HDTV. As a result, passive converters can convert between the two formats (including the rarely used audio-over-DVI) without issue.

    Displayport was designed primarily by VESA as a royalty free display standard to replace existing PC display standards (primarily VGA and DVI). As the electrical signals are different, Displayport can not passively convert DVI/HDMI signals* and require a powered (active) converter to do so.* As a result of being primarily a PC standard, Displayport is almost never used in consumer electronic devices, and HDMI has been the dominant connector in both fields as a result until very recently; Displayport only started to gain traction in recent years due to having superior bandwidth to HDMI [HDMI 2.0 can not handle 4k/60Hz HDR @ 4:4:4; Displayport can].

    *Dual-mode Displayport cables have the ability to convert single-layer DVI/HDMI passively. All other conversions (including dual-link DVI) require active adapters.

    DP is a free standard though, so for example DP's Adaptive Sync can be implemented by every vendor, meanwhile HDMI only allows these things as vendor-specific extensions (which means others can't do it unless the vendor specifically opens it up)
    Displayport Adaptive Sync utilizes a specific extension that is not technically mainlined in the DP spec yet, and as a result not all vendors support it.

    EDIT

    Per: https://www.extremetech.com/extreme/...-points-beyond

    Once-optional features like Forward Error Compression and Panel Replay are now mandatory (Panel Replay is an advanced form of Panel Self Refresh). Adaptive-Sync, however, will remain an optional feature. Manufacturers are not required to support it.
    Tt looks like Displayport Adaptive-Sync will remain an optional part of the standard with Displayport 2.0.

    /EDIT

    HDMI is mainlining a VRR implementation as part of HDMI 2.1's specification, and as a result every product that is HDMI 2.1 certified will support HDMI Variable Rate Refresh.
    Last edited by gamerk2; 27 June 2019, 11:41 AM.

    Leave a comment:


  • gamerk2
    replied
    Originally posted by torsionbar28 View Post
    If that's true, then it isn't DVI. DVI is a published standard, and it doesn't include audio, period. What you're describing is some proprietary nVidia thing that just happens to utilize the DVI connector.

    To make an analogy, if I hack together a way to attach a printer to my PC, using USB protocol, over an HDMI cable, that doesn't mean "HDMI supports printers".
    Nope, later DVI spec updates added audio support. However, as DVI has typically never been used outside the PC world, it's almost never been supported by products. I know of only a few monitors that supported audio over DVI.

    Leave a comment:


  • gamerk2
    replied
    This means that DP 2.0 is the first standard to support 8K resolution (7680 x 4320) at 60 Hz refresh rate with full-color 4:4:4 resolution, including with 30 bits per pixel (bpp) for HDR-10 support.
    I think VESA is forgetting HDMI 2.1 is a thing, as 2.1 supports 8k/120Hz (hell, it supports 10k/120Hz). This update is too little too late.

    Leave a comment:


  • Quppa
    replied
    I'm still waiting for a good, affordable 8-bit 120Hz 4K ~24" panel (ideally with built-in nearest neighbour scaling from 1080p and 720p), which is achievable with DisplayPort 1.3/1.4), so I'm not holding my breath for DisplayPort 2.0 devices. It's nice to dream about a 10-bit 144Hz 4K screen, though.

    Leave a comment:


  • ms178
    replied
    Originally posted by torsionbar28 View Post
    Well said brother, I propose the industry standardize on a single connector for all digital communications, regardless of type or purpose. The one connector to rule them all, shall be the 50 pin Centronics. WHO'S WITH ME?!
    Haha, please not that one or ATA-Cables! But seriously, what is wrong with a single device-to-display connector? That role could be well defined after all and HDMI and DisplayPort converging here on a single connector would surely simplify the life of vendors and consumers alike. You don't have to overload it with functionality though, there is USB 4 for that.
    Last edited by ms178; 27 June 2019, 09:50 AM.

    Leave a comment:


  • carewolf
    replied
    We can barely get support for DP 1.3 let alone 1.4..

    I hope USB 4 (C) will include DP 2.0, and make it a minimum for a new logo indicating DP capability.

    Leave a comment:

Working...
X