Announcement

Collapse
No announcement yet.

DisplayPort 2.0 Published For 3x Increase In Data Bandwidth Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • coder
    replied
    Originally posted by gamerk2 View Post
    1080p/120Hz/4:4:4 HDR or 4k/60Hz/4:2:2 HDR (due to HDMI bandwidth limitations). And that's native 120Hz, not interpolated.
    True 120 Hz? Not bad.

    Maybe the reason it's so cheap is because it's an older model lacking HDMI 2.1. That would enable 4k @ 4:4:4 120 Hz HDR, no problem. 2.1 also adds variable refresh rate, so that would be a slam dunk, if all you want to use it for is gaming & movies.

    Originally posted by gamerk2 View Post
    Computer displays are not special anymore.
    Yeah, but computer displays are also getting cheaper and better. OLED isn't yet making inroads there, because burn-in, but HDR support and color gamut are still improving. And if you like curved screens and ridiculous aspect ratios, you definitely ought to be shopping for a PC monitor.

    Leave a comment:


  • gamerk2
    replied
    Originally posted by coder View Post
    Yeah? And what refresh rates does it support?
    1080p/120Hz/4:4:4 HDR or 4k/60Hz/4:2:2 HDR (due to HDMI bandwidth limitations). And that's native 120Hz, not interpolated.

    Computer displays are not special anymore.

    Leave a comment:


  • coder
    replied
    Originally posted by carewolf View Post
    My scanlines have always worked fine,
    It's not that they didn't work, but just that the monitor would take a little longer to sync and maybe the image would be a bit less stable or more fuzzy. It's been so long that I'm a little foggy on exactly what the differences were.

    Leave a comment:


  • carewolf
    replied
    Originally posted by coder View Post
    My custom modelines never worked as well as whatever timings the windows driver came up with. I'm so glad those days are behind us.
    My scanlines have always worked fine, and sometimes better than default, it is just a nuisance to have to configure the monitor manually, and xorg can do it automatically, and even has the scanline in question. Somebody has just convinced people VGA can't handle 2560x1440, so both Windows and Xorg removes it from the list of possible resolutions. Adding the the scanline back manually on xorg is how I got it back as a supported resolution, but I just used the standard scanline it had removed as not possible

    Leave a comment:


  • coder
    replied
    Originally posted by gamerk2 View Post
    Technically, they have the same use cases,
    Some A/V-specific use cases:
    • CEC (Consumer Electronics Control)
    • lip-sync (specifically calibrating & communicating the delays for a different audio and video path)
    • ARC (Audio Return Channel)
    • YCbCr 4:2:0 video
    • Embedded networking
    Some computer-specific use cases:
    • daisy-chaining displays
    • RGB
    • USB tunneling
    Also, TVs are usually displaying content that has already been subject to lossy compression, whereas computers are typically displaying text or other content that's sensitive to compression artifacts. Another good example is DisplayPort 2.0's new Panel Self Refresh feature, which only makes sense when significant parts of the screen remain static, most of the time - something that's common for computers but rare for TV. Not to mention the whole subject of DisplayPort <-> Thunderbolt 3 <-> USB4 interoperability. I could go on, but I think it's safe to say that computer video and A/V still have a decent number of distinct use cases.

    Originally posted by gamerk2 View Post
    I can get a 55" OLED TV for literally half the price of some top-end 32" panels),
    Yeah? And what refresh rates does it support?

    If you want a TV-sized gaming monitor, Nvidia will sell you one.
    Last edited by coder; 07-06-2019, 07:41 PM.

    Leave a comment:


  • coder
    replied
    Originally posted by carewolf View Post
    Yeah, I forced by my server to provide 2560x1440 over the VGA port so that I didn't need to get a display port KVM. Xorg is not happy with it, and Windows refused me, but I made Xorg do it by giving it a custom scanline.
    My custom modelines never worked as well as whatever timings the windows driver came up with. I'm so glad those days are behind us.

    Originally posted by carewolf View Post
    16 years ago I had a very late gen CRT that could do 1920x1440 at 60Hz, and knew even higher resolutions existed and that analog signals masures resolution in lines, so assuming the cable is shielded enough, it should be entirely possible if the software only lets us.
    The first thing you need is a video card with a RAMDAC that can clock high enough. Then, you need a monitor that can handle it. However, to resolve the detail, your monitor also needs a fine dot pitch. Trinitron displays could resolve as much vertical resolution as you threw at them, if the beam focus was sharp enough, but horizontal resolution was still limited by the grille pitch.

    I didn't appreciate this until I tried a monochrome font (green on black) on a cheap CRT monitor. I could see the projection of the shadow mask as discrete dots on the screen, that I hadn't seen on a more expensive model. If I used secondary font colors, for my terminal windows, it wasn't so bad.

    Leave a comment:


  • coder
    replied
    Originally posted by gamerk2 View Post
    I think VESA is forgetting HDMI 2.1 is a thing, as 2.1 supports 8k/120Hz (hell, it supports 10k/120Hz). This update is too little too late.
    I think you're forgetting HDMI requires royalties and only supports that mode using lossy compression. For a number of reasons, this update makes good sense.

    Leave a comment:


  • gamerk2
    replied
    Originally posted by torsionbar28 View Post
    Good thing HDMI and DP are not competing interfaces. Each has its own use cases.
    Technically, they have the same use cases, but are being pushed by two different groups. They both have more or less the same capabilities when it comes to audio/video transport; it's the extra processing each does that tends to differentiate them functionally.

    To be perfectly honest, I don't see a need to have two competing digital standards that cover more or less the same uses cases. And for people like me who game primarily on large TVs (due to Display's basically topping out at 32"; nevermind I can get a 55" OLED TV for literally half the price of some top-end 32" panels), Displayport is simply not an option.

    Nope, this is not true. Do you have a link to some documentation that identifies audio as an official addition to the published DVI standard?
    Yeah, it looks like your right; the specification was un-official, and basically just used the unused signal pins that HDMI uses for Audio. And since the electrical signals for DVI and HDMI are identically, this allows an easy mechanism to support audio over DVI.
    Last edited by gamerk2; 06-28-2019, 09:11 AM.

    Leave a comment:


  • GreenReaper
    replied
    You will also be limited by the hardware at each end. In particular, the video card's RAMDAC only goes up so high. Reducing the blanking period and refresh rate can help, again if supported at both ends.

    Leave a comment:


  • carewolf
    replied
    Originally posted by GreenReaper View Post
    And some cards can push out 2560x1600 over VGA, but the result may not be as sharp as you hope. I run mine at 1280x1024@75Hz and it's fine if you use the fine-tuning options on the monitor. Heck, that's all the framebuffer can handle on graphics console mode (and I had to pull it down to 8-bit colour to avoid lag on scrolling).
    Yeah, I forced by my server to provide 2560x1440 over the VGA port so that I didn't need to get a display port KVM. Xorg is not happy with it, and Windows refused me, but I made Xorg do it by giving it a custom scanline. You can tell it is borderline though, if the VGA cable touches any other cables or gets close to a speaker or powerbrick the image is distorted. But I knew it had to be possible, 16 years ago I had a very late gen CRT that could do 1920x1440 at 60Hz, and knew even higher resolutions existed and that analog signals masures resolution in lines, so assuming the cable is shielded enough, it should be entirely possible if the software only lets us.

    Leave a comment:

Working...
X