DisplayPort 2.1b Arriving This Spring With DP80LL Cables

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • hajj_3
    Senior Member
    • Feb 2013
    • 327

    #21
    Originally posted by KoenDG View Post
    Wasn't the whole thing with 2.1a that it wasn't a requirement to implement the entire spec? That you could call yourself "2.1a" if you implemented even a single feature and none of the rest?

    Or am I misremembering?
    Are you thinking of OpenCL?

    Comment

    • thrashwerk
      Junior Member
      • Dec 2024
      • 1

      #22
      What's with cables and terrible naming? Why 2.1b? Why not 2.2 or 3? Seems like a big enough improvement that it could've really been called DP 2.2.

      I guess we should be happy that at least it isn't as bad as USB. Hope the next one won't be DP 2.1c Gen5x13 eXtreme.

      Comment

      • Old Grouch
        Senior Member
        • Apr 2020
        • 693

        #23

        Originally posted by Alexmitter


        While its true that framerate is lacking, its a known fact that higher than 30FPS(yea not quite that but for practical means lets call it that) feels really weird for cinema. So the only market this could really benefit is gaming and maybe regular TV but the benefits are complicated and often not actually there.

        https://www.quora.com/Why-does-60fps...in-video-games

        Originally posted by Noitatsidem View Post

        you people are why we'll never have smooth motion in movies & they'll forever be a slideshow.
        Slideshow? - more like stutter- or judder-show.

        Joking aside, people (both viewers and cinematographers) are acclimatised to the effects of the 24 fps frame rate of cinema. There was a lot of comment about Peter Jackson choosing to shoot The Hobbit at 48 fps. Some different views:

        Peter Jackson, a terrifically talented film maker and pioneer of new cinema technology, has given the world of cinema a very important, and perhaps


        The Hobbit: An Unexpected Journey is a roaring success. Film critics have not been as positive about The Hobbit as fans, but they positively heaped elevenses of derision upon Peter Jackson's decision to "experiment" with High Frame Rate in such a high profile feature.


        Cinematographers have been very careful to make sure that camera pans were carefully controlled (kept slow enough) so that people's brains could still 'stitch together' the successive images to participate in the illusion of motion. Sports videography does not have that opportunity - if the camera moves with the ball, the background becomes jerky, and if the camera stays still, the ball motion becomes jerky, so you get thing like artificially generated interpolation (soap-opera effect) to compensate. At a high-enough frame rate, you don't need such tricks.

        I'm unfortunate in that I get nauseous if I watch standard frame rate TV in a darkened room, because my peripheral vision notices the flickering. Classic cinema 35mm film projectors used a double shutter to illuminate each frame twice - and move the film once for each two 'flashes', giving a 'frame rate' of 48 fps, where each image was projected twice. This was to reduce the perception of flicker. (In each second, 24 images are recorded on the film. On projection, each image is projected once, the shutter drops (so you get a black screen) the same image is reprojected, the shutter drops and the film is advanced by one image while the shutter is down then the cycle occurs again, 24 times a second. So, in each 24th of a second, the sequence is 'image-black-image_repeat-black' giving 96 'images' per second, half of which are black, and the other half consists of pairs of identical images). Technical discussion here:



        It turns out that with LCD and OLED monitors having display characteristics different to that of CRTs, it helps to insert black frames to remove motion blur - Black Frame Insertion (BFI) - just like the shutter in cinema projectors.



        In this guide, we will explain what black frame insertion (BFI) is, how it works, and why it is a critical technique to minimize blur in gaming monitors.


        So having an LCD/OLED display technology that operates at high frame rates with half of the frames being black actually helps.

        There are good reasons why gamers like high frame rate screens.





        Comment

        • Anon'ym'
          Phoronix Member
          • Jul 2021
          • 55

          #24
          I still cannot uprehand why industry ignoring optical cabling...

          Comment

          • schmidtbag
            Senior Member
            • Dec 2010
            • 6614

            #25
            Originally posted by Anon'ym' View Post
            I still cannot uprehand why industry ignoring optical cabling...
            Agreed. We did it with SPDIF in the 80s. The technology exists and it's not difficult.
            I understand they want to preserve backward compatibility, but just give us an optical adapter then. So basically, we get 1m copper cables but for anything longer, you plug in an active adapter that perhaps uses standard SC fiber connectors. Perhaps they could even pull something off like SPDIF did where you can get both fiber and a 3.5mm jack in the same port, so you can use whichever technology suits you best.

            Comment

            • skeevy420
              Senior Member
              • May 2017
              • 8627

              #26
              Originally posted by Old Grouch View Post

              As far as television is concerned, improving the resolution is less important than improving the frame rate. The limit of the acuity of the human eye has (effectively) been reached: the limit of the human visual system to perceive problems with the display of moving objects (both high-speed linear, and rotational) has not been reached.

              To quote: NABAmplify: Your Eyes vs. Frame Rates: What You Can (and Can’t) See

              The human visual system does not have a 'frame rate' as such - it does not work like a digital video camera, or a traditional film camera with a 25 fps shutter, but it is sensitive to the discrepancies between projecting an image of moving objects at a certain frame rate and 'smooth' reality. Increasing the display frame rate reduces those discrepencies, but as the example shows, artificially contrived (pathological) situations can be generated that expose the limitations of frame-based displays.

              Note that this is not related to the lower limit of about 10 frames per second needed for images to generate perceived 'apparent motion'.
              I was just copy/pasting their information so it'd be easier for other people to find.

              As far as high frame rates go, my only issue, if you'd call it that, is less with how absurd it sounds and more along the lines of how much power it'll take for a GPU to generate 4K600, with or without frame generation or upscaling, and how much power the monitor will need to refresh itself that much. That would have to be one hell of a GPU if we're talking modern gaming. Anyone that's alternated between 30, 60, 144, and higher knows that more frames do matter regarding overall smoothness. There's also the whole resolution, distance, DPI aspect regarding how much we can perceive, too.

              Anyhoo, I'm currently at other end of the spectrum playing Metal Gear Solid PSX emulated at 20/30FPS totally defeating the purpose of having a 1440p144hz gaming monitor

              Comment

              • Ferrum Master
                Phoronix Member
                • Feb 2024
                • 112

                #27
                Cable that, adapter on that...

                Make an optical display port, whatever use the same SFP cage. It works for network and is fairly cheap and cable length is not an issue.

                But we have to to dumb shit all the time do we?

                Comment

                • anarki2
                  Senior Member
                  • Mar 2010
                  • 857

                  #28
                  Originally posted by cl333r View Post

                  I was actually surprised recently that from the cheap monitors all were HDMI, none DP, while looking for an HDR TV as monitor. Now I see why.
                  Nah.

                  The real reason is that pretty much all TV makers are members of the HDMI consortium. So to them, license fees don't really matter, it's just a matter of accounting, they're moving money from one pocket to the other.

                  Comment

                  • TheLexMachine
                    Senior Member
                    • Jan 2013
                    • 420

                    #29
                    Originally posted by bemerk View Post
                    Where are the attempts to have something like CEC&ARC in the DP spec so we can finally move to a common standard for tvs and computers?
                    CEC is locked behind licensing and patents, which I believe are due to expire in this decade. I'm not sure about eARC, but DisplayPort could probably do eARC since it handles lossless audio just fine, but it would require a standards revision and new chipset designs to go with it, for hardware implementation. Regardless, there's no movement on that and the industry is pushing to get rid of DP entirely, simply because it's easier for all the parties to go with HDMI.

                    Comment

                    • DumbFsck
                      Senior Member
                      • Dec 2023
                      • 325

                      #30
                      People talking about using optical could be better informed.

                      It is too expensive, the light has to be high quality, fast pulsing, that is expensive and it gets hot, so you must have a way to distribute the heat and possibly actively cool it.

                      All of it is undesirable to put in consumer and "cheap" home appliances like TVs.


                      SPDIF is VERY different, as that has basically no bandwidth at all (in comparison).

                      Comment

                      Working...
                      X