Originally posted by KoenDG
View Post
DisplayPort 2.1b Arriving This Spring With DP80LL Cables
Collapse
X
-
Originally posted by Alexmitter
While its true that framerate is lacking, its a known fact that higher than 30FPS(yea not quite that but for practical means lets call it that) feels really weird for cinema. So the only market this could really benefit is gaming and maybe regular TV but the benefits are complicated and often not actually there.
https://www.quora.com/Why-does-60fps...in-video-games
Originally posted by Noitatsidem View Post
you people are why we'll never have smooth motion in movies & they'll forever be a slideshow.
Joking aside, people (both viewers and cinematographers) are acclimatised to the effects of the 24 fps frame rate of cinema. There was a lot of comment about Peter Jackson choosing to shoot The Hobbit at 48 fps. Some different views:
Peter Jackson, a terrifically talented film maker and pioneer of new cinema technology, has given the world of cinema a very important, and perhaps
The Hobbit: An Unexpected Journey is a roaring success. Film critics have not been as positive about The Hobbit as fans, but they positively heaped elevenses of derision upon Peter Jackson's decision to "experiment" with High Frame Rate in such a high profile feature.
Cinematographers have been very careful to make sure that camera pans were carefully controlled (kept slow enough) so that people's brains could still 'stitch together' the successive images to participate in the illusion of motion. Sports videography does not have that opportunity - if the camera moves with the ball, the background becomes jerky, and if the camera stays still, the ball motion becomes jerky, so you get thing like artificially generated interpolation (soap-opera effect) to compensate. At a high-enough frame rate, you don't need such tricks.
I'm unfortunate in that I get nauseous if I watch standard frame rate TV in a darkened room, because my peripheral vision notices the flickering. Classic cinema 35mm film projectors used a double shutter to illuminate each frame twice - and move the film once for each two 'flashes', giving a 'frame rate' of 48 fps, where each image was projected twice. This was to reduce the perception of flicker. (In each second, 24 images are recorded on the film. On projection, each image is projected once, the shutter drops (so you get a black screen) the same image is reprojected, the shutter drops and the film is advanced by one image while the shutter is down then the cycle occurs again, 24 times a second. So, in each 24th of a second, the sequence is 'image-black-image_repeat-black' giving 96 'images' per second, half of which are black, and the other half consists of pairs of identical images). Technical discussion here:
It turns out that with LCD and OLED monitors having display characteristics different to that of CRTs, it helps to insert black frames to remove motion blur - Black Frame Insertion (BFI) - just like the shutter in cinema projectors.
In this guide, we will explain what black frame insertion (BFI) is, how it works, and why it is a critical technique to minimize blur in gaming monitors.
So having an LCD/OLED display technology that operates at high frame rates with half of the frames being black actually helps.
There are good reasons why gamers like high frame rate screens.
Comment
-
-
Originally posted by Anon'ym' View PostI still cannot uprehand why industry ignoring optical cabling...
I understand they want to preserve backward compatibility, but just give us an optical adapter then. So basically, we get 1m copper cables but for anything longer, you plug in an active adapter that perhaps uses standard SC fiber connectors. Perhaps they could even pull something off like SPDIF did where you can get both fiber and a 3.5mm jack in the same port, so you can use whichever technology suits you best.
Comment
-
-
Originally posted by Old Grouch View Post
As far as television is concerned, improving the resolution is less important than improving the frame rate. The limit of the acuity of the human eye has (effectively) been reached: the limit of the human visual system to perceive problems with the display of moving objects (both high-speed linear, and rotational) has not been reached.
To quote: NABAmplify: Your Eyes vs. Frame Rates: What You Can (and Can’t) See
The human visual system does not have a 'frame rate' as such - it does not work like a digital video camera, or a traditional film camera with a 25 fps shutter, but it is sensitive to the discrepancies between projecting an image of moving objects at a certain frame rate and 'smooth' reality. Increasing the display frame rate reduces those discrepencies, but as the example shows, artificially contrived (pathological) situations can be generated that expose the limitations of frame-based displays.
Note that this is not related to the lower limit of about 10 frames per second needed for images to generate perceived 'apparent motion'.
As far as high frame rates go, my only issue, if you'd call it that, is less with how absurd it sounds and more along the lines of how much power it'll take for a GPU to generate 4K600, with or without frame generation or upscaling, and how much power the monitor will need to refresh itself that much. That would have to be one hell of a GPU if we're talking modern gaming. Anyone that's alternated between 30, 60, 144, and higher knows that more frames do matter regarding overall smoothness. There's also the whole resolution, distance, DPI aspect regarding how much we can perceive, too.
Anyhoo, I'm currently at other end of the spectrum playing Metal Gear Solid PSX emulated at 20/30FPS totally defeating the purpose of having a 1440p144hz gaming monitor
Comment
-
-
Originally posted by cl333r View Post
I was actually surprised recently that from the cheap monitors all were HDMI, none DP, while looking for an HDR TV as monitor. Now I see why.
The real reason is that pretty much all TV makers are members of the HDMI consortium. So to them, license fees don't really matter, it's just a matter of accounting, they're moving money from one pocket to the other.
Comment
-
-
Originally posted by bemerk View PostWhere are the attempts to have something like CEC&ARC in the DP spec so we can finally move to a common standard for tvs and computers?
Comment
-
-
People talking about using optical could be better informed.
It is too expensive, the light has to be high quality, fast pulsing, that is expensive and it gets hot, so you must have a way to distribute the heat and possibly actively cool it.
All of it is undesirable to put in consumer and "cheap" home appliances like TVs.
SPDIF is VERY different, as that has basically no bandwidth at all (in comparison).
Comment
-
Comment