Announcement

Collapse
No announcement yet.

FRC, the ultimate tearfree video solution

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Kano View Post
    Lots of TFT can use 75 Hz too - even if they don't advertise that. One older FSC TFT even uses 75 Hz as default DCC setting. Instead of 48 you could use 72.
    Many TFTs *advertize* 75Hz but very few can actually *do* 75Hz. In fact, every single laptop and desktop 75Hz-"capable" monitor I've seen first hand actually downsamples from 75->60Hz, resulting in horrible horrible judder and a blurred signal. It's really bad.

    Are 120Hz TFTs actually sold yet? I would really like to see one close and personal. Unfortunately, I doubt they will be usable with stereo glasses - TFT latencies are simply too high for any serious stereo work (which is why we are using CRTs and polarized projectors at work).

    Perceptually there is a difference, only 50Hz is correct. Not that easy to notice on normal video, but if you look at a worst case test stream like a fast white pendulum on black @50fps you will not perceive it as solid with monitor @100Hz because of the two flashes per frame.
    Double-scanned output (100Hz) should be identical to 50Hz, only with less perceptible flickering. It's simple signal theory - CRT TVs have done this for years.

    If 100Hz actually looks worse than 50Hz, it might be one of the following issues:

    1. 100Hz driving the monitor to the absolute limit, resulting in degraded output (blur, geometric imperfections).

    2. The driving signal is not an exact multiple of 50Hz, resulting in judder (e.g. the monitor might be synced to 99.8Hz or 100.1Hz instead of 100Hz). This really is horrible, but can be usually fixed through custom timings.

    3. Reduced flickering allows the eye to perceive imperfections in the video signal that would be otherwise hidden.

    I've encountered all three issues on my CRT (Nec FE991SB). Lowering the resolution a notch and playing with custom timings usually helps.

    Comment


    • #22
      You definitely can buy this 120 Hz TFT:



      Usally available together with 3d stereo glasses.

      Comment


      • #23
        Hmm, from googling a bit it seems 120Hz is the highest you can actually use right now - and it's a hardware limitation of the HDMI cable itself.

        That's still a bit more than 75 though.

        Comment


        • #24
          This does not appeal to me.

          1) I have an industrial FireWire camera, which I am using at 7.5 FPS. No way is anyone going to be doing screen refresh at 7.5 FPS.

          2) What about other applications in other windows? This harkens back to the day of full-screen graphics.

          Comment


          • #25
            Also when you think a bit longer you could use 72 hz for pal too, when you play the movies with 24 instead of 25 fps. They have been accellerated only match PAL, but bascially they are 24 with 4% speedup.

            Comment


            • #26
              Originally posted by BlackStar View Post
              Double-scanned output (100Hz) should be identical to 50Hz, only with less perceptible flickering. It's simple signal theory - CRT TVs have done this for years.
              I thought that when I first noticed it and went out of my way to find a technical explanation, but failed. I don't think simple signal theory accounts for persistance of retinal image/perception. As for CRTs IIRC they were forced to do 100Hz as they got so big that 50 became a problem - maybe 50Hz TVs look better with worse case tests - I did say it's not obvious on normal material.

              If 100Hz actually looks worse than 50Hz, it might be one of the following issues:

              1. 100Hz driving the monitor to the absolute limit, resulting in degraded output (blur, geometric imperfections).
              This monitor will do 160Hz and I went as far as taking 1/125 shutter pics and they were always perfect.

              I could play the test @100fps and it was OK, play at 50fps with 50Hz = OK, 25fps with 50Hz - not OK. In fact I messed about with many refresh/fps combos and it never looked right until fps was with in about 10% of refresh.

              2. The driving signal is not an exact multiple of 50Hz, resulting in judder (e.g. the monitor might be synced to 99.8Hz or 100.1Hz instead of 100Hz). This really is horrible, but can be usually fixed through custom timings.
              Quite different artifacts which I could see while playing with fps not = or 1/2 refresh.

              3. Reduced flickering allows the eye to perceive imperfections in the video signal that would be otherwise hidden.
              What I saw (well perceived) were not imperfections - more like motion blur/slight double image.

              Maybe I need an AMD xilleon to interpolate the intermediate frames, I guess they do that for a reason...

              Comment


              • #27
                Originally posted by CrystalCowboy View Post
                This does not appeal to me.

                1) I have an industrial FireWire camera, which I am using at 7.5 FPS. No way is anyone going to be doing screen refresh at 7.5 FPS.
                What's 7.5 times 10?

                Comment


                • #28
                  @Kano: very interesting monitor, thanks! 3ms GTG is still on the high side (compared to CRTs, which are closer to 0.x ms), but it could actually be usable for stereo - as long as no color transition passes the ~8ms mark. I'll search for reviews.

                  @legume: from your description, this sounds like a typical de-interlacing artifact, which (I think) can become more visible on higher refresh rates. Of course, this theory is somewhat shakey since you observed the same artifact on games.

                  I admit I cannot see how a 60fps-locked game can look better on a 60fps refresh rate than on a 120fps (speaking about CRTs, obviously.)

                  Edit: personally, I think that motion interpolation techniques cause more harm than good on video. Film is shot on specific framerates (typically 24p) and any frame interpolation technique will invariably increase the error rate. Interpolation *can* look good (e.g. slow camera panning), but will introduce visible artifacts on any quick transition.

                  Edit 2: Xillen processors are pretty good, judging from high-end Samsung TVs, but they cannot work magic.

                  @Ant P.: laconic
                  Last edited by BlackStar; 20 July 2009, 05:34 PM.

                  Comment


                  • #29
                    Originally posted by BlackStar View Post
                    @legume: from your description, this sounds like a typical de-interlacing artifact, which (I think) can become more visible on higher refresh rates. Of course, this theory is somewhat shakey since you observed the same artifact on games.

                    I admit I cannot see how a 60fps-locked game can look better on a 60fps refresh rate than on a 120fps (speaking about CRTs, obviously.)
                    I really did want to find a technical explanation for this, and would still like to, but I am out of ideas and all the testing I did points to perception.

                    The stream was interlaced at first, but I de-interlaced and made a 50fps YUV stream, which I could frame step and see it was OK. Using yuv also relieved CPU for the high fps testing (though it's big so you need to have plenty of ram and run it a couple of timed to get it disk cached).

                    I tried different OSes, a different PC+monitor, different players sw/hw, different games and the results were always the same. The game results were forcing 30 on 60hZ and you have to be trying to see the difference - face a wall with vertical detail and sidestep left/right quickly while eye tracking the detail - 30fps on 60Hz = blurred, slight doubled or maybe tripled detail, at 60fps looks perfect.

                    Comment


                    • #30
                      I think the way to deal with it without compromising other windowed applications is to do a temporal re-sampling. We do spatial re-sampling all the time, why not do temporal as well? GPU should be able to handle it with ease. This we we re-sample the movie to screen refresh rate and show each new frame in each refresh.

                      Comment

                      Working...
                      X