Announcement

Collapse
No announcement yet.

FRC, the ultimate tearfree video solution

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Lots of TFT can use 75 Hz too - even if they don't advertise that. One older FSC TFT even uses 75 Hz as default DCC setting. Instead of 48 you could use 72.

    Comment


    • #17
      Originally posted by bridgman View Post
      Is there much use of frame rates over 30 ? My understanding was that 24, 25 and 30 pretty much covered all of the currently available DVD and BD disks (assuming 50i and 60i are displayed as 25p and 30p respectively). I know other formats are supported but AFAIK they aren't actually used much...
      Games, mostly. Right now a PC that can put out 200fps on max settings for most current games is within the budget of even normal people. I've seen TV ads for LCD televisions that already go that high, so apparently there is a market for this sort of thing.

      Comment


      • #18
        @vesa

        Thanks for the links - it's good to know there is hope, but it does seem like a bit hit or miss, what with some modern monitors seeming to have their own scalers & input lag etc. I saw a review site which claimed some ran at their own internal rate whatever the mode (and people couldn't get their games to vsync on them).

        @kano

        I've seen 75Hz on VGA input frequently, but I also read on one site that when they tested the monitor was just dropping frames to achieve this. I am not saying they all do this of course.

        @BlackStar

        I have a CRT so can do SD 50i stuff with my monitor @50 or 100Hz.

        Perceptually there is a difference, only 50Hz is correct. Not that easy to notice on normal video, but if you look at a worst case test stream like a fast white pendulum on black @50fps you will not perceive it as solid with monitor @100Hz because of the two flashes per frame.

        I can also see this effect by say forcing a game fps to cap at 1/2 monitor refreshrate.

        Mordern high end TVs with high refresh rates actually interpolate intermediate frames to avoid motion blur/judder.

        In practice I still use 100Hz as I perceive flicker too much @50 and current wait for vline vsync tends to fall apart when fps=refresh (and it's worse than just the odd skip/double with mplayer)

        Comment


        • #19
          Well INTERNALLY those tvs provide 100/200 hz. But i highly doubt that they provide those modes to direct use. Do you know how rare are 120 Hz capable TFT for use with Nvidia shutter glasses?

          Comment


          • #20
            BlackStar, yes that kind of CRT would be nice, but 1080p @ 120 Hz means 560 MHz pixel clock. That's nothing cheap or common. One also has to put video DACs right at the CRT gates to preserve needed analog signal bandwidth. I once had ECP 3100 tube projector running 1280x960 @ 50 Hz and was going to put it in use as 100 Hz interlaced system. The graphics card/drivers didn't support interlacing then and would also need frame locking, so it was forgotten. Nothing beats the quality of good CRT, but good ones cost too much.

            Kano, 72 Hz would be fine if link bandwidth allows. My understanding is that 1920x1200 @ 60Hz is the limit for single link DVI. On smaller resolution it is workable though.

            Ant P., I doubt that much over 120 Hz is of any real value visually and it includes interleaved 3D stereo. That's up to debate and rigorous testing of course

            Comment


            • #21
              Originally posted by Kano View Post
              Lots of TFT can use 75 Hz too - even if they don't advertise that. One older FSC TFT even uses 75 Hz as default DCC setting. Instead of 48 you could use 72.
              Many TFTs *advertize* 75Hz but very few can actually *do* 75Hz. In fact, every single laptop and desktop 75Hz-"capable" monitor I've seen first hand actually downsamples from 75->60Hz, resulting in horrible horrible judder and a blurred signal. It's really bad.

              Are 120Hz TFTs actually sold yet? I would really like to see one close and personal. Unfortunately, I doubt they will be usable with stereo glasses - TFT latencies are simply too high for any serious stereo work (which is why we are using CRTs and polarized projectors at work).

              Perceptually there is a difference, only 50Hz is correct. Not that easy to notice on normal video, but if you look at a worst case test stream like a fast white pendulum on black @50fps you will not perceive it as solid with monitor @100Hz because of the two flashes per frame.
              Double-scanned output (100Hz) should be identical to 50Hz, only with less perceptible flickering. It's simple signal theory - CRT TVs have done this for years.

              If 100Hz actually looks worse than 50Hz, it might be one of the following issues:

              1. 100Hz driving the monitor to the absolute limit, resulting in degraded output (blur, geometric imperfections).

              2. The driving signal is not an exact multiple of 50Hz, resulting in judder (e.g. the monitor might be synced to 99.8Hz or 100.1Hz instead of 100Hz). This really is horrible, but can be usually fixed through custom timings.

              3. Reduced flickering allows the eye to perceive imperfections in the video signal that would be otherwise hidden.

              I've encountered all three issues on my CRT (Nec FE991SB). Lowering the resolution a notch and playing with custom timings usually helps.

              Comment


              • #22
                You definitely can buy this 120 Hz TFT:

                http://www.samsung.com/us/consumer/d...=LS22CMFKFV/ZA

                Usally available together with 3d stereo glasses.

                Comment


                • #23
                  Hmm, from googling a bit it seems 120Hz is the highest you can actually use right now - and it's a hardware limitation of the HDMI cable itself.

                  That's still a bit more than 75 though.

                  Comment


                  • #24
                    This does not appeal to me.

                    1) I have an industrial FireWire camera, which I am using at 7.5 FPS. No way is anyone going to be doing screen refresh at 7.5 FPS.

                    2) What about other applications in other windows? This harkens back to the day of full-screen graphics.

                    Comment


                    • #25
                      Also when you think a bit longer you could use 72 hz for pal too, when you play the movies with 24 instead of 25 fps. They have been accellerated only match PAL, but bascially they are 24 with 4% speedup.

                      Comment


                      • #26
                        Originally posted by BlackStar View Post
                        Double-scanned output (100Hz) should be identical to 50Hz, only with less perceptible flickering. It's simple signal theory - CRT TVs have done this for years.
                        I thought that when I first noticed it and went out of my way to find a technical explanation, but failed. I don't think simple signal theory accounts for persistance of retinal image/perception. As for CRTs IIRC they were forced to do 100Hz as they got so big that 50 became a problem - maybe 50Hz TVs look better with worse case tests - I did say it's not obvious on normal material.

                        If 100Hz actually looks worse than 50Hz, it might be one of the following issues:

                        1. 100Hz driving the monitor to the absolute limit, resulting in degraded output (blur, geometric imperfections).
                        This monitor will do 160Hz and I went as far as taking 1/125 shutter pics and they were always perfect.

                        I could play the test @100fps and it was OK, play at 50fps with 50Hz = OK, 25fps with 50Hz - not OK. In fact I messed about with many refresh/fps combos and it never looked right until fps was with in about 10% of refresh.

                        2. The driving signal is not an exact multiple of 50Hz, resulting in judder (e.g. the monitor might be synced to 99.8Hz or 100.1Hz instead of 100Hz). This really is horrible, but can be usually fixed through custom timings.
                        Quite different artifacts which I could see while playing with fps not = or 1/2 refresh.

                        3. Reduced flickering allows the eye to perceive imperfections in the video signal that would be otherwise hidden.
                        What I saw (well perceived) were not imperfections - more like motion blur/slight double image.

                        Maybe I need an AMD xilleon to interpolate the intermediate frames, I guess they do that for a reason...

                        http://www.amd.com/us-en/assets/cont...inal080103.pdf

                        Comment


                        • #27
                          Originally posted by CrystalCowboy View Post
                          This does not appeal to me.

                          1) I have an industrial FireWire camera, which I am using at 7.5 FPS. No way is anyone going to be doing screen refresh at 7.5 FPS.
                          What's 7.5 times 10?

                          Comment


                          • #28
                            @Kano: very interesting monitor, thanks! 3ms GTG is still on the high side (compared to CRTs, which are closer to 0.x ms), but it could actually be usable for stereo - as long as no color transition passes the ~8ms mark. I'll search for reviews.

                            @legume: from your description, this sounds like a typical de-interlacing artifact, which (I think) can become more visible on higher refresh rates. Of course, this theory is somewhat shakey since you observed the same artifact on games.

                            I admit I cannot see how a 60fps-locked game can look better on a 60fps refresh rate than on a 120fps (speaking about CRTs, obviously.)

                            Edit: personally, I think that motion interpolation techniques cause more harm than good on video. Film is shot on specific framerates (typically 24p) and any frame interpolation technique will invariably increase the error rate. Interpolation *can* look good (e.g. slow camera panning), but will introduce visible artifacts on any quick transition.

                            Edit 2: Xillen processors are pretty good, judging from high-end Samsung TVs, but they cannot work magic.

                            @Ant P.: laconic
                            Last edited by BlackStar; 07-20-2009, 05:34 PM.

                            Comment


                            • #29
                              Originally posted by BlackStar View Post
                              @legume: from your description, this sounds like a typical de-interlacing artifact, which (I think) can become more visible on higher refresh rates. Of course, this theory is somewhat shakey since you observed the same artifact on games.

                              I admit I cannot see how a 60fps-locked game can look better on a 60fps refresh rate than on a 120fps (speaking about CRTs, obviously.)
                              I really did want to find a technical explanation for this, and would still like to, but I am out of ideas and all the testing I did points to perception.

                              The stream was interlaced at first, but I de-interlaced and made a 50fps YUV stream, which I could frame step and see it was OK. Using yuv also relieved CPU for the high fps testing (though it's big so you need to have plenty of ram and run it a couple of timed to get it disk cached).

                              I tried different OSes, a different PC+monitor, different players sw/hw, different games and the results were always the same. The game results were forcing 30 on 60hZ and you have to be trying to see the difference - face a wall with vertical detail and sidestep left/right quickly while eye tracking the detail - 30fps on 60Hz = blurred, slight doubled or maybe tripled detail, at 60fps looks perfect.

                              Comment


                              • #30
                                I think the way to deal with it without compromising other windowed applications is to do a temporal re-sampling. We do spatial re-sampling all the time, why not do temporal as well? GPU should be able to handle it with ease. This we we re-sample the movie to screen refresh rate and show each new frame in each refresh.

                                Comment

                                Working...
                                X