Announcement

Collapse
No announcement yet.

FRC, the ultimate tearfree video solution

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • vesa
    replied
    Originally posted by agd5f View Post
    The patches need to be synced up against git master and cleaned up before they could be merged upstream. They are currently pretty hacky.
    Ok. Currently this seems again like a manpower issue and the fact that they work well with Easyvdr distribution. I'm not the author and don't regrettably have the needed skills (yet), so my purpose was just making FRC a bit more known one man project. We'll see what future brings

    Leave a comment:


  • agd5f
    replied
    Originally posted by vesa View Post
    Any driver or hardware developers lurking here anymore? What do you think about the patches and what has to be done to get them included in the official driver tree?
    The patches need to be synced up against git master and cleaned up before they could be merged upstream. They are currently pretty hacky.

    Some suggestions:
    - The code need to be made multi-head aware. The current patch assumes crtc 0.
    - The current code only support pre-avivo chips (r1xx-r4xx). The code either needs to check and bail properly on newer chips, or support needs to be added for newer asics
    - Drm version bump and checks for new ioctl
    - Handle different monitor types properly. Digital monitors tend to be very picky about the mode timing. As such you'd need to be careful about adjusting the timing.

    Another thing to consider is that modesetting is moving to the kernel in newer drms, so you may want to consider implementing this in the drm rather than in the ddx.

    Leave a comment:


  • vesa
    replied
    Originally posted by FunkyRider View Post
    I think the way to deal with it without compromising other windowed applications is to do a temporal re-sampling. We do spatial re-sampling all the time, why not do temporal as well? GPU should be able to handle it with ease. This we we re-sample the movie to screen refresh rate and show each new frame in each refresh.
    Care to explain which applications would be compromised by frame rate change? If the change is done properly one can not even notice it. Working FRC does just some very minor changes. 48 to 60 Hz jump is a bit difficult, but definitely not impossible if taken care. Temporal resampling is a very good idea, but even broadcast engineers aren't too satisfied with the results and they pay big money for the hardware. The same applies to vertical scaling of interlaced video. Processing errors like rounding in 8 bit video is horrible...

    Also, when properly implemented, it's just one configuration option in ones media player. Add some intelligence and FRC is enabled only when content frame or field rate is above half the refresh rate. Like everything in FLOSS, FRC is an _option_, not 'one size fits all' system. Usually those options do their designed work next to perfect (given time to iron out bugs).

    CrystalCowboy, you don't have to use 7.5 Hz refresh Just pick 8*7.5=60 (Hz) and run that synchronously... I doubt it is worth it though as eight times 'oversampling' will result in very small jitter and no strong subharmonics.

    The following points to a diagram about frame rate resampling:
    http://users.tkk.fi/~vsolonen/bonk/FRC.svg

    If one really _really_ wants to understand and picture it all, Octave or Matlab model of square wave sampling will be needed. FFT analysis will show temporal spectrum components nicely. You more mathematically inclined will probably figure everything on paper and in brain too, but I'm not one of those

    Any driver or hardware developers lurking here anymore? What do you think about the patches and what has to be done to get them included in the official driver tree? Any chances AMD would do fractional-N type clock synthesizer in GPU firmware? That would be something
    Last edited by vesa; 21 July 2009, 09:48 AM.

    Leave a comment:


  • FunkyRider
    replied
    I think the way to deal with it without compromising other windowed applications is to do a temporal re-sampling. We do spatial re-sampling all the time, why not do temporal as well? GPU should be able to handle it with ease. This we we re-sample the movie to screen refresh rate and show each new frame in each refresh.

    Leave a comment:


  • legume
    replied
    Originally posted by BlackStar View Post
    @legume: from your description, this sounds like a typical de-interlacing artifact, which (I think) can become more visible on higher refresh rates. Of course, this theory is somewhat shakey since you observed the same artifact on games.

    I admit I cannot see how a 60fps-locked game can look better on a 60fps refresh rate than on a 120fps (speaking about CRTs, obviously.)
    I really did want to find a technical explanation for this, and would still like to, but I am out of ideas and all the testing I did points to perception.

    The stream was interlaced at first, but I de-interlaced and made a 50fps YUV stream, which I could frame step and see it was OK. Using yuv also relieved CPU for the high fps testing (though it's big so you need to have plenty of ram and run it a couple of timed to get it disk cached).

    I tried different OSes, a different PC+monitor, different players sw/hw, different games and the results were always the same. The game results were forcing 30 on 60hZ and you have to be trying to see the difference - face a wall with vertical detail and sidestep left/right quickly while eye tracking the detail - 30fps on 60Hz = blurred, slight doubled or maybe tripled detail, at 60fps looks perfect.

    Leave a comment:


  • BlackStar
    replied
    @Kano: very interesting monitor, thanks! 3ms GTG is still on the high side (compared to CRTs, which are closer to 0.x ms), but it could actually be usable for stereo - as long as no color transition passes the ~8ms mark. I'll search for reviews.

    @legume: from your description, this sounds like a typical de-interlacing artifact, which (I think) can become more visible on higher refresh rates. Of course, this theory is somewhat shakey since you observed the same artifact on games.

    I admit I cannot see how a 60fps-locked game can look better on a 60fps refresh rate than on a 120fps (speaking about CRTs, obviously.)

    Edit: personally, I think that motion interpolation techniques cause more harm than good on video. Film is shot on specific framerates (typically 24p) and any frame interpolation technique will invariably increase the error rate. Interpolation *can* look good (e.g. slow camera panning), but will introduce visible artifacts on any quick transition.

    Edit 2: Xillen processors are pretty good, judging from high-end Samsung TVs, but they cannot work magic.

    @Ant P.: laconic
    Last edited by BlackStar; 20 July 2009, 05:34 PM.

    Leave a comment:


  • Ant P.
    replied
    Originally posted by CrystalCowboy View Post
    This does not appeal to me.

    1) I have an industrial FireWire camera, which I am using at 7.5 FPS. No way is anyone going to be doing screen refresh at 7.5 FPS.
    What's 7.5 times 10?

    Leave a comment:


  • legume
    replied
    Originally posted by BlackStar View Post
    Double-scanned output (100Hz) should be identical to 50Hz, only with less perceptible flickering. It's simple signal theory - CRT TVs have done this for years.
    I thought that when I first noticed it and went out of my way to find a technical explanation, but failed. I don't think simple signal theory accounts for persistance of retinal image/perception. As for CRTs IIRC they were forced to do 100Hz as they got so big that 50 became a problem - maybe 50Hz TVs look better with worse case tests - I did say it's not obvious on normal material.

    If 100Hz actually looks worse than 50Hz, it might be one of the following issues:

    1. 100Hz driving the monitor to the absolute limit, resulting in degraded output (blur, geometric imperfections).
    This monitor will do 160Hz and I went as far as taking 1/125 shutter pics and they were always perfect.

    I could play the test @100fps and it was OK, play at 50fps with 50Hz = OK, 25fps with 50Hz - not OK. In fact I messed about with many refresh/fps combos and it never looked right until fps was with in about 10% of refresh.

    2. The driving signal is not an exact multiple of 50Hz, resulting in judder (e.g. the monitor might be synced to 99.8Hz or 100.1Hz instead of 100Hz). This really is horrible, but can be usually fixed through custom timings.
    Quite different artifacts which I could see while playing with fps not = or 1/2 refresh.

    3. Reduced flickering allows the eye to perceive imperfections in the video signal that would be otherwise hidden.
    What I saw (well perceived) were not imperfections - more like motion blur/slight double image.

    Maybe I need an AMD xilleon to interpolate the intermediate frames, I guess they do that for a reason...

    Leave a comment:


  • Kano
    replied
    Also when you think a bit longer you could use 72 hz for pal too, when you play the movies with 24 instead of 25 fps. They have been accellerated only match PAL, but bascially they are 24 with 4% speedup.

    Leave a comment:


  • CrystalCowboy
    replied
    This does not appeal to me.

    1) I have an industrial FireWire camera, which I am using at 7.5 FPS. No way is anyone going to be doing screen refresh at 7.5 FPS.

    2) What about other applications in other windows? This harkens back to the day of full-screen graphics.

    Leave a comment:

Working...
X