Page 4 of 5 FirstFirst ... 2345 LastLast
Results 31 to 40 of 44

Thread: AMD working on XvMC for r300g?

  1. #31
    Join Date
    Dec 2007
    Posts
    2,371

    Default

    Quote Originally Posted by misGnomer View Post
    Meanwhile for me as an end user it looks like none of the HD 4xxx or 5xxx generation cards will be reasonably usable in the near future. Maybe a cast out 3xxx card from an upgrading windows user...
    X1xxx = r500
    HD2xxx, HD3xxx = r600
    HD4xxx = r700
    HD5xxx = evergreen

    r600 and r700 are both well supported so HD2xxx-HD4xxx cards are working well already. evergreen support is just starting to roll out now.

  2. #32
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,142

    Default

    Quote Originally Posted by popper View Post
    you seem to miss the point?, its just an h.264 clip thats hard to decode, its nothing to do with "A Benchmark" app or whatever.
    It's a clip that's intentionally hard to decode, in order to test decoding efficiency. If this doesn't fit the definition of a benchmark I don't know what does.

  3. #33
    Join Date
    Jul 2009
    Location
    England
    Posts
    103

    Default

    Quote Originally Posted by BlackStar View Post
    It's a clip that's intentionally hard to decode, in order to test decoding efficiency. If this doesn't fit the definition of a benchmark I don't know what does.
    Really? So you believe the they shot the clip with the intention that it will be hard to decode? That seems completely unrelated to the task of filming a nature documentary to me. I think it's more a case of someone noticing it was hard to decode and using it for that unintended purpose.

  4. #34
    Join Date
    Jun 2009
    Posts
    2,929

    Default

    Quote Originally Posted by Hoodlum View Post
    I think it's more a case of someone noticing it was hard to decode and using it for that unintended purpose.
    This is the case with most benchmarks.

  5. #35
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,142

    Default

    Quote Originally Posted by Hoodlum View Post
    So you believe the they shot the clip with the intention that it will be hard to decode? That seems completely unrelated to the task of filming a nature documentary to me.
    Filming is completely unrelated to encoding. *Completely* unrelated.

    You can encode the same clip on dvd quality (plays fine), typical hd quality (~40Mbps max, plays fine) or you can use the current encoding (<=100Mbps) which may fail even on dedicated hardware.

    My Nvidia/VDPAU laptop can play every single full-hd movie fine but fails to decode this clip in real time. Does this mean that VDPAU sucks? No, it merely means that this clip is not representative of real-world hd content.

  6. #36
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,451

    Default

    Well, if you gave me a camera and asked me to film something that would mess up a typical video encode / decode stack...

    ... it's hard to think of anything worse than birds for blowing out the motion comp

  7. #37
    Join Date
    Jul 2009
    Location
    England
    Posts
    103

    Default

    Quote Originally Posted by BlackStar View Post
    Filming is completely unrelated to encoding. *Completely* unrelated.

    You can encode the same clip on dvd quality (plays fine), typical hd quality (~40Mbps max, plays fine) or you can use the current encoding (<=100Mbps) which may fail even on dedicated hardware.
    Fair enough but the assertion that this was the intention is still illogical. Why would they wish you a bad experience with the bluray version intentionally?

    My Nvidia/VDPAU laptop can play every single full-hd movie fine but fails to decode this clip in real time. Does this mean that VDPAU sucks? No, it merely means that this clip is not representative of real-world hd content.
    The problem with this statement is the simple fact that it is real-world hd content. I should know, I own the bluray. I still don't see how this is any different than Prime95 being a superb stress test for an overclocked pc (completely unintentionally).

  8. #38
    Join Date
    Jul 2009
    Location
    England
    Posts
    103

    Default

    Just played the clip on my other pc (which is slower than my main PC and not running Linux).

    In the test PC I am using:
    - Phenom II 2.8ghz Tri-Core
    - ATI 5770
    - 4GB dd3 ram

    Pretty mid-range.

    For the test I was using:
    - Windows 7
    - Catalyst 10.1
    - MPlayer with the highest quality post processing (6)

    Result:
    It actually used less CPU than my own rip of the bluray (26% at most) This clip is using the High@4.1 Profile. I use a higher profile for my rips. For comparison trying out a totally unrelated 720p nature program (High@4.1 profile) video results in 20% CPU maximum in the couple of minutes I watched it.

    The bitrate was posted earlier in the thread - 14.8mbps which is accurate. 14.4 of which is video. 14.8*122(duration)/8=225.7(the file size is actually 216MB). It is actually undersized even for 14.8mbps. This is not at all unrealistic for a bluray (which can go up to 40mbps).

    I can provide a screenshot of this with a frame counter if you like.

  9. #39
    Join Date
    Jul 2009
    Location
    England
    Posts
    103

    Default

    Quote Originally Posted by bridgman View Post
    Well, if you gave me a camera and asked me to film something that would mess up a typical video encode / decode stack...

    ... it's hard to think of anything worse than birds for blowing out the motion comp
    True. It definitely shows you if your acceleration isn't working properly, that's for sure!

  10. #40
    Join Date
    Jun 2009
    Posts
    1,132

    Default

    Quote Originally Posted by Hoodlum View Post
    True. It definitely shows you if your acceleration isn't working properly, that's for sure!
    no, it only shows that your decoder is prepared to a future bitrate standard, up today i havent see any BD or h264 movie encoded like that, is a good bench cuz well it helps you to be future proof, besides the obvious fact that you need one hell of expensive hardware to decode it. so until ssd is the commoners hard disk and we get more processing power in the masses i seriously doubt someone will use that mounstrocity of bitrate to the mass market.

    in resume is a good reference not a must, for now. and that now could be quite far in time cuz i really cant see the difference between 1080p and 1080p 100mbit bitrate in my 1080p led 120hz tv, dunno maybe if thaters move 500" led screen maybe but for normal market 1080p is here to stay for many years.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •