Announcement

Collapse
No announcement yet.

FFmpeg's VP8 Decoder Blasts Google's Decoder

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Standard error, sorry

    Comment


    • #12
      Originally posted by EarthMind View Post
      Hopefully it won't take too long before graphic chipset producers add VP8 decoding support to their GPUs.

      I'd really like to see how much VP8 differs from H.264 in file size and quality for 720p and 1080p videos.
      From what I've seen there's no difference in quality at 720p and above. Its 480p and below you can notice some things. H264 has a slight advantage with quality, but they're not differences you'll notice unless you were looking for them so the average consumer won't notice anything if YouTube or Hulu suddenly switched entirely to VP8. As VP8's code improves I feel by the time HTML5 is finalized there will be no difference between VP8 and h264 in quality.

      Comment


      • #13
        Bitrate is totaly BS. What's important is what you can display with that bitrate. But even that is totaly irrelevant when looking at a Flash Video replacement.

        What matters IS resolution performance. Maybe not for the BluRay replacement but because the speed of Flash Video is dependant on the resolution. I can literaly display a mpeg4 720p YouTube Flash vid on my Eee PC 900 Celeron 660mHz if I zoom out with my webbrowser. That's the problem that needs to be tackled with VP8.

        Comment


        • #14
          Yep, FFmpeg devs tend to make sure that things work well on all numbers of cores.
          Actually, the only way I've been able to get MORE than one core used, is by compiling mplayer against ffmpeg-mt, and even then, some formats only run single-threaded, like xvid. Least, last I checked a month or so ago.

          Comment


          • #15
            Edit: Other formats like mpeg2 and 4, seem to use multiple cores only when they run with ffmpeg-mt and have the threads option set.

            Comment


            • #16
              Originally posted by Kano View Post
              Well did you see that the bitrate is only 4.5 mbit for those videos? Of course it is 1080p, but what matters is the bitrate. Well youtube will most likely not use much more, so it should be possible to decode it. But html5 still misses a good full screen playback.
              remember that vp8 is not the ultimate ultra useless high bit rate ultimate killer video codec lol.

              vp8 is meant as a average joe nice youtube / or any other site with flash video replacement for html5. so 4 to 10 mbit is good enough.

              beside the bitrate of 40 is lol, even with my 32" 1080p tv is hard as hell to find any diff beetween 8 mbit or 100mbit except in some high speed scenes.

              ok for proffesionals o ppl with futuristic artifical uhd eyes 40mbits or more is neccesary but even these days BR is already an overkill like it is for the crappy eye average joe's

              Comment


              • #17
                I wouldn't say 1080p is overkill... you can still easily spot the massive single color squares on 1080p played at 1680*1050.

                I'm not videophile (I should be wearing classes).

                Comment


                • #18
                  Originally posted by V!NCENT View Post
                  I wouldn't say 1080p is overkill... you can still easily spot the massive single color squares on 1080p played at 1680*1050.

                  I'm not videophile (I should be wearing classes).
                  well, in my tv at real 1080p 1920*1080@60hz that shit looks like is going to get out of my tv anytime without any noticeable defect.
                  my tv is a normal samsung 550 series 32"(quite an average lcd tv)

                  in this case maybe is something wrong in your settings or encode errors or something.

                  glasses looks cool so you should wear some too XD just for fun

                  Comment

                  Working...
                  X