Announcement

Collapse
No announcement yet.

Will H.264 Codec Support Come To Fedora? Nope.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    The best way is to take approach that Document Foundation did when opposing M$ Docx. And they won.
    All other options are pathetic.

    If Apple is forcefully installing h264 on its products, it does exactly the same approach as m$, called pushing "own" standards. You clearly CAN?T accept patented format as standard for video exchange. This will set all for good.

    When will people learn that patents comissions should NOT be taken from NON-COMMERCIAL applications ?! You can?t charge someone who does not use your work to make money, this is simply unethical!

    Comment


    • #42
      Originally posted by XorEaxEax View Post
      I seriously doubt that storing both is a HUGE and EXPENSIVE undertaking, lots of sites already store the files in both the original and a streaming version (Vimeo comes to mind) and other sites have MUCH less content than youtube.
      The Mozilla people have been talking to actual companies. Here's some examples:




      Originally posted by XorEaxEax View Post
      As for the future of vpx and webm, Google is obviously commited
      If they were so obviously committed, they'd turn off h264 in Chrome when they promised to do it - at the beginning of last year. Hasn't happened.

      Originally posted by XorEaxEax View Post
      download 720p webm/vp8 and mp4/x264 videos on youtube and compare, I sure can't see any visual difference and the webm file is usually smaller.
      Provide a specific example, please.

      Also, do your own test transcoding a DVD video. I did so. Unfortunately it was more than a year ago, so I'd need to repeat the test with current versions of encoders. Back then libvpx was quite bad. Here's the worst example: original, x264, libvpx. Other examples weren't that bad, here's the entire album: http://www.imagebam.com/gallery/ylkh...csidsfo9yx1ww/. For each of the five images compared, they're always in the order of original, x264, libvpx, theora.

      libvpx has surely improved since then, but I doubt is has come close to x264. Only an actual test would say for sure though.

      Originally posted by XorEaxEax View Post
      Looking past Google however is what webm is really about for me, as I think it opens up great opportunities for startups in video content due to it not carrying any licence costs.
      If you want to reach the broadest audience, and you do, you need h264. I don't think limiting yourself to desktop Firefox+Chrome+Opera users is a wise business decision. Especially with how the mobile space is growing.

      Originally posted by XorEaxEax View Post
      For those declaring webm/vp8 a failure because it hasn't yet taken over the web, are you stupid?
      No, just realistic. Read the message from Asa Dotzler I linked to above - webm isn't making *any* progress on the web. It's the other way around. Companies are actively saying "no" to it. That's the reality. The Mozilla folks have accepted it, after talking to companies for months.

      Comment


      • #43
        Originally posted by Gusar View Post
        If they were so obviously committed, they'd turn off h264 in Chrome when they promised to do it - at the beginning of last year. Hasn't happened.
        They are continuing to develop vpX at a high pace and also hiring more developers to work at it, how is that not commitment? As for the exact reasons for not removing h264 from Chrome, I can only speculate that they feared it would negatively impact Chrome's adoption rate. But that does not equal lack of commitment towards webm as it's increased development shows. Personally I think it's better if webm gains ground on it's merits of being a licence-free codec with rather than by preventing access h264.


        Originally posted by Gusar View Post
        Also, do your own test transcoding a DVD video. I did so. Unfortunately it was more than a year ago, so I'd need to repeat the test with current versions of encoders.
        Yes you do, there's been several releases since then. I too do test encodes from DVD, and I did some tests when 'Duclair' was released and I was impressed with the results. In high-motion scenes x264 was certainly better (as in viewing still shots, live it's very hard to tell) at equal file sizes but the difference wasn't staggering by any means. x264 output is undoubtably better quality at equal file sizes but the difference is not huge and for web content it makes no percievable difference at all.

        When I did the tests I used the new constant quality option in vpenc (basically same as constant rate factor in x264 although this works in two pass):

        vpxenc 1.y4m -o output.webm --best --target-bitrate=100000 --end-usage=cq --cq-level=16 --kf-min-dist=0 --kf-max-dist=360 --token-parts=2 --min-q=0 --max-q=63 --passes=2 --auto-alt-ref=1

        Never mind the inflated --target-bitrate as it is just a dummy value, the cq-level is what determines the quality

        I compared this to the output of:
        x264 --subme=10 --ref=16 --bframes=16 --trellis=2 --no-fast-pskip --rc-lookahead=40 --8x8dct --me=umh --merange=24 --b-adapt=2 --analyse=all --partitions=all --b-pyramid=2 --cabac --direct=auto --crf=16

        And again while x264 was definately better at the same file sizes the difference was (imo) impressively small. Now I'm no expert encoder and chances are there are much better options for both these encoders, if so please do tell.

        Originally posted by Gusar View Post
        libvpx has surely improved since then, but I doubt is has come close to x264. Only an actual test would say for sure though.
        Give it a shot, I'd think you'd be surprised.

        Originally posted by Gusar View Post
        If you want to reach the broadest audience, and you do, you need h264. I don't think limiting yourself to desktop Firefox+Chrome+Opera users is a wise business decision. Especially with how the mobile space is growing.
        True but in reality you also need to offset the costs necessary to reach the broadest audience, x264 comes with licencing costs, webm does not.

        Originally posted by Gusar View Post
        No, just realistic. Read the message from Asa Dotzler I linked to above - webm isn't making *any* progress on the web. It's the other way around. Companies are actively saying "no" to it. That's the reality. The Mozilla folks have accepted it, after talking to companies for months.
        I don't know what Mozilla was expecting, seriously this sounds like a cop-out excuse just because they want to get their browser into the mobile sector and therefore needs to be able to use h264.

        Google offers a quality encoder for free, with a perpetual patent grant for anyone to use, they add support for it on their own site Youtube, which I can happily say is progressing great as I haven't come across a non-playable video in quite some time (I don't have flash installed so it's all webm for me), they continue to pour money into improving the codec and it's tools. Yet we have whiners complaining that they aren't doing enough??? Insane.

        Comment


        • #44
          Ok, where to begin...

          First off, that the article is biased nonsense that doesn't take the LAW into account. Fedora/RH is a **UNITED STATES** based linux distro, and are thus subject to USA LAW. That means that they must respect IP or obama will butt-rape them.

          Second, Fedora ***DOES*** support H.264, BONE STOCK, straight out of the box, no hacks, no 3rd party repos, nothing funny. Just not SOFTWARE codecs, which are pretty useless anyway. Certain hardware decoders (broadcom crystalhd, maybe some intel GPUs, etc.) can decode H.264, and Fedora *DOES* include open source drivers for some of those. The reason these are permitted is because the drivers don't actually do any of the decoding, the hardware does, and the hardware is LICENSED, hence LEGAL.

          Comment


          • #45
            Originally posted by smitty3268 View Post
            Google wouldn't disable it on old hardware - just new. Like when Android 5.0 comes out, for example. One of the hardware requirements to run 5.0 would be VP8 hardware acceleration. With VP8 Youtube videos, I think it might be possible. You could get places like ESPN to start serving up WebM streams along with h264. But this is a large step, and one that Google is obviously nervous about taking. It probably never will. It's exactly the kind of radical step that a company like Apple might do, to try and lock users into their format and push the market where they want it to go. And exactly the type of big step I think Google has been reluctant to take.
            It's not that bad; they have been offering the hardware decoder design for free for a while, and at least Rockchip has implemented it in many of their SoCs.
            If even many of the chinese Android phones have hw vp8, it wouldn't exactly be a huge investment to the big players.

            Comment


            • #46
              Originally posted by XorEaxEax View Post
              When I did the tests I used the new constant quality option in vpenc (basically same as constant rate factor in x264 although this works in two pass):
              The scale is the same as in x264? Google actually did tuning so that's the case? I kinda doubt that. So the same number probably doesn't mean the same thing to both encoders.

              Originally posted by XorEaxEax View Post
              I compared this to the output of:
              x264 --subme=10 --ref=16 --bframes=16 --trellis=2 --no-fast-pskip --rc-lookahead=40 --8x8dct --me=umh --merange=24 --b-adapt=2 --analyse=all --partitions=all --b-pyramid=2 --cabac --direct=auto --crf=16
              crf 16 ?? That's overkill. x264 reaches transparency at somewhere around 18, real life encodes are done with 19 or 20. If you're comparing at sizes where both encoders reach transparency, no wonder you won't see differences between them. Were the filesizes of the two videos actually the same? Usually to compare encoders, one does 2-pass encodes to a target bitrate.

              Originally posted by XorEaxEax View Post
              Give it a shot, I'd think you'd be surprised.
              Ok, I gave libvpx-1.0.0 a shot. Command-line is from [URL="http://www.webmproject.org/tools/encoder-parameters/"]here[URL] - 2-Pass Faster VBR Encoding. Target bitrate same as with my previous test, 1050 kbps. The command-line for x264:
              Code:
              x264 --pass [1/2] --bitrate 1050 --preset slower --tune film --bframes 5
              (BTW, the x264 encode with crf 20 is under 1000 kbps)

              The result: http://www.imagebam.com/gallery/ue7f...muddrjx3yz0ze/. Now compare that with the old test, I did snapshots of the same frames as then so that you can easily compare. What I see is libvpx is significantly better in one case, but still far from ideal in another. Could be a ratecontrol thing, but ratecontrol is also extremely important.
              One other thing: libvpx used to be helluva slow, like half the speed of x264. This has improved *a lot*, and I really mean that. I'm glad I did the test again, I now have more up-to-date experience with libvpx. I will not call is slow anymore

              PS. The last time I was asked why I didn't use the top settings for post encoders (--preset veryslow for x264 and "2-Pass Best Quality VBR Encoding" for libvpx). So I did that. Quality was pretty much the same, while the encoding took a lot longer. So that's why. The settings I chose is something I'd actually use for a real encode. Going with "better" settings only increases encode time, while not producing a visibly better picture.

              Originally posted by XorEaxEax View Post
              Yet we have whiners complaining that they aren't doing enough???
              I'm whining?

              Comment


              • #47
                Originally posted by Gusar View Post
                The scale is the same as in x264? Google actually did tuning so that's the case? I kinda doubt that. So the same number probably doesn't mean the same thing to both encoders.
                No it's not, I had to change the quality numbers so that the resulting file size was matching as closely as possible, but as I couldn't recall the exact numbers I used the same values here to avoid the -'hey! these are not the same quality settings' comments which would have forced me to explain that they do not map 1:1 and so I had to go for the same resulting filesizes and compare them visually. Now that was pointless as I still had to explain why.

                Originally posted by Gusar View Post
                crf 16 ?? That's overkill. x264 reaches transparency at somewhere around 18, real life encodes are done with 19 or 20.
                Again the number entered here was rather arbitrary, I did try 16 (and equivalent numbers of equal resulting file sizes on vpxenc) but also others, I know I went down to atleast --crf 23 on x264 in my tests (iirc I did about 4 tests at different crf settings), these tests were done sometime in early february so I don't recall the equivalent vpxenc quality settings resulting in the same (roughly as in +/- 2-3mb) filesizes.

                Originally posted by Gusar View Post
                Ok, I gave libvpx-1.0.0 a shot. Command-line is from [URL="http://www.webmproject.org/tools/encoder-parameters/"]here[URL] - 2-Pass Faster VBR Encoding. Target bitrate same as with my previous test, 1050 kbps.
                Well the settings examples are _really_ old as has been criticized on in irc and the mailing lists. In order to do a fair comparison you really should use the new quality option, particularly together with two pass (which is really fast). Also you get points for using Firefly footage but please some Inara (Morena Baccarin) shots goddamnit!

                Originally posted by Gusar View Post
                One other thing: libvpx used to be helluva slow, like half the speed of x264. This has improved *a lot*, and I really mean that.
                Yes, I believe one or two of the past releases were pretty much focused on improving the speed of the encoder (which as you stated was sorely needed).

                Originally posted by Gusar View Post
                I'm whining?
                Well it wasn't directed at you in particular but rather a general statement, however if the shoe fits...

                Comment


                • #48
                  Originally posted by TheBlackCat View Post
                  I am pretty sure they do it by being in a country that does not allow patents on software (i.e. pretty much any country besides the U.S.).
                  Where you are really doesn't matter a huge deal, as the U.S. is a very, ahem, enthusiastic legal jurisdiction. The U.S. legal system considers itself to have fairly extensive rights to haul foreign bodies (people, companies) up in front of itself.

                  No matter where you or your company is based - Europe, Canada, Timbuktu - if you distribute software into the U.S., you are entirely subject to being summonsed to a U.S. court hearing. You can, of course, choose not to show up and to ignore any judgements that are made against you. No-one's going to get extradited for a civil case (which is what patent cases are). But if you lose a U.S. patent case, then your opponent can certainly use all the various machinery of U.S. law enforcement to try and get the money they're awarded out of you, and if any of your cash flow goes through the U.S., they might succeed.

                  Really, when it comes to patent cases - any civil law cases - the question of where anyone is based doesn't matter a whole deal. The more important thing to do is, as it so often is, follow the money. For a start, it's almost never worth suing someone who doesn't have any money, which is why organizations that don't actually make any money usually get away with infringing patents. MPEG-LA is not going to bother suing, say, RPM Fusion for 'distributing patented software in the U.S.' because RPM Fusion doesn't have any money, and if it gets RPM Fusion shut down, something else will mushroom up in its place. It's just not worth their while. But if the potential lawsuit subject is making $10m a year in sales to the U.S., even if it's theoretically based in Nepal, you can bet they're going to sue.

                  Comment


                  • #49
                    Originally posted by droidhacker View Post
                    Ok, where to begin...

                    First off, that the article is biased nonsense that doesn't take the LAW into account. Fedora/RH is a **UNITED STATES** based linux distro, and are thus subject to USA LAW. That means that they must respect IP or obama will butt-rape them.

                    Second, Fedora ***DOES*** support H.264, BONE STOCK, straight out of the box, no hacks, no 3rd party repos, nothing funny. Just not SOFTWARE codecs, which are pretty useless anyway. Certain hardware decoders (broadcom crystalhd, maybe some intel GPUs, etc.) can decode H.264, and Fedora *DOES* include open source drivers for some of those. The reason these are permitted is because the drivers don't actually do any of the decoding, the hardware does, and the hardware is LICENSED, hence LEGAL.
                    Erm.

                    Software decoding of h.264 is perfectly useful on your typical modern PC, because they're all quite capable of doing it in software. And you seem to be rather overstating the case for hardware decoding in Fedora. crystalhd is in fact the _only_ hardware decoding that Fedora supports. There is no current open source support for hardware decoding on NVIDIA or Radeon cards, you have to use the closed-source drivers to get hardware decoding on those cards. There is open source code to do hardware decoding of some formats on Intel hardware, but our legal review determined that too much of the decoding is actually done by the driver not by on-chip firmware, and so we couldn't legally ship those bits. The plugins for libva decoding of patented formats on Intel adapters are stripped out of the Fedora libva package for this reason.

                    Comment


                    • #50
                      Originally posted by AdamW View Post
                      Erm.

                      Software decoding of h.264 is perfectly useful on your typical modern PC, because they're all quite capable of doing it in software. And you seem to be rather overstating the case for hardware decoding in Fedora. crystalhd is in fact the _only_ hardware decoding that Fedora supports. There is no current open source support for hardware decoding on NVIDIA or Radeon cards, you have to use the closed-source drivers to get hardware decoding on those cards. There is open source code to do hardware decoding of some formats on Intel hardware, but our legal review determined that too much of the decoding is actually done by the driver not by on-chip firmware, and so we couldn't legally ship those bits. The plugins for libva decoding of patented formats on Intel adapters are stripped out of the Fedora libva package for this reason.
                      Ok, I did know about the nvidia/radeon not having it, but wasn't certain about the intel. As for decode in software... sure it *can* be done, but you'll get your CPU fan spinning as fast as a dental drill to keep the heat off it from working so hard.

                      Hopefully, the UVD stuff passes legal review soon so that Radeon cards can be added to the list. CrystalHD and UVD have common roots, so presumably, the issues should be similar if AMD passes it.

                      Comment

                      Working...
                      X