Announcement

Collapse
No announcement yet.

nVidia likely to remain accelerated video king?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    It's not becoming increasingly moot. Try comparing output quality between a CPU driven decoder and GPU driven decoder. The GPU decoder wins EVERY time. ATI knows this and they've gone to great lengths to support higher HD x264/AVC profiles than Nvidia in windows.

    CPU decoded HD looks like ass.

    Comment


    • #72
      Originally posted by LinuxID10T View Post
      Also think of it this way... Getting an Nvidia card for GPU decode is purpose defeating due to the MASSIVE amount of heat and power they use. In a day of sub $100 dollar quad cores, GPU decode is becoming increasingly moot.
      For a Home Theatre PC, GPU video decode is quite important. With these it's common to have low power and very quiet operation as two big design parameters. Using a low power CPU with a passively cooled, low-end graphics card with video decode to pick up the slack is quite common. If the machine is running 24/7 obviously you don't want it burning heaps of power.

      A HTPC can be required to play back a HD video stream while at the very same time be recording three or more other video streams from multiple tuners. That's not to mention cases where you're using picture in picture where you could have to be decoding two HD video streams concurrently while recording other streams.

      I will say for a desktop machine hardware video decode is less important as you rightly point to the ever increasing amount of CPU throughput but there are some other situations where it's a great feture to have. I do hope AMD provide a public API to their UVD via fglrx.

      Comment


      • #73
        Originally posted by IsawSparks View Post
        It's not becoming increasingly moot. Try comparing output quality between a CPU driven decoder and GPU driven decoder. The GPU decoder wins EVERY time. ATI knows this and they've gone to great lengths to support higher HD x264/AVC profiles than Nvidia in windows.

        CPU decoded HD looks like ass.
        You obviously don't have a clue what you are talking about. GPU decoded video and CPU decoded video look EXACTLY the same. Quality like what you are talking about is due to video postprocessing. BTW, that last line, CPU decoded HD must be equal to "IsawSparks"

        Comment


        • #74
          Originally posted by LinuxID10T View Post
          You obviously don't have a clue what you are talking about. GPU decoded video and CPU decoded video look EXACTLY the same. Quality like what you are talking about is due to video postprocessing. BTW, that last line, CPU decoded HD must be equal to "IsawSparks"
          They don't look exactly the same. All CPU driven decoders sacrifice detail for performance. Dedicated decoders do not and post processing is a requirement for FULL RATE HD, such as 1080p Bluray AVC @ Profile 4.1 and higher. CPU driven HD @ profile 4.1 or high does indeed look like ass.

          What's with all the noobs coming here and making personal attacks these days? Phoronix is a site dedicated to technical analysis, if you can't argue with facts you don't belong here.

          Comment


          • #75
            Originally posted by bridgman View Post
            There is also an important distinction between "firmware" that runs on the main CPU and "firmware" which runs on the device itself. Unfortunately even that distinction gets a bit fuzzy in cases where the device includes a general purpose CPU and what amounts to an entire driver running on the device. Sometimes that firmware is burned into the device, sometimes it has to be loaded by the driver.
            In terms of openness, no. If it's a binary blob, it's a binary blob. It doesn't matter if it's running on the CPU of some other device.

            The only safe thing is to ask lots of questions when someone talks about "firmware" or "microcode" so you can understand the impact, eg :

            - does it run on the main CPU or on the device ?
            - how big is it, ie does it represent a big chunk of the driver stack ?
            - is it running on a hardware state machine or on a general purpose processor ?
            - etc...
            Again, it doesn't matter where the code is running. The only thing that matters, in terms of openness, is whether or not you can replace the blob and whether or not you can see what it's doing. The size really doesn't matter to anything. If it's a very large blob or a very small blob is irrelevant. Of course, I can better accept a small blob than a large one if I have to, but that's only because it's much more difficult to hide thing in small objects and they are much easier to examine. Type of machine it's running on (again, in terms of openness) doesn't mean anything to anyone, except the machine itself. Etc...

            Comment


            • #76
              I still don't understand why replaceability is a factor if you're not replacing it. How is a stable, unchanging microcode image loaded by the driver any different from the same stable, unchanging microcode image stored internally to the hardware, other than the driver "feeling dirty" because it has to touch the microcode ?
              Test signature

              Comment


              • #77
                Originally posted by IsawSparks View Post
                They don't look exactly the same. All CPU driven decoders sacrifice detail for performance. Dedicated decoders do not and post processing is a requirement for FULL RATE HD, such as 1080p Bluray AVC @ Profile 4.1 and higher. CPU driven HD @ profile 4.1 or high does indeed look like ass.

                What's with all the noobs coming here and making personal attacks these days? Phoronix is a site dedicated to technical analysis, if you can't argue with facts you don't belong here.
                That is COMPLETELY DEPENDENT on which decoder you use. It has only to do with software, not the fact that the CPU is decoding it. BTW, you are the noob here and if you were interested in technical analysis, you would know that what you are saying is completely qualitative. Where the hell did you get that postprocessing is a requirement for "FULL RATE HD." Postprocessing is done AFTER the video has been decoded. So, you clearly have no idea what you are talking about. You ought to post an example if you think you are right. CPU decoders don't lose any detail. Essentially, most GPU decoders ADD detail that wasn't in the original video, but CPU decoders can do that too. By adding detail, I mean deinterlacing, interpolation, and antialiasing. Get your facts straight before you post crap like this.

                BTW, a standard is a standard, that is why if my video is purple your's shouldn't be green.

                Comment


                • #78
                  Originally posted by IsawSparks View Post
                  They don't look exactly the same. All CPU driven decoders sacrifice detail for performance. Dedicated decoders do not and post processing is a requirement for FULL RATE HD, such as 1080p Bluray AVC @ Profile 4.1 and higher. CPU driven HD @ profile 4.1 or high does indeed look like ass.

                  What's with all the noobs coming here and making personal attacks these days? Phoronix is a site dedicated to technical analysis, if you can't argue with facts you don't belong here.
                  AFAIK LinuxID10t is correct here, or I could say you're both right. Yes post processing makes a huge difference in video quality, and yes today you see much more post processing on GPU decode stacks than CPU decode stacks, but that post processing is *not* done on the decode hardware anyways, it's done on shaders and *could* be added to a CPU decode path as well. It just hasn't been done yet, primarily because the folks doing the most aggressive postprocessing are proprietary driver developers and they use dedicated decode hardware if it's available.
                  Test signature

                  Comment


                  • #79
                    Originally posted by LinuxID10T View Post
                    That is COMPLETELY DEPENDENT on which decoder you use. It has only to do with software, not the fact that the CPU is decoding it. BTW, you are the noob here and if you were interested in technical analysis, you would know that what you are saying is completely qualitative. Where the hell did you get that postprocessing is a requirement for "FULL RATE HD." Postprocessing is done AFTER the video has been decoded. So, you clearly have no idea what you are talking about. You ought to post an example if you think you are right. CPU decoders don't lose any detail. Essentially, most GPU decoders ADD detail that wasn't in the original video, but CPU decoders can do that too. By adding detail, I mean deinterlacing, interpolation, and antialiasing. Get your facts straight before you post crap like this.

                    BTW, a standard is a standard, that is why if my video is purple your's shouldn't be green.
                    I know what the term post processing means. Do you have a clue what the profiles are for? No you don't. They refer to precision of sampling and nature of movement. Post processing in HD terms IS NOT LIKE DVD. It's use is to interpolate movement based on the profile. In DVD that stuff was done regardless of the compression rate, with the single exception of non interlaced DVD.

                    Stop talking like you understand when you're applying DVD standards to HD AVC content. The two are not the same and your argument is full of holes.

                    Comment


                    • #80
                      Originally posted by IsawSparks View Post
                      I know what the term post processing means. Do you have a clue what the profiles are for? No you don't. They refer to precision of sampling and nature of movement. Post processing in HD terms IS NOT LIKE DVD. It's use is to interpolate movement based on the profile. In DVD that stuff was done regardless of the compression rate, with the single exception of non interlaced DVD.

                      Stop talking like you understand when you're applying DVD standards to HD AVC content. The two are not the same and your argument is full of holes.
                      The profiles are for encoding silly, not decoding. All the decoder has to do is play the file. Essentially, when a decoder is said to work up to a certain profile, it means it is limited by bitrate beyond that, not that it can't do the postprocessing. Even with what you are talking about, postprocessing is optional.

                      BTW, please don't argue with Bridgman's conclusion. He works at ATI and knows far more about the topic than both of us.

                      Comment

                      Working...
                      X