Announcement

Collapse
No announcement yet.

Another Look At The Latest Nouveau Gallium3D Driver

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Thatguy View Post
    It doesn't make a difference, it still also has to cut bitrate and dither which costs clock. being as my video hardware is unsupported I can only display at that resolution currently.
    I most assuredly know that it does make a difference. This is easily seen when resizing a window while the video is playing. The bigger the window, the greater the load.

    But seriously. Its a non issue. 99% of users aren't compiling a kernel while watching a video. Usually they are watching just the video.
    No they may not be compiling a kernel but they may be doing items such as recording, timeshifting while watching another show which depending on your capture hardware can do. Many devices rely on software encoding and that starts adding up to the load.

    Also the bit rate of big buck bunny is typical of most video streaming bit rates. I have no idea where your comming up with the counter argument.
    It is hardly representative of HD bitrates. Maybe for webstreaming but not for anykind of local playback. The sample I put up however is very representative of the video one can expect from HD Rips, HD stream captures, HD personal video cameras, HD video encoders, etc. It falls within the specs of a typical AVCHD format that many consumer devices record to.

    I can even play 4 of the same video in the same size simultanoeusly which was my point.
    Doesn't mean anything if you can't decode even a medium bitrate HD stream (Let alone one that is CABAC instead of CAVLC).

    In todays day and age of fast cpu for x86, we simply don't have a huge pressing need for video accelration.
    Oh how wrong you are as it allows for smaller form factor devices with less power hungry processors that can be freed to do other tasks. CPU decoding is like trying to use a clawhammer to pound in a railway tie spike. Sure it can be done but it isn't efficient at all.

    Now if your saying that ARM needs vide accelration. I whole heartedly agree. But most of todays modern dual core and up systems are completely cpabale of playing video with zero acceleration
    With no post processing, scaled down video sure they are and they are extremely inefficient at it.

    and 1980x1020 isn't even all that popular.
    WHAT? lol you have to be kidding me. It is pretty much the standard.

    most video on pcs are DVD or lower quality. blue Ray players aren't even very common acesseories yet.
    Oh man you must live under a rock someplace (here I thought I lived in the boonies). HD is everywhere. From the TV you watch live, to the streaming content, to the devices like Bluray. If it isn't there ask your self why is it that a majority of TV's and monitors are 1080p nowdays?

    Comment


    • Originally posted by deanjo View Post
      I most assuredly know that it does make a difference. This is easily seen when resizing a window while the video is playing. The bigger the window, the greater the load.
      umm, not on my pc it doesn't. but you know. I'm not gonna keep having a symantical debate.



      No they may not be compiling a kernel but they may be doing items such as recording, timeshifting while watching another show which depending on your capture hardware can do. Many devices rely on software encoding and that starts adding up to the load.

      I'd be more concerend about disk write speed if I was watching and recording video at the same time. especially with one HDD in the machine

      It is hardly representative of HD bitrates. Maybe for webstreaming but not for anykind of local playback. The sample I put up however is very representative of the video one can expect from HD Rips, HD stream captures, HD personal video cameras, HD video encoders, etc. It falls within the specs of a typical AVCHD format that many consumer devices record to.
      its 1980 x1020 16:9 1gb file for a 4 minute video. thats pretty damn representitive of typical bit rates found on blue rays.



      Doesn't mean anything if you can't decode even a medium bitrate HD stream (Let alone one that is CABAC instead of CAVLC).

      give me a stream I will benchmark decoding it. I am not even remotely concerned. Again you point is moot on most modern hardware.


      Oh how wrong you are as it allows for smaller form factor devices with less power hungry processors that can be freed to do other tasks. CPU decoding is like trying to use a clawhammer to pound in a railway tie spike. Sure it can be done but it isn't efficient at all.
      yes becuase a 13 inch netbook screen render beautfiul 1024x780 images.

      your making a argument against not for video acceleration. How HD does a 3.5 inch screen need to be ?

      With no post processing, scaled down video sure they are and they are extremely inefficient at it.
      I'm not suffering.



      WHAT? lol you have to be kidding me. It is pretty much the standard.
      whats the standard ? 480 web video ? yeah its pretty much standard.



      Oh man you must live under a rock someplace (here I thought I lived in the boonies). HD is everywhere. From the TV you watch live, to the streaming content, to the devices like Bluray. If it isn't there ask your self why is it that a majority of TV's and monitors are 1080p nowdays?[/QUOTE]

      TV and Blue ray is the only thing "aside from gaming" ushing bit rate in any direction beyond 680.

      thats how it is. the bulk of videos online are sub 480.

      I have a 65 inch TV in my living room and a 1080p cable servie with 150 hd channels. I don't watch TV on my 23 inch monitor.

      Comment


      • Originally posted by Thatguy View Post
        Seriously ?

        I mean I can decode mp4 1080p just fine with 50% of one 3.2 ghz cpu core no problem.
        Have you tried it on say a machine like a HTPC with an Intel E3300 while it's also say recording two or three shows as well?

        Comment


        • Originally posted by Thatguy View Post
          I'd be more concerend about disk write speed if I was watching and recording video at the same time. especially with one HDD in the machine
          Shouldn't concern you at all, I regularly record 2 or 3 HD streams at once on one drive. With the cache buffers that are around today it is not a concern.

          its 1980 x1020 16:9 1gb file for a 4 minute video. thats pretty damn representitive of typical bit rates found on blue rays.
          Hardly, nice stretch of the facts however. It is a 690 meg file for a 10 minute clip with CAVLC encoding at an average of 9Mbit/s encoding, that is not even close to even AVHCD standards let alone bluray. Can you exaggerate any more? (1080P h264 version).

          give me a stream I will benchmark decoding it. I am not even remotely concerned. Again you point is moot on most modern hardware.
          I gave you a link to a representative stream in the earlier posts.

          yes becuase a 13 inch netbook screen render beautfiul 1024x780 images.
          your making a argument against not for video acceleration. How HD does a 3.5 inch screen need to be ?
          Lol, netbooks no but nettops are extremely popular for hooking up as a HTPC on a 1080p TV home theater. Nice diversion of the subject at hand.

          I'm not suffering.
          See above posts why, try some real HD content typical of HD content.

          whats the standard ? 480 web video ? yeah its pretty much standard.
          LMFAO, you thing everybody just watches youtube?

          TV and Blue ray is the only thing "aside from gaming" ushing bit rate in any direction beyond 680.
          lol, that's a fairly big use for video don't you think?

          thats how it is. the bulk of videos online are sub 480.
          Again streaming online video is not the majority of what people watch video on.

          I have a 65 inch TV in my living room and a 1080p cable servie with 150 hd channels. I don't watch TV on my 23 inch monitor.
          So do I, I do however have a very nice low power PC that serves as a complete media hub for playback of all HD materials. Cableboxes are all hooked up to the PC, where it gives me ample storage room and ability to do simple things like edit out commercials and store them for later use on the multiple HD TV's we have around the house where everybody can enjoy them in a location where they please. Probably the number one use of net tops is for HD HTPC setups.

          Comment


          • Originally posted by mugginz View Post
            Have you tried it on say a machine like a HTPC with an Intel E3300 while it's also say recording two or three shows as well?
            I can easily do that on a underclocked X2 processor @ 800 Mhz and still timeshift thanks to video decode acceleration and hardware encoding. All the while consuming extremely little power.

            Comment


            • Not that i'm necessarily agreeing with ThatGuy, but i do have to disagree with some of the things you're saying.

              Originally posted by deanjo View Post
              Again streaming online video is not the majority of what people watch video on.
              Proof? I'd be shocked if that wasn't what the majority of video watched on a PC was.

              If it isn't there ask your self why is it that a majority of TV's and monitors are 1080p nowdays?
              They aren't. Steam survey says < 30% of monitors are 1080p or higher. Most popular resolution is 1680x1050, with 1920x1080 and 1280x1024 just behind. And that's a survey of gamers, which is almost certainly going to be biased towards people with nicer than average hardware.

              I know all of your hardware is 1080p, and that you do all sorts of fancy DVR stuff on your machines, but you shouldn't then assume that everyone else must be doing the same thing.

              Comment


              • Originally posted by smitty3268 View Post
                Proof? I'd be shocked if that wasn't what the majority of video watched on a PC was.
                I didn't say on a PC, video as a whole

                They aren't. Steam survey says < 30% of monitors are 1080p or higher. Most popular resolution is 1680x1050, with 1920x1080 and 1280x1024 just behind. And that's a survey of gamers, which is almost certainly going to be biased towards people with nicer than average hardware.
                I am referring to items being sold in the present market. Just 3 years ago steam said 1024x768 and a monitor < 16" was the most popular with 2% being capable of running 1080p. The other thing is that steam reports what they are running their desktop at for gaming, not what their monitor is capable of.

                I know all of your hardware is 1080p, and that you do all sorts of fancy DVR stuff on your machines, but you shouldn't then assume that everyone else must be doing the same thing.
                I don't assume that, I'm looking at what is presently being marketed.

                Comment


                • Is it worth even bothering with these any time soon (unless you're a dev) when there's still an excellent binary driver?

                  Comment


                  • Originally posted by jalyst View Post
                    Is it worth even bothering with these any time soon (unless you're a dev) when there's still an excellent binary driver?
                    For practical end user needs there is no real reason to consider anything but using the binary driver.

                    Comment


                    • Thanks for substantiating my suspicions Dean!
                      If you've got a minute or two, I don't suppose you could help me out here too?

                      Discrete GT_430 Vs i3-530's Intel HD

                      Thanks if you can, all the best.

                      Comment


                      • Originally posted by crazycheese View Post
                        Linux is not 500 oses. It is one os with open specs. If BSD people want and have manpower they too, can port it to BSD. If they pay money to AMD and achieve specific amount, AMD will do this job for them. But AMD should start counting, market - aware, not blind. This is not different from clones, respins, etc. People want - people pay, time or money. The thing is I pay for AMD hardware and get nothing to use it efficiently. Except it works efficiently in Windows(tada!).
                        I would like to point out that Linux is a kernel, not an operating system. It is used in roughly 500 different operating systems, many of which are largely binary compatible due to the use of a GNU userland.

                        Furthermore, BSD died 16 years ago. A few forks occurred at that point and each one is a separate OS developed with its own tree. The mission of developers of BSD forks is to make source code to anyone for any purpose, including situations where improvements cannot be contributed back. Your suggestion that the developers of BSD forks can port GPL-licensed drivers to their platform is a poor one at best. Licensing wise, the BSD fork developers cannot touch GPL-licensed drivers anymore than the Linux developers can touch Windows drivers.


                        Originally posted by crazycheese View Post
                        Did RPM has dependency hell corrected? And even like with OpenSuse vs RHEL, although both RPM, they wont be compatible. Different compiler versions, different kernels, different configure options, different naming. I adhere more to exherbo way - standardization where standardization makes sense.
                        You should try Gentoo Linux. It resolved dependency hell years ago.

                        Comment


                        • Originally posted by Shining Arcanine View Post
                          Your suggestion that the developers of BSD forks can port GPL-licensed drivers to their platform is a poor one at best. Licensing wise, the BSD fork developers cannot touch GPL-licensed drivers anymore than the Linux developers can touch Windows drivers.
                          It's probably worth mentioning here that Linux graphics drivers are generally X11-licensed (compatible with both BSD and GPL) not GPL-licensed. There may be exceptions but (for example) the ATI/AMD drivers are all X11 AFAIK.

                          Comment


                          • Originally posted by Shining Arcanine View Post
                            I would like to point out that Linux is a kernel, not an operating system. It is used in roughly 500 different operating systems, many of which are largely binary compatible due to the use of a GNU userland.
                            There is linux kernel (name owned by Linus Torvalds) and linux kernel based operating systems which are both very flexible, very transparent and either compatible or can be made compatible.

                            Originally posted by Shining Arcanine View Post
                            Furthermore, BSD died 16 years ago. A few forks occurred at that point and each one is a separate OS developed with its own tree. The mission of developers of BSD forks is to make source code to anyone for any purpose, including situations where improvements cannot be contributed back. Your suggestion that the developers of BSD forks can port GPL-licensed drivers to their platform is a poor one at best. Licensing wise, the BSD fork developers cannot touch GPL-licensed drivers anymore than the Linux developers can touch Windows drivers.
                            BSD can use GPL license for anything they want. They have blobs in kernel, whats the problem with GPL? Legally it is absolutely correct and possible, unless they want pure BSD kernel. In this case they are left with trivia
                            - either to rewrite the component and allow it to be stolen into prorietary(which may mean legal prosecution from original GPL author, especially if it is a corporate entity);
                            - accept GPL license and disallow this subsystem to be closed down (stolen) into any proprietary code.

                            Look, they have choices and no one prevents them an access to the code.

                            Originally posted by Shining Arcanine View Post
                            You should try Gentoo Linux. It resolved dependency hell years ago.
                            Oh, I use Gentoo for nearly 3 years now. It has many many shortcomings. But this does not relate the original idea I mentioned - standardization where standardization makes sense - Exherbo way. If they want binary compatibility because it makes sence - they should alliance, if it is not beneficial they should not. If BSD wants BSD-only gfx subsystem they should rewrite it, if it does not make sense for them - they will be just FINE with copyleft.

                            Comment


                            • Originally posted by crazycheese View Post
                              There is linux kernel (name owned by Linus Torvalds) and linux kernel based operating systems which are both very flexible, very transparent and either compatible or can be made compatible.


                              BSD can use GPL license for anything they want. They have blobs in kernel,
                              I feel a distinction between the different BSDs is necessary here. The OpenBSD people hate blobs and won't have any of it, whereas the FreeBSD people modified the OS specifically just so it could run the NVidia blob.

                              Originally posted by crazycheese View Post
                              whats the problem with GPL? Legally it is absolutely correct and possible, unless they want pure BSD kernel. In this case they are left with trivia
                              - either to rewrite the component and allow it to be stolen into prorietary(which may mean legal prosecution from original GPL author, especially if it is a corporate entity);
                              - accept GPL license and disallow this subsystem to be closed down (stolen) into any proprietary code.

                              Look, they have choices and no one prevents them an access to the code.
                              Agreed.

                              Comment

                              Working...
                              X