Announcement

Collapse
No announcement yet.

Bickering Continues About NVIDIA Using DMA-BUF

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • @D0pamine

    Well, since Sonadow mentioned he uses Gnome Shell and that it's heavyweight, I'm saying it's likely not a prime suspect of performance regression.
    See here.

    Seriously, I can understand some people don't like Gnome Shell, but it's not THAT bad, I know people(including myself) who can use it fine and happily.

    @Sonadow

    There's some info here, that mostly apply to all distros regarding speeding up radeon if you or someone else is interested. I wouldn't expect performance on par with Catalyst though.
    Use at your own risk. If you mess up in any way, you should be able to boot without 3d acceleration using "nomodeset" parameter on grub.
    Will likely not help with video playback, since without video decode acceleration, it should be mostly done by CPU. Except some video player specific tweaks though, which can maybe help.

    Comment


    • Originally posted by Rigaldo View Post
      @D0pamine

      Well, since Sonadow mentioned he uses Gnome Shell and that it's heavyweight, I'm saying it's likely not a prime suspect of performance regression.
      See here.

      Seriously, I can understand some people don't like Gnome Shell, but it's not THAT bad, I know people(including myself) who can use it fine and happily.

      @Sonadow

      There's some info here, that mostly apply to all distros regarding speeding up radeon if you or someone else is interested. I wouldn't expect performance on par with Catalyst though.
      Use at your own risk. If you mess up in any way, you should be able to boot without 3d acceleration using "nomodeset" parameter on grub.
      Will likely not help with video playback, since without video decode acceleration, it should be mostly done by CPU. Except some video player specific tweaks though, which can maybe help.
      why i dont like gnome-shell/gnome3/unity/ect..... i could say personal preference but its heavy and uses the gpu to render and the fallback mode is almost useless

      lxde isnt bad as its very light and openbox looks the part without using the gpu

      mate is gnome2, it has all the lovely functions of gnome2 like Caja scripts for eg so i'll just use what works for me - i really dont care what everyone else uses but if someone has a problem with their gpu whilst using gnome3 and someone doesnt have the same problem whilst using something else....

      Comment


      • Originally posted by Sonadow View Post
        Still don't dare...messed up graphics drivers are a passport for an OS reinstall. Even in Windows, where uninstalling drivers are as simple as just hitting the 'uninstall' button in Control Panel, i still prefer to format the entire OS instead of just uninstalling the drivers so that I ALWAYS start fresh from a clean slate.

        The only drivers I dare to compile in Linux are WiFi drivers, and those are for very exceptional cases only.

        It is for this reason i favor AMD's and Nvidia's blobs: superior performance and easy to install; all I have to do is to make sure kernel-source and kernel-headers are installed and the blob installation always passes.



        I switch between Gnome shell and KDe4. Heavyweight DEs but still my preferred ones for their functionality and features.



        I got nothing to say to that.
        Actually in Windows you need to use utility like driver sweeper to uninstall the junk that drivers leave in registry and elsewhere in other words upgrading graphic card and failing to completely remove drivers is an sure way to hell

        Comment


        • So nvidia haven't got it.

          So nVIDIA haven't got it. Development of blobs for open-source system IS going to be problematic. Because they're not welcome. Because they're inherently against of opensource nature. After all if nvidia does not cares about everyone convenience and prefers to close source and hardware information. So everyone in their right to do the very same to nvidia. And assign license and use rules these nuts can't use without pain in the rear. If they want to cooperate, they have to be cooperative. Attempt to dictate own rules of the games != cooperation, period.

          Mr. Torvalds, please repeat. They're really slow in understanding.

          Comment


          • Originally posted by Sonadow View Post
            wake me up when Nouveau can deliver something like THIS on ULTRA settings on any 8-month-old game:
            "Those who would give up Essential Liberty to purchase some dumb stuff with cool effects, deserve neither Liberty nor stuff with cool effects". Almost Benjamin Franklin's quote .

            Comment


            • Originally posted by 0xBADCODE View Post
              "Those who would give up Essential Liberty to purchase some dumb stuff with cool effects, deserve neither Liberty nor stuff with cool effects". Almost Benjamin Franklin's quote .
              Better yet (link):



              Going to freedom costs an effort, it's a little sacrifice. Freedom demands some sacrifices. Sometimes it's a big sacrifice, but in our case it demands small sacrifices.
              Then, the one does not want to make any sacrifice for freedom, even if it's small: He doesn't value Freedom.
              A lot of people say: "I'll switch to free software when it costs me nohing; no effort, no inconveniences" ... they say that he doesn't value freedom, that the value he gives to freedom is 0. And those fools, have no freedom
              Nothing more to add
              Last edited by vertexSymphony; 10-21-2012, 11:22 PM.

              Comment


              • Originally posted by D0pamine View Post
                isnt optimus just a crappy solution to the performance hit caused by compositing ?
                Not even remotely.

                Optimus allows a computer to use both the integrated GPU and a discrete GPU in an efficient manner. This is most important for laptops, where you want the discrete GPU sometimes for the speed, features, and power, but you want the integrated GPU when you're just dicking around on the Web or using office apps to conserve on power and prolong battery life.

                The Optimus approach in Windows is a giant hack, yes. Optimus on Windows makes use of a registry of applications to determine at the driver level whether the app should be directed to the discrete GPU or not, which is a pain since you need to keep an app database up to date and have to manually override add entries to the DB (which is a few clicks in a context menu to do, but still) for any apps you build yourself. This is necessary right now because the current Direct3D API (as well as the current OpenGL API, of course) lacks a smart way to select an "appropriate" device. You can ask for a default device for a display, or you can ask for a specific device out of the list of all available devices, but there's no way currently to say "hey OS, I'm a high-performance app, give me whichever device is fastest" or to say "hey OS, I'm only making light use of the GPU, give me the low-power device." There's also an issue where a lot of apps don't support having the GPU swapped out from under them, even though the Direct3D API actually makes supporting that about as easy as could be expected (compared to OpenGL, which completely lacks any direct means of supporting it, since the OpenGL API assumes that your configured context will never ever be destroyed or invalidated unless the app explicitly requests it), so the driver can't dynamically switch to the high-end GPU after an app is started (e.g., by using heuristics to determine the app is a game or content tool that is better off on the discrete GPU).

                OS X's implementation does things equally bad, where the decision on which GPU to use is tied which API is used to do rendering. As soon as you use OpenGL directly (or one of the higher-level GL wrappers) OS X assumes the app needs high-speed graphics and switches to the discrete GPU. You're forced to the use the proprietary rendering kits Apple provides to get good-looking high-speed graphics that stick to the low-power GPU.

                Linux has an opportunity here to beat the other OSes to the punch and deliver on an actually sane API (as an EGL extension or a custom API in Wayland/GLX) that allows apps to indicate whether they'd prefer high-performance or low-power. Hell, it might even go so far as to add some OpenGL extensions to make having the context invalidated for a forced graphics switch to actually work. Which is better anyway for things like hibernate; much better to just wipe the context away and expect the app to reinitialize it after wakeup, especially for a game with potentially 1+GB of textures and buffers and whatnot for modern graphics/hardware.

                The next version of DirectX (unfortunately not the version in Win8, which is primarily targeted at tablets that don't have switchable graphics) will include some updates to DXGI to make OS-native switchable graphics more natural, and remove the reliance on hacks like Optimus (or worse, the physical switches that you still find on some AMD-based notebooks). Given Khronos' track record with doing anything correctly or in a timely manner, hell if anyone knows when OpenGL+EGL will gain a good native solution.

                Comment


                • Originally posted by XorEaxEax View Post
                  And these markets are getting even more important for discrete GPU vendors like NVidia as the end user desktop is moving away from discrete GPU's to GPGPU solutions. Also since NVidia can't get a licence to produce their own GPGPU solutions on the x86 architecture the x86-based user desktop is becoming a dead market for them as their days there are numbered.
                  GPGPU doesn't mean what you think it means.

                  Any somewhat recent discrete NVIDIA GPU _is_ a GPGPU solution. All "GPGPU" means is "general purpose GPU." In other words, using a GPU for non-graphics tasks, by supporting CUDA, DirectCompute, OpenCL, or OpenGL 4.3 Compute Shaders (or even using normal graphics pipeline shaders for non-graphics work, which of course has been done quite a bit and is what inspired the creation of the GPGPU-oriented APIs).

                  There are some theoretical advantages to having the GPU integrated with the CPU (or more accurately, the memory bus that is these days integrated in the CPU) since it can make it faster to transfer data to/from the GPU, but those advantages are minor compared to the massive speed improvements the GPU gets by having a dedicated high-speed GDDR memory controller, and those advantages also are diminished when you take into account that newer NVIDIA (and soon AMD, I'd suspect) GPUs can directly interface with the PCI-E bus and system memory controller.

                  Comment


                  • Originally posted by GreatEmerald View Post
                    Hmm, speaking of licensing, I wonder... Would it theoretically be possible to have a "half-closed" driver? Say, open-source bits that are yours, and leave a part of the code that isn't yours as closed-source.
                    Sure, but only so long as the closed part is entirely in userspace (if you want to keep the GPL defenders on lkml happy).

                    Also, you seemed to imply that it was Windows' fault that parts of the proprietary drivers are proprietary. That is not so. You are fully allowed to publish Free drivers for Windows, GPL'd even (keeping in mind that the GPL allows you to use non-Free APIs that are part of the core OS the GPL code is running on). More than one such driver exists, although they're very rare to see (and are rarely hardware drivers).

                    Almost any third-party code in their drivers is something voluntarily licensed, excepting maybe the support code for the following point.

                    The BIG item for the video drivers is the movie decode and DRM stuff that they must support in order to sell products to most consumers. If they accidentally open up those interfaces in the wrong way (.e.g, a way that allows the DRM to be bypassed) then they are going to get hosed from the folks they licensed the patents from and promised they'd keep the DRM lock tight. Since their hardware must support these DRM mechanics in order to be marketable to the larger consumer GPU market, and since those interfaces are apparently still partially protected by driver black magic and an assumption that nobody will figure out the hardware interface to the video decoder units, they are somewhat forced to avoid documenting said interfaces or publishing open code for them.

                    It may just be beneficial for someone to reverse engineer those and (re)illustrate the folly of security through obscurity, and either force the hardware vendors to move all the DRM protection entirely into the hardware, or (by some miracle) finally get everyone to give up on DRM altogether (good freaking luck).

                    Comment


                    • Originally posted by elanthis View Post
                      It may just be beneficial for someone to reverse engineer those and (re)illustrate the folly of security through obscurity, and either force the hardware vendors to move all the DRM protection entirely into the hardware, or (by some miracle) finally get everyone to give up on DRM altogether (good freaking luck).
                      Heck, even if they had put it entirely in hardware in the first place, HDCP itself is completely cracked now (and IIRC even before that some keys had leaked to manufacturers of HDCP-stripping devices), which means that the media companies have to get everyone to upgrade their TVs again if they want another shot at it. I'm guessing that won't happen until about 2020.

                      Comment

                      Working...
                      X