Announcement

Collapse
No announcement yet.

Intel Sandy/Ivy Bridge Gallium3D Driver Merged

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by remm View Post
    Looking at their reactions, they're really annoyed by this "toy" driver, and did all they could to shoot it down. We'll see where it goes from there, but certainly Intel looks more interested in their competitive advantage than benefiting the entire Linux ecosystem.
    Hence why you shouldn't buy Intel hardware.

    Comment


    • #17
      Originally posted by przemoli View Post
      Gallium driver for Sandy/Ivy do not bring anything more than classic driver. Nor it provide performance gains.

      Distros will stick to classic Intel driver.


      GPU driver devs/enthusiasts will be able to play with ilo more easily. That is just it.
      It removes the copout that it would take too long to be worth it to switch to Gallium3D.

      Comment


      • #18
        Originally posted by Kayden View Post
        I don't know where you heard that, but the classic i965 Mesa driver is entirely under the MIT license---the exact same license used by the rest of the project. Our driver contains no GPL code whatsoever.
        Matthieu Herrb's "Future of X on non-Linux systems" talk says that there is a file "intel_pm.c" that's under a GPL license. I'm guessing that this is for power management and is sufficiently tied to the Linux kernel that it becomes bound by the GPL (as a derivative work)? If not, what is it/why is it under the GPL?

        Comment


        • #19
          Originally posted by archibald View Post
          Matthieu Herrb's "Future of X on non-Linux systems" talk says that there is a file "intel_pm.c" that's under a GPL license. I'm guessing that this is for power management and is sufficiently tied to the Linux kernel that it becomes bound by the GPL (as a derivative work)? If not, what is it/why is it under the GPL?
          That is part of the kernel and not part of the mesa driver, so is out of context. The respective licence for intel_pm.c is:

          Code:
          /*
           * Copyright  2012 Intel Corporation
           *
           * Permission is hereby granted, free of charge, to any person obtaining a
           * copy of this software and associated documentation files (the "Software"),
           * to deal in the Software without restriction, including without limitation
           * the rights to use, copy, modify, merge, publish, distribute, sublicense,
           * and/or sell copies of the Software, and to permit persons to whom the
           * Software is furnished to do so, subject to the following conditions:
           *
           * The above copyright notice and this permission notice (including the next
           * paragraph) shall be included in all copies or substantial portions of the
           * Software.
           *
           * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
           * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
           * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.  IN NO EVENT SHALL
           * THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
           * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
           * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
           * IN THE SOFTWARE.
           */
          So it is dual-licensed - GPL for inclusion into the kernel and MIT/X otherwise. If you do come across a mistake like that, please assume it was an oversight rather than malice and bring it to our attention.

          Comment


          • #20
            Originally posted by ickle View Post
            That is part of the kernel and not part of the mesa driver, so is out of context. The respective licence for intel_pm.c is:

            <snip>

            So it is dual-licensed - GPL for inclusion into the kernel and MIT/X otherwise. If you do come across a mistake like that, please assume it was an oversight rather than malice and bring it to our attention.
            Thank you for the clarification. I didn't assume that there was any malice involved; sorry if my message came across as differently.

            Comment


            • #21
              Originally posted by Kivada View Post
              Intel's GPUs don't compete anyways, the silicon just doesn't have the performance. The idea of not doing it is moronic and comes from management that doesn't understand the basis of technology, that Intel is a HARDWARE company, not making sure every aspect of your hardware does exactly what it is capable of doing due to your bullshitting on drivers is only detrimental to your sales.
              I would disagree with this statement Kivada. For high end gaming, youre right. But Sandy Bridge and newer... at least for me, new Intel CPU's with a builtin GPU will replace everything from nvidia and ATI midrange and down. High-mid and up, I will still go discrete because im assuming those will be workstations or gaming machines, but mid-range and down? Just go Intel with the integrated. It really is more than an enough.

              *I type this from a Sandy Bridge Low-voltage (aka, underclocked) ultrabook. And for the most part, I couldn't be happier. I will be upgrading to Broadwell or whatever the next architecture name is after *well and I'm very excited to see the difference in performance even more.

              Comment


              • #22
                Originally posted by Ericg View Post
                I would disagree with this statement Kivada. For high end gaming, youre right. But Sandy Bridge and newer... at least for me, new Intel CPU's with a builtin GPU will replace everything from nvidia and ATI midrange and down. High-mid and up, I will still go discrete because im assuming those will be workstations or gaming machines, but mid-range and down? Just go Intel with the integrated. It really is more than an enough.

                *I type this from a Sandy Bridge Low-voltage (aka, underclocked) ultrabook. And for the most part, I couldn't be happier. I will be upgrading to Broadwell or whatever the next architecture name is after *well and I'm very excited to see the difference in performance even more.
                I take it you haven't used any of the AMD APU systems in the same price bracket then, they are considerably better in the graphics department then Intel's GPUs. In any case, Intel, like AMD and Nvidia is a HARDWARE company, the drivers should be considered a necessary part of that hardware, because what use is a driver for which there is no hardware to make use of it?

                As for Nvidia, you are right, but that is because Nvidia pigeonholed themselves by having no X86 CPU and where dependent on AMD and Intel to allow them to make motherboard chipsets for them. Nvidia only had the one out for a chance at existing 10 years from now by going to the much more competitive ARM market, but currently has little to show for it and are barely holding on to their GPGPU market via their early ability to get as many devs as possible on the CUDA bandwagon, knowing that if they went to OpenCL there would be very little incentive to only buy Nvidia hardware over what gets the highest performance per watt at the time of your purchase.

                Back in 2008 Nvidia should have either bought out VIA/S3 or fought to force Intel to allow them a license to make X86 hardware. By now they'd likely have an interesting product in the market.

                Comment


                • #23
                  Originally posted by Kivada View Post
                  I take it you haven't used any of the AMD APU systems in the same price bracket then, they are considerably better in the graphics department then Intel's GPUs. In any case, Intel, like AMD and Nvidia is a HARDWARE company, the drivers should be considered a necessary part of that hardware, because what use is a driver for which there is no hardware to make use of it?

                  As for Nvidia, you are right, but that is because Nvidia pigeonholed themselves by having no X86 CPU and where dependent on AMD and Intel to allow them to make motherboard chipsets for them. Nvidia only had the one out for a chance at existing 10 years from now by going to the much more competitive ARM market, but currently has little to show for it and are barely holding on to their GPGPU market via their early ability to get as many devs as possible on the CUDA bandwagon, knowing that if they went to OpenCL there would be very little incentive to only buy Nvidia hardware over what gets the highest performance per watt at the time of your purchase.

                  Back in 2008 Nvidia should have either bought out VIA/S3 or fought to force Intel to allow them a license to make X86 hardware. By now they'd likely have an interesting product in the market.
                  I haven't USED them, no, but I was under the impression (from AMD's own marketing material) that power consumption for the APU's wasn't going to hit intel's levels until the next architecture revamp next year. Plus there's the issue of very...lackluster open source drivers, and the closed source driver is slow to support new X releases (its still only at 1.13 right now) which is an issue for me as an Arch user.

                  Meanwhile Intel has good open source drivers on Linux and not bad closed source drivers on windows, with all of the important features for laptops covered in both drivers and immediate support for new X / Kernel releases.

                  Comment


                  • #24
                    Originally posted by Kivada View Post
                    I take it you haven't used any of the AMD APU systems in the same price bracket then, they are considerably better in the graphics department then Intel's GPUs. In any case, Intel, like AMD and Nvidia is a HARDWARE company, the drivers should be considered a necessary part of that hardware, because what use is a driver for which there is no hardware to make use of it?

                    As for Nvidia, you are right, but that is because Nvidia pigeonholed themselves by having no X86 CPU and where dependent on AMD and Intel to allow them to make motherboard chipsets for them. Nvidia only had the one out for a chance at existing 10 years from now by going to the much more competitive ARM market, but currently has little to show for it and are barely holding on to their GPGPU market via their early ability to get as many devs as possible on the CUDA bandwagon, knowing that if they went to OpenCL there would be very little incentive to only buy Nvidia hardware over what gets the highest performance per watt at the time of your purchase.

                    Back in 2008 Nvidia should have either bought out VIA/S3 or fought to force Intel to allow them a license to make X86 hardware. By now they'd likely have an interesting product in the market.
                    I always thought it would be wize for nvidia to buy out Transmeta. Of course that didnt happen, but it would have given nvidia a chance to get into the x86 market.

                    Comment


                    • #25
                      Originally posted by Ericg View Post
                      I would disagree with this statement Kivada. For high end gaming, youre right. But Sandy Bridge and newer... at least for me, new Intel CPU's with a builtin GPU will replace everything from nvidia and ATI midrange and down. High-mid and up, I will still go discrete because im assuming those will be workstations or gaming machines, but mid-range and down? Just go Intel with the integrated. It really is more than an enough.

                      *I type this from a Sandy Bridge Low-voltage (aka, underclocked) ultrabook. And for the most part, I couldn't be happier. I will be upgrading to Broadwell or whatever the next architecture name is after *well and I'm very excited to see the difference in performance even more.
                      Performance has improved greatly to the point where the GT2 graphics on Haswell can comfortably compete with mid-range dedicated hardware but you are still constrained by system memory. And last I checked, DDR3 is still much slower than GDDR5. Plus you do leech off system ram when using onboard graphics, For some people that's a no-no if you need every last bit of memory available in the system.

                      Also there has been one very annoying issue about Intel hardware; when compared side by side, a machine using the onboard graphics from Intel always seems to have a very blurred display vs the AMD and the Nvidia's where the display appears sharp and clear. I have no idea why this is the case though.

                      Comment


                      • #26
                        Originally posted by Ericg View Post
                        I haven't USED them, no, but I was under the impression (from AMD's own marketing material) that power consumption for the APU's wasn't going to hit intel's levels until the next architecture revamp next year. Plus there's the issue of very...lackluster open source drivers, and the closed source driver is slow to support new X releases (its still only at 1.13 right now) which is an issue for me as an Arch user.

                        Meanwhile Intel has good open source drivers on Linux and not bad closed source drivers on windows, with all of the important features for laptops covered in both drivers and immediate support for new X / Kernel releases.
                        I take it you haven't read any of the reviews on the hardware on on the current state of the AMD Gallium3D drivers. They are now at a point that they are better then Intel's drivers, go read them currently on the front page of this very site.

                        Combine that with the fact that the Intel HD Graphics 4000 couldn't even match the performance of the first generation APU's Radeon HD6550D let alone keep up with the Radeon HD7660D in the current models.

                        So yeah, keep saying Haswell will somehow be made out of magic ZOMG PWNIE farts when it hasn't even come to market, and the "reports" are all internal benchmarks where they are likely making all kinds of mods to the game to make it appear to be running faster when they're either disable high quality settings for the Haswell or have run a version of the game that is very poorly optimized on the non Intel GPUs.

                        Why assume this? Because Intel did this for years with their compiler, disabling any SSE extensions for any non Intel CPU, hence why they where investigated for anticompetitive practices. You will see this in many Windows benchmarks, where you can take a CPU from both companies that on Linux would perform almost identically until you ran Windows and something like SuperPI, with the SSE disabled on the AMD parts the Intel chip appears to be many times faster.

                        That reason alone is good enough to never recommend Intel hardware.

                        Comment


                        • #27
                          Originally posted by Sonadow View Post
                          Performance has improved greatly to the point where the GT2 graphics on Haswell can comfortably compete with mid-range dedicated hardware but you are still constrained by system memory. And last I checked, DDR3 is still much slower than GDDR5. Plus you do leech off system ram when using onboard graphics, For some people that's a no-no if you need every last bit of memory available in the system.
                          For AMD's APUs I remember seeing that the GPU performance scaled linearly with system ram speed, if you can overclock the system ram to the limits of either the ram or the memory controller you will get much better results. Currently the fastest factory certified DDR3 system ram I could find clocks in a 2.8Ghz. Though ram like that does indeed break the bank. I'd rather get some much cheaper in the 2.133Ghz or 2.4Ghz range with a voltage no higher then 1.5v at less then 1/3rd the price and overclock it myself.

                          It allows you to make quite a powerful ITX based HTPC system if you use an ASRock FM2A85X-ITX and an A10-5800K w/ some 2.4Ghz ram and a WinTV-HVR-2250 since it will actually be able to handle most of the games on Desura, Steam and Gameolith at max settings.
                          Last edited by Kivada; 04-30-2013, 05:49 AM.

                          Comment


                          • #28
                            Originally posted by Kivada View Post
                            For AMD's APUs I remember seeing that the GPU performance scaled linearly with system ram speed, if you can overclock the system ram to the limits of either the ram or the memory controller you will get much better results. Currently the fastest factory certified DDR3 system ram I could find clocks in a 2.8Ghz. Though ram like that does indeed break the bank. I'd rather get some much cheaper in the 2.133Ghz or 2.4Ghz range with a voltage no higher then 1.5v at less then 1/3rd the price and overclock it myself.

                            It allows you to make quite a powerful ITX based HTPC system if you use an ASRock FM2A85X-ITX and an A10-5800K w/ some 2.4Ghz ram and a WinTV-HVR-2250 since it will actually be able to handle most of the games on Desura, Steam and Gameolith at max settings.
                            Ohh YEAH! Now what we need is Steam Big Picture mode and XBMC integration

                            Comment

                            Working...
                            X