Announcement

Collapse
No announcement yet.

Valve Developed An Intel Linux Vulkan GPU Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by log0 View Post
    Yeah, they've been hacking on the Intel driver for quite some time, even before the optimization work.

    Although, Vulkan on Intel seems kinda pointless, will be GPU limited anyways.
    I think you're misunderstanding the point then.

    Vulkan represents a new way to design hardware architectures. It looks an aweful lot like the front end of a GPU. All that it requires is either coding around existing hardware or designing new hardware for it. It presents a future where whoever can stuff the most capacity into the smallest die wins. Thats exactly the kind of future Intel wants.

    Comment


    • #52
      Originally posted by rikkinho View Post


      for your conclusion all apu are pointless, but all intel igpu represents much more maket than amd and nvidia together
      That is what you say...

      My point is that Intel GPUs are pretty weak compared to their CPUs, so that the performance increase due to Vulkan will be probably not very noticeable.

      Comment


      • #53
        Originally posted by Kano View Post
        Well Intel gpus are heavyly used, most common are Ivy Bridge (HD 4000,HD 2500) and Haswell (HD 4400) - they are used more than the first integrated AMD (HD 8800 series) - which is interesting as they are slower on Windows.



        Most likely Vulkan can speed things up on Windows as well with the right driver. Well maybe AMD should think about this:

        http://store.steampowered.com/hwsurvey/processormfg/
        It probably counts optimus systems as well. Only a small portion is actually being used for 3D stuff.

        Comment


        • #54
          Originally posted by duby229 View Post
          I think you're misunderstanding the point then.

          Vulkan represents a new way to design hardware architectures. It looks an aweful lot like the front end of a GPU. All that it requires is either coding around existing hardware or designing new hardware for it. It presents a future where whoever can stuff the most capacity into the smallest die wins. Thats exactly the kind of future Intel wants.
          Lol, Mantle, Vulkan and co are all about the CPU becoming the bottleneck, which is not really an issue for Intel CPU/GPUs yet.

          If we had more powerful CPUs, nobody would give a damn.

          Comment


          • #55
            Originally posted by log0 View Post
            Lol, Mantle, Vulkan and co are all about the CPU becoming the bottleneck, which is not really an issue for Intel CPU/GPUs yet.

            If we had more powerful CPUs, nobody would give a damn.
            I disagree.

            The point is that future hardware designs can be made simpler. The old paradigm of whoever designs the most advanced hardware optimizations wins is gone. The new paradigm of whoever can stuff the most capacity into the smallest die wins is about to begin.

            Comment


            • #56
              The man has his head screwed on straight and that's that.

              Originally posted by d2kx View Post
              ༼ つ ◕_◕ ༽つ Praise Lord GabeN ༼ つ ◕_◕ ༽つ

              Comment


              • #57
                Originally posted by Kemosabe View Post
                Perhaps you're right the gallium way is not adequate for Vulkan but i think they chose intel for a much simpler reason:
                It's the only official opensource driver out there...
                that and it seems things are trending toward people not giving as much crap about whether their game is 4k hires and pushing 60 fps or whatever but rather whether the game has other addictive properties... perhaps multiplayer, challenging the mind/intellect more... rather than just the glitz factor.

                NVidia can choke on their binary blob for all I care...

                Comment


                • #58
                  Originally posted by haagch View Post
                  radeon/radeonsi is not "official"?
                  Ehm ... yeees?

                  Comment


                  • #59
                    Originally posted by eydee View Post
                    It probably counts optimus systems as well. Only a small portion is actually being used for 3D stuff.
                    It is very simple to detect if Optimus is available. In case of Linux glxinfo would not show anything from Intel if "xrandr --setprovideroutputsource" is used. Can't be hard to check on Windows.

                    Comment


                    • #60
                      Originally posted by MartinN View Post
                      that and it seems things are trending toward people not giving as much crap about whether their game is 4k hires and pushing 60 fps or whatever but rather whether the game has other addictive properties... perhaps multiplayer, challenging the mind/intellect more... rather than just the glitz factor.

                      NVidia can choke on their binary blob for all I care...
                      although i mostly agree with the gist of it, there is only one thing important about 4K@60Hx. if anything with decent complexity can work that out. all games will work at 1080p@60Hz, even those that are far from perfect. that simply means less hustle in settings, nothing else. 4K is important just as much as i am lazy

                      by the time i think about buying 4K tv. graphics will be pushing 8 or 16 or whatever the next standard is

                      Comment

                      Working...
                      X