Announcement

Collapse
No announcement yet.

Valve Developed An Intel Linux Vulkan GPU Driver

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Marc Driftmeyer
    replied
    Take notice to many who think Intel is some sort of great supporter of Linux. Valve has far fewer resources to leverage and manage to do this work.

    Leave a comment:


  • haagch
    replied
    Obligatory comment that optimus is only a problem because of nvidia's closed source driver.

    With amd + PRIME the problem has had a solution for some time now where you can set up in driconf what gpu a given binary should be rendered with:
    http://lists.freedesktop.org/archive...ay/060131.html



    Originally posted by Kemosabe View Post
    Ehm ... yeees?
    It's written to large parts by full time employees of amd and commited to mesa with accounts with their company mail addresses. If that's not official, what is?

    Leave a comment:


  • mmstick
    replied
    So much for the anti-Valve haters on here that would consistently state that Valve never did anything good for FOSS and the FOSS community.

    Leave a comment:


  • rikkinho
    replied
    ?

    Originally posted by Kano View Post
    @eydee

    It would be better it you think before you write. Bumblebee works with an on-demand started 2nd xserver as the Nvidia hardware is switched oft with bbswitch while not used. That's undetectable via glxinfo without preloader, but the officially supported variant has the Nvidia driver active all the time and can not save much energy while you don't run games, vsync is currently not supported as well. But the speed you can get is much higher than via Bumblebee. Btw. the xrandr code that Nvidia is using does not only work with modesetting, but with Intel and even radeon as output device as well. I used a HD 4550 and GT 630 OEM in a Q67 Intel board to check.
    with ubuntu 12.04.5, 14.04 14.10 you can switch cards (some type of fixed mode) and with intel nvidia card is off. nvidia prime don thave sync with monitor and is horrible to use, i have 2 optimus laptops, 1 intel/amd, 1 with intel, 1 nvidia, 2 desktops (nvidia and amd) and a laptop with apu with windows 7/8, ubuntu, manjaro and one arch

    Leave a comment:


  • Kano
    replied
    @eydee

    It would be better it you think before you write. Bumblebee works with an on-demand started 2nd xserver as the Nvidia hardware is switched oft with bbswitch while not used. That's undetectable via glxinfo without preloader, but the officially supported variant has the Nvidia driver active all the time and can not save much energy while you don't run games, vsync is currently not supported as well. But the speed you can get is much higher than via Bumblebee. Btw. the xrandr code that Nvidia is using does not only work with modesetting, but with Intel and even radeon as output device as well. I used a HD 4550 and GT 630 OEM in a Q67 Intel board to check.

    Leave a comment:


  • rikkinho
    replied
    ??

    Originally posted by eydee View Post
    We are talking about Steam. It won't even detect VRAM, not to mention optimus. It actually has zero optimus support, that's why people have to put bumblebee (or whatever it is) commands into the launch options game by game. Even though Steam is patched up and ported to linux, its heart is still written for windows 95 and multi-GPU systems were only a dream back then.

    what? steam don't need to have optimus support, optimus is problem from drivers, you can launch steam with bumblebee or nvidia prime, and it will assume the nvidia card... the problem is with x, nvidia and linux not valve

    Leave a comment:


  • eydee
    replied
    Originally posted by Kano View Post
    It is very simple to detect if Optimus is available. In case of Linux glxinfo would not show anything from Intel if "xrandr --setprovideroutputsource" is used. Can't be hard to check on Windows.
    We are talking about Steam. It won't even detect VRAM, not to mention optimus. It actually has zero optimus support, that's why people have to put bumblebee (or whatever it is) commands into the launch options game by game. Even though Steam is patched up and ported to linux, its heart is still written for windows 95 and multi-GPU systems were only a dream back then.

    Leave a comment:


  • gens
    replied
    Originally posted by log0 View Post
    Lol, Mantle, Vulkan and co are all about the CPU becoming the bottleneck, which is not really an issue for Intel CPU/GPUs yet.
    if they share memory, then it is

    Leave a comment:


  • justmy2cents
    replied
    Originally posted by MartinN View Post
    that and it seems things are trending toward people not giving as much crap about whether their game is 4k hires and pushing 60 fps or whatever but rather whether the game has other addictive properties... perhaps multiplayer, challenging the mind/intellect more... rather than just the glitz factor.

    NVidia can choke on their binary blob for all I care...
    although i mostly agree with the gist of it, there is only one thing important about [email protected] if anything with decent complexity can work that out. all games will work at [email protected], even those that are far from perfect. that simply means less hustle in settings, nothing else. 4K is important just as much as i am lazy

    by the time i think about buying 4K tv. graphics will be pushing 8 or 16 or whatever the next standard is

    Leave a comment:


  • Kano
    replied
    Originally posted by eydee View Post
    It probably counts optimus systems as well. Only a small portion is actually being used for 3D stuff.
    It is very simple to detect if Optimus is available. In case of Linux glxinfo would not show anything from Intel if "xrandr --setprovideroutputsource" is used. Can't be hard to check on Windows.

    Leave a comment:

Working...
X