Linux Hybrid Graphics Will Be A Mess For A While

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Kivada
    replied
    Originally posted by Sidicas View Post
    I never liked the idea of hybrid graphics to begin with...
    If they did a better job designing the chip, the laptop shouldn't need to use a completely different chip to get better battery life..

    There's no reason a graphics chip shouldn't be able to power down cores, just the same as an Intel CPU..
    Really? I just hate that AMD dropped XGP http://www.amd.com/us/products/techn...s/ati-xgp.aspx which would have been awesome for those that want just a laptop but with the option for high end multiGPU performance when at home.

    That and in the pre APU days I hoped for OpenCL on the IGP+sideport ram with a decent dedicated GPU for everything else. Now I want to do the same with an A8 series APU and at least a 6670.

    Leave a comment:


  • marwi509
    replied
    I think bumblebee currently does the job allright, give it a bit more time and it might work really well.

    Leave a comment:


  • alazar
    replied
    I think Bumblebee-project (not just Bumblebee) is more madured than Ironhide, at least it has a comunity behind it and is not maintaned just by one person.

    I'm now running a optimus laptop with bumblebee installed, and it works.

    @grigi: default the Intel is used and the nvidia driver (or nouveau) is not loaded. But the card still uses energy and you need to turn it off manually or with an application with the right acpi calls.

    Leave a comment:


  • grigi
    replied
    I want to know about optimus.

    I don't want/need the Geforce graphics. what happens when I run linux in an optimus enabled laptop?
    Does the normal intel graphics driver work?
    Is there any way of disabling the nvidia GPU, or is it disabled by default?

    Thanks in advance :-)

    Leave a comment:


  • nzjrs
    replied
    Originally posted by airlied View Post
    It's funny that Michael never reports the company I work for in Ubuntu articles :-), but other work I do gets mentioned as Red Hat does something, when really Red Hat is in no way "doing" it. It just happens to be a project I want to do and RH makes sure I can do.
    Dave.
    Perhaps RH should offer a free-form survey for users to bitch about graphics on Linux.

    I'm sure Michael would report that with glee.

    John

    p.s. Thanks for all your work.

    Leave a comment:


  • grotgrot
    replied
    Originally posted by alexhung View Post
    There is such a function. AMD/ATI calls it PowerPlay.
    And Nvidia calls theirs PowerMizer. It is still inherently very difficult to design an manufacture something that can scale down really low while powering up very high. For example look at what ARM did with their new processor release. Even though they have been working on power and performance for years they they too use two different cores rather than managing to get one to cover the necessary range. Graph at the bottom of http://arstechnica.com/gadgets/news/...uperphones.ars

    Leave a comment:


  • alexhung
    replied
    Originally posted by Sidicas View Post
    There's no reason a graphics chip shouldn't be able to power down cores, just the same as an Intel CPU..
    There is such a function. AMD/ATI calls it PowerPlay.

    Leave a comment:


  • grotgrot
    replied
    Originally posted by Sidicas View Post
    I never liked the idea of hybrid graphics to begin with...
    In theory it would be possible to design something that could be very low power/performance and ramp up to much higher performance/power but that is about as difficult as designing a vehicle that is applicable to both taking your shopping home and transporting logs. It just doesn't make economic sense.

    Computers and phones are already full of chips with overlapping functionality and capabilities. Rather than try to find one perfect solution for each needed area of functionality that can scale the whole spectrum, it is far better to adapt to the workload and other constraints, and you'll repeatedly see this done. Processing jobs can use the CPU or GPU. Storage can use the CPU or embedded controllers. Networking can run on the card or the CPU.

    The operating systems that can do this kind of adaption will be the ones that win. (Incidentally this is how mainframes have operated for decades.)

    Leave a comment:


  • Sidicas
    replied
    I never liked the idea of hybrid graphics to begin with...
    If they did a better job designing the chip, the laptop shouldn't need to use a completely different chip to get better battery life..

    There's no reason a graphics chip shouldn't be able to power down cores, just the same as an Intel CPU..
    Last edited by Sidicas; 31 October 2011, 04:07 PM.

    Leave a comment:


  • 89c51
    replied
    Originally posted by airlied View Post
    some of it is, the buffer sharing and kernel drivers for USB devices,

    the X server stuff not so much since its in the X server. However any solution replacing X in a desktop env will have to consider the issues, which I'm not sure anyone has done so far.

    Dave.
    thanks

    i seem to recall that wayland people have mentioned something about multi GPU and similar stuff in the mailing list and also that there has been discussions with graphic devs but that was a looong time ago

    Leave a comment:

Working...
X