Mir Display Server Support Lands In SDL2

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • dee.
    replied
    Originally posted by Awesomeness View Post
    server-allocated buffers which Canonical employee Christopher Halse Rogers claims to be a requirement for "the ARM world and Android graphics stack"
    https://en.wikipedia.org/wiki/Mir_%2...e_architecture
    They're going to server-allocate the shit out of those buffers. They'll server-allocate them so hard.

    Leave a comment:


  • Annabel
    replied
    Originally posted by mrugiero View Post
    Wow, solid arguments here.
    yes, it was

    Originally posted by mrugiero View Post
    What would be funny about it?
    everything

    Leave a comment:


  • mrugiero
    replied
    Originally posted by Annabel View Post
    Just a reminder (with a liiiiiitle bit of trolling) that GPLv3 + CLA is better than the MIT license.
    Wow, solid arguments here.

    [edit:] It would be funny to see Mir implementing the wayland protocol
    What would be funny about it?

    Leave a comment:


  • Annabel
    replied
    Just a reminder (with a liiiiiitle bit of trolling) that GPLv3 + CLA is better than the MIT license.
    Thank you for reading this. With love, Anna

    [edit:] It would be funny to see Mir implementing the wayland protocol

    Leave a comment:


  • sarmad
    replied
    Originally posted by Awesomeness View Post
    server-allocated buffers which Canonical employee Christopher Halse Rogers claims to be a requirement for "the ARM world and Android graphics stack"
    https://en.wikipedia.org/wiki/Mir_%2...e_architecture
    I believe Jolla has already proven Canonical wrong by running Wayland on top of ARM, unless Canonical is willing to eventually run Mir on top of Android itself.

    Leave a comment:


  • mrugiero
    replied
    Originally posted by Awesomeness View Post
    server-allocated buffers which Canonical employee Christopher Halse Rogers claims to be a requirement for "the ARM world and Android graphics stack"
    https://en.wikipedia.org/wiki/Mir_%2...e_architecture
    With Wayland, deciding where the allocation happens is up to the compositor, not the protocol. You can as well do server side allocation in Wayland, if you want and your compositor supports it.

    Leave a comment:


  • nanonyme
    replied
    Originally posted by Awesomeness View Post
    server-allocated buffers which Canonical employee Christopher Halse Rogers claims to be a requirement for "the ARM world and Android graphics stack"
    https://en.wikipedia.org/wiki/Mir_%2...e_architecture
    Have I badly misunderstood or does this mean Mir will not support DRI3 ever? I recall having read that it was designed with client-allocated buffers in mind.

    Leave a comment:


  • Awesomeness
    replied
    Originally posted by Pajn View Post
    He's just trolling. Hoping to start a flamewar.

    Everybody know Wayland is a protocol and Mir a display server.
    Still Mir requires to sign a CLA before committing while Wayland does not. So how is that trolling by him?

    Leave a comment:


  • Awesomeness
    replied
    Originally posted by sarmad View Post
    Now the question is: does Mir provide any technical superiority over Wayland? If it doesn't then what's the point behind it?
    server-allocated buffers which Canonical employee Christopher Halse Rogers claims to be a requirement for "the ARM world and Android graphics stack"

    Leave a comment:


  • Awesomeness
    replied
    Originally posted by bregma View Post
    So, you decided with intent that the Mir implementation copied the Wayland implementation, and not the other way around.
    Wayland support was commited long ago, so unless somebody invented time travel there is only one conclusion.
    That and the copyright header?

    Leave a comment:

Working...
X