Announcement

Collapse
No announcement yet.

Wayland's Weston Gets Output Scaling Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Wayland's Weston Gets Output Scaling Support

    Phoronix: Wayland's Weston Gets Output Scaling Support

    Besides a new Raspberry Pi renderer for Weston, another interesting set of Wayland patches today is for providing output scaling support with Weston when using the X11 and DRM back-ends...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Sooooo... a game that is set to run at 800x600 could take up the fullscreen still instead of just the ACTUAL 800 by 600 pixels?
    All opinions are my own not those of my employer if you know who they are.

    Comment


    • #3
      Originally posted by Ericg View Post
      Sooooo... a game that is set to run at 800x600 could take up the fullscreen still instead of just the ACTUAL 800 by 600 pixels?
      The actual use case is scaling windows for very high dpi displays like the retina.

      Comment


      • #4
        Originally posted by Ericg View Post
        Sooooo... a game that is set to run at 800x600 could take up the fullscreen still instead of just the ACTUAL 800 by 600 pixels?
        No, that has been possible for a long, long time already.

        This is mostly about windowed applications on high density screens, where the window would be unreadably small (not only text, but buttons, lines, all kinds of graphic) without scaling. While configuring a default scaling factor for all stuff on a screen, it also allows the application to render in high density and bypass the default scaling, to produce finer graphics.

        So in essence, applications that do not care about high density will be scaled up by user preference, and they will become readable and usable, while applications that do handle high density have the chance to produce finer graphics.

        This is not about DPI, font sizes, or keeping windows sizes equal in meters on different DPI monitors. This is simply about an integer scaling factor, that will make the difference between usable and unusable for a user.

        Comment


        • #5
          Which is still good, I'd love to see high-DPI displays become common on desktops. It's already there on mobile, tablets, Macs and ChromeOS (Pixel). Windows is holding us back for desktops and general laptops though. Hopefully with everything on Linux supporting high DPI and the general push towards Linux and Mac with the failure of Windows 8 we can finally start to see high DPI non-Apple PC displays become standard.

          Comment


          • #6
            Originally posted by Vash63
            Windows is holding us back for desktops and general laptops though.
            Huh? Surely what's holding us back is that there aren't any high-DPI desktop displays actually available to buy, nor is there yet a cabling standard capable of transmitting (for example) 4K images.

            Comment


            • #7
              Originally posted by Wingfeather View Post
              Huh? Surely what's holding us back is that there aren't any high-DPI desktop displays actually available to buy, nor is there yet a cabling standard capable of transmitting (for example) 4K images.


              Been around since 2006. I've had monitors with it personally for 3 years now. No, the reason Windows is holding us back is because no display manufacturer is willing to make a panel for the PC market that Windows does not support, and Windows on a high-DPI panel is a horrible experience. Especially WinXP, which still has considerable market share and until recently was still #1.

              Or do you really think Apple has some secret ability to make high DPI panels for Macbook Retinas that the entire monitor industry doesn't? Apple doesn't even manufacture their own panels. Clearly, it's the OS that's holding us back.

              Comment


              • #8
                Originally posted by Vash63
                Or do you really think Apple has some secret ability...
                No no, of course not. As you say, Apple don't make the panels (although they do buy up most of the capacity of many of their components, so the rest of the industry is locked out practically).

                But we're talking about different things. Laptop-sized panels obviously exist, but to my knowledge desktop-sized panels with these sorts of resolutions do not (except for the prototypes that get shown at CES every year but never seem to amount to anything). Even if they do exist, there certainly aren't any complete displays actually available to buy. You might be right that all this is due to a lack of support from Microsoft, but that is conjecture and it isn't obvious that this is the reason. I personally think it is much more likely that the extremely high cost of these displays would make them a non-starter. Ignoring Windows entirely, I cannot imagine there wouldn't be demand from Mac owners if these were available for anything resembling reasonable money.

                I stand corrected about DisplayPort; I didn't know that v1.2 (which has not been around since 2006) had so much bandwidth. Very cool.

                Comment


                • #9
                  Regardless of the specific spec, they could always use multiple cables like some 3D displays do with using two DVI-D DLs to make 4 links. I really do believe that if the market was there for high resolution displays it could have been done before now. I find it hard to believe we have really been stuck at around 100PPI for so many years without progress.

                  And I'd be willing to bet money as proof that the first retina class desktop display also comes from Apple, not due to technical ability but due to their entire market having an OS and app platform that can handle it properly.

                  Comment


                  • #10
                    Originally posted by pq__ View Post
                    Originally posted by Ericg
                    Sooooo... a game that is set to run at 800x600 could take up the fullscreen still instead of just the ACTUAL 800 by 600 pixels?
                    No, that has been possible for a long, long time already.
                    I?ve never understood the point of Wayland, but if it lets us do that, then YES it?s suddenly interesting .

                    Comment

                    Working...
                    X