Announcement

Collapse
No announcement yet.

Our Linux Graphics Survey Is Off With A Bang

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Our Linux Graphics Survey Is Off With A Bang

    Phoronix: Our Linux Graphics Survey Is Off With A Bang

    Merely eight hours ago we launched our 2009 Linux Graphics Survey to collect some data about the popular graphics drivers and hardware being used by our Linux readers along with other metrics such as the common ways one goes about installing their driver, what X server is being used, etc. The results of this survey are interesting in their own right, but they also help developers better understand what their users are most interested in with regard to the Linux graphics stack / X.Org and provide other statistics...

    http://www.phoronix.com/vr.php?view=NzY1OA

  • #2
    @phoronix: you really want to make some of the widgets to radio buttons.

    E.g. in the first question about "most". Doesn't make a lot of sense that the user can select more than one.

    Makes the result worthless.

    Comment


    • #3
      I wouldn't describe my usage of GNU/Linux as "Mainstream", since I don't run GNOME or KDE. Fluxbox, gnome-terminal, and firefox are what I usually have running. And emacs, git-gui, etc. if I'm hacking.

      If there was a "Hacker" or "old-school command-line curmudgeon" option, I'd select it, but there's not, and "Professional 2D" is the closest option. It's not that I'm doing photo editting, which would be one thing that definitely falls under pro 2D is , it's just that I know how Linux works under the hood, and most of my input is via kbd, not mouse. Most of the programs I use are at most 2D. Or even 1D, since cmdline progs only have terminal input and output, which is a single stream, thus 1 dimension.

      I play games, but I'm more interested in having a video card that idles at low power while still being able to push some serious pixels. (I recently bought a 4670 for this reason.) So I'm reluctant to check off enthusiast gamer. I don't actually play games most nights. But I checked that option for my desktop system since I do play some serious games when I do get around to it, occasionally WoW, or Nexuiz, or Vegastrike, not just minesweeper.

      Definitely feeling like a square peg trying to fit into your survey's round holes.

      The other questions were mostly ok, although ranking options in order of importance would be nice if there was an HTML field type that supported that... #1 concern is always stability. Unix systems can't crash; anything that causes system crashes makes it really inconvenient to e.g. run VirtualBox for my dad to VNC into from his desktop, or even just to have the machine as a file server. I don't want to waste money and electricity having separate physical machines for every task, but it seems like desktop use that puts any strain on the graphics drivers is incompatible with stability for ATI or Intel anyway. My old AGP NVidia machine is always quite stable.

      The reasons I went with ATI instead of NVidia this time were:
      1. ATI publishes specs for their HW. This is huge. I HATE closed-source proprietary crap that can never be properly supported on a system running only Free software. Even though with ATI the proprietary driver is essentially the only option for good 3D perf and power management on r700 hardware, and will be for some time, I will vote with my dollars for the company that supports open specs. I bought a g965 mobo when they came out in late 2006 so I could have open source 3D, and stuck with it even though I knew I could have better perf from a cheap but proprietary NV or ATI card. So I'm not just saying I care about freedom without acting. The tipping point in getting a 4670 was the need for a DVI output, because I just added a VGA extension cable, and it made the analog quality noticeably bad.

      2. Idle power. I leave my computer on all the time, acting as a file server for backups and other usage, and p2p file sharing. A video card that idles at more than 15W is totally unacceptable to me. Under 10W is reasonable. It's hard to find sites that measure idle power in a useful way (e.g. include onboard graphics idle as a baseline, or some other way to find the absolute power draw by the vid card). But AFAIK, Radeon 4650/4670 are the only cards that don't suck at 3D that idle under 10W. Although I think NV is close with some card or other. I'm more willing to compromise on this than on open specs.

      So obviously I need a driver with all the power-management features working to control a game-capable card like the 4670. I like my computer to be quiet, as well as eco-friendly. Although I do live in Canada, so the extra electric heat in the house isn't really wasting much energy, and definitely not putting extra load on air conditioning. If I needed to turn up the fan speed, I'd have to move it out of my bedroom and use kbd/mouse/audio/video extension cables, though.

      I wish it had been NVidia that had open specs, because I'd _much_ rather be using their proprietary driver than ATI's. I just found out that ATI's driver crashes Wine unless you disable UseFastTLS, so maybe it will be stable now. I used to have almost daily crashes when gaming frequently a couple weeks ago. NVidia has VDPAU and up-to-date everything and good 2D perf. I sometimes see not-properly-redrawn text in gnome-terminal with fglrx, but I never had that with intel. (And I don't think I saw that on my AGP 7600GT...)

      If ATI hadn't released specs, I'd probably have an NVidia card in my main desktop right now, and be happier with it from a purely pragmatic point of view. Although less happy about the state of software freedom, of course. Sometimes I half wish ATI hadn't released specs, so they wouldn't be the only choice for me. (I don't see NV releasing specs as a viable wish unless and until they get bought out by a company with a different culture. They never give an inch on that.)

      Comment


      • #4
        Display range question is out of date

        The question on display resolution is "old school", you quote 4:3 resolutions in a world of 16:10 and 16:9 monitors.

        Nearly all new monitors are 16:9, even 16:10 is becoming old hat.

        Comment


        • #5
          16:9 monitors Suck. They are way to wide for their height, and for that reason, are almost useless. (Most content scrolls vertically, not horizontally). Most people don't spend most of their time watching 16:9 movies on their laptops.

          Comment


          • #6
            Originally posted by thefirstm View Post
            16:9 monitors Suck. They are way to wide for their height, and for that reason, are almost useless. (Most content scrolls vertically, not horizontally). Most people don't spend most of their time watching 16:9 movies on their laptops.
            Suck or not, they are taking over everywhere - soon you will not be able to buy anything else.

            Comment


            • #7
              I know.... It seems the only reason they are coming out with these things is to artificially inflate the size of displays, while keeping the pixel count the same, or even reducing it...

              Comment


              • #8
                I'd still pick a 1920x1200 over a 1600x1200, and a 1680x1050 over a 1400x1050. anytime.

                The secret is to move your task-bar to the side of the screen, giving you a few more pixels vertically and making your working area square again.

                Comment


                • #9
                  1920x1200 and 1680x1050 are 16:10 resolutions.

                  Comment


                  • #10
                    true, but last time I checked, they're still widely available (compared to 4:3).

                    Comment


                    • #11
                      Originally posted by rohcQaH View Post
                      true, but last time I checked, they're still widely available (compared to 4:3).
                      Yeah, and 16:10 less widely than 16:9

                      Comment


                      • #12
                        Originally posted by sabriah View Post
                        Yeah, and 16:10 less widely than 16:9

                        very true

                        nearly all new desktop monitors are 1920x1080, even cheap notebooks and new super netbooks (11.5") are 1366x768. The choices are reducing.

                        It would appear that the manufacturers have decided that it is cheaper to stick with one aspect ratio and fewer options. It undoubtedly also means economies of scale with TV panels.

                        On a plus note my new 24" Viewsonic 1920x1080 cost a lot less than my 22" 1680x1050 monitor, which is only one year old, and now graces my daughters computer.

                        Comment


                        • #13
                          I was the one that posted the suggestion for the resolution.

                          Think not about the actual resolution, but about the range. The intent of the question was to allow people to indicate the sizing of monitor that is used, not the resolution. The fact that I started with 4:3 resolutions as boundaries is completely irrelevant.

                          I could have just talked about "megapixels" and given a range there.

                          The idea is to allow Michael to ensure that particular attention is made to the resolutions that are common. But I am expecting bi-modal results with a bump at 1600x1200 (ish) and then a 2560-ish resolution.

                          Regards,

                          Matthew

                          Comment


                          • #14
                            Originally posted by mtippett View Post
                            The idea is to allow Michael to ensure that particular attention is made to the resolutions that are common. But I am expecting bi-modal results with a bump at 1600x1200 (ish) and then a 2560-ish resolution.
                            For the love of $DEITY, stop. Really, just.. don't. I f**in' hate when an app/website reads a >monitor resolution< and adjust itself to that, completely ignoring that
                            • I may have a non-hiding taskbar on the side
                            • the browser has panels on one side and tabs on the other
                            • The app is just not running maximized
                            What you want to know is the window size and proportions. I'd be a lot happier if more apps properly auto-adjusted when the resolution is changed or a new monitor is added - how many times have I seen an app that pops up its tooltips/dialogs on the primary monitor regardless of where its window actually is?

                            What I'd like even more is for multi-configruable-toolbars type of apps (Opera, various IDEs, office suites) to be able to hold different settings for landscape and portrait mode, then switch between them as the monitor is rotated.

                            On the unrelated note of 16:9 vs. 16:10 - I'm apparently a single user sticking out of the average, but I want 16:10 exactly because I watch movies. Who in their right mind wants the movie to take up the whole screen, then have it covered by player controls/subtitles? We don't all watch movies in our native language, you know.

                            Comment


                            • #15
                              Is there a way we can see live survey responses?

                              Anonymous read only access to the mysql server perhaps?

                              Comment

                              Working...
                              X