Announcement

Collapse
No announcement yet.

A Q&A Panel About Contributing To X.Org & Open-Source

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • A Q&A Panel About Contributing To X.Org & Open-Source

    Phoronix: A Q&A Panel About Contributing To X.Org & Open-Source

    Next week at XDC2011 Chicago there will be a rather unique discussion taking place that's quite different from what normally goes on at this annual X.Org Developers' Conference. There is going to be a moderated panel discussion (tentatively titled "Contributing to X.Org and Open-Source") about contributing to X.Org, the Linux kernel, Mesa, and open-source software in general. For those not residents of the Chicago area, this session will be broadcast on the Internet...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Work against API fragmentation

    Ok so here's my question, from a Linux perspective obviously:

    Over decades, the X-Window system used to be a common denominator for many Unix variants. However, today we see heavy fragmentation on multiple levels. For example, in the video decoding and rendering area, there are multiple standards and de-facto standards such as VDPAU, VA-API, XvMC, Xvideo, OpenGL. Some augment each other, others are mutually exclusive. OTOH, there is DirectX and Quartz. If I were to code a video streaming application, I'd certainly know which platform to dread or even avoid. So yet again on Linux, more appears to be less. Can X.org play an active role in unifying all these valuable contributions in a way that's giving frames per seconds to the end user (vs. seconds per frame)?

    Another example for fragmentation is the old binary blob vs. OSS discussion, of course.

    Comment


    • #3
      Right now, by design, the X server sees and logs the correct panel size and resolution... and then throws it out and pulls numbers out of its ass to make 96 DPI.
      So, apparently I have a 20-inch laptop (so good luck trying to zoom 100% and get one inch to equal one inch)... and my fonts are headache-inducingly tiny. When I saw that, I said "screw this", and just booted back into Windows to get rid of the headache.

      When can we get somebody to undo this bullshit?

      The X server, starting with 1.7, ignores the physical size reported by the EDID or in xorg.conf and calculates it based on screen resolution and a DPI of 96. This is rather annoying for users of high DPI screens. GNOME and KDE (used?) to set 96 DPI by default in their settings. We should check whether they still do, and if so let them handle this; I don't think X should be handling this.

      It's not an Ubuntu-specific bug, but the Ubuntu bug report contains some additional comments that should've really been posted upstream.
      Last edited by DanaG; 05 September 2011, 05:33 PM.

      Comment


      • #4
        Originally posted by DanaG View Post
        Right now, by design, the X server sees and logs the correct panel size and resolution... and then throws it out and pulls numbers out of its ass to make 96 DPI.
        So, apparently I have a 20-inch laptop (so good luck trying to zoom 100% and get one inch to equal one inch)... and my fonts are headache-inducingly tiny. When I saw that, I said "screw this", and just booted back into Windows to get rid of the headache.

        When can we get somebody to undo this bullshit?

        The X server, starting with 1.7, ignores the physical size reported by the EDID or in xorg.conf and calculates it based on screen resolution and a DPI of 96. This is rather annoying for users of high DPI screens. GNOME and KDE (used?) to set 96 DPI by default in their settings. We should check whether they still do, and if so let them handle this; I don't think X should be handling this.

        It's not an Ubuntu-specific bug, but the Ubuntu bug report contains some additional comments that should've really been posted upstream.
        This isn't a technical support question Q&A.
        Michael Larabel
        https://www.michaellarabel.com/

        Comment


        • #5
          Looking forward to it

          I'm an IIT student and on the exec board of the ACM chapter there. We really appreciate that you're hosting this panel. Our chapter of the ACM has just been recreated and we're hoping to drive interest by starting small projects or helping students find open source projects that they can contribute to. This panel will help with questions we may have and show how big open source development is.

          I personally am looking forward to this talk. I had considered contributing to Lucene several times, but never took the time to look into how to go about contributing or what needed to be done.

          Thanks again,
          Mark McGuire
          ACM

          Comment


          • #6
            what does it look like

            Im curious about what kind of software techniques are required to understand the code or contribute. I see lots of register asignments when I look at the code and wonder what else is there. Can we contribute math or design pattern implementations?

            Comment


            • #7
              It depends on how far up the stack you are. Near the bottom of the stack you mostly need to understand how the hardware works, but as you get closer to the APIs (eg OpenGL) it's probably fair to say that (a) things get more abstract and (b) understanding how the API is used becomes at least as important as understanding the hardware.

              There are a couple of levels of compiler involved, if that helps - one going from the API programming language to a common-ish IR, and another going from that IR to hardware instructions.
              Test signature

              Comment

              Working...
              X