Announcement

Collapse
No announcement yet.

Intel Wants YOUR Linux Questions, Feedback

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • One complaint / feature request I have is about the lack of standardization between the DRM drivers when it comes to many of the sysfs/debugfs interfaces.
    Yeah, I totally agree with you Michael. Lack of standardization within the DRM system is disappointing sometimes. There are some standards, like connectors properties or i2c interaction to some point, but things are pretty much incompatible besides that.
    About GPU scheduling and turbo modes, me, Ben and Jesse were working on an idea of common scheduling/power/turbo interface in sysfs last year. This haven’t got anywhere yet, but I still hope we’ll be able to propose something in the coming months. If we have such features, and we do want to control them from userspace, some standardization would certainly be welcome here.
    When we’ll have something ready or announced, I’ll certainly let you know. I just cannot promise any deadlines for now, but hopefully it won’t take long.

    What I really miss is a tear free driver (xv output, flash). I'm quite surprised that after so many months since SB was launched it is still suffering from tear issues.
    The tearing issue on Sandy Bridge is the infamous https://bugs.freedesktop.org/show_bug.cgi?id=37686 bug. To be fixed, it requires some changes in kernel, libdrm and xf86-video-intel. However, those changes are small but pretty invasive – for instance, they change the way we deal with GPU-submitted batchbuffers, and we still had no time to solve this properly. Sorry!
    This is in my TODO list for this quarter though, so I hope I’ll get it fixed at some point.

    Even today a 5700MHD can not match what gm965 was doing 3-4 years ago. When are we going to see some improvements?
    If you check the performance historics of Intel GPU cards over the years, you’ll see that the performance is improving with each generation of chips. Nvidia and ATI had a great head-start ahead of us; and their approach to graphics is very different from ours. So integrated GPUs won’t be a match for dedicated cards for quite some time now (due to many reasons – die size, power requirements, GPU architecture itself, and so on)…
    Building a GPU card which matches the know-how and power of nvidia and ati ones requires millions of work-hours. It is not by accident that only 2 high-end GPU makers remain today (nvidia and ATI) – where 3dfx, s3, matrox, via and other minor ones have disappeared. Intel’s focus always was on integrated graphics cards and not on top-performance ones. So with this in mind, Intel’s GPUs evolution over the years has just begun - just check the performance numbers for gen2, 3, 4, 5, 6 (and, in some weeks from now, 7) Intel GPUs for the very same benchmarks to see how things change with each generation of GPUs. Let’s wait and see what happens in the future…

    Comment


    • I don't care much about shiny new OpenGL specifications met, or fps figures going up in newer releases (as long as the chip in question is able to render a decent desktop environment, that is). After all we are talking about bread and butter graphics chips.

      I've come to enjoy the comfort of configurable-free, fast-switching KMS. It doesn't look so good though on external displays connected to closed-lid notebooks. It all worked great until after 2.6.32, when suddenly after some acpi-lid detection policy change my X200s doesn't choose the native external display resolution anymore. Also at some point external display detection (with DVI-DP adaptor) stopped working at all. Probably that isn't a common enough setup to be well-tested, who knows? Anyway, there's a bug open including a patch that I need to apply to any kernel since 2.6.33 that reverts to my desired behaviour: https://bugzilla.kernel.org/show_bug.cgi?id=27622

      I've got two stationary setups for my ThinkPad X200s (LVDS native 1440x900):
      -) X200s closed-lid startup @ Ultrabase [DP] -> [DP] Eizo (native 1920x1200)
      Resolution ALWAYS wrong* without patch, Detection works (mostly)
      -) X200s closed-lid startup @ Ultrabase [DP] -> [DP]->[DVI-D] -> [DVI-D] Lenovo ThinkVision (native 1920x1200)
      Resolution ALWAYS wrong* without patch
      Detection completely non-functional, only BIOS-detection immediately at startup works. If I miss that fraction of a second, I can reboot all over again, and the Lenovo BIOS is really, really fast. I've become really good at this though, but it's still about a 30 : 70 to fail at first boot .

      * Resolution always wrong basically means that console output is aligned at the upper left corner and limited to the size of LVDS-native, while in X the LVDS-native resolution is blown up to the full size of the external display, looking gross.

      Comment


      • Originally posted by eugeni_dodonov View Post

        In general, if you do not experience random hangs and graphics corruptions when rc6 is enabled - I mean, issues which you haven't seen previously, you shouldn't have any issues with it at all. But if you do, we'll be very interested in hearing about it, because we are on a quest to locate machines which can reliably reproduce rc6-related problems. But so far, I'd say that around 99% of all machines should "just work" with it enable.

        Hi Eugeni, you should look for the Asus UX31E laptop.
        Here are some Ubuntu users that experience 1 to 4 machine shutdown when rc6 enabled on kernel 3.2 (selected posts) :
        http://ubuntuforums.org/showpost.php...&postcount=495
        http://ubuntuforums.org/showpost.php...&postcount=497
        http://ubuntuforums.org/showpost.php...&postcount=500

        As I'll get my UX31E tomorrow, I'm really interested in a fix

        Comment


        • Eugeni do you happen to know if/when the work of Gwenole whould be included in the upstream mplayer? Wider acceptance of vaapi whould be imho beneficial to the wider adoption of the Intel platforms reaching those that use the pc mailny for multimedia

          http://gitorious.org/vaapi/mplayer/c.../hwaccel-vaapi

          Comment


          • Originally posted by eugeni_dodonov View Post
            In general, if you do not experience random hangs and graphics corruptions when rc6 is enabled - I mean, issues which you haven't seen previously, you shouldn't have any issues with it at all. But if you do, we'll be very interested in hearing about it, because we are on a quest to locate machines which can reliably reproduce rc6-related problems. But so far, I'd say that around 99% of all machines should "just work" with it enable.
            I have now enabled RC6 on my X200s with a GM45 chip - in case it makes any sense - because I was experiencing hangs (X->console backdrops) after several hours using the drm portion of various 3.2_rc* and didn't have those at the end anymore. Let's see what it does with the latest 3.3 drm commits.

            Comment


            • my wishes

              1) Proper support for blur effect in KDE
              2) Using kwin for regression testing
              Last edited by schnelle; 01-17-2012, 09:46 AM.

              Comment


              • When you look at

                http://intellinuxgraphics.org/h264.html

                then the links (for mplayer) are a bit outdated (libva 1.0.15 will be splitted too), also does "Intel® HD Graphics" refer only to s1156 or to the s1155 pentium/celeron as well? Or is it just like 2000/3000 without the encoding parts only?

                Comment


                • Originally posted by ejmarkow View Post
                  I have the following from dmesg:

                  $ dmesg | grep -i drm
                  [drm] Initialized drm 1.1.0 20060810
                  [drm] Supports vblank timestamp caching Rev 1 (10.10.2010).
                  [drm] Driver supports precise vblank timestamp query.
                  [drm:intel_framebuffer_init] *ERROR* unsupported pixel format
                  Could you please try applying this patch (http://lists.freedesktop.org/archive...ry/014466.html) and reboot with drm.debug=0x04? It should tell you which pixel format you are missing..

                  Comment


                  • Do you happen to know if/when the work of Gwenole whould be included in the upstream mplayer?
                    No, according to Gwenole, this won't happen anytime soon, unfortunately.

                    However, while we are on topic of VAAPI, I believe the news about VAAPI support for Medfield chips got missed from Phoronix news… It is available at http://cgit.freedesktop.org/vaapi/pvr-driver/, and should support Penwell and Medfield platforms.

                    Comment


                    • Originally posted by eugeni_dodonov View Post
                      In general, if you do not experience random hangs and graphics corruptions when rc6 is enabled - I mean, issues which you haven't seen previously, you shouldn't have any issues with it at all. But if you do, we'll be very interested in hearing about it, because we are on a quest to locate machines which can reliably reproduce rc6-related problems. But so far, I'd say that around 99% of all machines should "just work" with it enable.
                      Originally posted by lemsto View Post
                      Hi Eugeni, you should look for the Asus UX31E laptop.
                      Here are some Ubuntu users that experience 1 to 4 machine shutdown when rc6 enabled on kernel 3.2 (selected posts) :
                      [...]
                      Bug opened
                      https://bugs.freedesktop.org/show_bug.cgi?id=44867

                      Comment


                      • external graphic card over thunderbolt

                        I don't know if this needs support from intel, but i would like to know what is needed to get an external graphic card over thunderbolt running in the (remote) future.
                        If it needs work from intel, is there any plan to support this?

                        Thanks for your answers.

                        Greetings.

                        Markus

                        Comment


                        • Originally posted by eugeni_dodonov View Post
                          Full OpenGL 3.x support requires some cooperation from the hardware. It is available on Ivy Bridge, yes, so GL 3.x should be supported on that architecture with hardware acceleration. For previous generations (Sandy Bridge and Ironlake), some parts of it must go through software-only implementations.
                          Would you elaborate on which parts require software-only implementations? Will this prevent the hardware from running Unigine Oilrush?
                          Last edited by Shining Arcanine; 01-18-2012, 05:21 AM.

                          Comment


                          • Why should something prevent Oilrush from running? You only need float textures active and s3tc. That game does not require opengl 3.0 at all. If you want to use Unigine engine with TESSELATION you need opengl 4, that could be implemented in theory with ivi bridge. But: tesselation is so slow on lowend chips that you would get a slideshow. When you remember: the first Unigine Heaven releases could use some specific ATI only tesselation engine on older hd 4 cards as well, it worked but was something like 7 fps, i do not expect more from an integrated gfx. Newer Unigine releases did not have got support for that extension, there you need a full opengl 4 card. But oilrush looks so simple, that tesselation will not help at all - for that game you definitely do not need an opengl 4 card.

                            Comment


                            • Long time brightness problem with GL games e.g. openarena

                              It would be great if the drivers could be tested in gaming scenarios. I've a rather new Dell desktop with "Intel Corporation 4 Series Chipset Integrated Graphics Controller (rev 03)" and I'm running kubuntu 11.10 with the i965 driver.
                              Desktop effects work fine and I've no complaint with the graphics in most scenarios, but when I play openarena, the screen is very dark and the brightness controls have no effect. Google shows this to be a common problem with gamers using intel graphics, and I haven't seen anything like solution. Needless to say, I've used nvidia graphics cards for gaming and of course the brightness controls work fine in openarena. Is there any hope on the horizon for gamers on intel graphics?

                              Comment


                              • To all people which says that dropping old hardware support is reasonable, since new hardware is so cheap.

                                Please note that in many countries, you often buy older hardware because it is very cheap (so one can afford it), have good CPU, good memory, and can be used for browsing, watching movies, office work, etc with just onboard graphics. Such machine can be bought for less than 50$ (often 30$). And spending additional 20$ is often no option economically (because it makes hardware 50% more costly), and physically (because often such computers, do not have AGP/PCIe slots, or needs low-profile cards, which are much harder to buy or more expensive). There is also lots of cheaper and cheaper laptop computers, which have intel graphics, and sorry their graphics cards cannot be upgraded. And despite being old, they have good performance for everyday needs, and because they are cheap, more people can have own portable/personal computers. Not to say by buying new hardware we create more computer waste, so it is good to reuse old hardware. Often in PCs there is no really any difference between using old graphics chip and new one - most of the people doesn't really play games, or needs compositing so much, etc. Also it doesn't really matter if my computer will have graphics card drawing 20W or 50W of energy (we are talking about onboard Intel graphics which is rather 2/5 W difference), because other elements of computer (like CPU, monitor, HDD) takes more energy. Also price of electric energy is different in different places, and cost savings will not be visible immediately - you need more money initially to buy better hardware to save it late - yes it is all clever, but not always possible, because people doesn't often have spare money to do this bigger investment.

                                Dropping support for old hardware, even when you think just buying new hardware will be better option is not best idea. And for sure not possible for many people.

                                Nobody is asking for running Crysis on old hardware. We are just asking to continue supporting old hardware with same level of features as it exists today, potentially fixing software bugs, and maintaining it even in newer kernels and xorg. For me it is sufficient that 2D acceleration, XVideo, xrandr and suspend/resume is supported. And unfortunately when using VESA driver most of this support will be missing for older hardware!

                                Comment

                                Working...
                                X