Announcement

Collapse
No announcement yet.

Marking DRI1 Drivers As Legacy & "Broken" Being Debated

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by slacka View Post
    This totally undermines the most common argument made by FLOSS proponents. They always justify the constant ABI breakage, claiming that if the hardware drivers were mainlined, the kernel devs will take responsibility maintenance. They can't really have it both ways. If they really want to talk the talk and kill DRI1, they should rewrite the drivers for DRI3.
    It's hard to rewrite drivers when no one has the hardware. You're basically asking them to rewrite a complicated low-level driver with 0 ability to test.

    Also - constant ABI breakage? These drivers have been working for years, and importantly - after this news, they will still continue working. You just have to explicitly check an option to make sure they are compiled in to the kernel. That's hardly constant ABI breakage, it's the exact opposite with the ABI still being kept the same even now.
    Last edited by smitty3268; 27 August 2016, 01:08 AM.

    Comment


    • #22
      Originally posted by andre30correia View Post
      the intel igpu are good in laptop world, lower power and good 3d and video acc, and iris pro are even in middle market of gpu for laptop
      We are talking of an intel GPU of 1999 that sucked ass hard.

      Comment


      • #23
        Originally posted by slacka View Post
        This totally undermines the most common argument made by FLOSS proponents.
        Nope.

        They always justify the constant ABI breakage, claiming that if the hardware drivers were mainlined, the kernel devs will take responsibility maintenance.
        This is valid only for hardware that is still popular enough to make sense mantaining.
        Random crap GPUs from 15 years ago that none uses anymore can be put to rest the same as the ancient 386 CPU support was dropped in 2012.

        They can't really have it both ways. If they really want to talk the talk and kill DRI1, they should rewrite the drivers for DRI3.
        This is just a deprecation flag so that people building kernels can easily disable DRI1 and all its drivers with a single switch, the drivers aren't dropped and the DRI1 interface isn't dropped either.

        Comment


        • #24
          Originally posted by Inopia View Post

          Is it really worth it to have support for 15 years old hardware in the bleeding edge kernel? Also they're not going to remove the support yet but just flag it as deprecated so you can build it (if it still works) in case you need it.
          I only asked if there was still high resolution console if the DRI1 is not compiled into the kernel. I will likely check that myself next week, but I think it works even without DRI1 driver. My use for that computer is only CLI and no networking so it would not be a big deal even if it couldn't use the latest kernel.

          Anyway, I like my software fresh. And if I really wanted to use DRI3 or whatever on that laptop, I would rewrite the driver myself. Doesn't seem worth it, though. Although, this makes me wonder if it would be a good exercise and introduction to graphics drivers programming in Linux.

          Comment


          • #25
            Originally posted by Tomin View Post
            I only asked if there was still high resolution console if the DRI1 is not compiled into the kernel.
            AFAIK that's dealt with by framebuffer drivers. The DRI drivers are to expose actual accelerated hardware.

            And if I really wanted to use DRI3 or whatever on that laptop, I would rewrite the driver myself. Doesn't seem worth it, though. Although, this makes me wonder if it would be a good exercise and introduction to graphics drivers programming in Linux.
            If you feel the urge to develop a driver, can I push you a little towards developing a driver for something that is still on sale and may benefit a sizeable amount of others too? Like Mali GPUs.

            Comment


            • #26
              In my opinion, having a generic driver that supports all these in the form of offering modesetting, and nothing else, should be good enough. Then the old drivers could be dropped. It is very unlikely that anyone needs 3D acceleration with these, nothing runs on these cards anymore, and ancient linux software usually doesn't run on modern distros.

              Comment


              • #27
                And that is the very definition of "retrograde" :|

                Comment


                • #28
                  Originally posted by starshipeleven View Post
                  AFAIK that's dealt with by framebuffer drivers. The DRI drivers are to expose actual accelerated hardware.
                  Thanks. That's pretty much what I though and easily enough for my needs.

                  Originally posted by starshipeleven View Post
                  AFAIK that's dealt with by framebuffer drivers. The DRI drivers are to expose actual accelerated hardware.

                  If you feel the urge to develop a driver, can I push you a little towards developing a driver for something that is still on sale and may benefit a sizeable amount of others too? Like Mali GPUs.
                  Haha, that would be more useful indeed, but I doubt that there is more documentation for those than for Savage (which already has an open source driver that needs to be rewritten). Anyway that Savage probably doesn't support much of OpenGL anyway (I'm surprised if it does 2.0) so it's not really worth anything any more (and from what I've heard it never really was).

                  Comment


                  • #29
                    Maybe they should take symbolic approach and drop these on Oct 2. 2017. so once S3TC patent expire

                    I played Rayman 2 on S3 Savage back in the time, and that 3D game required only 4 MB of VRAM

                    But currently we live in times where we talk about GB of VRAM, and new consoles coming too... so in 2017. probably thousand times more VRAM will be minimum required for games

                    Comment

                    Working...
                    X