Announcement

Collapse
No announcement yet.

Rust-Written Apple DRM Linux Kernel Driver Renders First Cube

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by dremon_nl View Post
    Would be endless fun to watch Thompson and Ritchie waiting for ANSI C standard before they write a single line of Unix code.
    The fact that these AT&T guys were pretty much the authors of C and UNIX. They *were* the standard. We don't have that luxury these days.

    Comment


    • #32
      Originally posted by Dukenukemx View Post

      If people made many claims then those people seem to be correct. The first M1 Mac was released in November 2020, which is nearly 2 years ago. Since then there's barely a functioning distro of Linux for them. 2 years into this and you celebrate a spinning cube. I guarantee you that once OpenGL 2.1 is "finished" you'll probably run Quake or Doom3 at horrible buggy frames per second. There's nothing wrong holding Apple accountable for not doing anything to help out. There's a lot of minor stuff Apple could do that would help move the project a lot faster.
      Why shouldn't we celebrate a spinning cube?? I'm not sure that you grasp the amount of work, ingenuity, collaboration, and talent that went into that. The more devices that are supported by Linux == larger Linux user base. Unfortunately, not every company is going to give you what you ask for. Linux would be dead right now (hell if it wouldn't have even started) if we mandated that every vendor supported and maintained (Linux) kernel drivers. Also I'm pretty sure kernel hackers like Hector enjoy hacking on new exotic hardware, that's kind of their thing. I mean he enjoys developing drivers, other developers enjoy collaborating with the project, others enjoy following the project....and there's people like you.
      Also, accelerated graphics are the one thing keeping many people from using these devices as their daily driver. Normal users don't give a crap about Quake or Doom. Not sure why you think everyone wants to game on these systems. Normal users just want to be able to use a desktop and watch videos without experiencing screen tearing (and high cpu usage from software-rendered graphics).

      Comment


      • #33
        Interesting point...
        What's the required version of OpenGL for a modern DE like KDE or GNOME to run in accelerated graphics instead of the CPU?
        Is OpenGL 2.1 enough orhigher like 3.x?

        Comment


        • #34
          Originally posted by akira128 View Post
          Why shouldn't we celebrate a spinning cube?? I'm not sure that you grasp the amount of work, ingenuity, collaboration, and talent that went into that. The more devices that are supported by Linux == larger Linux user base. Unfortunately, not every company is going to give you what you ask for. Linux would be dead right now (hell if it wouldn't have even started) if we mandated that every vendor supported and maintained (Linux) kernel drivers.
          This isn't Linux from 15 years ago when the only way to get Linux running on something was a lot of reverse engineering and people using their free time to make it work. By work I mean not very well either. To this day I still have trouble getting Linux working acceptably on my old PowerBook G4 with ATI Radeon 9600 graphics. So many bugs that just were never corrected. Today we have Linus Torvalds who called out Nvidia for not open sourcing their drivers. Today AMD and Nvidia as well as nearly all the hardware manufacturers will donate code to the Linux project so that their stuff works. Apple should do the same, and be held accountable for not matching the standards of the rest of the industry.
          Also I'm pretty sure kernel hackers like Hector enjoy hacking on new exotic hardware, that's kind of their thing. I mean he enjoys developing drivers, other developers enjoy collaborating with the project, others enjoy following the project....and there's people like you.
          People like me are calling it what is it. Without Apple's involvement this will be a fun project for Hector and others, and nothing more.
          Also, accelerated graphics are the one thing keeping many people from using these devices as their daily driver.
          Webcam, Bluetooth, sleep mode, you know not important features. I think they recently fixed Bluetooth but you can't use it with WiFi at the same time.
          Normal users don't give a crap about Quake or Doom. Not sure why you think everyone wants to game on these systems.
          Because that and porn at what people use their computers the most. Why you think Gabe Newell left Microsoft to start his own gaming company?
          https://youtube.com/clip/UgkxfqDlhOf...sf2q64P8E7aXFW
          Normal users just want to be able to use a desktop and watch videos without experiencing screen tearing (and high cpu usage from software-rendered graphics).
          I think that's more you than them. If that were true then Apple would have a much larger market share than they do right now.

          Comment


          • #35
            Originally posted by marlock View Post
            Interesting point...
            What's the required version of OpenGL for a modern DE like KDE or GNOME to run in accelerated graphics instead of the CPU?
            Is OpenGL 2.1 enough orhigher like 3.x?
            I don't know for sure, but I believe they generally support a GLES 2 mode in order to allow running on embedded/mobile devices, which would line up pretty closely with GL 2.1 and I imagine the drivers will pretty quickly support. For actual desktop GL I'd guess the requirements might be higher than that.

            Comment


            • #36
              Originally posted by Dukenukemx View Post
              People like me are calling it what is it. Without Apple's involvement this will be a fun project for Hector and others, and nothing more.
              Linus Torvalds is using an m2 Macbook Air to build aarch64 kernels on. And I know developers who using these systems for the same purpose. Some distros like CentOS don't allow you to cross-compile (official) packages -- which means official packages have to be compiled on a native arch. So M1/M2 systems are really attractive to (some) developers and package maintainers for that reason .....but there's a million other reasons. The performance/power ratio, build quality, and aesthetics are pretty damn good -- and that's enough to attract more mainstream users. Aside from the video driver what severe shortcomings exist that only Apple alone can fix?
              Have you seen these benchmarks? https://www.phoronix.com/review/apple-m2-linux
              It seems like it's a just tad bit beyond more then just a fun little project that won't amount to anything serious at this point.
              Originally posted by Dukenukemx View Post
              Webcam, Bluetooth, sleep mode, you know not important features. I think they recently fixed Bluetooth but you can't use it with WiFi at the same time.
              ​Bluetooth has been working for months and yes there's a 2.4Ghz wifi/bluetooth co-existence issue that's currently being worked on, but I think you're digressing here a bit. There's a difference between being fully-functional and being usable. You were mocking people for celebrating a spinning cube. But that spinning cube represents progress -- as in this project is progressing in the direction of becoming fully-functional. I think most people could get by without a webcam. Although let's be honest, even when/if these systems become fully supported by Linux -- you'll just find something else to complain about....because that's your thing.
              Complain, complain, complain.....don't develop sh**t.....don't do sh**t....complain some more.
              Originally posted by Dukenukemx View Post
              Because that and porn at what people use their computers the most.
              ​​I'm pretty sure aarch64 Linux users represent less than %.1 percent of the entire gaming market. I just don't get who would be buying an M1/M2 system for gaming on Linux. That sounds ridiculous.
              Originally posted by Dukenukemx View Post
              I think that's more you than them.
              ​​​So let me get this straight. You're saying that the general population of all computer users would care more about what frame rates they're getting with Doom then they would about having a stable desktop experience. Again...I don't really know what to say except for that's pretty damn insane.
              Last edited by akira128; 25 September 2022, 10:33 PM.

              Comment


              • #37
                Originally posted by Eberhardt View Post
                IMO that's ideal: Rust is not stuck with bad design decisions of the past but still maintains perfect backwards compatibility.
                Kinda. The standard library cannot break compatibilty even across editions and is already straddled with a few legacy bad decisions.

                Comment


                • #38
                  Originally posted by Dukenukemx View Post
                  It has yet to be seen how OpenGL 2.1 will even perform on Apple Silicon before the end of the year. Having a single person working on this isn't going to get anywhere near the results of other OpenGL drivers on Linux. Without Apple spending money and helping out, this isn't going to be usable for many years. Even then, only usable on M1/M2, because by the time this is usable on the M1/M2, we'll be on the M4 or M5 with enough changes to graphics that it'll take more years to get them working 100%.
                  you again... for people like you,,, you want an RTX 4000 with DLSS3.0 with fake Fames saves 25% power important to notebooks/laptops.

                  also RDNA3 also has FSR3 hardware similar to the DLSS AI cores is 8% faster than the shader version of FSR2.1 saves a lot of power because rendering at lower resolution and outputs higher resolution. FSR2.1 for example removes all ghosting in the example videos i watched. so FSR3.0 with native AI hardware to calculate it will be good.

                  apple with ther PowerVR Imagination Technologies is horrible outdated. they did chose it 10 years ago because tiled based rendering did save power...

                  i am 100% sure in future designs like Apple M4/M5 apple will no longer use PowerVR 'GPU tech and instead will follow the many ARM companies like Samsung and will license the AMD RDNA3 design.

                  to be honest apple was not interested in RDNA1 or RDNA2 because the power consumting was not better than their design but RDNA3 is better.

                  Qualcomm with their old ATI GPU tech has the same problem they have GPU SOCs with similar performance and similar power consumtion but problem is Qualcomm need much more tranistors for the same result means more tranistors clocked at lower speed like 650mhz... the AMD design has less tranistors clocked at 1400mhz... the performance and power consumtion is the same. to save tranistors and have raytracing hardware acceleration is a big plus.

                  if Nvidia gets RDNA license they get FSR3.0 hardware what also saves a lot of power.

                  if you are a company like apple or samsung or Qualcomm and you can save a lot of tranistors and get higher clock speeds and important features to license the AMD tech is the right way. the license in fact pay for itself by the tranistor count alone.

                  and if apple does this this also fix their linux GPU driver problem instandly because the opensource driver already exist.







                  Phantom circuit Sequence Reducer Dyslexia

                  Comment


                  • #39
                    Originally posted by Developer12 View Post
                    People have made many claims like yours about this project and time and time again have been proven wrong.
                    he has a very small coridor of point of interest... his point of view is based on gaming and multicore benchmarks.
                    apple hardware has no feature aside tiled based rendering (but all modern gpu designs have this amd did it with vega640) to make gaming efficient on notebooks.
                    thats why there are youtube videos who proof apple hardware is not good for gaming of course they are all biased because use x86 windows games emulated on ARM with rosetta2...
                    and also there are 6nm AMD notebooks who have similar multicore benchmarks.

                    outside of this... apple is much better apple wins all singlecore benchmarks ... apple has much more ASICs for different tasks what saves a lot of energy and is fast on stuff like video cutting software and so one.

                    apple need something like RDNA3/FSR3.0 or RTX4000 DLSS3.0 to make gaming on notebooks efficient with FAKE-Frames to lower power consumtion by 25% and rendering at lower resolution and upscale it at higher resolution to save more power.

                    FSR2.0 vs 2.1 alone eliminated all ghosting in the videos i see on youtube RDNA3/FSR3.0 does not render it in shaders anymore instead it use FMA/Matrix cores to make it faster and efficient.

                    apple could license amd gpu tech to get this.
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment


                    • #40
                      Originally posted by Dukenukemx View Post
                      If people made many claims then those people seem to be correct. The first M1 Mac was released in November 2020, which is nearly 2 years ago. Since then there's barely a functioning distro of Linux for them. 2 years into this and you celebrate a spinning cube. I guarantee you that once OpenGL 2.1 is "finished" you'll probably run Quake or Doom3 at horrible buggy frames per second. There's nothing wrong holding Apple accountable for not doing anything to help out. There's a lot of minor stuff Apple could do that would help move the project a lot faster.
                      "There's nothing wrong holding Apple accountable for not doing anything to help out."

                      this is proofed wrong multible times.

                      also you want a gaming laptop and why should apple focus to be a gaming laptop ?

                      if apple want to be a gaming laptop they need to license RDNA3... raytracing hardware, upscalling hardware FSR3.0,

                      "The first M1 Mac was released in November 2020, which is nearly 2 years ago."

                      this is not the relevant metric at all because for everyone it was clear the first one of a ISA change would always the hardest...

                      the M2 support was much faster online than the M2 support and this is the important metric .. not the m1 dates.

                      Phantom circuit Sequence Reducer Dyslexia

                      Comment

                      Working...
                      X