Announcement

Collapse
No announcement yet.

Speeding Up The Linux Kernel With Your GPU

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Um, wait. Can anybody explain me slowly what I am missing here. Okay, I got lots of work so I was too "lazy" to read the links. But. Doesn't such a thing need drivers to access the hardware? And if this is done all in kernel... oh, wait. Where were Nvidias free as in freedom (L)GPL/BSD/MIT drivers? (nouveau doesn't count)
    Stop TCPA, stupid software patents and corrupt politicians!

    Comment


    • #22
      Originally posted by allquixotic View Post
      Maybe Software RAID could somehow be accelerated by the GPU, although you'd need a very large stripe size for it to be worth it. With say RAID-5, you might want to be able to calculate parity bits faster. If you factor in GPU setup latency and the GPU can still do that faster than the CPU, that's great -- go for it. But what about the vast majority of the people who either don't use RAID, or use hardware RAID that offloads those calculations to dedicated hardware anyway?
      This has already been researched and implemented:

      Comment


      • #23
        Silly question:
        Can't GPUs be made to assist the CPU in software rendering (3D, video) via a standard instruction set extension (say 'SSE42' or 'x88')? Wouldn't that allow us to get rid of all the 'graphics driver' mess and have such things HW accelerated independently of the specific hardware?

        Comment


        • #24
          Originally posted by not.sure View Post
          Silly question:
          Can't GPUs be made to assist the CPU in software rendering (3D, video) via a standard instruction set extension (say 'SSE42' or 'x88')? Wouldn't that allow us to get rid of all the 'graphics driver' mess and have such things HW accelerated independently of the specific hardware?
          Yup it could in theory but it would be much slower and inefficient. That is basically what intels Larrabee was trying to do.

          Comment


          • #25
            Originally posted by deanjo View Post
            Yup it could in theory but it would be much slower and inefficient. That is basically what intels Larrabee was trying to do.
            Lol, no... Larrabee was some sort of CPU design that allowed you to do vector calculations in the CPU. These vector registers could aid in 3D rendering, but that's about it, basically.

            I still have that Intel paper somewhere in my Gmail account is case you don't believe me...

            Comment


            • #26
              Originally posted by V!NCENT View Post
              Lol, no... Larrabee was some sort of CPU design that allowed you to do vector calculations in the CPU. These vector registers could aid in 3D rendering, but that's about it, basically.

              I still have that Intel paper somewhere in my Gmail account is case you don't believe me...
              You are arguing but saying the same thing. SSE and the likes brought vector specific registers to the x86 much like Altivec did for the PPC. In fact AVX is an effort to further improve on those capabilities.
              Last edited by deanjo; 08 May 2011, 05:22 PM.

              Comment


              • #27
                Originally posted by deanjo View Post
                You are arguing but saying the same thing. SSE and the likes brought vector specific registers to the x86 much like Altivec did for the PPC. In fact AVX is an effort to further improve on those capabilities.
                Wasn't the thing we were arguing about that the CPU offloads these calculations to the GPU with a standardised instruction set instead of replacing them with the CPU instruction sets?

                I thought you were saying that Larrebee ofloaded them. I meant to say that it does them.

                Comment


                • #28
                  Just to put this into perspective, this project is all about nvidia trying to turn the linux kernel into their own personal proprietary blob. Drop dead nvidia!

                  Comment


                  • #29
                    Originally posted by droidhacker View Post
                    Just to put this into perspective, this project is all about nvidia trying to turn the linux kernel into their own personal proprietary blob. Drop dead nvidia!
                    No. nVidia needs Linux to manage their hardware. Don't forget that they are working on a CPU that can never match AMD or Intel. Linus would never accept it and nVidia knows this.

                    Linux is key to their hardware adoption in this regard and therefore they can't and won't do that.

                    Comment


                    • #30
                      Originally posted by droidhacker View Post
                      Just to put this into perspective, this project is all about nvidia trying to turn the linux kernel into their own personal proprietary blob. Drop dead nvidia!
                      Seeing as the linux kernel is GPL'ed I don't see how this could be possible ... no conspiracy theories please ...

                      Comment

                      Working...
                      X