Announcement

Collapse
No announcement yet.

One Million Dollars For A Shader-Based LGPLv3 GPU

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Kivada View Post
    Even their best case scenario isn't even OpenGL2.1 capable. Not much use for a shader GPU then, since all the fun programmable shader stuff is in OpenGL3+

    So again, what will be learned that can't be learned from the now public domain chip designs and the failed OSS GPU project that preceded this?
    Learning with a project like this is about hardware design not OpenGL programming. You don't need a special card to learn how to write OpenGL code.

    Comment


    • #22
      its a capitalist world...
      if a license free not compete with technology and performance, will be bad...
      even paying for licenses, they will gain...

      linux is a case of success, because you just need a computer to improve it.
      if build the hardware is required, it would be a disaster.

      Comment


      • #23
        Originally posted by tarceri View Post
        Learning with a project like this is about hardware design not OpenGL programming. You don't need a special card to learn how to write OpenGL code.
        Which was my point, what will be learned from this that can't already be learned from GPU designs whose patents have expired? Because that is what this looks like, a very old GPU design being done on an FPGA.

        And yes, the supported OpenGL version matters as that gives you an idea of how advanced their hardware design is, so far all they've shown is that they will be making a GPU design that is no different from just copying a very old GPU that has expired patents.

        There are already companies making updated i586 CPUs because the the patents have expired, theres no reason that you couldn't take the GPUs of the same era and make clones, which is all this projects seems like it's going to do.

        Comment


        • #24
          Originally posted by Kivada View Post
          Which was my point, what will be learned from this that can't already be learned from GPU designs whose patents have expired? Because that is what this looks like, a very old GPU design being done on an FPGA.

          And yes, the supported OpenGL version matters as that gives you an idea of how advanced their hardware design is, so far all they've shown is that they will be making a GPU design that is no different from just copying a very old GPU that has expired patents.

          There are already companies making updated i586 CPUs because the the patents have expired, theres no reason that you couldn't take the GPUs of the same era and make clones, which is all this projects seems like it's going to do.

          What are you talking about you cant just clone an old gpu you still need a copy of the design. Also patents stop you from using certain features until they expire, but copyright still exists so even if it were possible to just magically clone the gpu you still couldn't just make an exact copy

          Comment


          • #25
            Originally posted by Kivada View Post
            There are already companies making updated i586 CPUs because the the patents have expired, theres no reason that you couldn't take the GPUs of the same era and make clones, which is all this projects seems like it's going to do.
            The point is that companies normally don't release the HW designs of their GPUs, so there's nothing to clone. That's what makes this project different.
            Test signature

            Comment


            • #26
              Originally posted by tarceri View Post
              Francis I just wanted to say you should probably spend some time dumbing things down a little bit for example you need to explain the concepts a bit more thoroughly for people that are interested in the concept but don't understand how it all works.

              Some things I recommend expanding on:
              - What is an FPGA? and why would people be interested in it.
              - What is Verilog?
              - What is the difference between an ASIC and FPGA?
              - Any other things that are basic to understanding what the project is about.

              You campaign at the moment seems targeted only at those who already know what all these technologies are about and that is limiting your potential backers. In my opinion you need to add a bit more marketing to your campaign. I think you should add a whole new section "Why back this project?" here you really need to sell the idea to both education/developers/graphics enthusiasts but also to people interested in open source but are not technical there are many people out there that are not programmers but still very interested in supporting open source.

              Finally I've added you campaign to Reddit: http://www.reddit.com/r/linux/commen...u_kickstarter/
              You should keep an eye on the comments it will give you a good idea on how to refine your campaign page to answer common questions people have.

              Good luck
              Thanks. I'm working on addressing these issues. I'm a hardware engineer in the thick of FPGA design daily, so I probably write too much like an engineer.

              I'm also working on solutions for people who play with Raspberry-Pi or Arduino boards to see what is available for FPGA daughter cards (there are some). Being Day #1, I've been slammed with questions and comments, but I will try to get to them in the next few days.

              A few things so far:
              1) Thanks for the support to everyone and I understand the people who aren't on board. My biggest problem with doing this was that it's kind of niche for a product.
              2) I did honestly try to research the licensing before we did the release and I chose LGPL based on a software engineer I am friends with. I liken the GPU to a linked library. I would like anyone be able to use it. want to link it with an ARM, go right ahead. want to do a complete open source SOC, go ahead. But... and this is my sticking point. If you modify it, you have to make the modifications available. If you leave it alone, you just need to provide the source. I know GPL is viral, but I thought LGPL would fit the bill. please correct me if I am wrong.
              3) We didn't try to do hardware for a few reasons. I don't think we could have a one size fits all approach. The 2D part will fit in a small FPGA. The 3D part needs more. I think hackers might want a Cyclone V SOC or Xilinx Zynq to play with. I personally would love the time to take the Sparc T1/T2 and pair it with this and see if I could squeeze it into a Stratix V or Vertex 7, but they would cost 10K a piece.

              Thanks for the support and I understand the non-suport (not sure of the best word) from some. I'll try my best to follow up w/ anything posted here, or message me via here or kickstarter. Whether we make it or not, this is certainly a learning experience.

              Thanks,
              Frank

              Comment


              • #27
                Originally posted by fbruno View Post
                Thanks. I'm working on addressing these issues. I'm a hardware engineer in the thick of FPGA design daily, so I probably write too much like an engineer.

                I'm also working on solutions for people who play with Raspberry-Pi or Arduino boards to see what is available for FPGA daughter cards (there are some). Being Day #1, I've been slammed with questions and comments, but I will try to get to them in the next few days.

                A few things so far:
                1) Thanks for the support to everyone and I understand the people who aren't on board. My biggest problem with doing this was that it's kind of niche for a product.
                2) I did honestly try to research the licensing before we did the release and I chose LGPL based on a software engineer I am friends with. I liken the GPU to a linked library. I would like anyone be able to use it. want to link it with an ARM, go right ahead. want to do a complete open source SOC, go ahead. But... and this is my sticking point. If you modify it, you have to make the modifications available. If you leave it alone, you just need to provide the source. I know GPL is viral, but I thought LGPL would fit the bill. please correct me if I am wrong.
                3) We didn't try to do hardware for a few reasons. I don't think we could have a one size fits all approach. The 2D part will fit in a small FPGA. The 3D part needs more. I think hackers might want a Cyclone V SOC or Xilinx Zynq to play with. I personally would love the time to take the Sparc T1/T2 and pair it with this and see if I could squeeze it into a Stratix V or Vertex 7, but they would cost 10K a piece.

                Thanks for the support and I understand the non-suport (not sure of the best word) from some. I'll try my best to follow up w/ anything posted here, or message me via here or kickstarter. Whether we make it or not, this is certainly a learning experience.

                Thanks,
                Frank
                You could pair it with a Leon4 sparc much smaller and probably faster.... a T1 on an FPGA probably runs like mud. Its slow even as it is as an ASIC ots of bandwidth no single threaded throughput though (I own a T2000 for playing with :-) ) ... it would be pretty cool to be able to use this board as a Sun Console as well on the T2000 or any Sun with PCI. Or maybe even a custom Sbus card for my Sparcstation 20 ;-) ... apparently you can get Sbus connectors on ebay these days.

                Comment


                • #28
                  Originally posted by Kivada View Post
                  Which was my point, what will be learned from this that can't already be learned from GPU designs whose patents have expired?
                  Why do you think this is about "learning" something? It's about getting the hardware design for a GPU, not learning. Just like adding GL4 support into Mesa won't teach us anything we don't already know from other drivers, but will simply provide an open source implementation for it.

                  I'm not going to contribute, but I do think it's an interesting idea in theory. It reminds me of my college days when i was playing around with Verilog a bit.

                  Edit: by the way, companies don't put Verilog code in their patent applications. They make them as generic as possible, like any good patent application, so that they can try to claim as much as possible while giving away as few details as possible. And they are small individual pieces to a piece of hardware, not anything even approaching a fully working implementation.
                  Last edited by smitty3268; 10 October 2013, 01:21 AM.

                  Comment


                  • #29
                    Originally posted by tarceri View Post
                    What are you talking about you cant just clone an old gpu you still need a copy of the design. Also patents stop you from using certain features until they expire, but copyright still exists so even if it were possible to just magically clone the gpu you still couldn't just make an exact copy
                    The patents expired on i586, there are now unlicensed Intel compatible i586 chips on the market. The same can be done for anything once it's out of patent.

                    Comment


                    • #30
                      Originally posted by smitty3268 View Post
                      Why do you think this is about "learning" something? It's about getting the hardware design for a GPU, not learning. Just like adding GL4 support into Mesa won't teach us anything we don't already know from other drivers, but will simply provide an open source implementation for it.
                      Educational value was what a few people where saying it was for, I'm saying the money would be better spent hiring more full time Gallium3D devs.

                      This GPU, open as it may be has no potential market, especially when you se the limits of where they say they can take it's tech on their Kickstarter page. Even ARM GPUs have complete modern features unlike this project's high end goals that will only be 2001 era OpenGL1.4/DirectX8.1 features that are not even enough to run a composited desktop on anymore since those require OpenGL 2.*.


                      TL;DR

                      Opensource GPU using 2001 tech that nobody will buy or hire more devs to make better drivers for existing hardware that people are already buying?
                      Last edited by Kivada; 11 October 2013, 05:19 AM.

                      Comment

                      Working...
                      X