Announcement

Collapse
No announcement yet.

Kickstarter-Based Open-Source GPU Launches

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by boxie View Post
    I did - the main reason is that if successful - there will be a lot more people hacking on graphics. How can this be a bad thing for the open source drivers if there are more people around working on them.

    I can think of many little projects that could benefit from a small open source GPU in the DIY community.

    There is also a small chance that this sort of project might show the way (at least in some part) for other hardware companies wanting a better driver stack.

    *always the optimist*
    This would need to be at least OpenGL3 to be a fart in the wind, their goals are far and away shorter then that and as such it has no future.

    And thats me being optimistic as I can possibly be about the project.

    Comment


    • #22
      Originally posted by Kivada View Post
      Their best case scenario is 2001 era OpenGL1.4 class hardware, you can get that on eBay for $5 TODAY with OSS drivers in PCI, AGP and probably even in ISA if you dig.

      Unless theres some HUGE untapped market for ancient GPUs and i586 or UltraSPARC CPUs this project is a waste of time.

      The million would be far better spent on hiring more debs for the GPUs people actually want to buy.
      Originally posted by Kivada View Post
      From the $400,000 goal


      Thats OpenGL1.4, from 2001, so "shader based" apparently means vertex and pixel shaders.
      Originally posted by Kivada View Post
      This would need to be at least OpenGL3 to be a fart in the wind, their goals are far and away shorter then that and as such it has no future.

      And thats me being optimistic as I can possibly be about the project.
      Please read things properly before spreading fud.

      Universal Shader

      This is our ultimate stretch goal and requires a complete redesign. It's something we have been wanting to do for years, but didn't have the resources. This would allow us to create a complete open source implementation of a modern day graphics accelerator.

      If we receive more than the above, it will allow us to devote more time and effort to the project and we'll be able to release code sooner.

      This is new design work and our anticipated delivery would be Q2 2015.
      The only thing I would say in your favour is that the campaign page needs a huge overhaul I can see how you may have missed this if you were just skipping over the goals. For a target of $1 million they probably should have shelled out a couple of hundred to try get some marketing help.

      Comment


      • #23
        Originally posted by intellivision View Post
        Using the LGPL for hardware seems to be outside the scope of that license.
        Has the FSF approved this sort of use or given any legal advice to that effect?
        How do you define static and dynamic linking with hardware? How do you make sure that a library can be swapped out with one of another version?
        It's not out of scope and FSF does approve the use of (L)GPL for any project where the concept of "source code" is applicable. Source code is defined by FSF as "the preferred way of editing a project" - in the case of hardware chips, this would be the verilog source code.

        I guess this would mean that someone can't take the source, modify it and build a chip based on it, without releasing the changes. The linking part I guess would mean here that you can use this in conjunction with proprietary hardware, or put proprietary chips on the same PCB - which is probably a necessity anyway.

        Comment


        • #24
          @Bruno

          Have you contacted the FSF about this? They could give some free publicity for it, as it's fully endorseable by their conditions.

          Comment


          • #25
            I did email Dr. Stallman about this, but I'm not sure I made a good impression. I mixed up free software vs. open source. I'll try to follow up with him.

            My understanding of LGPL was that if someone changed the graphics code, they would have to release the changes. They should be free to use it just like a library linked in their design if they don't. Technically, You can compile a block stand alone and link it in later, so it should satisfy the LGPL. That was our intention.

            I'm jumping around a bunch of forums and messages, but I'll try to address anything here if you'd like. We wanted to give back to the community, but also to fund development of the Unified shader.

            There is little in the way of professional open source hardware. There is the Sun T1/ T2.

            Comment


            • #26
              Originally posted by fbruno View Post
              I did email Dr. Stallman about this, but I'm not sure I made a good impression. I mixed up free software vs. open source. I'll try to follow up with him.

              My understanding of LGPL was that if someone changed the graphics code, they would have to release the changes. They should be free to use it just like a library linked in their design if they don't. Technically, You can compile a block stand alone and link it in later, so it should satisfy the LGPL. That was our intention.

              I'm jumping around a bunch of forums and messages, but I'll try to address anything here if you'd like. We wanted to give back to the community, but also to fund development of the Unified shader.

              There is little in the way of professional open source hardware. There is the Sun T1/ T2.
              You'd be better off emailing Linus Torvalds... seeing as he sort of despises Nvidia ;-)

              As far as Stallman goes I've heard he's a really nice guy unless you touch a pet peeve of his... which it seems you did lol.

              If any of the OpenSource supporting companies weren't hypocrites they'd throw some money at this project.... and maybe they will.

              Also regarding Sun T1/T2 I'll mention it again as I did before. Those are huge bloated designs... you can in fact but 100Mhz Leon2FT Sparc processors from china last I looked... and they have a PCI interface also. I'd have to buy one and make a board for it before I could be certain but that would be better/cheaper/faster than a 100Mhz/100Mips Sun T1/T2 I imagine.

              Also considering that your GPU probably has a 128bit datapath.. isn't that suboptimal for FPGAs since wide datapaths waste alot of resources on the FPGA?

              Comment


              • #27
                Originally posted by cb88 View Post
                You'd be better off emailing Linus Torvalds... seeing as he sort of despises Nvidia ;-)

                As far as Stallman goes I've heard he's a really nice guy unless you touch a pet peeve of his... which it seems you did lol.

                If any of the OpenSource supporting companies weren't hypocrites they'd throw some money at this project.... and maybe they will.

                Also regarding Sun T1/T2 I'll mention it again as I did before. Those are huge bloated designs... you can in fact but 100Mhz Leon2FT Sparc processors from china last I looked... and they have a PCI interface also. I'd have to buy one and make a board for it before I could be certain but that would be better/cheaper/faster than a 100Mhz/100Mips Sun T1/T2 I imagine.

                Also considering that your GPU probably has a 128bit datapath.. isn't that suboptimal for FPGAs since wide datapaths waste alot of resources on the FPGA?
                I'll try to contact Linus. I'm sure Stallman is OK, but I waffled a bit on the license, so I'm not sure if he'll support it as LGPL, but I'll try.

                True about the LEON.

                A 128 bit datapath isn't that bad in an FPGA. In fact by the time you get to memory, it's often 128 or larger since the hard core memory controllers run many times faster than the FPGA can. in 2D, our internal datapath is 128, but it's shallow. in 3D, we work on one pixel/ cycle, so the maximum size is 32 bits. we cache the 3D pixels in the 2D datapath until we get to 128 bits or cross a 128 bit boundary in X or cross a Y boundary.

                In fact, in the FPGA, we have been doing much more than we did in the original ASIC, as internal memory is almost free. We carry a much wider bus of data along with each pixel to maximize performance. (It's not fully implemented yet as we are debugging some issues).
                -Frank

                Comment


                • #28
                  Over the past 10 years, i have shown people that display could be done BIOS-free (and coined the term modesetting), have played one of the most central roles in freeing ATI graphics, i then went on to provide the graphics bit for booting a consumer motherboard to a full linux with working text console (on coreboot), and then moved to ARM GPUs and blown that wide open too.

                  In this time the open graphics project made big promises and even provided some people with some boards. This imho was a project with quite a lot of promise, and it would definitely have benefitted from things like Kickstarter. It had a pretty honest premise from the getgo, but people were still enraged when it turned out that the main driver of this had ties to an FPGA manufacturer. My knowledge on this is hazy, but i still find it a bit overblown.

                  Perhaps due to the freeing of ATI, or perhaps because this sort of project really had no future with such expensive FPGAs... This project pretty much died, which perhaps is a shame. It simply could not compete with free intel and amd drivers, and cheap hardware.

                  Today though, even though kickstarter is around, the market has shifted even further. Hardware has gotten a lot cheaper, open source support has gotten a lot better. We are actually in a position where the last thing we need is more hardware, and some verilog is absolutely out of the question. We need to work on support the hardware we have, and where we have exposed a great lot already, but where we still need quite some work. This project seems like the last thing we need.

                  Plus, all of this smells like a dying company trying to make a quick buck on their IP by using the marketing machine that is kickstarter... There have been too many kickstarter projects like that already, and i hope that people are getting smarter about it.

                  Comment

                  Working...
                  X