Announcement

Collapse
No announcement yet.

Building Linux With LLVM/Clang Excites The Embedded World

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by stiiixy View Post
    I think you'r emissing a critical piece of the logic of your own argument. If AMD DIDN'T try out LVM, they would have never know, and now the whole world knows a little bit more that, at least an older version, LLVM isn't the be all and end all of compilation and GCC pretty much is for the time being.
    From what I understood, it has been clear enough from the very beginning that LLVM isn't great when it comes to generating VLIW code. Because it's more or less generic while VLIWs are awfully custom things with very specific requirements on commands stream generation. Something that LLVM never took into account on design and implementation phases. The result is quite predictable: guys wasted a lot of efforts to just get LLVM anyhow working at all and while LLVM isn't great at code optimization on it's own, on VLIW it seems to be completely awful. Old custom code generator STILL performs better. While nobody wasted so many time working on it.

    As for me it looks like decisions/design phase fault: guys used LLVM due to buzz around, failing to understand it's a troublesome way. It's not even LLVM fault on it's own. It's VLIW and it requires custom approach. LLVM knows nothing about VLIW existence and does not really helps to deal with things like this. Instead it makes things more complicated (as it's generic solution intended to handle dozen of unneeded scenarios), raises bar for those who wants to enter development, requires upstream syncs, etc, etc and ... and at the end of day it FAILS TO PROVIDE REASONABLE PERFORMANCE (and why anybody needs 3D driver with crappy performance, huh?). You see, Vadim has managed to seriously improve FPS on custom code generator because it's simple. If there was LLVM on it's place it would be unlikely as it reqires far more time to get same results.

    And the worst of all: it could be evaluated qickly. It's fairly predictable. Sure, AMD currently haves awful management issues, but this partucular stubborness is just hard to explain. They literally WASTED YEARS without anyhow usable outcome which can provide user-visible values. What about efforts to results ratio? And why some quite inactive dev could come, quickly improve OLD code and beat all this LLVM idiocy to the hell in terns of performance? Without wasting years of hard work, spending months on waiting upstream, etc. It's just a drastic demonstration of what happens when someone decides to "improve these damn user-visible results" rather than "use LLVM" (or whatever else crap you have).

    Let's just make some quote:
    I spent some time last year studying the LLVM infrastructure and R600
    LLVM backend and trying to improve it, but after all I came to the
    conclusion that for me it might be easier to implement all that I wanted
    in the custom backend.
    Or, for those interested, you can read the whole thread on http://lists.freedesktop.org/archive...ry/034547.html and all reply messages. That's a really amazing history on how LLVM fetishism could make development suboptimal and fruitless. I think it will be yet another shameful page of AMD engineering. When they had chance and missed it due to incredibly silly reasons.

    P.S. ah, they mumble LLVM needed for opencl. But looks like if they fail to understand that nobody needs SLOW opencl implementation. It's just pointless. Either you can roll out OPTIMIZED implementation or you would waste potential of hardware and lose the market.
    Last edited by 0xBADCODE; 10 March 2013, 05:46 PM.

    Comment


    • #22
      Originally posted by LightBit View Post
      But still perform worse than AMD open source and Nouveau, because their GPUs suck.
      They manage to perform amazing job granted that their hardware is a crap. AMD on other hand haves very good hardware and ... sucking drivers. Proprietary ones are just a bunch of troubles. Opensource ones are slow. And you see, guys managed to literally waste years working on drivers without real improvements on speed. When some external guy entered and improved FPS on Unigine demo to a degree LLVM backend can't even dream so far. Very-very quickly. Absolutely impressive demo on how efforts to results ratio could vary wildly, depending on people goals and management/judgement quality.
      Last edited by 0xBADCODE; 10 March 2013, 06:03 PM.

      Comment


      • #23
        Originally posted by erendorn View Post
        LOOOL so much irony in this post

        I though that GPL was about being "viral", and forcing "freedom" onto people?
        Technically speaking the GPL IS viral and IS a bit like a cancer-- one tiny little library that does something cool can force the GPL onto other projects if they use it. And once THAT project is under the GPL, any of ITS libraries are then under the gpl and it just keeps going UNLESS those libraries were already under a different license that is GPL compatible.

        The reality is the BSD style license "Here's code, I dont care what you do with it." is the MOST free license available, because it has zero restrictions or conditions or strings. But it does carry the danger of a company coming along, taking the code, making it better, and shipping a closed source program using that code without contributing anything back.

        And from that concern came the GPL-- "Here's code, do whatever you want with it, modify the crap out of it for all I care, but if you distribute a program to other people then you have to make it open source so that others can benefit." And that distribute part is key-- if a company takes a GPL library, modifies it, but keeps the modifications internally-- ergo they dont show it to the outside world and keep it within the company-- then they dont have to open source it.
        All opinions are my own not those of my employer if you know who they are.

        Comment


        • #24
          Originally posted by Ericg View Post
          Technically speaking the GPL IS viral and IS a bit like a cancer-- one tiny little library that does something cool can force the GPL onto other projects if they use it.
          Perfectly fair to my taste: if you want to use someone's shared job results, share your own job results as well. Else you're clearly a parasite, sir: you want to consume something without giving anything back. So it's a viral cancer. It's a viral cancer which either kills nasty parasites or converts them to allied force. Quite a good cure for those who does not wants to have hundreds of parasites on them. You see, while BSDs are showing us how EPIC FAIL looks like, GPLed Linux managed to get strong. I can assume that what is bad for parasites is perfectly fine for COOPERATIVE people. And why someone should bother self about convenience of parasites? Sure, there is option to throw some stuff away and don't care if someone using it will commit anything back. However this resembles dumping of toxic waste when original owner no longer needs this hazardous substance. Most notably, Apache Software Foundation became most common place where proprietary corporations are dumping their toxic waste^W^W ahem, ex-proprietary projects they no longer need. But the most ironic is that even proprietary corps prefer Linux these days and utterly do not care aboud BSDs fate. Because Linux alows to earn $$$ and BSDs are troublesome at that. It's very amusing to see how ex-ally gives a boot to ex-ally just because business only values money$$$ and nothing else than that.

          Comment


          • #25
            I wasn't saying that the BSD license was better, don't misunderstand me. Companies have very legitimate concerns for the GPL-- if you have a programmer looking at GPL code, learns something new from that GPL code, and then uses that new idea (but not the exact code itself) in the companies product..does that qualfiy for the GPL's "Derived" clause? Maybe yes, maybe no, I dont think its ever been contested in court.

            Over all, I think most companies who contribute back to BSD do so either out of good will, quid-pro-quo ness, or maintenance burden. Because when you have a lot of out-of-tree patches it becomes a big deal to maintain them all on your own, its easier to push them back to usptream and then just let upstream maintain them for you.

            For the most part I think the best idea is to 'default' your license to LGPL (nice middle ground. The parts they took from YOU they have to keep open, including changes, but their own custom code can be whatever they want.)
            All opinions are my own not those of my employer if you know who they are.

            Comment


            • #26
              Originally posted by 0xBADCODE View Post
              You see, while BSDs are showing us how EPIC FAIL looks like, GPLed Linux managed to get strong.
              Really? Where would Linux be without BSD (or the somewhat similar MIT) licensed software, like Xorg (or in the future Wayland) and many more?
              Don't be a hypocrite, remove those software from your Linux system and tell us again how strong your GPLed Linux is.

              Comment


              • #27
                Originally posted by 0xBADCODE View Post
                Perfectly fair to my taste: if you want to use someone's shared job results, share your own job results as well. Else you're clearly a parasite
                Perhaps I want to share my code under more liberal terms, or under another copyleft license like the CDDL or MPL 2.0.
                The problem with the GPL is that it's 'The GPL way', or no way at all, at the expense of other copyleft licenses.

                Also, your throw in about BSD systems and the Apache Foundation are unwarranted and unrelated.
                However, if you do want something to think about, Apache Openoffice has gotten 40 million downloads to date of version 3.4, not bad for 'toxic waste'

                Comment


                • #28
                  Originally posted by dee. View Post
                  Nothing personal - I would bash any other patent troll just the same.
                  When you grasp the concept of patent troll and actually get off your own ass and patent something people think is worth copying, I doubt you'd have uttered such nonsense.

                  Comment


                  • #29
                    Originally posted by intellivision View Post
                    Also, your throw in about BSD systems and the Apache Foundation are unwarranted and unrelated.
                    Unwarrented and unrelated maybe, but I have been noticing a trend...other than Apache itself, the Apache Foundation does seem to be getting a lot of "Well its a dead project but here's some code." donations. Harmony comes to mind.
                    All opinions are my own not those of my employer if you know who they are.

                    Comment


                    • #30
                      Originally posted by 0xBADCODE View Post
                      They manage to perform amazing job granted that their hardware is a crap. AMD on other hand haves very good hardware and ... sucking drivers. Proprietary ones are just a bunch of troubles. Opensource ones are slow. And you see, guys managed to literally waste years working on drivers without real improvements on speed. When some external guy entered and improved FPS on Unigine demo to a degree LLVM backend can't even dream so far. Very-very quickly. Absolutely impressive demo on how efforts to results ratio could vary wildly, depending on people goals and management/judgement quality.
                      It is harder to write a good driver for AMD and Nvidia GPUs, because they are more complex. Also AMD and Nvidia release new series of GPUs every year so they have a lot more work. Nouveau devs are even doing it for free.

                      Comment

                      Working...
                      X