Announcement

Collapse
No announcement yet.

NVIDIA Developer Talks Openly About Linux Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by myxal View Post
    *chuckle*
    *tags the post "yodawg"*
    Lolwhut?

    I should get a patent on patent trolling and then troll every patent troll with my patent troll patent, n stuff yo

    Aiiiiiiight, fo' shizzle my dizzle!

    I got dem transistor in my silicon, yo. Doin' threads n shit!

    2009, yeah, uhu... latawr dawg!


    Alright STOP! HAMMERTIME!
    Last edited by V!NCENT; 24 October 2009, 05:07 AM.

    Comment


    • For me Nvidia died with the day when AMD bought ATI. The opensource strategy was then only a matter of time and it happend fast.

      To tell nouvou would be great and not giving the developers any documentation is marketing bullshit, the block it as much as they can, the only way to be more incooperative would be to sue them, but 1. they likely would not win 2. the marketing damage would be extremly big. I hope nobody develops any big stuff for nouvou. Good opensource support for Nvidia hardware with massive effort of reverse-engineering would be silly, because you take pressure from the company. And with the same or less work you could make the amd drivers with the docs to an excellent driver with good 3d speed (the other stuff works since a few weeks nearly perfekt anyway, except VDPAU like stuff)

      Nvidia have hard work to do to not die in next few years. They are heavily under attack with better and cheaper grafics-cards from Amd. If cuda dont succeed (and it won?t) and they get no big foot into gp-prozessing they will at least shrink massivly or go bankrupt.

      And in this situation could be a missing opensource support and with it some lost sells be the waterdrop that brings the burrel to overflow or fasten den process (don?t know if that adage is common internatioly it?s a german saying )

      Nvidia isn?t in a strong position anymore, let?s ignore them (don?t upgrade the nouvou) the opensource community have the stronger position so just wait till they release docs/source or till they die.
      Last edited by blackiwid; 26 October 2009, 09:55 AM.

      Comment


      • Damn, it's obvious this site has a bias towards AMD/ATI, but reading these comments... I'll probably get flamed to atoms for saying this (or like before, get accused for working for Nvidia) but despite the wonderful step ATI has taken, Nvidia really has solid Linux support for workstation OpenGL. The most important usage of a GPU is NOT Compiz/Q3-engine.

        I personally have great respect for Bridgeman (representing OSS ATI) and I already know my next system will be AMD+ATI, but this is for my home so an OpenGL crash won't jeopardize the deadline and I probably just buy a Nvidia card if it's too much. ATI would want me to buy a FirePro but that is simply stupid. Talk about artificial limitation...

        So even though I would love to use pure FOSS even with drivers, until I can use modern OpenGL (no "desktop/workstation" artificial BS) the prospect of open source ATI goodness with, say 4770, looks 2-3 years away, and Nvidia no matter how boring corporate closed source they are, allow me to actually use those billions of transistors in the meanwhile.

        Comment


        • Originally posted by blackiwid View Post
          And in this situation could be a missing opensource support and with it some lost sells be the waterdrop that brings the burrel to overflow or fasten den process (don?t know if that adage is common internatioly it?s a german saying
          It's a Dutch saying too "The last drop that causes the bucket to overflow", but it's used for indicating that tension has build up and the last tiny incident makes the situation explode.

          Nvidia isn?t in a strong position anymore, let?s ignore them (don?t upgrade the nouvou) the opensource community have the stronger position so just wait till they release docs/source or till they die.
          That's questionable, because they have a nice deal with Apple and they could always be bought by either Apple or Intel. But I too am starting to recognise as nvidia as the 3dfx of today. I don't know if it's just me looking with a magnifying glass at FLOSS and ray tracing too much, which could be simple case of 'peer review group doesn't equal the rest of the world', but oh well...

          For all all you know nVidia could be sitting on some major kickass new GPU blueprints that are almosty ready to be released and it's GeForce 6800 all over again... who knows?

          Originally posted by numasan View Post
          Damn, it's obvious this site has a bias towards AMD/ATI, but reading these comments... I'll probably get flamed to atoms for saying this (or like before, get accused for working for Nvidia) but despite the wonderful step ATI has taken, Nvidia really has solid Linux support for workstation OpenGL.
          I think you are mixing the wrong things here and jumping to conclusions. This website is geared towards Linux and Linux users like FLOSS, so it's only natural that people are +1 towards ATI now and -1 towards nVidia now that nVidia has come out publicly by saying that they will not step on the FLOSS bandwagon a endlessly long time after they replied that they were considdering it. That is naturally a nail in the coffin for them. Ofcourse they rule with their long standing driver support for Linux, but it's proprietary... You are not going to be flamed by any sane person though, because nVidia != Microsoft.
          Last edited by V!NCENT; 26 October 2009, 11:30 AM.

          Comment


          • Originally posted by V!NCENT View Post
            I
            That's questionable, because they have a nice deal with Apple
            Today's rumors report that the next macbook generation should feature AMD's graphic cards

            But I'm sure nVidia can survive this loss (maybe the generation that comes next will feature nV again )

            Comment


            • Originally posted by TeoLinuX View Post
              Today's rumors report that the next macbook generation should feature AMD's graphic cards

              But I'm sure nVidia can survive this loss (maybe the generation that comes next will feature nV again )
              It was Steve Jobs that said: "Microsoft doesn't have to lose in order for Apple to win." translated: nVidia isn't going to be unsuccessful if Apple also makes a deal with AMD to ship ATI GPU's.

              Comment


              • Originally posted by V!NCENT View Post
                It was Steve Jobs that said: "Microsoft doesn't have to lose in order for Apple to win." translated: nVidia isn't going to be unsuccessful if Apple also makes a deal with AMD to ship ATI GPU's.
                Perhaps. But Nvidia have an increasingly hard battle to fight with Intel owning the low-end 'I don't game' chipset market, AMD owning the 'I play old games occasionally' chipset market, no more Intel chipset sales (though I've no idea whether they ever did make much money from chipset sales because the only people I know who had Nvidia chipsets were always complaining about them) and discrete GPUs which must cost more to build than ATI's; last time I looked at benchmarks ATI seemed to get substantially higher shader performance per transistor.

                I doubt they'll go bust any time soon, but I think they're in for a few lean years before they find a new niche. I'll be buying an Nvidia card for my next Windows PC, but only because my video editing software doesn't support anything else. The only Nvidia chip I'm likely to buy for Linux any time soon is an Ion so I can play HD files in MythTV.

                Comment


                • Originally posted by movieman View Post
                  Perhaps. But Nvidia have an increasingly hard battle to fight with Intel owning the low-end 'I don't game' chipset market [...]
                  But what if Intel is having an even harder battle to fight for the 'I do game' market? If nVidia would drop in stock price like a maniac, it would be wise, in my opninion, for Intel, who is sitting on a shitload of money, to buy out nVidia just soon enough before it's death.

                  What are your thoughts on this? You have said to have worked in the graphics industry for 10 years so you were probably involved with deals that did not involve "Can I overclock my video card if I buy that cooler? A BIOS? I am 12 years old, what is this?"

                  Comment


                  • Also you forget the Zune deal with Tegra. I guess Tegra will power iPhones soon too.

                    Comment


                    • Originally posted by V!NCENT View Post
                      But what if Intel is having an even harder battle to fight for the 'I do game' market?
                      Intel are pushing Larrabee for the high-end graphics market, so they don't really need Nvidia's hardware (or, at least, they presumably believe that Larrabee will be better/cheaper/more profitable). If Nvidia were about to go bust I suspect they'd wait and pick up the laid-off engineers cheap for Intel projects; didn't they hire many of the laid-off 3Dlabs engineers for Larrabee? Must be cheaper than buying the company, and wouldn't cause any problems with anti-trust laws... plus in the company buy-outs I've been through most of the other company's hardware was abandoned and many of their engineers left, so you're probably better off just hiring the people who want to work for you in any case.

                      That said, Nvidia's patent stash would probably be worth a few bucks if Intel don't already have a generic cross-licensing deal with Nvidia.

                      Comment

                      Working...
                      X