Announcement

Collapse
No announcement yet.

NVIDIA To Begin Publishing Open GPU Documentation

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Originally posted by Yorgos View Post
    the first reason is hidden inside the 3.11 kernel
    Dynamic power management for Radeon?

    Last time I tried Nouveau (1 year ago), it halved the battery life of my laptop.
    On the plus side, VGA output worked right out of the box (and I could configure desktop sharing through KDE's settings), something that has NEVER worked with the blob a life-saver on presentations!

    Comment


    • #47
      Leaving aside the details, I think this is quite significant. A lot of big players are very unhappy with Microsoft. Microsoft uses its monopoly power to straight jacket any opposition. but maybe things are starting to break, a lot of Microsoft's so called partners are seeing the chance to break out.

      Comment


      • #48
        Originally posted by MartinN View Post
        I used to build my own gaming rigs... in retrospect, this is a waste of time - unless you're in it for the hobbyist/enthusiastic aspect, in the end it is always more expensive than buying an xbox, nintendo, PS, whatever..... plus, there are titles that are game console-specific that are the most interesting games to play (Zelda, for instance)... will likely never see a port to anything other than Nintendo. Your gaming rig will also be obsolete in about 6 months, while consoles last several years before they are updated.

        If you want to play games - buy a console. For work, buy a laptop with integrated GPUs (such as Intel's HD/Iris). Let us stop this dogma that one computer has to do it all and do it best - it shouldn't - unless you're willing to fork over a premium, and most people aren't.
        Games look way better on a PC and are more open (eg. custom maps/mods) so you can't really say that... but we're getting off-topic here.

        Comment


        • #49
          Ooooh... Nvidia is feeling the heat? Goodd for Nvidia! But, what I wonder is, what have not been released that developers would like/need to have?

          Comment


          • #50
            Originally posted by CrvenaZvezda View Post
            But, what I wonder is, what have not been released that developers would like/need to have?
            Pretty much everything.. so far anyway.

            Comment


            • #51

              Comment


              • #52
                Originally posted by gamerk2 View Post
                Disagree. Due to power draw, APU's are never going to advance past entry-level performance. Desktops and high end systems are still going to need dedicated GPU's as a result.
                The average desktop user spends their day using office apps, surfing the web, chatting on Facebook, watching cat videos on YouTube, etc. Desktops do not need GPU performance that exceeds the PS4 and Xbox One (which most people would not define as "entry level"), or even the 13” Macbook Retina.

                I remember exactly the same arguments about every bit of hardware that's been integrated. How many people still buy discrete sound cards? Discrete ethernet cards? Yes they still exist, but it is a very small minority. The integrated part doesn't need to be better, it just needs to be cheaper and "good enough".

                Edit: Seems the new 21” Haswell iMac has an integrated GPU - "In terms of processing power the base $1,299 USD 21.5-inch model trades last generation's NVIDIA GeForce GT 640M for the on-die Iris Pro GPU inside the Intel Haswell 2.7 GHz quad-core Core i5-4570R processor." http://arstechnica.com/apple/2013/09...-and-802-11ac/ Surely an iMac is powerful enough for the typical desktop user.
                Last edited by chrisb; 09-24-2013, 07:38 PM.

                Comment


                • #53
                  Originally posted by chrisb View Post
                  My guess: NVIDIA Loses Huge GPU Order Due To Linux Blob combined with NY Times: NSA Should Be Barred From Requiring Companies To Introduce Surveillance Backdoors. Given the revelations about the NSA being able to force US companies to include backdoors in their software, foreign governments would have to be crazy to keep on using closed source binaries from US vendors. There is no way China can trust US software now. Losing an order for over 10 million GPUs is pretty bad, losing every future foreign government order would be a disaster.

                  The other thing threatening Nvidia is APUs. Intel has an onboard GPU. AMD has an onboard GPU. Integration can result in lower power and potentially higher performance through unified memory. Apple already moved the Macbook 11" and 13" to integrated GPU, both Sony and MS next-gen consoles are integrated GPU... The writing is on the wall for discrete cards in consumer systems. This only leaves Nvidia with the possibility of licensing their core for ARM mobile, or selling high performance cards to select customers. They need to adapt fast because people aren't going to pay the discrete card premium once Sony+MS show the world that cheaper integrated graphics is good enough for the top videogames consoles.
                  I question the validity of the article though. I know it's NYT but still... Surveillance backdoors are nothing new. I worry that such back doors make all security measures on machines redundant to the person with the know how. So really, are they doing themselves any favours? They're just allowing everyone to read every machine. How is the world safer by making every machine vulnerable just because they want it.

                  Onboard APU and GPU is going to force nvidia to make their own cpu's. Nvidia have always been about big performance, so maybe this is the start of more open source to come? I honestly think they are doing this in prep for Valves new steambox.

                  Comment


                  • #54
                    My head just exploded

                    Comment


                    • #55
                      There was some speculation a while ago about AMD drivers that maybe they want to drop making catalyst for GNU. Maybe its the same thing here?

                      We're in an important moment of GNU/Linux history - X is going out the door and there are 2 contenders waiting to take it's place. For GPU makers it means more work to support the new stuff, and probably more work than it was for X. So maybe both AMD and nVidia look into stopping making their propertiary drivers for GNU, and instead throw the docs + keeping 1-2 guys for supporting the FLOSS developers.

                      It would be both cheaper for the companies (less people employed and the community does all the work) and the FLOSS community would be happy cause the drivers were free. Everybody wins.

                      (these are just my thoughts)

                      Comment


                      • #56
                        Well I'll be ****ed...

                        Comment


                        • #57
                          Originally posted by Calinou View Post
                          Games look way better on a PC and are more open (eg. custom maps/mods) so you can't really say that... but we're getting off-topic here.
                          How much people pay for better looking game? Are better looking games worth the invest?

                          Comment


                          • #58
                            There's room for improvement

                            My experience with NVIDIA supporting their own f......ing hardware on Linux is "We're working on it".

                            https://devtalk.nvidia.com/default/t...rd-xv-support/

                            They talked the talk, let's see if they walk the walk. There's definitely room for improvement.

                            Comment


                            • #59
                              Originally posted by gamerk2 View Post
                              Disagree. Due to power draw, APU's are never going to advance past entry-level performance. Desktops and high end systems are still going to need dedicated GPU's as a result.
                              LOL

                              You couldn't be more wrong, seriously...

                              With each die shrink, the gpu portion of the apu die will become bigger and bigger. I anticipate a percentage like 60-70% a few years ahead.

                              Plus, there is no real reason why TDPs for consumer APU's have to be in the range of 100W... Why not 150 or 200 watts? If that means no discrete GPU, why not?

                              Discrete gpus will go the way of discrete fpus. And unless NVIDIA's project Denver works, it is dead...

                              Comment


                              • #60
                                http://lists.freedesktop.org/archive...er/014495.html

                                IMHO that's a rather honest post concerning the motivation behind
                                the plans to release some functionality in a binary-only firmware.

                                Promising (the honesty), I like that.

                                Comment

                                Working...
                                X