Announcement

Collapse
No announcement yet.

NVIDIA Says It Has No Plans To Support Wayland

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by pingufunkybeat View Post
    Who says that?

    The open source driver should run almost every native 3d Linux app out there. For the rest, there are bug reports.
    http://xorg.freedesktop.org/wiki/RadeonFeature - See Evergreen
    http://xorg.freedesktop.org/wiki/RadeonProgram - R800 is missing, but R700 has many times status GARBAGE

    Comment


    • #72
      Originally posted by Chrome View Post
      http://xorg.freedesktop.org/wiki/RadeonFeature - See Evergreen
      http://xorg.freedesktop.org/wiki/RadeonProgram - R800 is missing, but R700 has many times status GARBAGE
      Have you looked mesa version with which these tests were done? Most of the Garbages are from Mesa 7.7 which is ancient.

      Comment


      • #73
        If Mesa 7.9 is first non-garbage driver for R700, then it was released with 2 years delay. R700 was released in 2008.

        Comment


        • #74
          Good thing we live in the present, then ; )

          Comment


          • #75
            Originally posted by Chrome View Post
            It shows that most things work.

            Only the advanced functions like MSAA and Crossfire do not work.

            http://xorg.freedesktop.org/wiki/RadeonProgram - R800 is missing, but R700 has many times status GARBAGE
            All the GARBAGE stuff on native games is related to ST3C texure compression, which is unfortunately patented.

            There is a workaround for r500 and lower, but it hasn't been figured out for r600+, unfortunately.

            And that's exactly four native games which don't work.

            Comment


            • #76
              Originally posted by VinzC View Post
              It is interesting how "We have no plans to support Wayland." is translated by "NVIDIA is claiming they will not support Wayland."

              :rofl:

              Ah, human factor!
              QFT

              NVidia has said many times that they don't want to spend time making drivers/software for rapid changing software/APIs. Prove them that Wayland is not an experiment and is actually the present of Linux graphics, and they will support it.
              It's impossible to know now if Wayland will become the standard some day.

              Comment


              • #77
                Originally posted by TemplarGR View Post
                You must have been using Linux for the duration of this era, and not following hardware review sites, because the reality vastly differs from that... I have gamed in many configurations personaly, and ATI won hands down when compared to similar NVIDIA offerings.
                Guess you missed everything from the GF 6+ series up to the GTX 2x0 series.

                Reality check: Game developers want marketshare for a feature to reach critical mass before supporting it. If NVIDIA due to lack of engineering power can't support it on time, they will hold back, sometimes they will be even bribed (Assasin Creed anyone?) to hold back support...
                Reality check: Lol, lack of engineering power? Come on now give your head a shake. Engineering power has never been a concern at nvidia. It's all about putting your development resources where it is needed and will benefit most. It's no secret that Nvidia at the time was concentrating on their next gen architecture and DX 10.1 didn't bring anything earth shattering to the table with DX11 just around the corner. As far as bribing goes get your facts straight. From the developers mouth:

                [TR: What specific factors led to DX10.1 support's removal in patch 1?

                Beauchemin: Our DX10.1 implementation was not properly done and we didn't want the users with Vista SP1 and DX10.1-enabled cards to have a bad gaming experience.
                Nvidia spokesman Ken Brown told us unequivocally that "no money changed hands" as a result of Ubisoft joining Nvidia's "The Way It's Meant To Be Played" program, because that program is entirely a co-marketing arrangement.
                Ubisoft spokesman Michael Beadle confirmed to us that Ubisoft's participation in TWIMTBP was strictly a co-marketing deal, and that Nvidia did not pay Ubisoft any money as a result of the deal. He seconded Brown's statement that Nvidia did not influence the decision to remove DirectX 10.1 support from the first patch for Assassin's Creed. Beadle said the decision was made strictly by the game's development team.
                http://techreport.com/discussions.x/14707

                Another reason is that most games today are console ports, so their level is at Dx9 anyway...
                Thanks for proving that DX 10.1 support was hardly a "must have" feature.

                Not to this degree.
                Chipsets, GPU's, Processors, AMD does it alot as well. As does intel.

                You must be joking... Let's see if AMD pays game devs to cut features from the competition, a la Batman Arkham Asylum, then we can compare them...
                Get your facts straight. Batman AA issue was that nvidia sent down their engineers to help out with their anti-aliasing code. It was nvidia's IP. AMD had every opertunity themselves to do the same or license the AA code from nvidia. You can't cut features that AMD never implemented in the development.

                And from my findings, NVIDIA's gpus are bad. I have witnessed 4 NVIDIA's gpus die, 3 of them before completing their second year. None was overclocked. The only problem i ever faced on Ati's gpus, was with the game trackmania united, it just crashed on Ati 3850 repeatedly(and frustrated me because i couldn't explain to them that their computer is fine, the damn game is designed for NVIDIA cards...). My own HD3870, have served me well for 3 years, i even played Crysis on 1680x1050 on v.high(i know sometimes i got lags but it was playable).
                And my findings are are completely different but on par with what the local computer stores find here. Having owned several generations of Nvidia cards starting with the TNT and ranging pretty much throughout every series (TNT , TNT2, GF2, GF2MX, GF4200 and 4600s, GF 6600GT, GF 7600GT, GF 8200, GF 8800GT's GF GTX 275's ....) every single one of them with the exception of the GF 2 MX (which I modded to a Quadro) still works absolutely fine. I also have AMD Radeon 7200, X1950, HD 2900XT, HD 3800) and of all of them only two still work (7000 and 2900) and those have not been pushed nearly as hard as the nvidia cards.

                I could scientifically prove you are trolling, based on this comment. For example one could wonder, if you are not interested in opensource, why use Linux/GNU in the first place?

                Another question could be, if you are not interested in Linux because of licencing, but because of its features, what are those features? Gaming? OpenCL maybe or CUDA?Video Accelaration?Photoshop?3d design? What?
                Yes it is all about features for me

                -freedom of choice
                -great development packages
                -security through obscurity (I don't like running a OS with a big target painted on it)
                -great customized configuration options (configuration, not ricer "OMG adding --holyshit-this-is-an-awesome compiler flags)
                -centralized software selections (aka package repos)
                -the KDE desktop
                -being able to spin my own distro
                -efficient use of hardware (like not having to buy another edition for multiprocessor support), better task handling, stability, not having to reboot for a bloody update
                -earlier before windows came about for support like x86-64, proper memory management
                -modularity
                -much much more which has absolutely no bearing if the OS is opensource or not as none of it requires diving into someone elses source.

                In other words, everything what a OS is supposed to do, pperate the system, it works for me, not me working for the OS. . If opensource was such a key factor for most people the most popular distro's would be source based and that is far from the facts of reality. As far as what software I use, it does not matter if it is opensource or not, it has to do the tasks that I want, nothing more, nothing less. Yes I do use Cuda, openCL, Pro/E, Maya, video acceleration, Nero 4 linux. I also develop my own applications for items like video media manipulation. Being able to utilize vdpau for example on my h264 editor has been a Godsend. As as being able to spoof my HTPC's so that the telco company sees them as IPTV boxes and ridding myself of a shitty, featureless UI that they have. I can go on and on and virtually none of the reasons why I use linux relates to it being opensource. I'd be just as happy if I had to go onto a website to configure a kernel and out pops a kernel of my configuration for download.

                People who need to buy a discreet commercial gpu, 99% of the time want to game with it(in the time of decent integrated gpus). If they were professionals, they would buy workstation gpus, not consumer, unless they were really poor.
                Many times the consumer based cards are good enough for what the end user needs. Maya and Pro/E for example work fine on a consumer nvidia card with the blob but trying to use them with any OS driver is next to impossible.

                So, people that game wouldn't use linux opensource drivers (or ATI blobs) anyway, unless they were masochists.
                Fixed that for ya.

                One should never forget a golden rule: Always try to think outside the box, just don't forget what the box is for. You are arguing about a binary driver for Linux while forgeting what is Linux for...
                Actually you are forgetting what linux is for.

                In 1991, in Helsinki, Linus Torvalds began a project that later became the Linux kernel. It was initially a terminal emulator, which Torvalds used to access the large UNIX servers of the university. He wrote the program specifically for the hardware he was using and independent of an operating system because he wanted to use the functions of his new PC with an 80386 processor.
                Wow, Linus wanted to fully utilize his hardware, imagine that. It's the same reason why people run nvidia blobs.

                Performance will improve rapidly from now own i believe. It took much time to build their basic foundation, but now end user improvements should be more noticable.
                While it may improved a "generic-one-size-fits-all" will never come close to what an optimized solution can do. As far as a big improvement in performance goes, I'd have to see it to believe it first. So far nothing is even within reaching distance of what the blobs has in terms of performance and features. I didn't buy a modern performance video card for linux use with open drivers to get slapped around by a 3 generation old card with blobs just so I can delude myself into a warm fuzzy feeling that somehow the opensource solution was better.

                Modern integrated gpus should be able to handle anything opensource thrown at them. Compositing effects, opensource games etc.

                Only Linux workstation professionals really need discreet gpus.
                For your uses it sounds like an iPad would fit all your needs it does all the basics as well. Hell even a web enabled TV would probably suffice by your target audience.

                No you don't. This is not simply a radical change. It is a change that will require:

                1) Either changing entirely the nvidia blob structure to sit on top of Nouveau, or replacing entirely Nouveau's userland with its own. This will certainly hurt performance AND features.

                2) Providing proprietary KMS fucntionality inside the Linux kernel. This has to be opensource, to prevent problems, or this will require replacing the memory manager with NVIDIA's, which will make Catalyst look like a dream come true...
                Neither of these items are even close to the complexity of building a new driver for a new architecture. Nor does anything require it to be opensource to function properly. If anything history has showed us with the blobs is that Nvidia has been able to handle any curveball that the opensource community throws at it in a effort to force them to their will.

                This is not something as simple as CUDA, especially since the Linux blob just "borrowed" it from the Windows version(people give CUDA much more credit than it deserves, it is not exotic, just a compiler for a different architecture). This will require major Linux-centered work.
                I can only chuckle at this insanity that you post. Being able to "borrow" from the windows driver is a harder accomplishment then you give it but it does never the less does show that nvidia does have a great codebase to be so portable to other platforms.

                Seriously, do you own NVIDIA stock? If you do not, and are interested in knowing thyself, visit this:

                http://encyclopediadramatica.com/Fanboy
                Actually I own 5000 shares of AMD stock, no nvidia stock. I just hope there are more guys like you and in your link to make up the losses incurred so that I can at least get my initial investment back. Thank God for my Apple shares I suppose.

                Comment


                • #78
                  Originally posted by BlackStar View Post
                  AMD's upcoming Bobcat processors should be fast enough HD (plus they have hardware decode).
                  Hardware decode has been present in ATI hardware for a long time now but is still in a state of shambles in linux. What makes you think Bobcat is going to enjoy a better fate?

                  Comment


                  • #79
                    Originally posted by Kano View Post
                    @TemplarGR

                    Using the latest flash 10.2 beta with youtube hd works when the pc is fast enough (as that does not use a very high bitrate it can be be bit slower than for bluray content). But you really forget the lots of atom based netbooks and lots of low budget pcs with slow cpu.
                    As BlackStar told you, this is a strange use case you mention here. Current netbooks shoudln't be used for viewing HD content. Their screens and storage space are not meant for it.

                    Of course, it could be convenient if now and then we could watch an HD video on our netbooks(i too own one), for example if we downloaded a movie in HD and didn't want to bother resampling first. But this is a corner case at the moment. Not even in Windows one can use current netbooks with HD content...

                    Comment


                    • #80
                      @deanjo

                      We have a real fanboy here... What a shock. If i had the time to waste i would answer in detail your latest post, but i see no point in it. I have already said what needed to be said.

                      Some quick things though:

                      I can only chuckle at this insanity that you post. Being able to "borrow" from the windows driver is a harder accomplishment then you give it but it does never the less does show that nvidia does have a great codebase to be so portable to other platforms.
                      Being able to borrow CUDA code from Windows is simple copy-paste work. The only thing that needs modification is the part that interacts with the Xserver, which is trivial work. More work was spent on testing the thing than actually developing it...

                      Reality check: Lol, lack of engineering power? Come on now give your head a shake. Engineering power has never been a concern at nvidia. It's all about putting your development resources where it is needed and will benefit most. It's no secret that Nvidia at the time was concentrating on their next gen architecture and DX 10.1 didn't bring anything earth shattering to the table with DX11 just around the corner. As far as bribing goes get your facts straight. From the developers mouth:

                      Ubisoft spokesman Michael Beadle confirmed to us that Ubisoft's participation in TWIMTBP was strictly a co-marketing deal, and that Nvidia did not pay Ubisoft any money as a result of the deal. He seconded Brown's statement that Nvidia did not influence the decision to remove DirectX 10.1 support from the first patch for Assassin's Creed. Beadle said the decision was made strictly by the game's development team.

                      Get your facts straight. Batman AA issue was that nvidia sent down their engineers to help out with their anti-aliasing code. It was nvidia's IP. AMD had every opertunity themselves to do the same or license the AA code from nvidia. You can't cut features that AMD never implemented in the development.
                      Sure, and i suppose you believe everything goverment and the media tells you at face value, because official statements would never lie to ya right? They are all good honest people just struggling to help humanity... You must be american...

                      The reality of the matter is, AA is a neutral open technolodgy, that should be available to you regardless of ventor. Batman's developers didn't really need any help from NVIDIA to do that, countless of devs before them had accoomplished it without their help. Your iq has to be under 90 to believe otherwise...

                      And Ubisoft. don't get me started please... These statements you put here are a joke. This feature worked fine, because i had used it both on Ati and NVIDIA hardware. It just provided Ati gpus with a boost, and NVIDIA couldn't have that. Even if it is true that no money were given by NVIDIA, which i don't really believe actually, a marketing deal is worth a lot of money you know...

                      Anyway, you are a fanboy and arguing with one is bad practice. Have fun using poor products based on sentiments instead of logic. Unlike you, if AMD ever pulls the same crap like NVIDIA, i am switching to Intel permanently.

                      Comment

                      Working...
                      X