Announcement

Collapse
No announcement yet.

AMD gets beaten by NVidia, any plans?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #91
    Originally posted by Qaridarium

    intel is better than nvidia..
    via is also better than nvidia...
    Try gaming on one of those chipsets, and we'll see how you feel.

    FWIW -- I don't care if drivers are open or closed, I just want them to work.

    Comment


    • #92
      Originally posted by energyman View Post
      http://novembertech.blogspot.com/200...late-much.html

      read it. ATI beating Nvidia in gpgpu.
      That post is comparing ATI's R500 architecture with NVidia's G70 architecture (G80 hadn't been released yet in October 2006). Both ATI and NVidia's GPU architectures have changed completely since then.

      Comment


      • #93
        Originally posted by Qaridarium
        yes why not ?

        intel is better than nvidia..
        via is also better than nvidia...
        amd is better than nvidia...
        3dfx is better than nvidia....

        oh.. nvidia buyd 3dfx.... ok 3dfx was better as nvidia in the past!

        nvidia is the last anti-opensource fighter ... nvidia sould die!
        1. Intel: if you look at it from a usability perspective, Intel is by far the worst of the three major GPU vendors and probably only better than VIA. Their hardware is at the bottom of the barrel of the IGP heap, although they are starting to catch up to NVIDIA's slowest IGPs. Their last few driver releases (some 2.3 versions, 2.4, 2.5) have been absolutely terrible and have caused major performance regressions in an already weak part and have been wreaking havoc on suspend and resume. There is a much bigger gap between performance of Intel parts on Windows and Linux than the pre-8.41 ATi fglrx drivers that everybody loved to hate. I applaud Intel for actively contributing to the Xorg driver and publishing their hardware documentation, but they've got a real mess on their hands right now that still isn't completely fixed yet even with the 2.6 driver, GEM, and kernel 2.6.28. I know they'll eventually sort it out, but right now they're not doing so well.

        2. VIA: VIA's parts don't have anywhere near the performance of NVIDIA's and it's much harder to get playback acceleration to with with VIA parts with openchrome than with NVIDIA's drivers and XvMC/VDPAU. I'd put them at the bottom of the barrel with regard with Linux hardware as they have lackluster parts and aren't all that open-source friendly.

        3. AMD/ATi: You are probably right with AMD being better than NVIDIA as AMD has hardware and proprietary drivers competitive with NVIDIA and also works with the open-source community.

        4. 3dfx: they had been bought out by NVIDIA a looong time ago.

        And I have also used a pretty decent assortment of parts:

        VIA
        - MVP4 (Trident) IGP

        Intel
        - i810E IGP
        - i945GM IGP
        - GM45 IGP

        NVIDIA
        - TNT2 M64
        - 6200 TC PCIe
        - 6200 PCI

        ATi
        - Radeon Mobility 9000
        - Radeon x1900GT
        - Radeon HD 3850
        Last edited by MU_Engineer; 04 February 2009, 11:56 PM.

        Comment


        • #94
          Originally posted by Qaridarium
          i never ever have nvidia like problems with my intel onboard VGAs.--.

          i lost my 17" notebook becourse the geforce 8600 has a bad nvidia chip nvidia do it wrong! not intel!

          i also have a notebook with intel AND nvidia in it.. the inlel one is better on linux no driver install is needet.. also do not have stabilty problems the nvidia one get to HOT! !!!

          only part that nvidia wins again intel is the "speed"
          The problem with some of the G82/G84 parts is not with the chip itself but with the solder used to attach the chip to the substrate. It is a common problem that affected lots of ICs with the transition to lead-free solder mandated by the EU RoHS program. It is what caused the Xbox Red Ring Of Death among other problems.

          You may be able to get an Intel IGP working out of the box easier on most distributions since it has an Xorg driver, but that Xorg driver is far worse than NVIDIA's proprietary driver in terms of features and performance. It is trivial to install the NVIDIA driver to get full functionality out the hardware, while it is highly non-trivial to build the latest Intel driver, kernel, DRI, and Mesa from git code to get a semi-working 3D out of Intel IGPs at the moment.

          come one via is better becourse they tourn into a opensorce company !
          They have announced open-source initiatives but have not delivered much yet. I don't believe they're committed to providing open-source anything until I see significant proof like AMD and Intel have done.

          Comment


          • #95
            Originally posted by jonnycat26 View Post
            Try gaming on one of those chipsets, and we'll see how you feel.

            What's wrong with them? I can even play Fallout 3 using medium details on my Radeon X1600XT. On GF6800 Far Cry just looked horrible and I can say: try gaming on it. Compositions are faster using OS drivers than using nvidia binary blob and GF7600 (I had this card only one day, so I didn't tested it good enough).

            FWIW -- I don't care if drivers are open or closed, I just want them to work.
            I do.

            Comment


            • #96
              Originally posted by Vighy View Post
              So you are just telling to each others that "in your point of view you are right", but nobody tries to fix some stable point of view, onto which everybody can agree.

              You are not discussing: you are speaking to the walls.
              QFT.

              If you care for FOSS, get an ATi card* or Intel IGP.
              If you want the best driver (especially feature-wise) available right now, get a nVidia-card.
              There's no definite answer to this, everybodys got to decide this on their own.

              * and don't be stupid and buy a card not yet supported by the FOSS-drivers and then complain about the quality of the proprietary driver.

              Comment


              • #97
                Originally posted by kraftman
                What's wrong with them? I can even play Fallout 3 using medium details on my Radeon X1600XT. On GF6800 Far Cry just looked horrible and I can say: try gaming on it. Compositions are faster using OS drivers than using nvidia binary blob and GF7600 (I had this card only one day, so I didn't tested it good enough).
                Please read again what jonnycat26 post before. I don't thinks that's what he means.

                Originally posted by Qaridarium
                well.. bevor i has a amd... i has nvidia .. and i don't can they linux works better on nvidia.

                only wine--- NOT LINUX works better on nvidia!
                Please use proper English, cause you make many readers confused with your post.

                IMHO, the post above this one is more reasonable for me..
                Last edited by t.s.; 08 February 2009, 06:09 AM.

                Comment


                • #98
                  Originally posted by Qaridarium
                  in 2009... a 4870 makes more fps per watt than a gtx280!!!

                  in 2009 a rv740 in 40nm will makes much more fps per "watt" than a gtx295!!!!

                  LMAO, uhhuh, meanwhile your consuming 68.4 watts websurfing and desktop stuff while the GTX 280 is consuming 43.7 watts doing the same task. Also depending on the game there are many instances where your 4870 has a lower fps per "watt" then your 4870.

                  We dissect the power consumption of 106 different graphics cards in single- and dual-board configurations to help you pick the right PSU.

                  Comment


                  • #99
                    fps-per-watt is about as useful a measurement as bogomips.

                    The only meaningful measurement is the power needed to run 2D and 3D at the monitor's refresh rate.

                    Comment


                    • Originally posted by Ant P. View Post
                      fps-per-watt is about as useful a measurement as bogomips.

                      The only meaningful measurement is the power needed to run 2D and 3D at the monitor's refresh rate.
                      Not completely true, what if the card has a hard time reaching vsync max framerate at the monitors native res or simply where vsync is not desired?

                      Comment

                      Working...
                      X