No announcement yet.

Any update: HD Radeon 4xxx v.s. Nviida 260?

  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by deanjo View Post
    Sorry but your wrong. This has been done for years. For example you cannot run SLi setup on a ATI chipset board and vice vera. You cannot use a off the shelf video card in a Mac. Hell you can't even make a chipset for another venders CPU unless you have licensing rights. Then there is also the fact that some devices (raid cards and ide controllers come in mind) will actually blacklist or drop the drives to a lower performance mode then others to address compatibility issues. Another example of this is Nvidia cards having to drop to 1x AGP mode on the old irongate amd chipsets.

    Sorry but limiting functionality and features are very commonplace in the PC world and it is not illegal at all.
    I believe we're answering different questions. You're thinking of the wider action of restricting one's products (be they hardware or software) to a particular platform. This is generally fine but subject to some legal constraints. I'm talking about the specific extension of the PCI-E standard to restrict competitor products (specifically video cards).

    PCI-E is a problem because it is a trademark of the consortium PCI-SIG. If your board says it is PCI-E compatible and uses the PCI-E logo, it has to work with all PCI-E peripherals. Why? Well, as I said, if you truly want to restrict competitor peripherals on your board, you would do one of the following:

    1. Make a proprietary standard the sole implemented one on your motherboards and make your cards capable of working on both that and regular PCI-E. If you do this and still certify your board as PCI-E, this is illegal because it's the same as the embrace+extend strategy that Microsoft did with Java.

    2. Install a controller chip on the PCI-E bus to intercept signals and detect the device id, passing through signals from ids registered with one's own cards. Not only is this utterly impractical, but it's also anticompetitive in that it targets a certain subset that should work with your boards. An example of this deemed illegal is the telecoms who owned the phone lines restricting DSL access to their own services.

    To address the examples you gave:

    SLI on ATI chipset: ATI does not feature the SLI sticker or license SLI from Nvidia. It does not even claim to support it. Nor is Nvidia under any obligation (afaik) to sell a license to ATI because there isn't a patent sharing agreement between Nvidia and ATI. SLI is not an open standard.

    Video cards on mac: Apple doesn't write any drivers for that video card for its own OS. There's nothing preventing you from installing Linux on a mac and using an add-in card with open-source drivers. There's nothing restricting you; Apple just doesn't go out of its way to help you. There's a difference there.

    CPUs and patents: This isn't an issue. Yes, rights for CPU sockets need to be licensed. PCI-E also needs to be licensed, but it's available to anyone in the consortium. Your motherboard most likely says PCI-E certified on the cover and also whatever socket has been licensed. Obviously, if you don't license something, you can't claim to support it, but then that's your (the vendor's) fault.

    Raid controllers: I'm not too well-read on these devices, but it sounds like someone isn't implementing a standard correctly or that there is no standard. In that case, dropping to another mode is doing a favour to support a device that has been improperly made or not certified.

    Same thing I'd think with Nvidia and Irongate.

    I remember when I worked on the .NET Framework team at Microsoft that we always had a lawyer in attendance to make sure our work under ECMA licensing did not cause too onerous an implementation burden on things like Mono. You can see how seriously standards are implemented given the wide compatibility of devices on PCI-E.


    • #52
      I was about to make a new topic asking for people's thoughts on the 260 vs 4xxx, but I'll just use this topic instead.

      I'm using AMD64 Gentoo with an HD4850, ati-drivers 9.9-r2, and xorg-server

      I'm getting quite sick of the shoddy performance. Conky keeps flickering in the back with and without compositing, and resizing and restoring windows lags.

      I'm now considering moving my Steam games over to Gentoo and play on Wine so I don't have to keep switching between OS's.

      So here's my question: is it worth switching over to the GTX260? I knew there were going to be issues with the 4850 under Linux, but my standards changed throughout using it. I would really like good performance in 3D without flickering 2D windows.

      Will it be worth shelling out $150 (minus whatever I sell my 4850 for) for an nVidia card, or just suck it up and wait for the drivers to improve?


      • #53
        Conky flickering seems to be a common problem with default settings but it seems to be fixable :

        The key seems to be enabling double buffering and making it not draw to the root window. Double buffering also may need the "dbe" module loaded in xorg.conf. The flickering does not seem to be specific to any vendor's hardware or drivers.

        A performance optimization patch ("107 don't backfill") was removed between 1.5x and 1.6 X server versions in order to fix a problem on another vendor's hardware, but the result was laggy minimize/maximize with XAA acceleration (ie with fglrx). Most distros offer a version of the X server which has that patch restored, eliminating the delays.
        Last edited by bridgman; 19 September 2009, 08:48 PM.
        Test signature


        • #54
          wine + nvidia is of course a much better combination than wine + fglrx as most developers use nvidia. i really want to see a gt300 card for linux. a gtx 260 or 275 is very interesting now for the price, you get vdpau too for hd videos. videos look much better with it.


          • #55
            That one fixed the problem.

            Originally posted by bridgman View Post
            A performance optimization patch ("107 don't backfill") was removed between 1.5x and 1.6 X server versions in order to fix a problem on another vendor's hardware, but the result was laggy minimize/maximize with XAA acceleration (ie with fglrx). Most distros offer a version of the X server which has that patch restored, eliminating the delays.
            I see, I'll look into it.

            Thanks for the speedy replies, much appreciated.


            • #56
              I can confirm that the connector pipe does have a bracket for the centrestand
              Tattoo laser removal free trial - Tattoo removal cream before and after pictures


              • #57
                My GeForce GTX260 arrived a couple days ago, but crapped out on me. It's being RMA'd right now.

                On the GTX260, Team Fortress 2 is able to run on Wine sometimes (depending on the phase of the moon I guess). However, UT2004 gets about 70-80 FPS and 2D doesn't have that great of a framerate in Compiz-fusion. My Radeon HD4850 can get smoother compositing with FGLRX other than the delay with maximizing, which I still have to fix. UT2004 performance is a different story with FGLRX...

                I remember my old GeForce 7900GTX having great 2D and 3D. Are the drivers just not up to snuff yet with the GTX 200 series? Does anyone have the same experience with their GTX260? There's no way UT2004 should get 70-80 FPS, my 7900GTX got around that framerate, not to mention Compiz. I think I remembered to turn off compositing before I started UT2004, might have to double check that before I complain some more.

                I have a feeling I'm doing something wrong with the drivers or settings with my GTX260, however Ubuntu and Gentoo seemed to have about the same performance with 2D and Team Fortress 2. I tried both the 185.* and 190.* drivers under both OS's and performance on both drivers seemed to be about the same.
                Last edited by Mardok; 26 September 2009, 05:28 PM.