Announcement

Collapse
No announcement yet.

AMD To Drop Radeon HD 2000/3000/4000 Catalyst Support

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Death Knight View Post
    Here is the another thread about open source GPUs for Linux. Looks like nVidia doing best in this job, even without supporting open source directly, like AMD.

    http://www.reddit.com/r/linux/commen...rformance_with

    I am thinking on it for a while. Why AMD needed to support open source on Linux OSs?
    Why they hired programmers ( 2 programmers! Laughs! ) for producing codes for many different GPUs?

    I think, probably, they feel that, they are really behind of other brands and gonna loose their share between linux users...
    So they wanted to "look like" Linux compatible and put their weight on FOSS drivers. Opened specs etc. They made solid improvements. Made good steps. Thanks for that.
    But they look like they forget about being "behind" and the gap doesn't gonna close with hiring just two developers.
    They need MUCH more developer to support their GPUs... Dropping catalyst support for old cards? No problem If they are gonna spend maintain money for FOSS drivers... Also gonna happy about it.

    AMD is better to hire 10 more FOSS developer and drop whole catalyst support for linux!
    Do you agreed?
    They'll never do that.

    Reason 1: Catalyst runs on Windows, and many (most!) of the programmers they pay to work on Catalyst are focused on Windows.
    Reason 2: They couldn't get the Gallium3d open source stack running well as a replacement Windows driver, even if they tried. Main reason is that all the encumbered code (UVD, probably catalyst AI heavy optimizations, probably stereoscopic 3d, probably crossfire, i.e. most of the things they refuse to bring to the open drivers) would never be allowed to be open sourced. Other companies' legal property is within Catalyst, and that magic sauce will never be allowed to be in the open drivers.
    Reason 3: Because of Reason 1 and 2, it is impossible to not continue developing Catalyst for Windows if they want to continue to support 90% of the PC market's OS of choice.
    Reason 4: Because of Reason 3, they can't divert funds (which really just comes down to salaried employees) away from Catalyst to focus on the open drivers. They have to spend "extra" money on the open drivers, and they don't have a lot of extra because Intel is owning them on the CPU front and they're still spending heinous cash developing new CPUs.

    Kinda wish the ATI/AMD merger never happened... well I think some of AMD's more open source friendly culture influenced ATI positively, but on the other hand, ATI is basically the only viable business unit keeping AMD afloat. Their CPUs are an absolute joke. The ATI graphics cards must be their cash cow, preventing them from going bankrupt (yet), because they have an actually competitive product.

    Comment


    • Originally posted by Death Knight View Post
      AMD is better to hire 10 more FOSS developer and drop whole catalyst support for linux!
      Do you agreed?
      I fully agree. Better 20.

      Comment


      • Originally posted by allquixotic View Post
        Kinda wish the ATI/AMD merger never happened... well I think some of AMD's more open source friendly culture influenced ATI positively, but on the other hand, ATI is basically the only viable business unit keeping AMD afloat. Their CPUs are an absolute joke. The ATI graphics cards must be their cash cow, preventing them from going bankrupt (yet), because they have an actually competitive product.
        Meh, the APUs are doing pretty good, they made them competitive in the mobile market again which is where the consumer market is going to more or less, most people just want a laptop these days instead of a full desktop. It also helps that in the consumer market the CPU stopped being relevant around the time dual cores became available to the low end consumer, the most demanding things these consumers want to do is maybe play some Sims or WoWcrack, neither of which needs allot of CPU, but when the GPU sucks they notice, they may not attribute it to the GPU as to why the game looks like crap and skips but they do notice when it isn't up to snuff much easier then they notice an extra few seconds scraped off their MP3 rips.

        Comment


        • About moving to NVIDIA from AMD - you're comparing their OSS drivers. In that case, yes, AMD OSS drivers are in a better shape support-wise. But when comparing the binary blobs, NVIDIA has the upper hand - once again, look at Wine statistics. FGLRX can't even launch any Unreal Engine 3 games further than the menu, while NVIDIA users give those games Platinum rankings and report near perfect performance... So yes, if you want to support OSS, then sticking with AMD could be a good idea. But if you want to actually play games with the binary blob - not so much, unfortunately.

          Originally posted by Kivada View Post
          @GreatEmerald

          Fisrt, most people that are the target market for super high res screens are the high end gamer market, wo upgrade their GPU every year-18 months, they moved from their HD6950s that they BIOS modded into HD6970s to HD7970s already, when high res screens hit they'll be waiting for the first model to hit some magic level of low latency to replace their 1920x1200 or 2560x1600 res screen or screens.

          Furthermore, you CPU make very little difference at very high resolution and GPU settings, if you are quadrupling 1920x1080 as would be the case in 3840x2160, a jump from 2,073,600 pixels to 8,294,400 pixels. You see it here all the tie with Larabel stupidly using IOQuake3 based games on modern hardware like i7s and GTX580s, the the limiting factor with games with such low hardware requirements show that the CPU ends up being the limiting factor, now though, if you look at a really high GPU intensity game you quickly find that the CPU is no longer the limiting factor as you increase the resolution and graphics settings, if you max out a game like Metro 2033's GPU settings at 2560x1600 the GPU hit's it's limitations long before the CPU hit's it's capability to keep feeding the GPU up to date location data for everything on the stage that you're on. Increasing the resolution only makes the GPU cry uncle that much faster as the work the CPU has to do doesn't change much between 640x480 and 2560x1600.

          Yeah, thats par of the reason to move to higher resolution screens, no more need for AA, AF you still might need to give things in the distance a sense of depth, but even your 2D experience becomes allot better as fonts will be much crisper and easier to read at very high pixel densities.
          That's not the high end gamer market, that's the enthusiast market... Because, like I said, there is no need to upgrade from HD4000s to play high end games today. Anyone who does upgrade is basically wasting their money, or already have something crazy like a 3+ monitor setup, which means that they're enthusiasts.

          Like I said, in my case it's special, since I also record my games. So a jump from 2,073,600 pixels to 8,294,400 pixels would bring down the framerate from 30 FPS to 7 FPS due to the CPU not being able to handle that many pixels in real time. For those who don't record, though, it's easier. Then again, even when I don't record, Mass Effect 3 sometimes gets dropped frames due to the CPU, since animations, all UnrealScript and some other things are purely CPU-based. But, of course, increasing the resolution would make the GPU a bottleneck under non-recording circumstances.

          I hope they can improve DPI support until then, though. Right now changing the DPI on Linux only changes the font size, and not the size of any graphical elements, even if they use SVGs for them...

          Comment


          • Originally posted by GreatEmerald View Post
            I hope they can improve DPI support until then, though. Right now changing the DPI on Linux only changes the font size, and not the size of any graphical elements, even if they use SVGs for them...
            KDE's Oxygen handles high DPI impressively well. I have a 135 PPI (pixels per inch) netbook. It's only problem regarding the pixel destiny is Firefox's policy of forced 96 PPI, which makes web browsing a difficult task.

            Comment


            • Originally posted by GreatEmerald View Post
              About moving to NVIDIA from AMD - you're comparing their OSS drivers. In that case, yes, AMD OSS drivers are in a better shape support-wise. But when comparing the binary blobs, NVIDIA has the upper hand - once again, look at Wine statistics. FGLRX can't even launch any Unreal Engine 3 games further than the menu, while NVIDIA users give those games Platinum rankings and report near perfect performance... So yes, if you want to support OSS, then sticking with AMD could be a good idea. But if you want to actually play games with the binary blob - not so much, unfortunately.
              I don't care about the blob for either company and I care even less about Wine, PlayOnLinux, Crossover or Cedega compatibility.

              Originally posted by GreatEmerald View Post
              That's not the high end gamer market, that's the enthusiast market... Because, like I said, there is no need to upgrade from HD4000s to play high end games today. Anyone who does upgrade is basically wasting their money, or already have something crazy like a 3+ monitor setup, which means that they're enthusiasts.
              Try the PC only titles not the console ports like Mass Effect 3, if you max the graphics settings on titles like Metro2033, Aliens VS Predator, Hard Reset, Just Cause 2, Deus Ex: Human Revolution, Dirt 3, Crysis: Warhead, Mafia 2, Batman Arkam City, Total War: Shogun 2, STALKER: Call of Pripyat and the list goes on for Windows games that when maxed out on detail settings will barely break a 30 FPS average let alone the 60FPS the high end gamers demand on 1920x1080~2560x1600 no AA~4x AA if that on an HD4890, the guys playing these games refuse to compromise by lowering settings, hence why they are always buying the best they can get their hands on, some have 3-4 top end cards pushing a single monitor...

              Sure, if you are talking only Linux titles, almost all can be played just fine on a $100-130 GPU.

              Comment


              • Originally posted by Hirager View Post
                KDE's Oxygen handles high DPI impressively well. I have a 135 PPI (pixels per inch) netbook. It's only problem regarding the pixel destiny is Firefox's policy of forced 96 PPI, which makes web browsing a difficult task.
                Block the ads, collapse their place holders and hit Ctrl+"+" to scale up the page, it scales everything, graphics, Videos, Flash etc. unlike most browsers that only increase the font size which causes the text to go places it shouldn't. It works great here at 133PPI and Firefox remembers the zoom setting for every site and automatically applies it site wide.

                Also, Image Zoom is a really nice extension.

                Comment


                • sad news

                  that's really a bad news. the main problem is that there's no alternative : today it's impossible to make linux run on cards like radeon 9xxx (pre-HD) which are still technically completely capable.
                  Current opensource drivers dont even allow youtube videos to run smooth on this kind of gfx card
                  Since it's impossible to upgrade gpu in most notebooks, the linux user amount will only decrease

                  Comment


                  • Originally posted by Kivada View Post
                    Block the ads, collapse their place holders and hit Ctrl+"+" to scale up the page, it scales everything, graphics, Videos, Flash etc. unlike most browsers that only increase the font size which causes the text to go places it shouldn't. It works great here at 133PPI and Firefox remembers the zoom setting for every site and automatically applies it site wide.

                    Also, Image Zoom is a really nice extension.
                    Alas, I have multiple machines, in which I have to have everything synced up, together with settings and add-ons. There is no place for this kind of manipulation. Hah, I think I'll just file a bug report for this issue.

                    Comment


                    • Originally posted by adrenochrome View Post
                      that's really a bad news. the main problem is that there's no alternative : today it's impossible to make linux run on cards like radeon 9xxx (pre-HD) which are still technically completely capable.
                      Current opensource drivers dont even allow youtube videos to run smooth on this kind of gfx card
                      Since it's impossible to upgrade gpu in most notebooks, the linux user amount will only decrease
                      You're saying as if old drivers are specially destroyed. The problem with the drivers you mentioned was that they were written in compliance with deprecated DRI. All the current drivers are compliant with DRI2. No one wanted to port the old drivers, and maintenance cost was too big. Nothing like this will happen ever again!

                      Comment

                      Working...
                      X