Announcement

Collapse
No announcement yet.

AMD To Drop Radeon HD 2000/3000/4000 Catalyst Support

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Sonadow View Post
    All I see is this:

    AMD has
    - committed money to develop OSS drivers and released partial specifications of some of their chipsets for the community to work on
    - committed themselves to develop OSS drivers
    - a subpar OSS driver which falls behind the reverse-engineered Nouveau driver

    NVIDIA has
    - openly commented that they wil never touch OSS drivers

    and AMD gets all the hate just because of one bolded point.


    To quote one person who made this statement in the forum somewhere in the early pages:



    Wake up your bloody idea; AMD is doing just that. And where have the community gone? Most have hauled their sorry butt off to Nouveau and left AMD to handle the OSS driver development themselves with a tiny bunch of X developers. Look at the news postings on the website: all the effort is now on? Nouveau. Nouveau. Nouveau.

    This is irrefutable proof that the OSS community whining for 'OSS drivers' are nothing but a bunch of hypocrites who only know how to TALK TALK TALK TALK TALK AND MAKE EMPTY PROMISES. The community 'will do the rest'? What a load of baldfaced LIES. Don't ever make that statement again, because you SURE ARE HELL have no intention to do so.

    You want a PROPER working OSS Radeon driver? Write it YOURSELF. If you can't you have 0 right to bitch about the state of the Radeon driver.

    AMD should never have opened up their drivers if this is the response they are going to get.

    PS: Kudos to the devs who actually continued giving their all in the OSS Radeon driver development even after AMD released the partial specifications.
    Well put, nvidia is laughing all the way to the bank.
    They don't have to lift one finger to help the OSS guys, and yet people are 'going green' because there will be no binary blobs for older hardware, but AMD still has tech docs out which is much more than what nvidia has EVER done.

    And now most clowns in this thread crying the sky is falling are going to "punish" AMD/ATI by buying nvidia gear because AMD/ATI have the gall to release tech docs to help OSS driver guys, and nvidia hasn't.

    Yeah, smart move there!

    Michael Larabel should point out the reasons why the OSS nvidia driver is better than the OSS AMD driver, and how are they getting better results with no tech docs releases from nvidia, compared to what AMD has released. Is it a only a manpower issue, or is there more to this ?

    Comment


    • @GreatEmerald

      Fisrt, most people that are the target market for super high res screens are the high end gamer market, wo upgrade their GPU every year-18 months, they moved from their HD6950s that they BIOS modded into HD6970s to HD7970s already, when high res screens hit they'll be waiting for the first model to hit some magic level of low latency to replace their 1920x1200 or 2560x1600 res screen or screens.

      The general consumer will be roped in with QFHD HDTV brought to you only via Bluray+ as the US internet infrastructure couldn't possibly handle transferring that kind of video as none of the ISPs want to upgrade, ever and the fact that the movie studios are drooling at the opportunity to offer another media standard that you have to pay $30 a movie for. Think about how they sold HDTV and the 3D HDTV to the general public, with most of the gear sold not being able to possibly handle 1920x1080.

      Furthermore, you CPU make very little difference at very high resolution and GPU settings, if you are quadrupling 1920x1080 as would be the case in 3840x2160, a jump from 2,073,600 pixels to 8,294,400 pixels. You see it here all the tie with Larabel stupidly using IOQuake3 based games on modern hardware like i7s and GTX580s, the the limiting factor with games with such low hardware requirements show that the CPU ends up being the limiting factor, now though, if you look at a really high GPU intensity game you quickly find that the CPU is no longer the limiting factor as you increase the resolution and graphics settings, if you max out a game like Metro 2033's GPU settings at 2560x1600 the GPU hit's it's limitations long before the CPU hit's it's capability to keep feeding the GPU up to date location data for everything on the stage that you're on. Increasing the resolution only makes the GPU cry uncle that much faster as the work the CPU has to do doesn't change much between 640x480 and 2560x1600.

      Yeah, thats par of the reason to move to higher resolution screens, no more need for AA, AF you still might need to give things in the distance a sense of depth, but even your 2D experience becomes allot better as fonts will be much crisper and easier to read at very high pixel densities.

      Yeah, I'm on 1600x1200 at 15" or 133PPI. but I'm currently sitting 4-5 feet back from the screen, so I have the site zoomed so I don't have to squint.
      Last edited by Kivada; 04-21-2012, 04:09 PM.

      Comment


      • Originally posted by vortex View Post
        Michael Larabel should point out the reasons why the OSS nvidia driver is better than the OSS AMD driver, and how are they getting better results with no tech docs releases from nvidia, compared to what AMD has released. Is it a only a manpower issue, or is there more to this ?
        Uhuh, while I'm no driver dev what I do know of the Gallium3D drivers is that the Intel, AMD and Gallium3D devs are doing all the heavy lifting of actually implementing all the features in the stack via state trackers, the Nouveau guys then just have to figure out how to make their driver spit out the same thing the Nvidia blob does when given the same OpenGL command as the Gallium3D drivers would be giving. Which allot less work then where Nouveau started before they moved to the Gallium3D stack where they had to do pretty much everything themselves.

        Lets also not forget Linux's large installed base of Nvidia users because for most of the decade their cards where the only ones that worked to actually game as your choices where Intel garbage or getting an R200 based Radeon card.

        These days these people are now zealots because they have a psychologically vested interest even though the only thing Nvidia has ever done for them is provide VDPAU as a way to keep them from switching to AMD.

        I'm not against Nvidia, I'm just pro open source, I actually wanted Nvidia to merge with VIA and produce their own X86-64 APUs like AMD is doing as it'd be better for the market as a whole to have 3 major players in the x86 game and keep Intel on it's toes, if they had done so in 2008 by now they'd by now have enough time to cook up a pretty good low-mid range system based around a midrange GPU, their own Chipset and an updated version of the Nano, sure they probably wouldn't work on Linux all that well given the history of both Nvidia and VIA/S3 but on Windows they'd be pretty good.

        Comment


        • For what it's worth...

          Originally posted by RealNC View Post
          Something that works.
          I've just bought a HD6450, and its OSS drivers run World of Warcraft "out of the box" on Fedora 16. Once I'd installed libtxc_dxtn.so, anyway. Fedora 16 uses Mesa 7.11.2, and its very playable.

          Comment


          • Here is the another thread about open source GPUs for Linux. Looks like nVidia doing best in this job, even without supporting open source directly, like AMD.

            http://www.reddit.com/r/linux/commen...rformance_with

            I am thinking on it for a while. Why AMD needed to support open source on Linux OSs?
            Why they hired programmers ( 2 programmers! Laughs! ) for producing codes for many different GPUs?

            I think, probably, they feel that, they are really behind of other brands and gonna loose their share between linux users...
            So they wanted to "look like" Linux compatible and put their weight on FOSS drivers. Opened specs etc. They made solid improvements. Made good steps. Thanks for that.
            But they look like they forget about being "behind" and the gap doesn't gonna close with hiring just two developers.
            They need MUCH more developer to support their GPUs... Dropping catalyst support for old cards? No problem If they are gonna spend maintain money for FOSS drivers... Also gonna happy about it.

            AMD is better to hire 10 more FOSS developer and drop whole catalyst support for linux!
            Do you agreed?

            Comment


            • Originally posted by Death Knight View Post
              Here is the another thread about open source GPUs for Linux. Looks like nVidia doing best in this job, even without supporting open source directly, like AMD.

              http://www.reddit.com/r/linux/commen...rformance_with

              I am thinking on it for a while. Why AMD needed to support open source on Linux OSs?
              Why they hired programmers ( 2 programmers! Laughs! ) for producing codes for many different GPUs?

              I think, probably, they feel that, they are really behind of other brands and gonna loose their share between linux users...
              So they wanted to "look like" Linux compatible and put their weight on FOSS drivers. Opened specs etc. They made solid improvements. Made good steps. Thanks for that.
              But they look like they forget about being "behind" and the gap doesn't gonna close with hiring just two developers.
              They need MUCH more developer to support their GPUs... Dropping catalyst support for old cards? No problem If they are gonna spend maintain money for FOSS drivers... Also gonna happy about it.

              AMD is better to hire 10 more FOSS developer and drop whole catalyst support for linux!
              Do you agreed?
              They'll never do that.

              Reason 1: Catalyst runs on Windows, and many (most!) of the programmers they pay to work on Catalyst are focused on Windows.
              Reason 2: They couldn't get the Gallium3d open source stack running well as a replacement Windows driver, even if they tried. Main reason is that all the encumbered code (UVD, probably catalyst AI heavy optimizations, probably stereoscopic 3d, probably crossfire, i.e. most of the things they refuse to bring to the open drivers) would never be allowed to be open sourced. Other companies' legal property is within Catalyst, and that magic sauce will never be allowed to be in the open drivers.
              Reason 3: Because of Reason 1 and 2, it is impossible to not continue developing Catalyst for Windows if they want to continue to support 90% of the PC market's OS of choice.
              Reason 4: Because of Reason 3, they can't divert funds (which really just comes down to salaried employees) away from Catalyst to focus on the open drivers. They have to spend "extra" money on the open drivers, and they don't have a lot of extra because Intel is owning them on the CPU front and they're still spending heinous cash developing new CPUs.

              Kinda wish the ATI/AMD merger never happened... well I think some of AMD's more open source friendly culture influenced ATI positively, but on the other hand, ATI is basically the only viable business unit keeping AMD afloat. Their CPUs are an absolute joke. The ATI graphics cards must be their cash cow, preventing them from going bankrupt (yet), because they have an actually competitive product.

              Comment


              • Originally posted by Death Knight View Post
                AMD is better to hire 10 more FOSS developer and drop whole catalyst support for linux!
                Do you agreed?
                I fully agree. Better 20.

                Comment


                • Originally posted by allquixotic View Post
                  Kinda wish the ATI/AMD merger never happened... well I think some of AMD's more open source friendly culture influenced ATI positively, but on the other hand, ATI is basically the only viable business unit keeping AMD afloat. Their CPUs are an absolute joke. The ATI graphics cards must be their cash cow, preventing them from going bankrupt (yet), because they have an actually competitive product.
                  Meh, the APUs are doing pretty good, they made them competitive in the mobile market again which is where the consumer market is going to more or less, most people just want a laptop these days instead of a full desktop. It also helps that in the consumer market the CPU stopped being relevant around the time dual cores became available to the low end consumer, the most demanding things these consumers want to do is maybe play some Sims or WoWcrack, neither of which needs allot of CPU, but when the GPU sucks they notice, they may not attribute it to the GPU as to why the game looks like crap and skips but they do notice when it isn't up to snuff much easier then they notice an extra few seconds scraped off their MP3 rips.

                  Comment


                  • About moving to NVIDIA from AMD - you're comparing their OSS drivers. In that case, yes, AMD OSS drivers are in a better shape support-wise. But when comparing the binary blobs, NVIDIA has the upper hand - once again, look at Wine statistics. FGLRX can't even launch any Unreal Engine 3 games further than the menu, while NVIDIA users give those games Platinum rankings and report near perfect performance... So yes, if you want to support OSS, then sticking with AMD could be a good idea. But if you want to actually play games with the binary blob - not so much, unfortunately.

                    Originally posted by Kivada View Post
                    @GreatEmerald

                    Fisrt, most people that are the target market for super high res screens are the high end gamer market, wo upgrade their GPU every year-18 months, they moved from their HD6950s that they BIOS modded into HD6970s to HD7970s already, when high res screens hit they'll be waiting for the first model to hit some magic level of low latency to replace their 1920x1200 or 2560x1600 res screen or screens.

                    Furthermore, you CPU make very little difference at very high resolution and GPU settings, if you are quadrupling 1920x1080 as would be the case in 3840x2160, a jump from 2,073,600 pixels to 8,294,400 pixels. You see it here all the tie with Larabel stupidly using IOQuake3 based games on modern hardware like i7s and GTX580s, the the limiting factor with games with such low hardware requirements show that the CPU ends up being the limiting factor, now though, if you look at a really high GPU intensity game you quickly find that the CPU is no longer the limiting factor as you increase the resolution and graphics settings, if you max out a game like Metro 2033's GPU settings at 2560x1600 the GPU hit's it's limitations long before the CPU hit's it's capability to keep feeding the GPU up to date location data for everything on the stage that you're on. Increasing the resolution only makes the GPU cry uncle that much faster as the work the CPU has to do doesn't change much between 640x480 and 2560x1600.

                    Yeah, thats par of the reason to move to higher resolution screens, no more need for AA, AF you still might need to give things in the distance a sense of depth, but even your 2D experience becomes allot better as fonts will be much crisper and easier to read at very high pixel densities.
                    That's not the high end gamer market, that's the enthusiast market... Because, like I said, there is no need to upgrade from HD4000s to play high end games today. Anyone who does upgrade is basically wasting their money, or already have something crazy like a 3+ monitor setup, which means that they're enthusiasts.

                    Like I said, in my case it's special, since I also record my games. So a jump from 2,073,600 pixels to 8,294,400 pixels would bring down the framerate from 30 FPS to 7 FPS due to the CPU not being able to handle that many pixels in real time. For those who don't record, though, it's easier. Then again, even when I don't record, Mass Effect 3 sometimes gets dropped frames due to the CPU, since animations, all UnrealScript and some other things are purely CPU-based. But, of course, increasing the resolution would make the GPU a bottleneck under non-recording circumstances.

                    I hope they can improve DPI support until then, though. Right now changing the DPI on Linux only changes the font size, and not the size of any graphical elements, even if they use SVGs for them...

                    Comment


                    • Originally posted by GreatEmerald View Post
                      I hope they can improve DPI support until then, though. Right now changing the DPI on Linux only changes the font size, and not the size of any graphical elements, even if they use SVGs for them...
                      KDE's Oxygen handles high DPI impressively well. I have a 135 PPI (pixels per inch) netbook. It's only problem regarding the pixel destiny is Firefox's policy of forced 96 PPI, which makes web browsing a difficult task.

                      Comment


                      • Originally posted by GreatEmerald View Post
                        About moving to NVIDIA from AMD - you're comparing their OSS drivers. In that case, yes, AMD OSS drivers are in a better shape support-wise. But when comparing the binary blobs, NVIDIA has the upper hand - once again, look at Wine statistics. FGLRX can't even launch any Unreal Engine 3 games further than the menu, while NVIDIA users give those games Platinum rankings and report near perfect performance... So yes, if you want to support OSS, then sticking with AMD could be a good idea. But if you want to actually play games with the binary blob - not so much, unfortunately.
                        I don't care about the blob for either company and I care even less about Wine, PlayOnLinux, Crossover or Cedega compatibility.

                        Originally posted by GreatEmerald View Post
                        That's not the high end gamer market, that's the enthusiast market... Because, like I said, there is no need to upgrade from HD4000s to play high end games today. Anyone who does upgrade is basically wasting their money, or already have something crazy like a 3+ monitor setup, which means that they're enthusiasts.
                        Try the PC only titles not the console ports like Mass Effect 3, if you max the graphics settings on titles like Metro2033, Aliens VS Predator, Hard Reset, Just Cause 2, Deus Ex: Human Revolution, Dirt 3, Crysis: Warhead, Mafia 2, Batman Arkam City, Total War: Shogun 2, STALKER: Call of Pripyat and the list goes on for Windows games that when maxed out on detail settings will barely break a 30 FPS average let alone the 60FPS the high end gamers demand on 1920x1080~2560x1600 no AA~4x AA if that on an HD4890, the guys playing these games refuse to compromise by lowering settings, hence why they are always buying the best they can get their hands on, some have 3-4 top end cards pushing a single monitor...

                        Sure, if you are talking only Linux titles, almost all can be played just fine on a $100-130 GPU.

                        Comment


                        • Originally posted by Hirager View Post
                          KDE's Oxygen handles high DPI impressively well. I have a 135 PPI (pixels per inch) netbook. It's only problem regarding the pixel destiny is Firefox's policy of forced 96 PPI, which makes web browsing a difficult task.
                          Block the ads, collapse their place holders and hit Ctrl+"+" to scale up the page, it scales everything, graphics, Videos, Flash etc. unlike most browsers that only increase the font size which causes the text to go places it shouldn't. It works great here at 133PPI and Firefox remembers the zoom setting for every site and automatically applies it site wide.

                          Also, Image Zoom is a really nice extension.

                          Comment


                          • sad news

                            that's really a bad news. the main problem is that there's no alternative : today it's impossible to make linux run on cards like radeon 9xxx (pre-HD) which are still technically completely capable.
                            Current opensource drivers dont even allow youtube videos to run smooth on this kind of gfx card
                            Since it's impossible to upgrade gpu in most notebooks, the linux user amount will only decrease

                            Comment


                            • Originally posted by Kivada View Post
                              Block the ads, collapse their place holders and hit Ctrl+"+" to scale up the page, it scales everything, graphics, Videos, Flash etc. unlike most browsers that only increase the font size which causes the text to go places it shouldn't. It works great here at 133PPI and Firefox remembers the zoom setting for every site and automatically applies it site wide.

                              Also, Image Zoom is a really nice extension.
                              Alas, I have multiple machines, in which I have to have everything synced up, together with settings and add-ons. There is no place for this kind of manipulation. Hah, I think I'll just file a bug report for this issue.

                              Comment


                              • Originally posted by adrenochrome View Post
                                that's really a bad news. the main problem is that there's no alternative : today it's impossible to make linux run on cards like radeon 9xxx (pre-HD) which are still technically completely capable.
                                Current opensource drivers dont even allow youtube videos to run smooth on this kind of gfx card
                                Since it's impossible to upgrade gpu in most notebooks, the linux user amount will only decrease
                                You're saying as if old drivers are specially destroyed. The problem with the drivers you mentioned was that they were written in compliance with deprecated DRI. All the current drivers are compliant with DRI2. No one wanted to port the old drivers, and maintenance cost was too big. Nothing like this will happen ever again!

                                Comment

                                Working...
                                X