Announcement

Collapse
No announcement yet.

I Miss My MacBook Pro, Buggy Iris Graphics Gives Headaches

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by zanny View Post
    All the way through the correct answer is to buy hardware supported by your OS. Duh. It is actually a terrible ideology that the Linux community pushes that says "no please don't worry about your hardware, it will work one day!" No it won't, the hardware is the responsibility of the device manufacturer, and all the thousands of man hours of reverse engineering the community has put in to making these piles of refuse work only gives companies reason to never support Linux and to continue making brilliant people waste their time doing the work for them.
    I buy AMD GPUs. Why? AMD supports Linux with proper free drivers.
    You know you contradict yourself there right?

    Comment


    • #62
      You know you contradict yourself there right?
      Um how? I didn't buy an SI part until it had comprehensive up to date support, and then I only bought one a year to guarantee compatibility (7870). Before that, I had a 5850 that worked well for a few years, and I only updated when I had comprehensive stable driver support for the newer cards.

      And yeah, graphics cards are one of the most limited choice markets. But feel free to let me know what other options for free software driver discrete graphics cards I have that contribute more developers than AMD.

      Comment


      • #63
        Originally posted by zanny View Post
        Um how? I didn't buy an SI part until it had comprehensive up to date support, and then I only bought one a year to guarantee compatibility (7870). Before that, I had a 5850 that worked well for a few years, and I only updated when I had comprehensive stable driver support for the newer cards.
        None of those devices are fully supported however to their hardware capabilities with the free drivers so you are still buying hardware that is not living up to it's capabilities. Heck the 290 series is still without working support with the free drivers and if you choose to use the blob, you gain features and better utilize it's hardware but the card is still being held back in terms of performance. Like it or hate it, your best overall video hardware that is capable of using it's full capabilities are Nvidia cards. Unless you yourself plan on actually coding that support and using the source code and documentation to develop better card support then there is really little to no advantage going the free driver route but you do give up a lot on functionality.

        Comment


        • #64
          Originally posted by zanny View Post
          But feel free to let me know what other options for free software driver discrete graphics cards I have that contribute more developers than AMD.
          I think this is what you want

          Comment


          • #65
            Originally posted by deanjo View Post
            Sorry gotta call pure 100% BS on that one. I've never had a upgrade on OS X take longer then 30 minutes dating all the way back to 10.4 and some of those were on pokey single core x86s and G4/G5's. Trust me, I've handled well over 1500 OS X upgrades.
            That's not what I've seen for years. I think it's probably due to developers having to upgrade their dev environments.

            There is plenty of server hardware out there for it from SAN's, 10Gbe ethernet, rackmounts, raid cards, etc.
            What I meant was XServe boxes or the like. I mean, sure, we can all get a Synology NAS and plug it into a Mac, but when running a server farm, not even Pixar uses OSX (Yes, I know that is a dated link, but there has been no indication of change). Back when they were selling XServe boxes, they consistently got hammered by Linux, Solaris, and even Windows on performance. I also understand the latest OSX home server product got severely neutered a few years back too, although I don't have any experience with it.

            I know or know of hundreds of companies that deploy on Linux clusters. I know 0 that deploy on OSX clusters. Why do we keep training devs to use iTunes when they should know their deployment environment better?

            In any event, my post was pointing out that there are many benefits and advantages to running Linux as your main OS. For me, they significantly outweigh any disadvantages, especially since I pick stable tested hardware up front. And I believe that can be true for a lot more people, especially developers.
            Last edited by deppman; 29 May 2014, 09:05 PM.

            Comment


            • #66
              Originally posted by deppman View Post
              That's not what I've seen for years. I think it's probably due to developers having to upgrade their dev environments.
              Dude, those upgrades were done in dev departments. There is a big difference in deploying times when you have someone that actually knows how to deploy something properly vs someone kludging around and trying to figure out how to do a deployment because of their unfamiliarity of the system. Ask a Windows guru how to deploy linux upgrades and he is likely to find it slow and kludgy as well, same goes with any other deployment where the person is doing it on a foreign to them infrastructure and OS.


              What I meant was XServe boxes or the like. I mean, sure, we can all get a Synology NAS and plug it into a Mac, but when running a server farm, not even Pixar uses OSX (Yes, I know that is a dated link, but there has been no indication of change). Back when they were selling XServe boxes, they consistently got hammered by Linux, Solaris, and even Windows on performance. I also understand the latest OSX home server product got severely neutered a few years back too, although I don't have any experience with it.

              I know or know of hundreds of companies that deploy on Linux clusters. I know 0 that deploy on OSX clusters. Why do we keep training devs to use iTunes when they should know how to use the command line better?
              Oh you should know a few, Facebook and Mozilla for example among quite a few others. The hardware is out there to go big as well Sonnet for example has quite a selection by itself but it is far from the only player.

              Last edited by deanjo; 29 May 2014, 09:20 PM.

              Comment


              • #67
                Originally posted by deanjo View Post
                Dude, those upgrades were done in dev departments.
                That doesn't make the point any less valid. The bottom line is Linux upgrades I've seen take hours. The OSX upgrades take the devs days.

                Oh you should know a few, Facebook and Mozilla for example among quite a few others. The hardware is out there to go big as well Sonnet for example has quite a selection by itself but it is far from the only player.
                Yes, they might have a tiny niche for certain server applications, although I wouldn't be lining up to convert my compute clusters to iMac minis any time soon . The point is there are far more developers deploying on Linux than OSX. And a lot of them would benefit if they actually used Linux as their development environment.

                It's not about dissing Apple, but pointing out that a Linux client has real, tangible benefits which can make it a better choice as a client OS.

                Comment


                • #68
                  Originally posted by zanny View Post
                  If I set it higher I'd get tessellation and all surface reflection that could cripple my framerate, though.
                  OGL-render of 4A Engine doesn't support tesselation, at least for now.

                  Comment


                  • #69
                    Originally posted by RussianNeuroMancer View Post
                    OGL-render of 4A Engine doesn't support tesselation, at least for now.
                    They do some kind of depth texture effect, maybe depth of field, but it nukes the framerate either way.

                    you are still buying hardware that is not living up to it's capabilities
                    I guess a definition of sufficient capabilities is relevant.

                    For all intents and purposes, for my x86 cpu to "work", it needs to provide complete support for the x86-64 instruction set and all the extension sets its rated for (in the case of Haswell, that would be AVX2, etc).

                    For my GPU to work, I would expect the same. Radeon SI has an open ISA published, as does Intel parts, and if I can know the ISA to completely control the device I consider that sufficient.

                    I believe I've heard that AMD has hidden some of their instructions for internal use. If that is true, I'd agree the cards are not as open as Intel ones.

                    The reality of there being an OpenGL driver on top of it is kind of just a legacy mess in my opinion. I am grateful they support it, as they also do with GCC, as does Intel, etc. But for my needs as long as I have a fully programmable open documented device that meets my criteria.
                    Last edited by zanny; 30 May 2014, 02:34 PM.

                    Comment

                    Working...
                    X