Announcement

Collapse
No announcement yet.

A Revived Linux Driver To Be Attempted For The ATI RAGE 128

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by nanonyme View Post

    Maybe old but not *that* old. I'd say hardware older than what r600 driver handles (ballpark of ten years old) is not really useful for modern world purposes.
    True but ~r600 is still extremely complex to jump into the deep-end with. When the crap hits the fan, we really will need to start from scratch and build our way up again. It is going to be horrible!

    Plus, I don't think modern GPUs will exist for the consumer to buy. I think you will have to stream everything to and from the "cloud". I would take a slower card over that any day, especially when offline haha.

    Once the digital world locks itself down and regresses enough, the old stuff wont actually be at too much of a disadvantage. A good comparison is that even to this very day, you can achieve so much more productivity on Windows 3.1 or *DOS than you can on a locked down iOS.
    Last edited by kpedersen; 09 June 2018, 07:01 AM.

    Comment


    • #22
      I still have a «brand new» pair of AGP ATi Radeon 200/300 video cards just before the period they get bought by AMD, but no AGP motherboard available.

      Comment


      • #23
        Originally posted by onicsis View Post
        I still have a «brand new» pair of AGP ATi Radeon 200/300 video cards just before the period they get bought by AMD, but no AGP motherboard available.


        Quite a few work in ALBATRON ATOP AGP to PCI Express adaptor. This is party why really old agp stuff is not dieing out quickly. People are putting them in modern PCI-e motherboard due to bridge cards. So its either an old AGP motherboard or a new motherboard with a bridge card converting pci-e to agp. Different places still sell the bridge cards.

        Comment


        • #24
          This is good news and I hope he will succeed with Openchrome, his reusable DRM thing and R128. Old computers - as long as they aren't power-wasters - can still be put to good use. And even if these cards just works as a not-entirely-headless monitor adapter for some server system. Low power consumption, nearly no heat added to the system and it gives you 80x25 or some graphical res. on the screen and you can get stuff done.
          Stop TCPA, stupid software patents and corrupt politicians!

          Comment


          • #25
            It does look fancy but seems to req. a lot more circuitry than PCI<-2->PCIe solutions. And:
            "Requires driver installation"
            This made me a little afraid. I might check the kernel config for options, if I can find something there. Moreover - will this work on all mainboards already on low level?
            Stop TCPA, stupid software patents and corrupt politicians!

            Comment


            • #26
              Originally posted by kpedersen View Post
              Plus, I don't think modern GPUs will exist for the consumer to buy. I think you will have to stream everything to and from the "cloud". I would take a slower card over that any day, especially when offline haha.
              Might and might not go that way. Internet bandwidth is not an infinite resource. It's already quite stretched out and borderline bottlenecked on a global scale.. Putting "everything" into cloud might not be an option.

              Comment


              • #27
                We already had this bandwith problem with streamed video, fixed with colocating with residential ISPs. Although this works best for monopolies like Youtube, Netflix etc. Streaming is though a misnomer in this case as these are in facts downloads where we watch video being downloaded.
                In some way, game streaming would be relatively easy if you have a short path to the customer, in that a single file server in your server farm can store every PS4 game, every Xbox game.

                What sucks is you need low latency, low jitter.
                What helps is many watch other peope playing instead of playing themselves, so you can stream that with higher latency and a more bandwith efficient encoding.
                A solution is to have a communal building next to the datacenter where you house hundreds of no-life gamers! This might work even in the middle of desert.

                Comment


                • #28
                  Originally posted by aht0 View Post
                  Might and might not go that way. Internet bandwidth is not an infinite resource. It's already quite stretched out and borderline bottlenecked on a global scale.. Putting "everything" into cloud might not be an option.
                  Oh I am certainly not saying it will be a good experience for us lowly peasants but I am almost 100% certain companies will make it go that way. For greed and control. It is already becoming quite commonplace for a computer to be unusable offline.

                  Also, an additional note... Didn't Mesa recently drop OpenGL 2.0 support for Intel GMA945 and below. So in theory, the old ATI Rage 128 actually has the same OpenGL support as my GMA 915 card still in high use haha.

                  Comment


                  • #29
                    Originally posted by kpedersen View Post

                    I can't help but feel that if the open-source community were ever going to take their lives into their own hands and start creating their own hardware, the very old chips will be the best place to start. The more knowledge and support we have for them now will put us in the best position for then.
                    I dont see this ever happening. For one how would you generate enough volume to cover manufacturing costs?

                    Basically, there is no point having fantastic open-source driver support for the most modern card that we can only use on locked down Windows, Android, iOS anyway (yes, this will almost certainly happen (I predict around 2025?) unless FOSS can start generating hardware).
                    What we have seen over the last few years is just the opposite with both AMD and Intel being very open. The one hold out, that really matters, is the ARM world where GPU support really sucks. At least currently at the chip level the world is going in the opposite direction compared to what you imagine.

                    So right now we have a simpler solution of just making a PCB to support these chips. Ive yet to see a long term project having a lot of success here outside of Raspberrry PI. So if the FOSS world really has these concerns it is already possible for them to design a system board that is open. That happened with any great success so how would you design a GPU that would gain a level of success to pay for itself?

                    Over the years a number of so called open RISC CPUs have happened, none of which could be considered a success. Even the current poster child has yet to impress with system availability. So im not inclined to stay up waiting for FOSS hardware.

                    Comment


                    • #30
                      Originally posted by kpedersen View Post

                      True but ~r600 is still extremely complex to jump into the deep-end with. When the crap hits the fan, we really will need to start from scratch and build our way up again. It is going to be horrible!
                      If crap did indeed hit the fan i really doubt many would be worried about their computers. People in the cities would be wiped out. At best a few farmers in the country would survive and it wouldnt be easy for them.
                      Plus, I don't think modern GPUs will exist for the consumer to buy. I think you will have to stream everything to and from the "cloud". I would take a slower card over that any day, especially when offline haha.
                      I really don't see that happening. Frankly streaming is a terrible solution for reliable computing. Second there will never be enough bandwidth.
                      Once the digital world locks itself down and regresses enough, the old stuff wont actually be at too much of a disadvantage.
                      Baloney! As long as unlocked hardware exist it makes good sense to keep your systems current technology wise.
                      A good comparison is that even to this very day, you can achieve so much more productivity on Windows 3.1 or *DOS than you can on a locked down iOS.
                      This is more ignorance than i can rational believe comes from someone familiar with computing hardware. First off many of us are buying iPhones because they are locked down. After all a cell phone is a critical communications device for many of us.

                      Beyond that you cant seriously compare hardware from the days of DOS with a modern cell phones hardware. The hardware in the iPhone runs rings around most of the hardware I've ever owned. That goes all the way back to a Vic 20. It is certainly faster than the Windows 3.1, Windows 95 and even the first NT machine i had and far better than those machine when i installed Linux from that era. I might also remind you that iOS is an OS lightyears ahead of what DOS was or even those early Windows variants.

                      In any event iPhone is only locked down to a certain degree. As a developer i can load whatever i want on the machine that makes sense on the platform. If i want i can make an app a free download on app store for everybody to download. If a developer follows Apples guidelines, anybody that downloads an app store app has a certain degree of confidence that the App will not compromise their iPhone. You can call that locked down but for many of us it is a bit of security.

                      In anyevent the point here is that whining about locked down hardware is nonsense if that is what the customer wants. This especially when unlocked hardware is freely available. The sky isn't falling yet so lets not claim it is.

                      Comment

                      Working...
                      X