Announcement

Collapse
No announcement yet.

GCC 10 Compiler Drops IBM Cell Broadband Engine SPU Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Cell was interesting. It was a response to the needs of its time. It didn't stick, but it exposed some new thinking in compute hardware design. Not all designs that respond to a problem catch on. But that happens with most innovations.

    Comment


    • #12
      Originally posted by s_j_newbury View Post
      I don't really understand why GCC would drop support for a relatively modern, powerful (even if flawed) ubiquitous hardware platform that's still used by enthusiasts just because the vendor no longer supports it. Since when was GCC a vendor oriented compiler project? Is it just a lack of maintainers for the code in question?
      Actively developed, complex pieces of software like GCC require constant work on the CPU targets to keep up with changes. If nobody is willing to maintain any particular CPU target support, that support will eventually be deprecated and subsequently removed. Which is what happened here. The GCC community is under no obligation to maintain support for dead targets indefinitely.

      That Sony prevented 3rd party OS'es on the PS3 was probably also a quite effective tool to drastically reduce any volunteer interest to work on that target.

      Comment


      • #13
        PS3 is jailbroken though, nothing stops anybody from not updating their firmware to a "fixed" version or downgrading to a working version (may require a dongle?). Would be great if Sony would open up the platform, but it really isn't likely.

        It is interesting hardware for the "homebrew scene".

        Playstation Development Wiki - PS5, PS4, PS3, PS2, PS1, PSP, Vita Information

        Comment


        • #14
          Originally posted by jabl View Post

          Actively developed, complex pieces of software like GCC require constant work on the CPU targets to keep up with changes. If nobody is willing to maintain any particular CPU target support, that support will eventually be deprecated and subsequently removed. Which is what happened here. The GCC community is under no obligation to maintain support for dead targets indefinitely.
          Of course this is true, but maybe they just needed to draw attention to the fact it needs a maintainer, or maybe that's part of the intention with dropping the support, although deprecating it first might have been sufficient.


          That Sony prevented 3rd party OS'es on the PS3 was probably also a quite effective tool to drastically reduce any volunteer interest to work on that target.
          No doubt Sony destroyed the hobbyist/non-gaming development communities back when they removed "Other OS", it certainly stopped me from working on it at the time since I also wanted to be able to play games on PSN. I even bought the Linux Dev Kit for the PS2 back in the day and made good use of it, so it's not like I'm just complaining about a hypothetical, I bought the PS3 with the plan to develop/hack with Linux on it!

          Comment


          • #15
            Originally posted by s_j_newbury View Post
            Of course this is true, but maybe they just needed to draw attention to the fact it needs a maintainer, or maybe that's part of the intention with dropping the support, although deprecating it first might have been sufficient.
            Yes, the gcc rule is that functionality that is removed is obsoleted for one major release before it's removed, precisely to let the larger community know what's coming and giving somebody time to take over maintenance. Accordingly, it was mentioned in the release notes of the last major release: https://gcc.gnu.org/gcc-9/changes.html

            Now it was removed from GCC trunk, which will eventually become gcc 10. Though if someone steps up to maintain it, I'm sure the removal can still be reverted.

            Comment


            • #16
              Originally posted by s_j_newbury View Post
              I don't really understand why GCC would drop support for a relatively modern, powerful (even if flawed) ubiquitous hardware platform that's still used by enthusiasts just because the vendor no longer supports it. Since when was GCC a vendor oriented compiler project? Is it just a lack of maintainers for the code in question?
              This has been the trend lately in the Linux kernel world. Linux used to be a choice OS for re-purposing old hardware. Not so much any more. Probably NetBSD is the only OS left that targets all kinds of old and obscure hardware.

              The main reason is probably related to Linux's popularity. Back in the day, it took years before new hardware was supported on Linux, because the folks writing the drivers were mainly hobbyists working in their free time. Now that there's big corporate sponsorship, it's a double edged sword. On the one hand, when a company puts full time devs on Linux kernel drivers, we get drivers that are robust and complete when (or even before) the hardware is released. On the other hand, those companies don't want to pay indefinitely until the end of time for those devs to maintain the drivers. So unless someone else steps up and volunteers, they become "abandonware" drivers and get dropped from the kernel.

              Personally, I think the kernel and toolchain folks should keep support for old hardware, unless it starts failing to compile, or some glaring flaw is discovered in it. I don't see how it's harming anything to keep the old hardware support in there. But then again, I'm not a developer.
              Last edited by torsionbar28; 04 September 2019, 02:38 PM.

              Comment


              • #17
                Originally posted by LoveRPi View Post
                Insanely high performance architecture that developers were too lazy to design for. IBM did everything right except make the barriers to entry low enough for average devs but cache coherency needs to die.
                Nailed it! Cell proved what's possible when hardware stops trying to bend over backward to cater to poor code. IMO, that was the last generation of consoles that actually leapfrogged PCs in any meaningful way.

                So, are any indy devs still making any PS3 games or demos? I don't know if the PS3 will ever this sort of vintage, but devs are still working with far more obscure hardware:

                Members of Carnegie Mellon University's computer club have somehow managed to not only obtain a working GCE Vectrex, but create an incredible 64K audiovisual demo on the obscure, 30-year old game console.



                There's a good Vectrex emulator, in case you can't scrounge a working HW unit.

                Comment


                • #18
                  Originally posted by Alex/AT View Post
                  Insanely "high" performance POWER-based architecture with insane development and optimization costs plus no good optimization compatibility with existing software (SPEs are 'units in themselves' that are just pieces of compute power like GPU units that cannot run generalized software). The end result (death of the platform) was pretty predictable.
                  The architecture was originally designed for 90nm where transistors were still precious. They segregated the SPEs because a more complicated instruction set would be too complex to implement given the silicon budget. However as I said before, it did force explicit synchronization instead of relying on cache coherency which reduced programmer bad habits and actually forced them to think and plan.

                  If they had to re-implement the architecture today, I would imagine they could address numerous deficiencies and not have the same limitations of that era. You can clearly see many design philosophies of the architecture integrated into chips today. Together with more reconfigurable crossbar switch, rings, heterogenous ISA, and chiplets, it could create processors that would be real monsters in real-world applications.

                  Comment


                  • #19
                    Originally posted by edwaleni View Post
                    Cell was interesting. It was a response to the needs of its time.
                    It was ahead of its time, really. If OpenCL had been around when it launched (not to mention SYCL), it'd have probably gained much more traction among developers.

                    As one might infer from that, more capable GPUs obviated the need for quite such a thing.

                    Comment


                    • #20
                      Originally posted by Alex/AT View Post
                      Insanely "high" performance POWER-based architecture with insane development and optimization costs plus no good optimization compatibility with existing software (SPEs are 'units in themselves' that are just pieces of compute power like GPU units that cannot run generalized software). The end result (death of the platform) was pretty predictable.
                      Well, IBM was positioning it for HPC. Had GPUs not come along, I think it would've definitely spawned further generations for that market.

                      Originally posted by Alex/AT View Post
                      Also, the general mainstream software compiler optimization for PUs without branch prediction and active microcode execution thread management is a pain, and all such platforms are pretty much doomed.
                      Well, you could try to say the same for GPUs. Except, it turns out that if the raw capability is great enough, developers are really willing to jump through some hoops to harness it.

                      Comment

                      Working...
                      X