Announcement

Collapse
No announcement yet.

GCC 10 Compiler Drops IBM Cell Broadband Engine SPU Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • coder
    replied
    Originally posted by edwaleni View Post
    Cell was interesting. It was a response to the needs of its time.
    It was ahead of its time, really. If OpenCL had been around when it launched (not to mention SYCL), it'd have probably gained much more traction among developers.

    As one might infer from that, more capable GPUs obviated the need for quite such a thing.

    Leave a comment:


  • LoveRPi
    replied
    Originally posted by Alex/AT View Post
    Insanely "high" performance POWER-based architecture with insane development and optimization costs plus no good optimization compatibility with existing software (SPEs are 'units in themselves' that are just pieces of compute power like GPU units that cannot run generalized software). The end result (death of the platform) was pretty predictable.
    The architecture was originally designed for 90nm where transistors were still precious. They segregated the SPEs because a more complicated instruction set would be too complex to implement given the silicon budget. However as I said before, it did force explicit synchronization instead of relying on cache coherency which reduced programmer bad habits and actually forced them to think and plan.

    If they had to re-implement the architecture today, I would imagine they could address numerous deficiencies and not have the same limitations of that era. You can clearly see many design philosophies of the architecture integrated into chips today. Together with more reconfigurable crossbar switch, rings, heterogenous ISA, and chiplets, it could create processors that would be real monsters in real-world applications.

    Leave a comment:


  • coder
    replied
    Originally posted by LoveRPi View Post
    Insanely high performance architecture that developers were too lazy to design for. IBM did everything right except make the barriers to entry low enough for average devs but cache coherency needs to die.
    Nailed it! Cell proved what's possible when hardware stops trying to bend over backward to cater to poor code. IMO, that was the last generation of consoles that actually leapfrogged PCs in any meaningful way.

    So, are any indy devs still making any PS3 games or demos? I don't know if the PS3 will ever this sort of vintage, but devs are still working with far more obscure hardware:

    https://www.theverge.com/2012/5/1/29...trex-demoscene
    http://www.pouet.net/prod.php?which=59227

    There's a good Vectrex emulator, in case you can't scrounge a working HW unit.

    Leave a comment:


  • torsionbar28
    replied
    Originally posted by s_j_newbury View Post
    I don't really understand why GCC would drop support for a relatively modern, powerful (even if flawed) ubiquitous hardware platform that's still used by enthusiasts just because the vendor no longer supports it. Since when was GCC a vendor oriented compiler project? Is it just a lack of maintainers for the code in question?
    This has been the trend lately in the Linux kernel world. Linux used to be a choice OS for re-purposing old hardware. Not so much any more. Probably NetBSD is the only OS left that targets all kinds of old and obscure hardware.

    The main reason is probably related to Linux's popularity. Back in the day, it took years before new hardware was supported on Linux, because the folks writing the drivers were mainly hobbyists working in their free time. Now that there's big corporate sponsorship, it's a double edged sword. On the one hand, when a company puts full time devs on Linux kernel drivers, we get drivers that are robust and complete when (or even before) the hardware is released. On the other hand, those companies don't want to pay indefinitely until the end of time for those devs to maintain the drivers. So unless someone else steps up and volunteers, they become "abandonware" drivers and get dropped from the kernel.

    Personally, I think the kernel and toolchain folks should keep support for old hardware, unless it starts failing to compile, or some glaring flaw is discovered in it. I don't see how it's harming anything to keep the old hardware support in there. But then again, I'm not a developer.
    Last edited by torsionbar28; 04 September 2019, 02:38 PM.

    Leave a comment:


  • jabl
    replied
    Originally posted by s_j_newbury View Post
    Of course this is true, but maybe they just needed to draw attention to the fact it needs a maintainer, or maybe that's part of the intention with dropping the support, although deprecating it first might have been sufficient.
    Yes, the gcc rule is that functionality that is removed is obsoleted for one major release before it's removed, precisely to let the larger community know what's coming and giving somebody time to take over maintenance. Accordingly, it was mentioned in the release notes of the last major release: https://gcc.gnu.org/gcc-9/changes.html

    Now it was removed from GCC trunk, which will eventually become gcc 10. Though if someone steps up to maintain it, I'm sure the removal can still be reverted.

    Leave a comment:


  • s_j_newbury
    replied
    Originally posted by jabl View Post

    Actively developed, complex pieces of software like GCC require constant work on the CPU targets to keep up with changes. If nobody is willing to maintain any particular CPU target support, that support will eventually be deprecated and subsequently removed. Which is what happened here. The GCC community is under no obligation to maintain support for dead targets indefinitely.
    Of course this is true, but maybe they just needed to draw attention to the fact it needs a maintainer, or maybe that's part of the intention with dropping the support, although deprecating it first might have been sufficient.


    That Sony prevented 3rd party OS'es on the PS3 was probably also a quite effective tool to drastically reduce any volunteer interest to work on that target.
    No doubt Sony destroyed the hobbyist/non-gaming development communities back when they removed "Other OS", it certainly stopped me from working on it at the time since I also wanted to be able to play games on PSN. I even bought the Linux Dev Kit for the PS2 back in the day and made good use of it, so it's not like I'm just complaining about a hypothetical, I bought the PS3 with the plan to develop/hack with Linux on it!

    Leave a comment:


  • s_j_newbury
    replied
    PS3 is jailbroken though, nothing stops anybody from not updating their firmware to a "fixed" version or downgrading to a working version (may require a dongle?). Would be great if Sony would open up the platform, but it really isn't likely.

    It is interesting hardware for the "homebrew scene".

    https://www.psdevwiki.com/ps3/

    Leave a comment:


  • jabl
    replied
    Originally posted by s_j_newbury View Post
    I don't really understand why GCC would drop support for a relatively modern, powerful (even if flawed) ubiquitous hardware platform that's still used by enthusiasts just because the vendor no longer supports it. Since when was GCC a vendor oriented compiler project? Is it just a lack of maintainers for the code in question?
    Actively developed, complex pieces of software like GCC require constant work on the CPU targets to keep up with changes. If nobody is willing to maintain any particular CPU target support, that support will eventually be deprecated and subsequently removed. Which is what happened here. The GCC community is under no obligation to maintain support for dead targets indefinitely.

    That Sony prevented 3rd party OS'es on the PS3 was probably also a quite effective tool to drastically reduce any volunteer interest to work on that target.

    Leave a comment:


  • edwaleni
    replied
    Cell was interesting. It was a response to the needs of its time. It didn't stick, but it exposed some new thinking in compute hardware design. Not all designs that respond to a problem catch on. But that happens with most innovations.

    Leave a comment:


  • kpedersen
    replied
    I only worked on one project involving the PS3 hardware (it was porting a lesser known gaming title). I found it really interesting but didn't get a massive chance to delve too deep into the hardware or the SPU (the cheap port of the game didn't require too much resources . I would love to have a play again (we even still have one of the devkits laying around).

    Slightly unrelated (because access to the SPU is restricted) but I don't believe Sony has much to lose by reinstating the 3rd Party OS functionality. I doubt they have funds to spend on programmers to make this patch though.

    Leave a comment:

Working...
X