Announcement
Collapse
No announcement yet.
Facebook Is Aiming To Make Compilers Faster Using Machine Learning With CompilerGym
Collapse
X
-
Neural networks are revolutionary, but overrated. But the idea looks promising as long there's Natural Language Processing piped into it to document it's methods and can turn it into a hand crafted algorithm and write a scientific paper. Otherwise, when you trust magic that you don't know how it works, you get results like this.
- Likes 3
-
Originally posted by bregma View Post(3) how faster compilers equate with faster programs is beyond my understanding.
- Likes 1
Leave a comment:
-
>Our goal is to be a catalyst for using ML to make compilers faster, which is important as poorly optimized programs are slow and consume too many computing resources as well as too much energy, limiting applications of energy-efficient edge devices and making data centers less environmentally friendly
are they going to work on deleting Electron?
- Likes 3
Leave a comment:
-
Pretty cool. I once read a blog post about GitHub Copilot that hypothesized that AI being incorporated into whatever the next big language is will be like safety being the main feature of Rust. I think there's some credence to that.
Originally posted by bregma View PostThis seems very confusing. Their goal is to have compiler produce more "optimized" (for what?) programs by calling out to some Python programs that will iterate over combinatoric solutions using inscrutable and unknown heuristics and that will speed up the compiler. It strikes me that (1) adding a whole lot of out-of-process processing is unlikely to make the compilation faster, (2) how it is possible to prove the correctness of the output of a ML-generated algorithm laden with intellectual debt, and (3) how faster compilers equate with faster programs is beyond my understanding.
I guess the proof will be in the pudding. Or at least, the results will be in the pudding. I wouldn't trust my life or my liability to ML-generated software. Azimov's laws are fiction.
At least a lot of the AI world is using embedding now which is probably as close to symbolics as you can get in a connectionist algorithm.
Leave a comment:
-
The best way to optimize code is to stop using shit like electron… and come back a reasonable use of low level coding.
- Likes 15
Leave a comment:
-
This seems very confusing. Their goal is to have compiler produce more "optimized" (for what?) programs by calling out to some Python programs that will iterate over combinatoric solutions using inscrutable and unknown heuristics and that will speed up the compiler. It strikes me that (1) adding a whole lot of out-of-process processing is unlikely to make the compilation faster, (2) how it is possible to prove the correctness of the output of a ML-generated algorithm laden with intellectual debt, and (3) how faster compilers equate with faster programs is beyond my understanding.
I guess the proof will be in the pudding. Or at least, the results will be in the pudding. I wouldn't trust my life or my liability to ML-generated software. Azimov's laws are fiction.
- Likes 5
Leave a comment:
-
Originally posted by ddriver View PostGreat, maybe they can use AI to figure how to not be such a sh1tty company.
Also, having used react, I somehow doubt performance and efficiency are true priorities.
- Likes 5
Leave a comment:
-
GCC also needs to get configurable phase ordering, so it can benefit from such work. The problem with GCC right now is that the optimization phases break each other or are simply not applied because they happen in the wrong order. Often it would be necessary to run an optimization phase multiple times throughout the optimization process and with reinforcement learning, you could figure out the best pipeline setup for your application. And find a better default setting as well.
- Likes 3
Leave a comment:
-
I hope efforts get directly mainlined in compilers in some way and we can see a lot better optimization along them.
- Likes 1
Leave a comment:
Leave a comment: