Originally posted by ms178
View Post
Announcement
Collapse
No announcement yet.
Parallelizing GCC's Internals Continues To Be Worked On & Showing Promising Potential
Collapse
X
-
- Likes 1
-
Originally posted by Clive McCarthy View PostThe use of very large source files (anything above 64kB) is a clear indication of poor software design
and btw, compiler compiles not just source file, but source + all includes. it is often measured in megabytes.
from slides "gimple-match.c: 100358 lines of C++ (GCC 10.0.0)"
so they are basically trying to compile itself faster
(you wouldn't find it in repo, it's autogenerated - is it poor or rich software design?)Last edited by pal666; 26 September 2019, 07:23 PM.
Comment
-
Originally posted by atomsymbolC/C++ #include directives, in particular the current way of processing #include directives by C/C++ compilers, is poor software&system design.Last edited by pal666; 26 September 2019, 07:23 PM.
Comment
-
Originally posted by jacob View Post1.6x speedup is something that users will certainly notice. This is great news
and even this 1.09 part isn't passing testsuite yetLast edited by pal666; 26 September 2019, 07:40 PM.
Comment
-
Originally posted by Clive McCarthy View PostMichael, You write:
"One of the most interesting Google Summer of Code projects this year was the student effort to work on better parallelizing GCC's internals to deal with better performance particularly when dealing with very large source files. Fortunately -- given today's desktop CPUs even ramping up their core counts -- this parallel GCC effort is being continued."
The use of very large source files (anything above 64kB) is a clear indication of poor software design and lack of modularity. It should not be encouraged by speeding up the compiler. The linker needs to be fast but the front end is already capable of using all the cores one has available. Parallelizing something that can already be run in parallel is a waste of effort.
Clive.
Comment
Comment