Originally posted by discordian
View Post
Announcement
Collapse
No announcement yet.
Oracle Ships GraalVM 1.0 To "Run Programs Faster Anywhere"
Collapse
X
-
-
Eclipse OMR is apache 2, but GraalVM does much more. AOT compilation for the JVM is finally here, after first hearing Android was going get it probably ~5 years ago and Oracle was doing this maybe ~2 yrs ago.
You can optimize Java using static, and native types; Scala using @specialized and just be careful about using new and var in both respectively. You can get close to 2x-3x slower performance writing even naive code compared to C++ and probably in less than 1/2 the time it took the C++ developer. Figures from this ~6 year old paper: https://research.google.com/pubs/pub37122.html
The cost of using the heap and getting this performance is 3x the RAM (there is an Uppsala paper on GC/tuning), which GraalVM also reduces. The resource usage figures are more than the jump from assembly to compiled languages, so maybe we're seeing diminishing returns in terms of higher level languages vs. raw performance. But yeah, AOT is a big step for Java/JVM. Embedded Java on something like RPi is looking much better and the ability to run llvm bitcode should be interesting.Last edited by audir8; 19 April 2018, 11:37 PM.
Comment
-
Originally posted by jacob View Post
Good on them for using the GPLv2 but otherwise I say wait and see. I know nothing about the project but judging by the announcement, this sounds actually like some pretty terrible piece of bloatware/hypeware. Software that promises to be all things to everyone has never ever turned out to be any good so far.
Taking TruffleRuby, an implementation of ruby leveraging Graal, if you dig through here, you'll see the phenomenal improvements, even compared to JRuby which runs on the JVM: https://medium.com/square-corner-blo...t-91a5c864dd10 "After warming up, TruffleRuby pulls well ahead of the pack with its highly optimized GraalVM JIT. (See also TruffleRuby with SubstrateVM, which significantly improves startup time with only slightly less speed once warmed up.)" You'll see the warmup is really quick, and when you're talking about server side stuff, that warmup is pretty negligible.
Also: https://pragtob.wordpress.com/2017/0...-a-year-later/ writing a Go AI in Ruby, comparing across multiple different Ruby implementations.
Comment
Comment