JNI only adds a few clock cycles to every call. On my machine it was like 5-10 nanoseconds.
Originally Posted by snadrus
So if your function call takes 1millisecond without JNI, it will take 1,000005 milliseconds with JNI.
IMO this is negligible overhead. Unless of course you are calling very short running JNI functions (a few nanoseconds) in a tight loop.
But if that is being done, your are most certainly doing it VERY WRONG
What's maybe also worth mentioning, is that Wayland works in multi-seat, for local server-client setups, and now a common large distance networking API, being HTMLv5, can do remote applications, which works on almost any OS, on any platform, even natively on Windows Internet Explorer, without requiring X.org.
The only thing you can't do with Wayland is playing Doom3 out of the box, but seriously, why the hell would you want that?
I have no basis to judge whether Wayland is a good standard, but I'm hoping for convergence between the various Linux display servers. It would be cool to be able to use the same drivers for Linux, Android and Chrome OS, for example. If Android adopts Wayland, drivers won't be a problem, GPU vendors will be falling over each other offering Wayland compatibility. Would be nice if some of that fruit could fall off on Linux. Om nom nom nom.