Every time someone asks "Can Wayland run on <whatever hardware, software and/or country>?", it begs the counter-question "What do you mean by Wayland?".
Wayland is a concept and a protocol. In that sense, it does not run at all. Or how do you think, say, HTTP runs? On the other hand, we have the Wayland core library, which contains the protocol specifications and the necessary implementation of the protocol in C. Still, it is only protocol. I can easily claim, that Wayland can run on anything that can a) show and render images, and b) pass images between processes. Does it make sense in some real-world case, or does an implementation exist, are completely different matters.
Weston, the reference implementation of a Wayland server, does have specific requirements depending on the backend used, and whether you want it to integrate properly with client applications using EGL.
As for performance benchmarking, if you have a properly written application, its framerate will always be capped to the display refresh rate. It is built in the protocol. If you try to avoid the cap, you may end up waiting in eglSwapBuffers() or maybe EGL does triple buffering, in which case the Wayland server is again involved only at the display refresh rate, at most. You simply cannot update the display more often than its refresh rate. Therefore benchmarking using an application framerate measures the performance of the application the most, if anything. To measure server performance, you need something else than just an application framerate.
You would also need to define, what performance is. Only after that you can think about how to measure it.
Wayland is a concept and a protocol. In that sense, it does not run at all. Or how do you think, say, HTTP runs? On the other hand, we have the Wayland core library, which contains the protocol specifications and the necessary implementation of the protocol in C. Still, it is only protocol. I can easily claim, that Wayland can run on anything that can a) show and render images, and b) pass images between processes. Does it make sense in some real-world case, or does an implementation exist, are completely different matters.
Weston, the reference implementation of a Wayland server, does have specific requirements depending on the backend used, and whether you want it to integrate properly with client applications using EGL.
As for performance benchmarking, if you have a properly written application, its framerate will always be capped to the display refresh rate. It is built in the protocol. If you try to avoid the cap, you may end up waiting in eglSwapBuffers() or maybe EGL does triple buffering, in which case the Wayland server is again involved only at the display refresh rate, at most. You simply cannot update the display more often than its refresh rate. Therefore benchmarking using an application framerate measures the performance of the application the most, if anything. To measure server performance, you need something else than just an application framerate.
You would also need to define, what performance is. Only after that you can think about how to measure it.
Comment