Originally posted by smitty3268
View Post
Announcement
Collapse
No announcement yet.
Why More Companies Don't Contribute To X.Org
Collapse
X
-
I believe tha X is misunderstood too, and i do not believe Wayland is what will make Linux better for the desktop. It is too bad that most people, even on these forums, do not understand Linux/GNU at all. There are many Ubuntu kids/know-it-alls spreading FUD on the internets...
What we really need *for the desktop*, is better opensource drivers plus better compositing managers. We should be using more and more of the gpu for rendering work. Gpus are everywhere these days, it is not possible to buy a current system, even a netbook, without at least OpenGL 1.4+ support.
In 2 years, all Intel and AMD desktop/laptop/netbook cpus will include an on-die gpu(actually this is already happening, it is just that Bulldozer will need a second revision to become a fusion product, hence the 2 year period). This will be at least OpenGL 2.1+, with enough speed and ram to play decently modern games.
So, what will provide us with speed, power savings, eye candy, is to have all rendering happening on the gpu. We will arrive at that point, and having X or Wayland will be largely irrelevant.
Comment
-
I'm surprised X hasn't taken off in the thin client market. With the cheapness of building powerful multicore machines for use as mainframes I would think it would be really cost effective for schools and businesses to leverage something like NX based upon XOrg instead of buying hundreds of expensive way overpowered workstations. Nothing else could in principle perform as well as X in that area.
Comment
-
Originally posted by Smorg View PostI'm surprised X hasn't taken off in the thin client market. With the cheapness of building powerful multicore machines for use as mainframes I would think it would be really cost effective for schools and businesses to leverage something like NX based upon XOrg instead of buying hundreds of expensive way overpowered workstations. Nothing else could in principle perform as well as X in that area.
Comment
-
Originally posted by smitty3268 View PostWhile i agree this was in poor taste, I don't see how you could call this "tormenting." It's not like they strapped them down and started water boarding or anything. Taunting, maybe.
Comment
-
Spreading the FUD...
Originally posted by TemplarGR View PostI believe tha X is misunderstood too, and i do not believe Wayland is what will make Linux better for the desktop. It is too bad that most people, even on these forums, do not understand Linux/GNU at all. There are many Ubuntu kids/know-it-alls spreading FUD on the internets...
What we really need *for the desktop*, is better opensource drivers plus better compositing managers. We should be using more and more of the gpu for rendering work. Gpus are everywhere these days, it is not possible to buy a current system, even a netbook, without at least OpenGL 1.4+ support.
In 2 years, all Intel and AMD desktop/laptop/netbook cpus will include an on-die gpu(actually this is already happening, it is just that Bulldozer will need a second revision to become a fusion product, hence the 2 year period). This will be at least OpenGL 2.1+, with enough speed and ram to play decently modern games.
So, what will provide us with speed, power savings, eye candy, is to have all rendering happening on the gpu. We will arrive at that point, and having X or Wayland will be largely irrelevant.
Yes, I agree in having better OS drivers, but we also need better closed-source ones (mainly in the Catalyst camp). But the first thing you need to know is... how to program these drivers without infringing patents (in the ATI camp) or reverse-engineering them (Nouveau)?
Problem nr1.
About Wayland, is it REALLY an Ubuntu project?! I think the "hacker culture" destroys Linux reputation...
The problem we're also talking is about problems in programming X. Wayland is already supported by Freedesktop.org, and, programming an old architecture turns into a hard resource-spending project (even more if we're talking about OS projects ). Starting "from the scratch" (Wayland) sometimes is a better alternative than doing significant changes to some projects (X and Xorg).
Good OS drivers don't have OGL2.1 support (that's like having DX8 gaming support, but only 10 years later), they have all the OpenGL implementation WORKING (until OpenGL4, at least). And that will still take lots of time in the OS camp ...
But I agree in some points with you... I also believe most notebooks in the near future will have on-die GPUs, but those will be low-perfomance ones...
Cheers
Comment
-
Originally posted by evolution View PostYes, I agree in having better OS drivers, but we also need better closed-source ones (mainly in the Catalyst camp). But the first thing you need to know is... how to program these drivers without infringing patents (in the ATI camp) or reverse-engineering them (Nouveau)?
Problem nr1.
Even Microsoft does not allow shit like that to happen. With Visa and Windows 7 they forced Nvidia and friends to move that out of the kernel. This is one of the reasons why Nvidia drivers sucked for Vista early on.
About Wayland, is it REALLY an Ubuntu project?! I think the "hacker culture" destroys Linux reputation...
The problem we're also talking is about problems in programming X. Wayland is already supported by Freedesktop.org, and, programming an old architecture turns into a hard resource-spending project (even more if we're talking about OS projects ). Starting "from the scratch" (Wayland) sometimes is a better alternative than doing significant changes to some projects (X and Xorg).
It's just that the X Server driver model sucks.
We can keep using X11 applications, but we need to get rid of the X Server controlling the hardware. It's like having your web browser control the hardware. Except your web browser always runs as root.
Think about that.
Good OS drivers don't have OGL2.1 support (that's like having DX8 gaming support, but only 10 years later), they have all the OpenGL implementation WORKING (until OpenGL4, at least). And that will still take lots of time in the OS camp ...
But I agree in some points with you... I also believe most notebooks in the near future will have on-die GPUs, but those will be low-perfomance ones...
There are HUGE performance and programming advantages to having the GPU on the same piece of silicon. The GPU runs faster, there is massively less latency between the CPU and the GPU, it's more energy efficient, and it's cheaper. All sorts of advantages. The GPGPU concept is a slam dunk in terms of "It's going to happen".
Almost nobody runs discrete graphics card any more unless they have some specific need for higher performance graphics or need something like 3 monitors or whatever. And that is only because the GPGPU is it's infancy. Hell it's not even begun.
The deal is that for most workloads your not going to see any advantage to having more then 4 or 8 general purpose cores.
For improving performance beyond that your going to start to see processor cores that are specifically designed for different workloads.
That's effectively what a GPU is nowadays. There is no such thing as 'hardware acceleration' anymore. "Acceleration" just means that your OpenGL software stack is processed on both the GPU and CPU processors.
Think about this:
Do you want to load proprietary drivers just to be able to use your processor?
Comment
-
Think about this:
Do you want to load proprietary drivers just to be able to use your processor?
No. Almost everything will have on-die GPUs. Already the vast majority of systems use IGP. Intel outsells AMD/ATI and Nvidia, probably combined. And even then I am sure that until Intel kicked Nvidia in the balls in terms of the motherboard market IGPs were the largest consumers of Nvidia GPUs.
It's just that the X Server driver model sucks.
We can keep using X11 applications, but we need to get rid of the X Server controlling the hardware. It's like having your web browser control the hardware. Except your web browser always runs as root.
Cheers
Comment
-
Originally posted by evolution View PostYes, I know the market share of intel IGP's is bigger than ATI + nVidia in conjuction... Although there will always be a market for discrete GPU's while we still need them for some heavy graphic demands or GPGPU, (for instance)... Integrated GPUs is the way to go in mobile market, but in the desktop one, I don't think so... (personal opinion, OC)
On die gpus provide many advantages for speed. It is believed for example that Llano will be able to compete even with lower-mainstream discreet parts. And we are talking about the first generation of products of this kind.
One other thing to consider is that there are some changes needed to optimize for on-die gpus. You need to optimize in order to harness the minimal latency an on-die gpu provides. But as the world moves to on-die gpus, these optimizations will come.
After a while, the next step will be to scrap all discreet gpus altogether. Yup, that's right. The next step will be to use multiple APU's(cpu+gpu) instead of cpu+discreet gpu, when you need higher performance. You need to play Crysis 4? Add another APU, either on a second socket onboard and/or another pciexpress card with an apu onboard... This will be the future at some point... It makes sense.
Comment
Comment