Announcement

Collapse
No announcement yet.

Is Windows 7 Actually Faster Than Ubuntu 10.04?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by allquixotic View Post
    The summary judgment I pass on all open source 3d graphics drivers to date is too little, too late. Even in their mature, finished form, like with the Intel G965 drivers that have been in development for many years, they are too little; and they're much, much too late (the G965 hardware I used to use has since been replaced with a more modern system).

    By the time the open source radeon stack can do interesting things like OpenGL 2.1 with a HD5970, I'll be ready to upgrade to a HD6000 series. Only the people who upgrade once every 6-8 years are going to see any benefit from this unimpressive rate of development.

    <snip>

    But when you're fighting against Moore's Law as you are with hardware drivers, FOSS simply can't keep up -- until the people creating the hardware start to put a proper priority (i.e. >= 50% of the development manpower) into open source drivers. Til then, Linux and other FOSS operating systems will be behind the curve with the most complicated device drivers such as 3d.
    Not sure I agree with this. You're looking at a level of activity that only started a couple of years ago and declaring the initial results "too late", which is fair, but I would argue that the reason for that lateness is "starting too late", not the inability to keep up with new hardware introduction.

    Using ATI hardware as an example, in 2007 there was decent 2D support for the 3xx/4xx GPUs (ie the 2002-2004 SKUs), experimental 3D for the same parts, plus the initial RE'ed avivo driver for 5xx. The only fully supported parts were r2xx and earlier, which were ~6 years old at the time. This matches what you are saying.

    2-1/2 years later, the graphics stack has been rebuilt around a common memory manager, which is a critical pre-requisite for further 3D work, 2D has been largely rewritten to make use of the 3D engine, *and* hardware support has gone from being ~6 years behind to ~6 months behind on average, in the sense that Evergreen has had modesetting support for a while (3 months behind), power management on EG is happening in sync with older parts (6 months behind), but acceleration is just staring to light up now (9 months behind).

    Measuring status at the start of a project is fine, and it would be totally fair to say that open source development was 6 years behind in 2007 when the project started, but given the amount of catch-up in the last couple of years it's really hard to argue that your "years behind" statement applies today or will apply in the future.
    Test signature

    Comment


    • Originally posted by Shining Arcanine View Post
      A while back, I recall looking into what options are used and the key 4 or 5 options that tend to define a kernel fitting well into a server role or a desktop role were set to the choices that would best suit a server. I believe that those are the defaults from kernel.org.
      Yes there are only a few settings but they are not all the Y/N type setting. Your timer frequency has multiple settings. The default kernel settings you get even in a vanilla kernel is not optimized to server or desktop but a compromise of both worlds. You might want to look at part of the discussion here as to an indication why defaults are chosen :

      Thirty years ago, Linus Torvalds was a 21 year old student at the University of Helsinki when he first released the Linux Kernel. His announcement started, “I’m doing a (free) operating system (just a hobby, won't be big and professional…)”. Three decades later, the top 500 supercomputers are all running Linux, as are over 70% of all smartphones. Linux is clearly both big and professional.

      Comment


      • Originally posted by deanjo View Post
        I don't think it has to be as "modern" as D3D10+ / GL3.x+ since many games are still written 9.0c / GL 2.1 level. If an OS is capable of running multiple renderers chances are that the default renderer will used in that game/app will be the one that offers better performance. I'm simply saying that if a Brand A vs Brand B comparison is to be done it shouldn't be limited to the lowest common denominator. If you do it's like someone taking a UDMA 66 drive, slapping it on a 40 wire IDE cable and then benching it.
        Yeah, I agree with that: benchmarks should test as many applications as possible to extract valid results, including legacy applications. The issue is that the sample size is rather small and is heavily biased against OpenGL: noone is actually shipping OpenGL 3.x/4.x software right now (except Unigine perhaps) so we might be testing modern D3D10+ pipelines against legacy OpenGL pipelines - can't really draw solid conclusions that way. That's why I would prefer GL3.x vs D3D10 tests for example.

        That said, I wouldn't give too much weight on the "default renderer" argument. More often than not, the default is D3D simply because Windows doesn't ship OpenGL IHVs by default (and because Intel's Windows OpenGL stack sucks but that's another story entirely). Google is creating a WebGL implementation on top of D3D for precisely those reasons (how sad is that?)

        Comment


        • BTW has anybody bothered to figure out the bang per buck comparison yet with the Specview tests? You don't get a whole lot of value with the dual Opteron 2384 + FirePro V8800 system considering you could build yourself the complete i7 + 9800GTX+ for less then the price of the V8800.

          Comment


          • Hello.
            Can you tell me how Phoronix gets the Hardware to perform the tests? Has the Hardware been donated by an organization or company? What organization or company?
            Thank you.

            Comment


            • Originally posted by YAFU View Post
              Hello.
              Can you tell me how Phoronix gets the Hardware to perform the tests? Has the Hardware been donated by an organization or company? What organization or company?
              Thank you.
              Pretty sure it's a mix and match. Some AIB's send their wares to but I believe most of the product is paid for out of pocket.

              Comment


              • Originally posted by BlackStar View Post
                You are mixing up API functionality with API design. Please don't, they are not the same thing: OpenGL 3.3/4.0 is modern in functionality but legacy-ridden in design. The original OpenGL 3.0 proposal would have fixed both issues in one go but the design was scrapped as too ambitious. The resulting spec only fixed 50% of the issue (functionality), leaving the broken design intact.

                Why is the design broken?....
                My first post! Maybe my last!

                Having just started to have a go at learning OpenGL I found your comment of useful interest. You state that the design is 'broken'. Design is the 'how'; the how of a specification. I have read recent the OpenGL specs but find it not much more than API functionality spec. It is hard to gain an understanding of the overall process desired. Threading is hardly mentioned in the glspec32.core spec.

                So, do you think that the design is 'broken|old' because the specs are also in need of updating to reflect modern GPUs and CPUs: multi-processors and threaded software techniques?

                Comment


                • Originally posted by Vanir View Post
                  My first post! Maybe my last!

                  Having just started to have a go at learning OpenGL I found your comment of useful interest. You state that the design is 'broken'. Design is the 'how'; the how of a specification. I have read recent the OpenGL specs but find it not much more than API functionality spec. It is hard to gain an understanding of the overall process desired. Threading is hardly mentioned in the glspec32.core spec.

                  So, do you think that the design is 'broken|old' because the specs are also in need of updating to reflect modern GPUs and CPUs: multi-processors and threaded software techniques?
                  I think the spec somewhere mentions that it very consciously doesn't specify how it should be implemented. That's entirely up to the hardware manufacturers. The spec can only make sure that it doesn't get in the way of a particular implementation.

                  Comment


                  • I see all this flame about DX, OpenGL, etc, and I see nothing about the real conclusion you're all omitting: if one can reasonably play games with Ubuntu, maybe better with other Linux distros, why then should anyone pay for a Windows license at all?

                    Maybe because some titles haven't made into Linux yet? Because some people believe that paying for Windows and those extra FPS are worth? Who knows?

                    My point is that the article proves that Linux is competently able to handle games, as long as you **don't** have an Intel IGP. Period. Want more details? More benchmarks? More options enables/disabled? Fine. But I believe that more benchmarks certainly will not change the conclusion of the article. At least, that's my opinion.

                    Comment


                    • Originally posted by Caveira View Post
                      My point is that the article proves that Linux is competently able to handle games, as long as you **don't** have an Intel IGP.
                      Not sure how you can draw a line like that when there are failures to even run the applications on certain setups, not to mention tests were done with their closed source drivers. Other facets of gaming have to be considered as well such as audio.

                      Yes under ideal circumstances and hardware configurations linux can keep up to windows in gaming and those circumstances depend heavily on outside 3d party closed source support. We still however do not have a good X vs Y comparison because not all in real life usage capabilities and functions were explored.

                      Comment

                      Working...
                      X