Announcement

Collapse
No announcement yet.

Some Good & Bad News For The Nouveau Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • mmmbop
    replied
    grammar

    apologies for being condescending, etc. but it has been bugging me:

    Michael's use of the past perfect tense, when the past simple tense is more appropriate, bugs me every time..

    E.g.:
    In the end, only the GeForce 8 series hardware had played well for us with the latest Linux 2.6.37 kernel ...

    The word 'had' in that sentence implies that what was once true during testing is no longer the case. Instead, consider this:

    In the end, only the GeForce 8 series hardware played well for us with the latest Linux 2.6.37 kernel ...


    It's easy to come up with examples just by searching for the word 'had'. E.g. from later on in that page:
    All four of these cards had run fine using the open-source code ...
    => All four of these cards ran fine using the open-source code ...


    I'm not trying to say that I have perfect grammar, that I expect all of you to be native English speakers, or that I don't appreciate the hard work and value of this site. I just see room for improvement. (Like how the new design is so much better than the old.. thanks for that!)

    & Michael isn't the only one. From the comments:
    Have you ever wrote a graphical driver? Because you appear to have no clue about it.
    => Have you ever written a graphical driver? ...

    Sorry to nitpick.

    Leave a comment:


  • squirrl
    replied

    :display driver howto


    It'd be constructive if the developers would take a hiatus and put together an instructional HOWTO.

    It took me six months to learn how to export models from Blender3D, load then display them in OpenGL.

    I had several topics to grasp. Python, OpenGL, GLUT, and finally GCC idiosyncrasies.

    With 3D driver development it's possibly to assume 10 topics at least.

    Leave a comment:


  • MuPuF
    replied
    tessio: The current focus of pscnv devs is to get a working driver for fermi cards. You'll need to wait before getting any results.

    If you look carefully at the mesa commit log, you may find something of your interest

    Leave a comment:


  • tessio
    replied
    And how about pscnv performance?

    Leave a comment:


  • MuPuF
    replied
    ChrisXY: I felt this way up to when I've been given a laptop with an nVidia card. I tried nouveau and liked it up to the point of contributing.

    Actually, we are having some fun trying to understand the craziness of nvidia in hardware design. My heart is deeply on the AMD side, but there is a real challenge with nVidia too.

    I guess if some new developers would come and actively contribute to radeon, more documentation could be released and more work be done under NDA before the release of new GPUs.

    This isn't something we can do with nVidia. We are trying to catch up and have good support when the blob will stop supporting your card.

    As for the reason why nVidia would release doc, I don't think wayland is incompatible with the current blob architecture. Embedded devices? Well, having the blob or an open source driver isn't going to change anything user-wise, so....

    Leave a comment:


  • ChrisXY
    replied
    Originally posted by r1348 View Post
    How are the nouveau devs supposed to make a performant driver without any proper documentation?
    They shouldn't. Why do the hard work for some company who clearly doesn't want you to do it.
    They should work on the radeon drivers. Clearly they are highly skilled so they could probably speed radeon development up much after getting used to it.
    Then, when a really good open source radeon driver exists, and wayland is about to replace X in some distributions used for embedded devices etc. nvidia might consider releasing documentation and THEN they can do their work there.

    Leave a comment:


  • MuPuF
    replied
    Originally posted by monraaf View Post
    It always amuses me when you are clueless on what's going on you don't investigate but just come up with some wild theory. I would have expected a little more clue from somebody who has been reporting on Linux graphics for so many years.

    My take on it is that VDrift tries to use glsl shaders and when they fail to compile (as is the case with Mesa) it falls back to a different rendering path, which is faster at the cost of lower graphics quality.
    Compared to other journalists, I think Michael is very good .
    I understand he lacks time for that but some little programming with OpenGL wouldn't hurt him.

    Leave a comment:


  • monraaf
    replied
    Originally posted by phoronix
    The VDrift racing game continues to be peculiar. With our ATI comparison the lower-end Radeon GPUs on the Gallium3D driver had outperformed the Catalyst driver, which was similarly the case on the NVIDIA side too. All four GeForce 8 graphics cards had ran VDrift at 1920 x 1080 much faster with the Gallium3D driver than did the NVIDIA driver. The NVIDIA driver results led us to believe there is an issue within NVIDIA's driver or Mesa-specific optimizations within VDrift itself.
    It always amuses me when you are clueless on what's going on you don't investigate but just come up with some wild theory. I would have expected a little more clue from somebody who has been reporting on Linux graphics for so many years.

    My take on it is that VDrift tries to use glsl shaders and when they fail to compile (as is the case with Mesa) it falls back to a different rendering path, which is faster at the cost of lower graphics quality.

    Leave a comment:


  • MuPuF
    replied
    Thunderbird: Indeed! So far, we've seen the memory clock is often a big bottleneck.

    Michael: If you want to upclock the card to the maximum frequency, please contact me. I'll tell you what to do.
    By the time, could you provide us with the clock speed for each card compared to the maximum clocks? The performance in game are almost linearly bound to the memory speed.

    Leave a comment:


  • misiu_mp
    replied
    Those results exceed the expectations of all of us who know anything about graphics driver situation in Linux. It also raises even greater expectations about the future.

    Now, given the hacker nature of the Linux community, in my opinion, the free drivers' developers should put more effort into making efficient GPGPU computing available in Linux. Think of all the opportunities wide-spread opencl support could bring to the community.
    I would be more happy with 50% opengl and 85% opencl performance than vice versa.

    Leave a comment:

Working...
X