Announcement

Collapse
No announcement yet.

NVIDIA's Working On A New Driver Architecture?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • DanL
    replied
    Originally posted by elanthis View Post
    Everyone would be happier still though if NVIDIA could just fit into the existing infrastructure.
    It's called nouveau
    I, for one, would be happier if "the existing infrastructure" could catch up to Nvidia. I would autoejaculate if ATI got onboard the VDPAU train instead of saying, "NIH (Not Invented Here), so here's our crappy Xvba implementation."
    And for intel users, there should be massive work on a gallium driver, though I'm not seeing a whole lot of commits there...
    Santa, where are you?!

    Leave a comment:


  • deanjo
    replied
    Originally posted by Shining Arcanine View Post
    It is the 260 drivers.
    Not even close.

    Leave a comment:


  • mirv
    replied
    Originally posted by Havner View Post
    Sorry for double post, but apparently can't edit post after 1 minute.

    Just wanted to add that article title suggests that some revolutionary changes might be ongoing while after reading the quote it just looks like they are refactoring, and not necessarily the whole driver, might be just the part that was mentioned.
    Yes, I'll have to agree with you there, but hopefully there'll be more news later to clear things up. Otherwise, it's just an nvidia dev saying "working to make the driver better" which really shouldn't be news to anyone.

    Leave a comment:


  • mirv
    replied
    Originally posted by Fenrin View Post
    In some games e.g. Regnum Online there was a huge performance regression between NVIDIA driver 19x and 25x/26x drivers. Regnum Online uses shaders, but even with disabled shaders the performance regression exists (here is a very short thread in the NVIDIA forum about this issue, it was close very early).

    I wonder why and also if other more advanced Linux games are affected by this performance regression.
    If nvidia are internally rewriting the drivers to a new architecture, the focus is on getting said new architecture working first, then increasing performance, so I wouldn't really call it a performance regression - it's not a bug. The benefit is of course that the driver should be able to be better maintained in the future, and can likely handle and provide new features easier.

    Leave a comment:


  • Fenrin
    replied
    In some games e.g. Regnum Online there was a huge performance regression between NVIDIA driver 19x and 25x/26x drivers. Regnum Online uses shaders, but even with disabled shaders the performance regression exists (here is a very short thread in the NVIDIA forum about this issue, it was close very early).

    I wonder why and also if other more advanced Linux games are affected by this performance regression.

    Leave a comment:


  • Havner
    replied
    Sorry for double post, but apparently can't edit post after 1 minute.

    Just wanted to add that article title suggests that some revolutionary changes might be ongoing while after reading the quote it just looks like they are refactoring, and not necessarily the whole driver, might be just the part that was mentioned.

    Leave a comment:


  • Havner
    replied
    Originally posted by elanthis View Post
    Sounds a lot more than they're just slowly refactoring their existing driver.
    Precisely.

    Whole big article just because NVidia engineer put "architecture" word in his sentence. Not even "new architecture". Two pages of pure speculations. And you then wonder why officials don't want to comment on various things when they need to watch every single word they say.

    Leave a comment:


  • myxal
    replied
    @hdas: Intriguing. Sounds like I should have given the FX5200 another whirr before ditching it. Ah well...

    Leave a comment:


  • hdas
    replied
    @ myxal and kayosiii: Yes, the nvidia-settings gui application shipped with the driver does on-the-fly stuff like multi-montior setups.

    As a side note, if you need a command-line utility that is more convenient and functional (for people like me), checkout this awesome tool called "disper": http://willem.engen.nl/projects/disper/ .

    Leave a comment:


  • kayosiii
    replied
    Originally posted by myxal View Post
    Does the app make adjustments on-the-fly though (which is what I mean by run-time? Last time I saw it (6 months ago) it was basically a beefed-up xorg.conf editor frontend and any adjustments beyond resolution changes required restarting X.
    Seems to work that way for me I use it when I hook my laptop up to an external screen It can do this without requiring an X restart. It can be used to modify your X config settings on startup. But other than that everything works live.

    Leave a comment:

Working...
X