Results 1 to 8 of 8

Thread: NVIDIA 169.04 Driver Improvements

  1. #1
    Join Date
    Jan 2007
    Posts
    14,378

    Default NVIDIA 169.04 Driver Improvements

    Phoronix: NVIDIA 169.04 Driver Improvements

    Last week NVIDIA introduced the 169.04 Beta Linux driver for their GeForce and Quadro graphics cards. This X.Org driver contained a number of GeForce 8 fixes, initial support for the GeForce 8800GT graphics card, monitoring of PowerMizer state information, and other changes. What we had not tested, however, at that time was a performance comparison of the new driver and the previous 100.14.23 driver. The undocumented fact we have found is that this 169.04 Beta driver does deliver performance improvements for the GeForce 8 series on Linux.

    http://www.phoronix.com/vr.php?view=11476

  2. #2
    Join Date
    Aug 2007
    Posts
    21

    Default

    The most cool feature of this release is XRender improvement. Speed up is very very much visible in render_bench application. In some tests it is up to 128x speed increase on Geforce 6150. I feel that since 169.04 work in KDE 2D desktop is much more comfortable.

    169.04 is the best driver since release when XvMC was introduced.

    The sad thing about Phoronix and all other IT news site or newspapers are 3D tests only. I would like to see some day feature review of GPUs in terms of video and 2D performance/features.
    (XRender; XAA; EXA; Xv max resolution , image formats; XvMC accelerated movie formats; DPMS)

    It would be interesting if recent 169.04 Nvidia driven hardware is faster in 2D than Matrox Parhelia with recent mtx driver, Radeon with recent fglrx driver, S3 chrome 20 with recently released Linux driver (it is initial release but with XvMC !!!) or Intel.

    There are test tools for 2D:
    render_bench (XRender performance)
    x11perf
    xvinfo to check what Xv caps are
    xvmcinfo app to check XvMC caps

  3. #3
    Join Date
    Sep 2007
    Posts
    158

    Default

    I think we can thank ATI for these recent improvements

  4. #4

    Default

    Quote Originally Posted by remm View Post
    I think we can thank ATI for these recent improvements
    I am not so sure about that...

  5. #5
    Join Date
    Oct 2007
    Posts
    370

    Default

    Quote Originally Posted by zbiggy View Post
    The most cool feature of this release is XRender improvement. Speed up is very very much visible in render_bench application. In some tests it is up to 128x speed increase on Geforce 6150. I feel that since 169.04 work in KDE 2D desktop is much more comfortable.

    169.04 is the best driver since release when XvMC was introduced.

    The sad thing about Phoronix and all other IT news site or newspapers are 3D tests only. I would like to see some day feature review of GPUs in terms of video and 2D performance/features.
    (XRender; XAA; EXA; Xv max resolution , image formats; XvMC accelerated movie formats; DPMS)

    It would be interesting if recent 169.04 Nvidia driven hardware is faster in 2D than Matrox Parhelia with recent mtx driver, Radeon with recent fglrx driver, S3 chrome 20 with recently released Linux driver (it is initial release but with XvMC !!!) or Intel.

    There are test tools for 2D:
    render_bench (XRender performance)
    x11perf
    xvinfo to check what Xv caps are
    xvmcinfo app to check XvMC caps
    xvinfo says maximum size is 2046 x 2046

    i have a few questions regarding this..

    is this the maximum size of video which you can play back via xv, or is it the maximum OUTPUT of xv? what i mean is, if i have a monitor capable of 2560x1600, and i for example attempt to play 1920x1080 video, will xv be unable to scale the 1080p video up to 2560x1440? or is it just that the video resolution i input may max be 2046x2046?

  6. #6
    Join Date
    Jan 2007
    Posts
    138

    Default

    It means the largest display size it can handle is 2046x2046

  7. #7
    Join Date
    Oct 2007
    Posts
    370

    Default

    do you know if it can handle more on a card supporting dual-link dvi? otherwise it kindof sucks.

  8. #8
    Join Date
    Dec 2007
    Posts
    677

    Default

    Does the video card adaptively decide when I needs to go on a higher performance level? Because I couldn't find an option to set it.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •