Page 3 of 5 FirstFirst 12345 LastLast
Results 21 to 30 of 44

Thread: HD 3850 AGP lockup with OpenGL load

  1. #21
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,572

    Default

    I doubt you will see any significant performance difference between AGP8x and AGP4x. Even 2x won't make that much difference.

    And GLXGEARS IS NOT A FREAKIN' BENCHMARK

    Seriously, all glxgears does is draw flat-shaded triangles and clear the screen a lot. No textures, no shaders, none of the things that modern GPUs spend their transistor budget on. An old card will usually outrun a new card on that kind of test.
    Last edited by bridgman; 01-25-2009 at 12:45 PM.

  2. #22
    Join Date
    Aug 2007
    Posts
    6,675

    Default

    Well it is a thruput benchmark. Just raw power but nothing special.

  3. #23
    Join Date
    Jan 2009
    Posts
    18

    Default

    By the way, mine GPU speed is always 324Mhz, although normal speed must be 670Mhz. Can I set GPU speed to 670 manually?

  4. #24
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,572

    Default

    Quote Originally Posted by Kano View Post
    Well it is a thruput benchmark. Just raw power but nothing special.
    Agreed, but "throughput" means different things to different people. Glxgears only really measures ROP performance, and for the last few years the trend has been towards relatively fewer ROPs and relatively more shaders and texture engines. Driver optimizations have also tended towards more code which improves performance with complex workloads at the expense of longer code paths for trivial workloads.

    I think we're already at the point where software rendering is faster than low end GPUs when running glxgears. On more complex workloads, of course, it's a different story.

    Quote Originally Posted by monomakh View Post
    By the way, mine GPU speed is always 324Mhz, although normal speed must be 670Mhz. Can I set GPU speed to 670 manually?
    The chip should switch to full speed automatically when it sees a significant 3D workload. It's possible that glxgears doesn't use enough of the chip to trip the clock switching logic, not sure though. I guess that would be the ultimate insult for a benchmark; when the driver doesn't think it's even worth speeding up the clocks
    Last edited by bridgman; 01-25-2009 at 05:47 PM.

  5. #25
    Join Date
    Jan 2008
    Posts
    294

    Default

    Quote Originally Posted by monomakh View Post
    I have the same videocard and the same problem - hangs on 8x agp mode.
    My system is:
    hardware
    CPU:Athlon XP 2200+
    Mainboard:Soltek SL-75FRN3-L on nForce2 chipset
    RAM:768Mb
    Videocard: Sapphire HD 3850 agp
    Powerblock just 300w, but videocard doesn't start with another mine 400w block.

    OS: OpenSuse 11.1 x86 (2.6.27.7-9)
    Catalyst 8.12 (8.10 and previous cant to be installed on OpenSuse 11.1. 9.1 beta have the same problem).
    I've got the same card on an Abit nforce2 and it doesn't hang - there are differences, though.

    You are 150W short of PSU power and also I've never seriously used my card with kernels > that 2.6.24.7 as I get glitching.

    I also don't run gl apps with composite on as that was a bit flakey - glxinfo | grep render should say direct YES

  6. #26
    Join Date
    Jan 2009
    Posts
    18

    Default

    Composite was off, when I run all my tests. Today I've tried to enable composite and gets some slower FPS but difference is not serious.
    direct rendering is set to YES.
    As I said before, my system don't start with mine 400W PSU and this videocard, and I've back to 300W PSU. Although,maybe I will try to power on videocard to 300W PSU and motherboard and other devices to 400W PSU, but before I want to start some tests and even install Windows. If all will be works at windows - then problem not in PSU.

  7. #27
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,572

    Default

    Your 400W supply may have more +5 and even less power on the +12 rail; that's not unusual. The board draws most of its power from the +12 rail.

    You need to check the per-rail amperage ratings on each P/S and compare that to what the board requires. There's a bit of wiggle room if you have a really basic PC (1 hard disk, no floppy etc..) but not that much.

  8. #28
    Join Date
    Jan 2009
    Posts
    18

    Default

    I've downloaded and run under wine GL Excess (OpenGL benchmark program).
    I have the notebook with Quadro NVS 140M (much slower than HD 3850) and the same OS and GL Excess installed.
    GL Excess can start only in window mode, although no problem with fullmode with Nvidia.
    Summaries for HD 3850(1st rate) was much better than NVS 140M(2nd rate) for FILL RATE(21553/2829), polygon count(13522/3893),vram tests(4812/2607), only cpu-fpu(1763/3849) tests was failed (it's normal, my CPU is slower than on notebook).
    Instantly after test, I go to Catalyst Control Center but GPU works just only on 324Mhz.
    I will try to see how my card is works on windows... ughh... i need to find free space on hdd to install it

  9. #29
    Join Date
    Jan 2008
    Posts
    294

    Default

    Quote Originally Posted by monomakh View Post
    Instantly after test, I go to Catalyst Control Center but GPU works just only on 324Mhz.
    You'll not see the full rate after the test, you can make a little script like -

    Code:
    while [ 1 ] ; do
            aticonfig --adapter=0 --od-getclocks
            aticonfig --adapter=0 --od-gettemperature
            sleep 3
    done
    paste that into an editor save it as, say, poll-card.
    chmod +x poll-card and run it from an xterm/konsole/whatever
    ./pollcard > card-out
    run your bench then <ctrl> c it when you've finished and grep xyz card-out for whatever lines you are interested in.

  10. #30
    Join Date
    Jan 2009
    Posts
    18

    Default

    aticonfig --adapter=0 --od-getclocks
    ERROR - Could not find library: libatiadlxx.so

    ll /usr/lib/libatiadlxx.so
    -rw-r--r-- 1 root root 164529 Jan 25 20:30 /usr/lib/libatiadlxx.so

    By the way, I have installed windows xp sp2. GPU always works at 669Mhz by default, neither 324Mhz. The results of benchmark was really better than at linux (except VRAM test). The most difference was at polygon count and CPU/FRU tests.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •