1. Computers
  2. Display Drivers
  3. Graphics Cards
  4. Memory
  5. Motherboards
  6. Processors
  7. Software
  8. Storage
  9. Operating Systems


Facebook RSS Twitter Twitter Google Plus


Phoronix Test Suite

OpenBenchmarking.org

ECS NVIDIA GeForce GT 240 512MB

Michael Larabel

Published on 22 January 2010
Written by Michael Larabel
Page 3 of 9 - 23 Comments

The ECS GeForce GT 240 was installed into a system running an Intel Core i7 920 processor clocked to 3.60GHz, used an ASRock X58 SuperComputer motherboard, had 3GB of DDR3 system memory, and ran with a 320GB Seagate ST3320620AS SATA HDD. On the software side was Ubuntu 9.10 (x86_64) with the Linux 2.6.31 kernel, X Server 1.6.4, and the NVIDIA 195.30 beta display driver was installed.

The GeForce GT 240 booted up just fine with the binary NVIDIA graphics driver and had properly mode-set to 2560 x 1600. The PCI ID for this graphics card was 0x0ca3 while the PCI vendor ID is of course 0x10de. Compiz was working and OpenGL applications were running just fine. PowerMizer also had no problems dynamically changing the performance levels based upon load, but when we tried using CoolBits to overclock the GPU was the first signs of a problem. When attempting to change the 3D clock frequencies manually or using the auto detect feature, the clocks would not change from 550MHz for the CPU and 1700MHz for the GDDR5 memory. Attempting to apply any other core/memory clock values would not take. We could not overclock (or underclock) this ECS GeForce GT 240 graphics card under Linux and all attempts to do so had failed. This is either a bug within NVIDIA's binary Linux driver, an issue with the video BIOS on this ECS graphics card, or some combination thereof.

Due to this overclocking issue and another set of problems to be mentioned in this article, the ECS GeForce GT 240 graphics card was then installed in an entirely different AMD-based system and it was running with the NVIDIA 190.53 stable Linux driver rather than the 195 series beta. With this driver, CoolBits claimed to work and the optimal 3D clocks that it found was 720MHz for the graphics processor and 2040MHz for the memory. These optimal clocks are rather high and equates to the GPU running 30% faster and the memory running 20% above its rated speed. These values seemed to apply fine, but once carrying out the testing following this, there was no difference in the performance compared to its stock speeds. When polling the NV-CONTROL extension to see what the clocks were reading, they were back at their stock frequencies.

Latest Linux Hardware Reviews
  1. Scythe Mugen MAX
  2. Intel Core i7 5960X Haswell-E On Linux
  3. Intel 80GB 530 Series M.2 SSD On Linux
  4. With A New Motherboard, The Core i7 5960X Haswell-E Lights Up
Latest Linux Articles
  1. Counter-Strike: Global Offensive NVIDIA/AMD Benchmarks On Linux
  2. Running Fedora 20 On Intel's Core i7 Haswell-E Platform
  3. A Tour Of The New Phoronix Office
  4. 7-Way Linux Desktop Gaming Comparison On Ubuntu 14.10
Latest Linux News
  1. Fedora 21 Alpha Finally Sees The Light Of Day
  2. Qt 5.4 Will Support Applications Under A Wayland Compositor
  3. Valve Rolls Out A New Steam Storefront
  4. Exynos DRM Driver Gets Updated For Linux 3.18
  5. The Features Coming For Fedora 21
  6. Counter-Strike: Global Offensive Starts Rolling Out To Linux Users
  7. The Gestures Support Of GNOME 3.14
  8. Linux 3.17 Has Basic Support For The Xbox One Controller
  9. openSUSE 13.2 Beta Still Using Btrfs By Default, & KDE Plasma 5 For Testing
  10. GTK+ 3.14 Brings Much Better Wayland Support, Multi-Touch, New Theme
Latest Forum Discussions
  1. X.Org Women Outreach Program Only Turns Up Two Applicants So Far
  2. Uselessd: A Stripped Down Version Of Systemd
  3. Wasteland 2 Officially Launched Today, Including For Linux Gamers
  4. State of Nouveau now and in the near future?
  5. Updated and Optimized Ubuntu Free Graphics Drivers
  6. NVIDIA GTX 770/780 -works ?
  7. New stress testing utility for GPU's
  8. How to get Catalyst 14.4 working on Ubuntu 14.04