1. Computers
  2. Display Drivers
  3. Graphics Cards
  4. Memory
  5. Motherboards
  6. Processors
  7. Software
  8. Storage
  9. Operating Systems


Facebook RSS Twitter Twitter Google Plus


Phoronix Test Suite

OpenBenchmarking Benchmarking Platform
Phoromatic Test Orchestration

ECS NVIDIA GeForce GT 240 512MB

Michael Larabel

Published on 22 January 2010
Written by Michael Larabel
Page 3 of 9 - 23 Comments

The ECS GeForce GT 240 was installed into a system running an Intel Core i7 920 processor clocked to 3.60GHz, used an ASRock X58 SuperComputer motherboard, had 3GB of DDR3 system memory, and ran with a 320GB Seagate ST3320620AS SATA HDD. On the software side was Ubuntu 9.10 (x86_64) with the Linux 2.6.31 kernel, X Server 1.6.4, and the NVIDIA 195.30 beta display driver was installed.

The GeForce GT 240 booted up just fine with the binary NVIDIA graphics driver and had properly mode-set to 2560 x 1600. The PCI ID for this graphics card was 0x0ca3 while the PCI vendor ID is of course 0x10de. Compiz was working and OpenGL applications were running just fine. PowerMizer also had no problems dynamically changing the performance levels based upon load, but when we tried using CoolBits to overclock the GPU was the first signs of a problem. When attempting to change the 3D clock frequencies manually or using the auto detect feature, the clocks would not change from 550MHz for the CPU and 1700MHz for the GDDR5 memory. Attempting to apply any other core/memory clock values would not take. We could not overclock (or underclock) this ECS GeForce GT 240 graphics card under Linux and all attempts to do so had failed. This is either a bug within NVIDIA's binary Linux driver, an issue with the video BIOS on this ECS graphics card, or some combination thereof.

Due to this overclocking issue and another set of problems to be mentioned in this article, the ECS GeForce GT 240 graphics card was then installed in an entirely different AMD-based system and it was running with the NVIDIA 190.53 stable Linux driver rather than the 195 series beta. With this driver, CoolBits claimed to work and the optimal 3D clocks that it found was 720MHz for the graphics processor and 2040MHz for the memory. These optimal clocks are rather high and equates to the GPU running 30% faster and the memory running 20% above its rated speed. These values seemed to apply fine, but once carrying out the testing following this, there was no difference in the performance compared to its stock speeds. When polling the NV-CONTROL extension to see what the clocks were reading, they were back at their stock frequencies.

Latest Articles & Reviews
  1. Radeon Linux Benchmarks: Catalyst 15.3 Beta vs. Linux 4.0 + Mesa 10.6-devel
  2. Trying Out The Modern Linux Desktops With 4 Monitors + AMD/NVIDIA Graphics
  3. Turning A Basement Into A Big Linux Server Room
  4. NVIDIA's $1000+ GeForce GTX TITAN X Delivers Maximum Linux Performance
  5. OS X 10.10 vs. Ubuntu 15.04 vs. Fedora 21 Tests: Linux Sweeps The Board
  6. The New Place Where Linux Code Is Constantly Being Benchmarked
Latest Linux News
  1. VirtualBox 5.0 Now In Beta, Adds PV To Windows/Linux Guests
  2. Go Language Improvements Coming For Ubuntu 15.04
  3. The Big SuperTuxKart Update Is Almost Ready
  4. Blender 2.74 Brings Many Improvements
  5. Qt Creator 3.4 Is Near
  6. Allwinner: "We Are Taking Initiative Actions Internally"
  7. It's Been Five Years Since The Phoronix Visit To Chernobyl
  8. Vulkan, The New Linux Server Room & BioShock Won Linux Users In March
  9. Debian 8.0 Jessie Gets A Release Date
  10. Firefox 37 Coming Today With Heartbeat, HTTPS Bing
Most Viewed News This Week
  1. The Big Features Of The Linux 4.0 Kernel
  2. Systemd Developers Did NOT Fork The Linux Kernel
  3. Improved OpenCL Support For Blender's Cycles Renderer
  4. Open-Source Driver Fans Will Love NVIDIA's New OpenGL Demo
  5. Allwinner Continues Jerking Around The Open-Source Community
  6. GNOME 3.16 Released: It's Their Best Release Yet
  7. Ubuntu 15.04 Final Beta Released
  8. Nuclide: Facebook's New Unified IDE