GeForce GTX 680 On Linux
Now in terms of the NVIDIA GeForce GTX 680 on Linux, assuming you're using the official NVIDIA Linux binary driver you should be in good shape. As with previous generations of hardware, NVIDIA's official Linux driver should be at near feature party to the NVIDIA Windows driver. The performance should also be in good shape too relative to the Windows performance.
One of the Linux feature limitations that first appeared in the NVIDIA driver during Fermi enablement was no NVIDIA Linux overclocking support for the new hardware. With Fermi, the architecture became more complicated and implementing support for proper overclocking support within the NVIDIA Linux driver did not happen but it stands as a low-priority item. For older NVIDIA GeForce hardware, there still remains core/memory overclocking support in the NVIDIA binary driver when enabling the CoolBits option in the xorg.conf, after which point the clock speeds can be easily manipulated through the NVIDIA Settings panel. With Kepler, there is also no overclocking support under Linux. It's possible it could come in a future driver update, but there's no guarantee, so for now you're just left to run the graphics card at whatever speed it's set to run at from the video BIOS.
At least PowerMizer does work with the binary driver for automatically switching between performance levels. While PowerMizer works, after running the tests I realized there was a slight problem... The third (highest) performance level indicates a 705MHz core clock, 3004MHz memory clock, and 1411MHz processor clock. Okay, the GDDR5 memory clock is right, but the rest are not; the graphics core clock is some 300MHz too low. The performance level two is also the same as the performance level three. In checking what the Phoronix Test Suite was reporting, which reads its values using the nvidia-settings extension and in the case of clock frequencies via the "GPU3DClockFreqs", it too found the GK104 core topping out at 705MHz rather than 1006MHz.
In contacting the NVIDIA Linux team, they investigated and at first thought it might have been a defective video BIOS or other issue. However, in the end the NVIDIA Linux developers believe the card is operating correctly, it's just not being reported as such. With Kepler each of the GPU's performance levels has a range of frequencies and so they think it's basically just showing the low-end values. However, the reporting should be improved in a future release. For more details see this news posting.