Therefore, I am assuming that WebCL uses a separate implementation of OpenCL and will require a separate kernel compiler? I don't know if graphics cards have memory segmentation, but if they don't, that would be the only way, and it would lead to binaries with somewhat decreased performance. I've searched online before, but I couldn't find anything in regards to graphics card memory segmentation.
Originally Posted by smitty3268
The main form of protection is provided by the IOMMU on the Northbridge. (Avoiding the graphic card being able to snoop any CPU space which isn't allowed expressely).
Originally Posted by vadix
Then there is protection at the API / at the driver (Allowing one app to execute kernels only on memory it has access to) The good thing is that most API don't allow complete random read/write anywhere in the memory. In theory that should be technically possible, but in practice, OpenCL enforce that you allocate buffers and provide them to your function.
Litecoin mining, that would be.
Originally Posted by Veerappan
Bitcoin mining is now at the hand of specilized hardware (ASICs) which are *Vastly* more performant than graphic cards (GPU get in the range of mega-hashes per seconds, ASIC get in the range of giga-hashes per second, and the latest "huge pile of ASICs in a big server case" reach the tera-hash per second range. A GPU is useless next to that).
Litecoin use a different mining algorithm (SCrypt instead of SHA256 ^ 2) and thus is heavily memory dependant. Newer generation of hardware don't make such a big jump forward (While it's mined on GPU, big high-end cluster CPUs are still competitive, and future ASIC seem that they will be in the same performance range as GPUs only a bit more economic on energy).
Originally Posted by curaga
More seriously, that indeed is going to be a big problem (specially since more and more online devices are portable [laptops, tablets, smartphones] and the impact of such online malware is going to be huge on the battery).