If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
You'd think they start with graphic calculations and then think about spreadsheet calculations.
Let the graphical stuff be calculated by the GPU not CPU!
but last time I tried to compile their shaders (several months ago), they were not working, and looked like they broke some of the CL standard... but it was hard to tell, given that it was all CUDA code that was ported/translated through pre-processor macros.
Well, that explains. So it seems as if NVidia's OpenCL implementation is as forgiving as it's OpenGL implementation -- definitively not good. At least not good, if development is done using these drivers.
First, what *is* OpenCL going to be used for in LibreOffice? It just doesn't do all that much heavy number crunching as far as I know. Not that I'm against it, I'm just curious.
Secondly.. as for AMD not helping Blender, hopefully AMD will run into some of the same types of bugs in their OpenCL implementation that are making your OpenCL uses fail, will have to fix them, and your OpenCL code will then function on AMD GPUs.
hopefully AMD will run into some of the same types of bugs in their OpenCL implementation that are making your OpenCL uses fail, will have to fix them, and your OpenCL code will then function on AMD GPUs.
And again. The Blender Guys blame AMD, but for me its not clear if its really an fault in AMDs OpenCL Implementation or that Nvidias and Intels Implementation allow an not Specified Scenario.
And last as my PoV as an User. Then the Blender Guys have to implement an Workaround that it will work and stop crying like little pussys that AMD stole there candys.
Oh my lord don't argue about stupid stuff. Apparently you missed out on the rising popularity of hyperbole. He isn't trying to literally say only 3 people use Blender, only pointing out the vast difference in numbers of users between a general-purpose software suite and a specialized tool.
Originally posted by curaga
This picture sums it up with great eloquence.
You're using the fucking wrong tool for the job if your spreadsheet is that heavy.
Not at all. GPGPU usage makes things faster and it saves on battery life. Even if your spreadsheet doesn't _need_ OpenCL it's still beneficial to use it.
It's also disingenuous to claim that big spreadsheets are wrong. What _is_ the right tool for the job if you need a regular office worker to handle large volumes of tabular formulas? Have every office hire an extra $100k/year software developer to sit around in case someone needs a large data set to be processed and then spend 12 weeks designing, developing, and eventually supporting a specialized database schema and interface for what might just be a one-off use?
Modern hardware includes GPUs integrated directly into the CPU itself. You almost can't buy them separate any more. It's irresponsible to write software that can't make use of GPGPU where there's an advantage to doing so.
Not at all. GPGPU usage makes things faster and it saves on battery life.
Very much situation dependant. I'd be inclined to think it increases power use for the spreadsheet scenario, having to power up more shader cores, move things to VRAM.
It's irresponsible to write software that can't make use of GPGPU where there's an advantage to doing so.
Just like autovectorization, once cpus have fully integrated gpu units with unified address space and no penalty for using the gpu, it's the job of the compiler to make use of it with existing code.
It's irresponsible to require random app foo developers to learn a domain-specific language for a minority use case, when the compiler should be able to do a good enough job for them.
Comment