I have a question, which may be a pipe-dream, but I wanted to know if any of my fellow Stack Overflow'ers could help me with.
In the company I work for, we do billions of image manipulations each month. Basically, we take a massive image, slice it into 256 pixels square images, color quantize them and save them as pngs - and move onto the next mammoth image. We employ a number techniques to do this as quickly as possible, and currently it IS very fast, but I think there is a chance we could make it stellar in speed.
The application itself is .Net 2.0, loops through the various bytes in the large image, reading the bytes for each smaller image, and use GDI to save the image after it has run through a quantization algorithm. We have dozens of machines which run this application, and all of them have Nvidia Geforce 8 Video cards (or better).
Is there a way in which I can use the GPU instead of the CPU to perform any or all of the tasks above? If so, how do I do this? Unfortunately I have never coded anything like this before so if anyone can help me, I might need it explained quite thoroughly (and slowly).