views:

73

answers:

1

We are having the following problem: a series of Core Image filters runs constantly in our program. When evaluating on my Macbook Pro, Core Image decides to schedule all graphics computation on the GPU, as expected. When using a MacPro, however, CI uses the CPUs! This is a problem, as we need them for other processing. [1]

The question now is: Can one tell CI to run exclusively on the GPU?

[1] Both hardware sets are of the newest kind. The MacPro has 8 Cores.

+1  A: 

CoreImage always uses the GPU if it can unless explicitly disabled. Check your console logs. It should spit something out saying why it fell back to software.

Another option is to try a binary search to find the offending filter. Strip stuff out of your chain until you can tell what's causing the problem.

See this thread for someone else debugging a problem like this.

Ken