I use QTKit with a QTCaptureMovieFileOutput to capture my iSight Cam. Below you see a short excerpt of the code, which sets the compression options:
while ((connection = [connectionEnumerator nextObject])) {
NSString *mediaType = [connection mediaType];
QTCompressionOptions *compressionOptions = nil;
if ([mediaType isEqualToString:QTMediaTypeVideo]) {
compressionOptions = [QTCompressionOptions compressionOptionsWithIdentifier:@"QTCompressionOptionsSD480SizeH264Video"];
} else if ([mediaType isEqualToString:QTMediaTypeSound]) {
compressionOptions = [QTCompressionOptions compressionOptionsWithIdentifier:@"QTCompressionOptionsHighQualityAACAudio"];
}
[mCaptureMovieFileOutput setCompressionOptions:compressionOptions forConnection:connection];
}
// Decrease framerate (be nice to CPU):
[mCaptureMovieFileOutput setMinimumVideoFrameInterval:0.2];
//Looks like we're good to go.
[mCaptureSession startRunning];
My problem is that this code is sucking my CPU hardly down (60% of CPU-Usage). It's working fine. I did some otimisation. I tried [mCaptureMovieFileOutput setMinimumVideoFrameInterval:0.2] to drop some frames an the speed is a lot better.
I cannot believe, that this cannot be faster. I've read about OpenCL Hardwareacceleration and other hardware accelerated stuff. Quicktime seems to use hardwareacceleration only for playback. Is there any possibility to enable optimizations in QTKit, or is there any other framework to capture the iSightCam using FAST compression without sucking so much CPU power?
This QTKit approach is working, but not suitable for realtimeapplications (with i.e. additional Videoprocessing). On iPhone or the new iPodTouch it's working with less CPU-power and it runs perfectly. But what's about OS X?