In my code I'm trying to show a UIWebView
as a page is loading, and then, when it's done, capture an image from the web view to cache and display later (so I don't have to reload and render the web page).
I have something along the lines of:
CGContextRef context = CGBitmapContextCreate(…);
[[webView layer] renderInContext:context];
CGImageRef imageRef = CGBitmapContextCreateImage(context);
UIImage *image = [UIImage imageWithCGImage:imageRef];
The problem I'm running into is that, due to UIWebView
's tiling, sometimes only half of the page is rendered to the context by the time I capture the image.
Is there a way to detect or block on UIWebView
's background rendering thread so that I can get the image only after all of the rendering has finished?
UPDATE: It may be that thread race conditions were a red herring (it's unclear from the documentation, at any rate, whether UIWebView
's custom layer or a CATiledLayer
in general blocks on its background threads).
This may instead have been an invalidation issue (despite several sorts of calls to setNeedsDisplay
on both the UIWebView
and its layer). Changing the bounds of the UIWebView
before rendering it appears to have eliminated the "not drawing the whole thing" problem.
I still ran into a problem where a few tiles were being drawn at the old scale, but calling renderInContext:
twice seems to have mitigated that sufficiently.