I have a UINavigationController with two UIViewControllers on it (A and B). From A, I push B onto the stack. Then the user re-orients the device. I need to move some things around the screen (buttons, etc) to be visible in the new orientation on A.
I find that -shouldAutorotateToInterfaceOrientation is getting called on both A and B (and returning YES). But the -will / -didRotateFromInterfaceOrientation are only getting called on the visible ViewController (B). When B gets popped off the stack, A is shown in the new (correct) orientation but without the buttons getting moved as needed.
To solve this, I find myself implementing the following pattern:
in header file:
@interface A : UIViewController {
// ...
UIInterfaceOrientation layoutOrientation;
}
// ...
- (void)orientationChanged;
@end
in .m file:
- (void)viewDidLoad {
// ...
layoutOrientation = self.interfaceOrientation;
}
- (void)viewWillAppear:(BOOL)animated {
// ...
if (layoutOrientation != self.interfaceOrientation) {
[self orientationChanged];
}
}
- (void)didRotateFromInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
[self orientationChanged];
}
- (void)orientationChanged {
// move my buttons
layoutOrientation = self.interfaceOrientation;
}
Essentially, I'm checking to see if the orientation changed in -viewWillAppear and doing the work to update the UI there if needed. It works just fine, but this seems (a) tedious, and (b) a lot of duplicated code among my various classes like A. I can fix (b) by moving the code to a common superclass, but this still seems like something I shouldn't have to do.
Is there a better way of moving my buttons around on views that are not top-most on the navigation stack? My views come from .xibs if there's something in IB I need to check. Should I just be designing my views such that they don't need to move buttons around when the orientation changes?
Thanks!