I am running a simple function that get's called in multiple areas to help deal with layout on an iPad app during orientation changes. It looks like this:
- (void) getWidthAndHeightForOrientation:(UIInterfaceOrientation)orientation {
NSLog(@"New Orientation: %d",orientation);
end
And I call it in various places like this:
[self getWidthAndHeightForOrientation: [[UIDevice currentDevice] orientation]];
The function normally has some simple code that runs if the orientation is portrait or landscape. Unfortunately it wasn't working as expected when the app is started in what would be position 1. I get 0 as a result. Later if the function is called in the same manner but the device has never been rotated I get back a value of 5. What does this mean? Why would it throw these values?
In short why would [[UIDevice currentDevice] orientation] ever throw 0 or 5 instead of any value between 1 and 4?