I think your problem is in your frame adjustment. My gut tells me it has something to do with the logic being inside shouldAutorotateToInterfaceOrientation
because that happens before the rotation is actually done. Try doing your math the way they do it in this article and see if that helps.
I am actually still struggling with the same problem (a year later ... need iOS 3.0 compatibility). The rotation part was easy. However, the views and buttons don't seem to process touch and move/drag events. I've verified that the window (UIWindow) is receiving the touch events and at the right locations.
It's as if the last 1/3 of the screen (i.e. 480-320) doesn't propagate events to receivers after a rotation to landscape from portrait.
The example in the link by slf doesn't help since the rotated view controller doesn't have responders.
Any ideas?
I've fixed the problem I was having .. I basically needed to do [self.navigationController.view setNeedsLayout]
.
The way I understand this (which maybe incorrect is that self.navigationController.view.frame
was same as self.view.frame
and both were equal to (x=0,y=0,width=320,height=480). I then rotated self.view
by M_PI/2
and did a number of frame manipulation on select self.view.subviews to get everything to animate/position/scale correctly.
That worked out okay but the navigation controller was not willing to acknowledge touch events to parts of self.view there were to the right of 320. In essence, if this self.navigationController.view.clipsToBounds
were true, it might not even have shown that part of self.view.
Anyway, setting setNeedsLayout on the navigation controller's view resolved the issue. I hope this helps someone else. I suspect that was also the problem SorinA was having with buttons not getting touch events.