I have a question about the device rotation and how Cinder handle it.
I see in the examples that if you want to use the device in landscape mode, you need to call glRotate before you draw. I don't like this solution because if you want to interact with your objects you also need to rotate the position of the finger. I also see nothing to detect if the screen is in a landscape or portrait position.
But I have seen applications, like the beautiful Planetary, that handle very well this rotation.
So here is my question:
How can you rotate the whole app (draw, touch, etc) just like an Objective-C App would do, just by calling a single method?
I have vertical and horizontal layouts for iOS apps. In my version of Cinder (which needs a more recent merge with master soon), I've added orientation events. When orientation changes, I draw to a FBO and use the texture to rotate a fullscreen rectangle for a nice transition.
About the touch event, I was asking if rotating the images still display them at there position on the screen. If a Rectf::contains would return true if the finger is in the displayed Rect. Sorry if I am confusing.
Ok, it seems that this is not what I was looking for.
When you turn the screen at 90 degrees, the x become at the vertical and the y at the horizontal. As your image is rotated, there is no more coherence between the touch position and the display position.
I personally have some wrapper code that adjusts the coordinates of the input event based on the rotation. The below code is what I use to convert an unrotated position to a correctly rotated (it probably won't work exactly as is, I have a special enum for orientations and already have the actualWidth and actualHeight reflecting the orientation):