I had a fun little problem in my work-in-progress iOS app recently. The app recognises a pan gesture on the main view, but also has some buttons as subviews of the main view. Think a white rectangle that receives pan gestures, and a
UIButton in the middle of that view.
Everything worked fine in the simulator. Panning worked. Tapping the button worked. Not so on a real device.
On a real device when I tapped the button it would often trigger the pan gesture’s action, as it had code bound to the end of the pan gesture to perform a task. My pan gesture was not actually panning something visibly either, so it was not obvious at first that the pan gesture was occurring at all.
The reality was that when a human finger taps a button on a real iOS device, this can easily be detected as a slight pan. A “smudge” gesture if you will. If there was any movement at all in the finger that tapped, the pan gesture would trigger instead of the button action.
The solution was simple when you understand
UIGestureRecognizer. You need to tell the recognizer to ignore new touches that begin in the subviews. It seems rather counter-intuitive that we should have to do this given that
UIControl(s) are “primary” UI elements that the user interacts with, and having their behaviour messed with by gestures in the parent view hierarchy is completely pointless from what I can tell. You would expect that any touch that starts on a
UIControl would trigger that control’s gestures as a priority. Anyway, on to the solution.
You need to set a delegate on the gesture recogniser and then then in that delegate implement
This is one solution, testing to see if the topmost view that received the first touch was a
UIControl. You will need to change this to whatever is most appropriate for your UI. You may want to simply check if it is a subview of the view containing the gesture recognizer, but this could be too expensive if you have nested views.