Thread Tools
This thread is privately moderated by Jack Crossfire, who may elect to delete unwanted replies.
Dec 18, 2013, 03:58 AM
Registered User
Jack Crossfire's Avatar
Thread OP

Android touch screen frustrations

So making a 2 stick controller on a touch screen requires detecting 2 simultaneous touch points. The challenge with a stick controller is keeping a touch point corresponding to a stick applied to that stick, no matter where the touch point moves. If the touch point corresponding to the cyclic was applied to the throttle when it moved to the left side of the screen, it would be disastrous.

Basically, the Goog has tried at least 3 methods for handling multitouch in the Android API. There were ACTION_DOWN, ACTION_POINTER_2_DOWN, ACTION_POINTER_3_DOWN ... macros. Then they tried some really painful ACTION_POINTER_INDEX_SHIFT, ACTION_POINTER_ID_MASK macros which Eclipse says are now depreciated. There's no evidence of what the current, accepted way of doing it is.

The problem is the only way of knowing the current state of the touch screen is to trap a serial stream of MotionEvent events, then construct a table of all the touch spots on your own. There's no way to poll the touch screen or get the current state of the screen with a bunch of gets.

So you get a serial stream of MotionEvent events. The MotionEvent contains an x,y coordinate for every touch point, but there is only 1 action variable describing whether a single touch point in that list was pressed or released. For every press or release of a touch point, a new MotionEvent has to be received with the x,y coordinates for all the points & an action applying to 1 point.

If there is only 1 touch point, the action variable applies to it & the enumerations are ACTION_DOWN, ACTION_UP. If there are multiple touch points, the enumerations are ACTION_POINTER_DOWN, ACTION_POINTER_UP, you have to extract the ID's of all the touch points, then match an ID which is embedded in the action variable to the ID of the relevant touch point.

It requires doing every event handler twice for a single touch point & multi touch point case. Each widget needs to record the ID of the touch point it's handling.

The Goog tries to automatically detect which touch point belongs to which finger. If it moves inside a certain distance, it's the same finger. Whatever touch point is controlling the right stick stays applied to that stick, wherever it moves on the screen.
Sign up now
to remove ads between posts
Dec 18, 2013, 04:37 AM
Registered User
There are tons of games that seem to do it no problem.

Ps, not really programming savy, but it is much nicer when the screen is split in two halves, and your center reference for each finger is simply where you first touch that half of the screen. It makes control much more fluid than a projected "stick"

Quick Reply
Thread Tools