In Unity3D, it's really easy to do something when the user clicks on an object. Just add a script to that object defining a method named OnMouseDown() and add your behaviour to it. However, if you want your game to support multi touch on mobile devices or other type of devices with a touch screen, Unity doesn't provide methods like OnMouseDown() but for touches. You have to manually loop through Input.touches and check their states which can quickly become a burden if you have different kind of objects with different behaviour.
That's why I made the MultiTouchInputManager class which manage the user's touch events and transfer them to the appropriate detected object under the user's touch. The object, which inherit from an abstract MultiTouchObject class can then easily implements the TouchDown, TouchPressed and TouchUp events to easily make custom behaviours like drag and drop.
The MultiTouchInputManager also works in pair with the MultiTouchCameraDrag script which allows panning and pinch-zooming of an orthographic camera. That, along with physic or non-physical dragable objects classes which inherit from MultiTouchObject, makes a great starter pack for any 2D multi touch enabled projects.
How to use the MultiTouchInputManager:
- Add the MultiTouchInputManager script component to the Camera object or on an empty object.
- Add objects on your scene with scripts that inherit from MultiTouchObject. The Sample project already includes the MultiTouchDraggableObject (non physic drag and drop) and the MultiTouchPhysic2DDraggableObject (drag and drop by physical forces). You can use them directly or inherit from them for custom drag and drop behaviours.
- Adjust touch priority field of your objects to establish which object will have priority if multiple objects are overlapping under the user's touch.
Sample project download link : MultiTouchInputManager