3D Clickable Objects in Unity C#: Interesting Problem / Solution using IClickable Interface

(I yanked this directly out of a Google Plus post I made, and have tried to refine it a bit.)

I ran into something a little interesting and formed a semi-elegant solution in a very OO tradition (my humble opinion). What follows is a pretty detailed history of the problems so that you’ll know how I ended up where I am. If you have OnMouseDown woes with 3d objects, maybe consider this solution.


The Problem

While designing a 3d level selection screen that allows the user to navigate by touch/mouse, select a level, or go back to the home screen, I had nothing but issues getting “clicking” to work. That is, selecting one of the on-screen options was finicky to begin with, and became downright problematic as time went on. The sequence of events went something like this:

1. Implemented clickable objects with OnMouseDown (being the newbie I am to Unity). It worked in the editor and on Android, but I quickly learned that this is a poor practice. Doesn’t matter though, because…

2. Began implementation of draggable navigable scene. This went very smoothly in the editor, and didn’t go well at all in the Android Environment. It doesn’t matter though, because through some trial and error, I figured it out, and then…

3. While I was implementing raycast clicking for android, no matter what I clicked, the game would “Click” the most recent item added. I pretty quickly figured out why, but the important detail is that…

4. OnMouseDown stopped working on the entire scene. I can’t quite pinpoint when this started, but suddenly none of my clickables were clicking. This was very puzzling, and some online searching revealed that OnMouseDown is not a great method. At this point, I had my idea…


THE SOLUTION

I knew I wanted to use RayCasting anyway. This would be a better solution for android, and it would be better while testing if I implemented a uniform method of “clicking” across both my edit and android environments. To accommodate this, I created the simple IClickable interface. This monolithic script follows in its entirety:


using UnityEngine;

using System.Collections;


public interface IClickable

{

void Click();

}


I know, it’s quite the behemoth. Now, when I create my clickable main menu button, or level button, or whatever, I declare the class as follows:


public class MainMenuButtonBlock : MonoBehaviour, IClickable {


And within the class, I implement the interface:

#region IClickable implementation

public void Click ()

{

Application.LoadLevel (GameConstants.MAINMENU);

}

#endregion


I inherit and implement this interface on scripts attached to the items that I want clickable, basically creating a Click event handler. Now, I need to invoke the handler. This is also simple:


In the Update Method of a game object controlling the scene logic, I have an if…elif compiler directive, the first for UNITY_EDITOR, the second for UNITY_ANDROID. The only difference between the two is that the editor checks the mouse state and position, and the android checks touch. Once I have position, I cast a ray, check the collisions, and see if I’ve hit one of my Clickable objects:


RaycastHit hit = new RaycastHit();

…//TOUCH LOGIC OF YOUR CHOOSING TO CREATE THE RAY…

if (Physics.Raycast(ray, out hit)) 

{

GameObject clicked = hit.collider.gameObject;

//**Here’s the important part***:

IClickable clickedItem = clicked.GetComponent (typeof(IClickable)) as IClickable; 

if (clickedItem != null)

clickedItem.Click();

}


Just tested on both environments, and I’m good to go. Hopefully this helps somebody!