I've built a robot which uses the Nexus tablet in place of its face. Players will then be able to interact with the robot by touching various parts of his body and completing circuits connected to the MakeyMakey.
I'm also developing a game with Construct 2 allowing players to play as the robot, using his hands, feet, and other buttons to control him. AppMobi and CocoonJS are the two current platforms available for exporting mobile applications out of Construct 2.
Unfortunately, both platforms only support touch input, and the MakeyMakey only outputs standard keypress functions (W,A,S,D,?,?, ect.). I'm struggling to find a way to make the MakeyMakey substitute the touch input or communicate with the game.
I've approached both developers for the mobile platforms and they have no ETA on the addition of enabling such any input outside touch.
I've found I can run a shell script to emulate touch input on the device directly:
sendevent dev/input/event0 3 57 56 sendevent dev/input/event0 3 48 4 sendevent dev/input/event0 3 53 1266 sendevent dev/input/event0 3 54 34 sendevent dev/input/event0 0 0 0 sendevent dev/input/event0 3 57 4294967295 sendevent dev/input/event0 0 0 0
Although, do not know of a way to run such a script with keypress tied to the MakeyMakey.
I've found Tasker for Android can run the script(!), but the app is not able to detect key presses and therefore I have no way to trigger the task.
I sense digging into the Ardunio code and attempting to customize the MakeyMakey itself may be the best course of action, but I'm unsure if it can potentially output touch events, or key presses Tasker will recognize.