If you really want to use the Arduino, the first step is getting the arduino functioning with the touchscreen in the way you would like to use it. Write arduino functions to perform all the basic required actions with the touchscreen. A touchEvent interrupt handler to record the location and timestamp of a touch event. A drawScreen function to paint the screen from an array of pixel_t. Store some frequently used image maps in Arduino flash if you have the space, for quick display of menus, buttons, background. create a text2imageMap function. write the matrix math functions for moving your small element image maps around within the main pixel array. Handle overlaps/off screen/… analyzeTouch would parse a generic touchEvent into a logical action such as buttonPush, swipe, scale, rotate, areaSelect, etc; passing on the original timestamp. For each action, for example: menu selection, if the display should always show a fixed new set of choices based on a given choice, code that in the arduino. Write functions to store and readback state or configuration data from EEPROM. Basically, code your entire user interface into the arduino(if it will fit) so that it can display each required element, and parse all meaninful touch information discarding the rest. Dont implement any actual functionality of the selections in the arduino, just record the selections.
Next step, devise a protocol. figure out what commands sent to the arduino should trigger display of which stored image maps, where on the screen. How to receive a serialized pixel_t screen and send to drawScreen. Send out notice of logical actions with their timestamps.
Now code your BeagleBone, PC or really any processor to interface with your protocol.
Next, share your work. I’d love to see it.