My robot mapping project, How to get the rotated angle of my drone

I am a programmer but not an EE. I am learning EE and doing a robotics project by myself for one of my professors at school. The goal is to create a mapping drone. I have done a lot of research on the topic and it seems localization is a major problem to attack in robotics. I know I will need some help with this project and I am somewhat new to using arduino but I am getting the hang of it.
My robot is tank style and has brushless motors with encoders on them. My idea was to use the feedback from the motors to calculate a linear vector and then use the angle of orientation to approximate the x,y pose of the drone. What I would really like to know from you ardunio and robo pros is what would be the best way to get the angle of orientation from my robot. My teacher is funding this so buying additional sensors isn't a problem. I just want to know the angle the robot is facing so I can use this with the total distance traveled to update the x,y positions of the robot so the sensor data I collect at that point can be sensibly added to the map it is creating and so the robot will know where it is at relative to it's starting point at any given time. Thank you for your advice.

Have you looked into the gps units. A bit pricey but should give you what you need to map an area.

I have, I didn't want to get into using them because this is a prelude to entering a competition where they are sposed to operate in a GPS free zone. I was just hoping to use some trig functions to update the x, y. I'm just unsure on how to actually measure the angle, I have looked into to magnometers and accelorometers but I'm unsure if they will actually be able to calculate the angle my robot has rotated.

Next best thing is a compass sensor (The GY-26 Compass sensor module). I have not used one, but check it out, search the forum there is plenty on the sensor.

My robot is tank style and has brushless motors with encoders on them. My idea was to use the feedback from the motors to calculate a linear vector and then use the angle of orientation to approximate the x,y pose of the drone.

How are you going to account for surface slippage if you use the motor encoders in your position calculations?

Here is a reasonably complete analysis of differential steering (there are others) A Tutorial and Elementary Trajectory Model for The Differential Steering System Tank treads are much harder to model than wheels, because of the slippage.

zoomkat:
How are you going to account for surface slippage if you use the motor encoders in your position calculations?

I hadn't really thought about surface slippage. This is the first robot I have built. I'm more of a programmer. I haven't really been able to come up with any other good ways to actually localize. Do you have any suggestions on the best way for me to localize. Maybe combine the encoder data with accelometer data. Localizing has really just thrown me for a loop. I really appricate the feed back and any ideas.

Localizing has really just thrown me for a loop.

Welcome to the club! The determination of position and orientation of a robot is MUCH harder than it seems at first.

I haven't really been able to come up with any other good ways to actually localize.

If there were a really easy/simple way, you would already see every using it.