I've looked at several methods of navigation like SLAM and WAVEFRONT and well, they're extremely complex. It seems to me that a lot of the concrete implementations involve odometry. However odometry is generally unreliable.
Is there a way to calculate distance moved based on original positioning without utilizing odometry?
Yeah - it's called SLAM - but you already knew that.
I've seen some stuff using a Kinect to recognize glyphs and such, but that's a little expensive for me. That, and it relies on sticking glyphs/beacons everywhere. I'd prefer a non-beacon based navigation/mapping
This is SLAM too - the only thing different, is that true SLAM relies on recognising rooms and such based on their layout, and the position of the robot within that layout, as remembered by previous mapping trials, whereas a glyph/beacon system is more or less the same thing, except utilising an "artificial" and "known" feature to identify the location.
I've had some thoughts about utilizing sonar 'topography' IE, defining a region by sweeping around a PING sensor around on a servo. Since I have a heading given by a compass unit, I can assign that 'sweep' to an array. If I move forward, say 10 cm which I determine with the PING sensor, I take another sweep. If I turn, I know via my compass heading..and so on and so forth.
I feel like I'm on to something, but I don't quite know what. Maybe mapping/navigation based on wall following??
This too, is SLAM, whether your know/understand it or not. I suggest you read further on SLAM, and hopefully you can understand the mathematics behind it (don't ask me, I've never worked with it - I agree it is very complex); if the mathematics are beyond your knowledge level, then you'll need to learn enough of that to understand what is going on. One little step at a time; don't expect to know it all in one shot...