Finding the error in accelerometer readings

Hello,

I’m trying to calibrate my IMU MPU6050 accelerometer readings by finding the zero offset. I placed the IMU flat on a surface and followed this tutorial: Arduino and MPU6050 Accelerometer and Gyroscope Tutorial
But i can’t figure out what this code means:

AccErrorX = AccErrorX + ((atan((AccY) / sqrt(pow((AccX), 2) + pow((AccZ), 2))) * 180 / PI));
AccErrorY = AccErrorY + ((atan(-1 * (AccX) / sqrt(pow((AccY), 2) + pow((AccZ), 2))) * 180 / PI));

I managed to figure out this part:

sqrt(pow((AccY), 2) + pow((AccZ), 2)

which is like doing the Pythagoras theorem to give the length of the slope between the Y and Z axes. But then, how can arctan be done between the Y axis and this slope? It doesn’t make sense to me.

Much better tutorial: https://thecavepearlproject.org/2015/05/22/calibrating-any-compass-or-accelerometer-for-arduino/