Finding the error in accelerometer readings

Hello,

I'm trying to calibrate my IMU MPU6050 accelerometer readings by finding the zero offset. I placed the IMU flat on a surface and followed this tutorial: https://howtomechatronics.com/tutorials/arduino/arduino-and-mpu6050-accelerometer-and-gyroscope-tutorial/
But i can't figure out what this code means:

AccErrorX = AccErrorX + ((atan((AccY) / sqrt(pow((AccX), 2) + pow((AccZ), 2))) * 180 / PI));
AccErrorY = AccErrorY + ((atan(-1 * (AccX) / sqrt(pow((AccY), 2) + pow((AccZ), 2))) * 180 / PI));

I managed to figure out this part:

sqrt(pow((AccY), 2) + pow((AccZ), 2)

which is like doing the Pythagoras theorem to give the length of the slope between the Y and Z axes. But then, how can arctan be done between the Y axis and this slope? It doesn't make sense to me.

Much better tutorial: Tutorial: How to calibrate a compass (and accelerometer) with Arduino | Underwater Arduino Data Loggers