Hi I want to make a pan tilt camera using esp32 camera (ai thinker) and a FTDI. I want the servos to move/track the object according the position (x,y)
I use the machine learning "edge impulse" and I uploaded photos from the object I want to track and identify
First, I took the existing edge impulse code and I added( in order to find the center point )a line that calculates the center using the height and width of bounding_boxes
But the servos the don’t move accordingly they move a bit off
Then shawn_edgeimpulse from edge impulse team answered to my question
Bounding box information (x, y, width, height) are given in pixels. So, when you calculate center coordinates (centerX
and centerY
), those are pixels (from 0 to whatever the width/height of your image is). Assuming you are using the Arduino servo library, the Servo.write()
method expects an angle in degrees (0 to 180). You need to translate center coordinates in pixels to an angle to pass to your servos.
So I modify the code and look like this, still not working ;/
.
.
.
int servo_min_angle = 0;
int servo_max_angle = 180;
int image_width = 48;
int image_height = 48;
.
.
.
void loop()
{
for (size_t ix = 0; ix < result.bounding_boxes_count; ix++) {
auto bb = result.bounding_boxes[ix];
if (bb.value == 0) {
continue;
}
int servoX_angle = map(bb.x, 0, image_width, servo_min_angle, servo_max_angle);
int servoY_angle = map(bb.y, 0, image_height, servo_min_angle, servo_max_angle);
servoX.write(servoX_angle);
servoY.write(servoY_angle);
ei_printf(" %s (%f) [ x: %u, y: %u, width: %u, height: %u ]\n", bb.label, bb.value, bb.x, bb.y, bb.width, bb.height);
}
}