Hi, I have a pro micro which I want to use in a mouse project. My first step is to put the mouse cursor at the centre of the screen. I use the following simple algo to do this:
- Define HRES and VRES
- Move cursor HRES pixels right to hit edge of screen.
- Move cursor VRES up to hit top of screen.
- Now we are at a known location - top right corner of screen.
- Move cursor left HRES/2 pixels.
- Move cursor down VRES/2 pixels.
This is a piece of code that achieves the above.
#include <Mouse.h>
#define MOVE_DELAY 1 // in milliseconds
#define HRES 1920
#define VRES 1080
void setup() {
Mouse.begin();
}
void recenter() {
// First get to the top right corner
for (int i=0; i< HRES; i++) {
Mouse.move(1, 0);
delay(MOVE_DELAY);
}
for (int i=0; i<VRES; i++) {
Mouse.move(0, -1); // for some reason -1 is up on my screen
delay(MOVE_DELAY);
}
delay(1000);
for (int i=0; i<HRES/2; i++) {
Mouse.move(-1, 0);
delay(MOVE_DELAY);
}
for (int i=0; i<VRES/2; i++) {
Mouse.move(0, 1);
delay(MOVE_DELAY);
}
delay(10000);
}
void loop() {
recenter();
}
I am seeing the following problem here. if I set MOUSE_DELAY to 10, then the algorithm works. However if I set it to 1, it appears to overshoot so that when the cursor moves from the top right corner, it overshoots the centre of the screen and ends up too far to the left and too far down (too close to the top left corner). I find this very strange, don't you?
Why does MOUSE_DELAY=1 cause a problem?