Microbit V2 delay slows down when using Visual Studio Code

Hi,
A strange phenomenon I am encountering with using the microbit V2 with Visual Studio Code using debug. It seems that the delay() function becomes very slow after a few seconds. The following demo program runs a blinking led for about 3 seconds with the expected delay of 1000 ms but gets down to about a delay of 60 seconds per blink after that. Somehow this has to do with using Visual Studio Code because if I run the same code without Visual Studio Code then the same code with the delay does not slow down. The led blinks at a rate of 1 second as expected then.
Any ideas what might be causing this behavior ?

Kind regards,
Rob van der Ouderaa

#include <Arduino.h>

#define ROW 25
#define COL 3

void setup() {
  pinMode(ROW, OUTPUT);
  digitalWrite(ROW, HIGH);
  pinMode(COL, OUTPUT);
  digitalWrite(COL, LOW);
}

void loop() {
  digitalWrite(COL, LOW);
  delay(1000);                       // wait for a second
  digitalWrite(COL, HIGH);
  delay(1000);                       // wait for a second
}

What does this mean?

It means that I start the microbit program from the Visual Studio Code IDE (https://code.visualstudio.com/). It allows me to debug the code running on the microbit. I can do things like single step through the source code and halt and examine the program that is running on the microbit.

So you are not using Arduino IDE? Is micro bit even Arduino compatible? Haven’t seen c++ usage from quick search just python and some handicapped visual thingy

Sure, the microbit V2 is designed to be Arduino programmable. But Arduino basically is just C++ with additional libraries and a very thin layer calling setup() and loop(). By using the Visual Studio Ide and including <Arduino.h> you can just program it like in the Arduino IDE but with the benefits of a better editor and real-time debugger. Cool stuff :-).

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.