I doubt if TFLite is still supported. Most, if not all of the Arduino effort was abandoned several years ago and the Github repository hasn't been updated for over 2 years.
You are much better off learning this stuff using a laptop and Python. There are countless tutorials.
I built model model in python on google colab. I only convert the model to tflite for inference on the arduino nano. but the model is built using sliding window. because I collected the data using arduino nano's accelerometer capturing 5 distinct gestures each gesture has 6 samples.
for (int i = 0; i < outputSize; i++) {
// Ensure output index is within range
if (i >= outputSize) {
Serial.println("Error: Output tensor index out of range!");
continue;
}
Then you should have no problem explaining how the if (i >= outputSize) { block gets executed then, as it apparently does from the output you've shown. Because I've looked at your code snippet and it's not apparent to me how i is ever greater than or equal to outputSize in the loop. Are you passing it by reference to the not present modelGetOutput function and changing it there? There doesn't seem to be anywhere in the loop itself where it's modified outside of the for statement.
Please as you can see from the code I did the most part of the coding. I only used chatgpt when I encountered the error and used their version. However, I saved by main code somewhere for things like this. Please help if you can.
you will find a relevant constant declaration and this function at the beginning of loop():
// see constant declaration!
const float accelerationThreshold = 2.5; // Threshold (in G values) to detect a "gesture" start
// ... later in the example ...
// wait for a significant movement
while (true) {
if (IMU.accelerationAvailable()) {
// read linear acceleration
IMU.readAcceleration(aX, aY, aZ);
// compute absolute value of total acceleration
float aSum = fabs(aX) + fabs(aY) + fabs(aZ);
// if total absolute acceleration is over the threshold a gesture has started
if (aSum >= accelerationThreshold) {
samplesRead = 0; // init samples counter
break; // exit from waiting cycle
}
}
}
Samples are only taken when the sum of the absolute values of aX, aY and aZ reach or exceed a given threshold. Using fabs() is mandatory as the values can be negative.
Your sketch seems to be a modified version of the example but this essential part is missing ...
In your sketch are several unnecessary (but not hindering) lines:
This can be deleted since expectedInputSize and actualInputSize are calculated based on the same constants windowSize and features. They will always be equal (unless you expect a defect controller or compiler):
// const int windowSize = 50; // Number of timesteps
// const int features = 3; // 3 features: x, y, z
// const int expectedInputSize = windowSize * features;
int actualInputSize = windowSize * features;
if (expectedInputSize != actualInputSize) {
Serial.println("Error: Model input size mismatch!");
Serial.print("Expected: "); Serial.println(expectedInputSize);
Serial.print("Actual: "); Serial.println(actualInputSize);
return;
}
This if clause does not make sense:
// const int outputSize = 5; // Expected number of output classes
for (int i = 0; i < outputSize; i++) {
// Ensure output index is within range
if (i >= outputSize) {
Serial.println("Error: Output tensor index out of range!");
continue;
}
As outputSize is a constant and i is only controlled by the for-loop the value of i can not become equal or greater than outputSize. The if-clause can be deleted.
If you apply the functionality as in the example the problems should be removed.
#include <Arduino_BMI270_BMM150.h> // IMU Sensor Library for Arduino Nano 33 BLE Rev.2
#include <ArduTFLite.h>
#include "model.h"
const float accelerationThreshold = 2.5; // Threshold (in G values) to detect a "gesture" start
const int numSamples = 50; // Number of samples for a single gesture
int samplesRead; // sample counter
const int inputLength = 150; // dimension of input tensor (3 values * 50 samples)
constexpr int tensorArenaSize = 8 * 1024;
alignas(16) byte tensorArena[tensorArenaSize];
// a simple table to map gesture labels
const char* GESTURES[] = {
"Horizontal Shake",
"Vertical Shake",
"Letter S",
"Circle",
"Letter M"
};
#define NUM_GESTURES (sizeof(GESTURES) / sizeof(GESTURES[0]))
void setup() {
Serial.begin(9600);
while (!Serial);
// init IMU sensor
if (!IMU.begin()) {
Serial.println("IMU sensor init failed!");
while (true); // stop program here.
}
// print IMU sampling frequencies
Serial.print("Accelerometer sampling frequency = ");
Serial.print(IMU.accelerationSampleRate());
Serial.println(" Hz");
Serial.println();
Serial.println("Init model..");
if (!modelInit(model, tensorArena, tensorArenaSize)){
Serial.println("Model initialization failed!");
while(true);
}
Serial.println("Model initialization done.");
}
void loop() {
float aX, aY, aZ;
// wait for a significant movement
while (true) {
if (IMU.accelerationAvailable()) {
// read linear acceleration
IMU.readAcceleration(aX, aY, aZ);
// compute absolute value of total acceleration
float aSum = fabs(aX) + fabs(aY) + fabs(aZ);
// if total absolute acceleration is over the threshold a gesture has started
if (aSum >= accelerationThreshold) {
samplesRead = 0; // init samples counter
break; // exit from waiting cycle
}
}
}
// reading cycle of all samples for current gesture
while (samplesRead < numSamples) {
// check if a sample is available
if (IMU.accelerationAvailable()) {
// read acceleration and gyroscope values
IMU.readAcceleration(aX, aY, aZ);
// normalize sensor data because model was trained using normalized data
aX = (aX + 4.0) / 8.0;
aY = (aY + 4.0) / 8.0;
aZ = (aZ + 4.0) / 8.0;
// put the 3 values of current sample in the proper position
// in the input tensor of the model
modelSetInput(aX,samplesRead * 3 + 0);
modelSetInput(aY,samplesRead * 3 + 1);
modelSetInput(aZ,samplesRead * 3 + 2);
samplesRead++;
// if all samples are got, run inference
if (samplesRead == numSamples) {
if(!modelRunInference()){
Serial.println("RunInference Failed!");
return;
}
// get output values and print as percentage
for (int i = 0; i < NUM_GESTURES; i++) {
Serial.print(GESTURES[i]);
Serial.print(": ");
Serial.print(modelGetOutput(i)*100, 2);
Serial.println("%");
}
Serial.println();
}
}
}
}
I got the below output
Accelerometer sampling frequency = 99.84 Hz
Init model..
Model initialization done.
Input tensor index out of range!
Letter S: Output tensor index out of range!
Please note that I will not have time for an in-depth analysis of this quite extensive library.
You may try to increase the buffer size as mentioned above.
If it does not help you may also check that the model you are using really fits to your application. If you use the model.h as delivered with the example code I assume it has been set up and trained for just two(!) gestures not five ...
Thank you for your time and replies. However, I'm using this sketch for my trained model.h. However, I will increase the buffer size and see if it will work for me.