Mac can't handle amount of serial data that windows can

Hi everyone

So, I’m building an OSC controller with a teensy 3.6. I’m reading 16 fader (slide potentiometer) values and sending them to a computer if a fader is moved. The code works as expected on Windows machines, but Macs seem to not be able to handle the amount of data I’m sending. If I make big jumps with a fader there is a very noticeable lag and if I do that with more than one fader it sometimes takes the mac seconds to process all the sent values. Then sometimes it recovers and i can receive a single fader signal normally, but sometimes the lag persists or gets continually worse, even if don’t make any crazy movements.
I see that behavior in the software for which the control messages are meant (Max), as wells as the Arduino serial monitor. I have tried it on two different Macbooks.

Reducing the resolution of the ADC (hence reducing the amount of data sent) helps, but I have to reduce it by about 2/3 (!!) for it to work as smoothly as on Windows (and I have yet to see as much as a stutter on a windows machine).

I’m assuming Apples USB implementation doesn’t just generally suck, so what am I missing? What am I doing wrong?

#include <OSCMessage.h>
#include <SLIPEncodedUSBSerial.h>
#include <ResponsiveAnalogRead.h>

const int faderAmount = 16;
const int analogInputPins[16] = {A14, A13, A16, A17, A18, A19, A20, A1, A2, A3, A4, A5, A6, A7, A8, A9};
char* oscAdresses[16] = {"/fader1", "/fader2", "/fader3", "/fader4", "/fader5", "/fader6", "/fader7", 
                         "/fader8", "/fader9", "/fader10", "/fader11", "/fader12", "/fader13", "/fader14", 
                         "/fader15", "/fader16"};

                  
class Fader {
  public:
    Fader(int _inputPin, char* _oscAdress);
    void handleInput(SLIPEncodedUSBSerial& SLIPSerial);
  
  private:
    int inputPin;
    char* oscAdress;  
    ResponsiveAnalogRead analog;
};

Fader::Fader(int _inputPin, char* _oscAdress) 
  : inputPin(_inputPin), oscAdress(_oscAdress), analog(_inputPin, true)
{
  pinMode(inputPin, INPUT);
  analog.enableEdgeSnap();    
}

void Fader::handleInput(SLIPEncodedUSBSerial& SLIPSerial) {
  analog.update();

  /* sending the OSC message */
  if(analog.hasChanged()) {    
    OSCMessage msg(oscAdress);
    msg.add(analog.getValue());
    SLIPSerial.beginPacket();  
    msg.send(SLIPSerial); 
    SLIPSerial.endPacket();
  }
}
                    
const int ledPin = 13;
Fader** fader = new Fader*[faderAmount];
SLIPEncodedUSBSerial SLIPSerial(Serial);

void setup() {
  pinMode(ledPin, OUTPUT);
  SLIPSerial.begin(9600);

  /* initializing the fader objects */
  for(int i = 0; i < faderAmount; i++) {
    fader[i] = new Fader(analogInputPins[i], oscAdresses[i]);
  }
  
  digitalWrite(ledPin, HIGH);  // turn on LED, so we know the Controller is powered. 
}

void loop() {
  
  for(int i = 0; i < faderAmount; i++) {
    fader[i]->handleInput(SLIPSerial);
  }
}

How old is your MAC? Most MAC Books are said to have far superior USB integration than PCs.

What is this mysterious software are you running on the computer?

How do you thinking that reducing the resolution reduces the data? If the data type isn't smaller, then the amount of data isn't smaller. Reducing the resolution just reduces the size of the number. If the new number doesn't fit into a smaller data type, then there isn't a reduction in data transferred.

The MACs I tested this on are from 2015.

The controller is supposed to control MAX 7.

What I mean by reducing data is that when the resolution of the ADC is smaller (let's say 8bit instead of 10bit) and I jump from bottom to top with a fader then for that jump only 256 values are sent to the computer instead of 1024. Hence less load on the computer...

But an int is an int is an int and int is 16 bits. It doesn't matter what value you set it to, an int is 16 bits. If you reduce the resolution so as to only use the values 0-256 instead of 0-1024, you are sending 0x00FF instead of 0x03FF. Still sending the same number of bits to the MAC, therefore the UART on the Arduino and the serial connection on the computer handles the same number of bits per second.

Per the reviews and discussions, the USB integration within the MACs in the decade is superior to the PCs. Something about being closer to the bus and less chipsets to pass through to get from the port to the CPU, but admittedly I haven't followed the details of items like this for decades so I can only relay what I have read published (not via blog or forum).

Are you using the Mac OS or Windows on the Mac?

The USB 3.0 connection (high transfer rate) on my Mac consistently out performs my AMD PC, both are running Windows.

But an int is an int is an int and int is 16 bits. It doesn't matter what value you set it to, an int is 16 bits. If you reduce the resolution so as to only use the values 0-256 instead of 0-1024, you are sending 0x00FF instead of 0x03FF. Still sending the same number of bits to the MAC, therefore the UART on the Arduino and the serial connection on the computer handles the same number of bits per second.

I completely agree on the first part! But I don't mind sending integers it's the amount of integers I'm sending that's problematic. Sending 256 integers per second instead of 1024 (times however many faders I'm moving) is what makes the problem go away.

I think that's just what I'm gonna do for now until I'm smarter and have more experience with those kinds of things :slight_smile: Thank you for your help anyway!

fricia:
I completely agree on the first part! But I don't mind sending integers it's the amount of integers I'm sending that's problematic. Sending 256 integers per second instead of 1024 (times however many faders I'm moving) is what makes the problem go away.

I think that's just what I'm gonna do for now until I'm smarter and have more experience with those kinds of things :slight_smile:

You need to slow down and think about what is going on.

Changing the resolution of the ADC does not change the amount of integers, it changes the value of the integers. If you could change the resolution to 2 bits it still sends one 16 bit integer per fader.

The changes you are describing have had no impact on the number or amount of data being transmitted. You are doing something else that you have not published if there is actually less data being sent (and I repeat, that you have not written anything to indicate that)

Changing the resolution of the ADC does not change the amount of integers, it changes the value of the integers. If you could change the resolution to 2 bits it still sends one 16 bit integer per fader.

Yes, absolutely! But let's say I move 3 faders from bottom to top in 500ms. The teensy sends 31024 integers to the computer in 500ms. With 8bit ADC resolution it will only send 3256 integers in 500ms - less data in the same amount of time. That's what I was talking about and that's what "solves" the problem.
Or I could just add a delay after every every loop to achieve a similar result... Which is what I've done and accepted as a solution.

Thank you all for your help!

fricia:
Yes, absolutely! But let's say I move 3 faders from bottom to top in 500ms. The teensy sends 31024 integers to the computer in 500ms. With 8bit ADC resolution it will only send 3256 integers in 500ms - less data in the same amount of time. That's what I was talking about and that's what "solves" the problem.
Or I could just add a delay after every every loop to achieve a similar result... Which is what I've done and accepted as a solution.

Thank you all for your help!

Absolutely not. Where are you getting these numbers?

fricia:
Hi everyone

So, I'm building an OSC controller with a teensy 3.6. I'm reading 16 fader (slide potentiometer) values and sending them to a computer if a fader is moved.

Each fader is one ADC value irrespective of the resolution. If your system has 16 faders and you move one, you send 16 integers (one integer per fader) to the computer. If you move 16 faders then you send 16 integers (one per fader). No change in the number of integers sent.

If you use the 10 bit resolution then you send 16 integers each with a value between 0 and 1023. If you use 8 bit resolution, then you send 16 integers with a value of 0-255. No change in the number of integers sent.

Can you post the code for the ResponsiveAnalogRead library in your original code, please. I would like to know how it is doing the reads.

What I suspect is changing when the resolution is changed, is not the response to the computer but the time it takes to update (read) each of the analog inputs. The delay you are seeing is not on the teensy output side (connection to the PC), but the teensy analog input (fader to teensy via ADC). Remember that doing a 10 bit ADC takes longer than going an 8 bit ADC; and you have a bunch of ADC reads to do. By reducing the resolution you get more updates to the computer per second, thereby making the fader display on the computer screen smoother.

Max 7 might be poorly implemented on MacOS?