How can I control the output voltage linearly over time?

Hey guys,
I am facing a problem with controling a proportional control valve: The should be a mode, where the user can input a pressure rate, which will be transformed into a voltage rate, and the voltage increases linearly, therefore the pressure raises linearly. However, since the Arduino executes very fast and I dont know the clock speed exactly, I can only control the rate with delay()-commands or millis()-commands. This leads to a quite bumpy increase, which can falsify the results. An even bigger problem with a delay: It ads a certain amount of time, so I can only estimate how much delay I need to reach 1 second for 1 circle because the raise of voltage also takes a short amount of time. Are there any other commands to control the voltage output of the A0 (DAC) pin linear over time or do you have a different solution to that?
The code snippet that matters:

  while (currentPressure < 6.0) {                     // max 6 bar pressure
    targetVoltage= targetVoltage + pressureRate * 0.01 * conversionFactor; 
    if (targetVoltage > 6.0 * conversionFactor) targetVoltage = 6.0 * conversionFactor;
    analogWrite(dacPin, targetVoltage* 1023.0 / 5.0);
    delay(10);

Greetings

How is currentPressure measured?
What is conversionFactor?
What is pressureRate?

a quite bumpy increase

What is the desired bump?

currentPressure is measured by the actual value delivered by the valve. The conversion factor is only there for the converting the user input (in pressure/second) into voltage/second for the arduino. The pressure rate is the speed at which the pressure is supposed to increase, e.g. 0.1 bar/s or 10 psi/second.
The desired bump should be eliminated or at least as small as possible.

Why? Are you banned from using delayMicroseconds() and micros()?

Are you sure that is what causes the bumpiness? The DAC defaults to 8 bits resolution, so only 256 levels are possible. Using analogWriteResolution(12) you can increase this to 4096 values.

I guess I havent understand the conversion of analog and digital. I am already working with 10 bit resolution, but I dont know how this affects the bumpiness. Maybe you can explain it to me?
Of course I am not banned from delayMicroseconds() :smiley: but the problem will remain: adding microseconds can be useful to estimate one cicle of "increase" but I want one cicle to take exact values of e.g. 10 ms. Or am I wrong with this?

Greetings

No, by default it is 8 bits. You posted all the code that matters in your original post, and analogWriteResolution() is not there.

The higher the resolution, the higher the number of distinct voltage levels that can be produced by the DAC and so the bumps between levels will be smaller. But there will always be bumps. A DAC can't produce a truly analog voltage, but a high enough resolution DAC can produce something that is close enough to a true analog voltage.

It can't be eliminated.
If you want small as possible then set the resolution to 12 bits and only increase targetVoltage one bit at a time.

This (1023.0) assumes a 10 bit DAC. It should actually be 1024.0 (AFAIK).

You can use micros() to measure more exact duration of the time spent. Set starting micros() at beginning of loop and then use another while at the end to wait until micros() is larger than starttime+delay.

That is a really really really easy thing to look up.

F_CPU may tell you..