Can SoftwareSerial cause instability of Sensor Node

Hello arduino fellows,

I ran into a problem with a LoRa based sensor node that works with a sensor with UART interface. I use the standard SoftwareSerial library with a baudrate of 9600. The MCU is sleeping to save power. The sleeping time is set to 15min. The sensor can run two weeks without any problems but then suddenly the sensor reading only outputs zero and at the same time the sending interval is reduced to only one minute. This roughly one minute occurs nowhere in the code. Could it be a problem of the SoftwareSerial that causes instability with the ATMEGA MCU?

Why do you think that it is SoftwareSerial that causes the problem in the code?

The problem is likely in the code that you forgot to post.

mbobinger:
I use the standard SoftwareSerial library with a baudrate of 9600. …The sensor can run two weeks without any problems … Could it be a problem of the SoftwareSerial that causes instability

Almost certainly not. The answer lies in you secret code, or maybe the equally secret power supply.