Looks like if software serial fills its buffer it sets a flag and discards the incoming byte.
You might be better off using hardware serial for the camera. Software serial is consuming more processor cycles than hardware serial. The hardware serial only needs to interrupt when a byte is received, software serial needs to do stuff for every bit.
At 115200 baud you are getting a byte every 1/11520 seconds (every 86 uS). That isn't a lot of time to be processing it. Especially if you are chewing up processor cycles handling each of those 8 bits.
But let's do one thing at a time. Read in a batch into an array. Print it. Check it's OK. You won't need the debugging displays later will you?
The processor has 16 clock cycles per microsecond. So you can do 1376 clock cycles (16 x 86) per incoming byte. Some instructions take 1 cycle, some take 2, a few take more. So you can do a reasonable amount in the time between bytes.
Again though, using hardware serial gives you somewhat more leeway.