Frequency and interrupts


I have a general question about how interrupts work. Say, I have two interrupts, Ia and Ib. Ia works at a very high frequency (10MHz) and Ib at around 200Hz. I would be calling Ib within Ia and this calling is random and is based on certain results in Ia.

So, my question now is, how bad would be the influence of that low frequency on the entire system. Is there any way I can reduce the delay being caused by Ib? Maybe calling it once every 500ms or something like that?

I'm confused by your use of the term "interrupt" - I don't think you're using it correctly. You don't call an interrupt from software, which you talk about doing - interrupts are triggered based on external events.

Can you explain exactly what you're hoping to do?

I have a program which as CAN protocol in it. This protocol works at 10MHz(CPU frequency). I need to receive information from IMU regarding roll, pitch and yaw via UART. However, the IMU sends information at 200Hz[side note: UART of the microcontroller can receive at 8.5MHz]. I need to use this IMU information inside CAN related code, which contains other protocols as well. But I want to receive information in such as way that the overall frequency is affected as little as possible.

If you've got a 10 MHz serial stream, and the UART only works at 8.5 MHz, you've got a problem.

I can make the UART work at 10MHz. But the IMU sends data at only 200Hz.