When developing code to interface with networks of devices in a home or a commercial setting, do you prefer server side or node side computing?
In other words, does your node compute data and make a decision to execute some action, or does your node send the data to a server application that does the computation and issues a command back to the node on which action to execute?
For example: I see a lot of automated watering systems with soil temp and humidity sensors. Would you rather the node compute the temperature and humidity, and decide what action to take? Or would you rather the node send the data to your server and have the server decide which commands to issue for different conditions? How do you decide?
In my case I have a network with computers and microcontroller based devices connected. I prefer to do as much as I can on the microcontrollers so that they're not dependent on the computer being on and working. The microcontrollers have lower power consumption and are more reliable. I am also way better at programming in C++ than Python, which is primarily what I'm using on the computers.
However, there are cases where it makes more sense to do processor intensive computing on the computer because the microcontroller based devices aren't powerful enough. I recently started setting up a security camera system using Raspberry Pi Zeros. I decided it was better to use the Zeros only for streaming the video feeds and then do the processing of those feeds for motion detection on my computer, which also handles notifications and storage of the videos where motion occurs. The reason for this is that the Zeros are barely able to handle the streaming of high resolution images at a reasonable frame rate and don't have enough extra processor power to do anything more.