I have 12,000 lines of C++ code from a Zero project that I want to move to UNO Q. I have split the real-time stuff to the microcontroller and I want the Python script to communicate over sockets to a C++ program and an apache2 web server running outside of the docker environment. I have a test socket server python script and a test client python script that I can run successfully from the Debian command line. However, as soon as I start up App Lab, it sends data to the test server program and closes the socket. It sends the following: GET / HTTP/1.1\r\nHost: 127.0.0.1:8218\r\nUser-Agent: Go-http-client/1.1\r\nAccept-Encoding: gzip\r\nConnection: close\r\n\r\n
I have the same problem whether I use the SBC mode or are connected via USB. Any ideas?
I'd wager that it has to do with how App Lab runs its scripts, inside a dedicated Docker container. When it does this your Python script is not running with the same network namespace as when run on the CLI. As such the target is wrong unless the server is also inside the same container.
I believe this post here:
should help you with your problem.
Of course you could just run the python script outside of the applab enviroment.
Thanks. I spent a long time trying to figure out how to put something in the Docker command line, but it seems to be programmatically generated by App Lab. There is an app-compose.yaml file created in the .cache directory in the App Lab project, but a new one gets created every time you run the project.
I need to run inside the App Lab environment because my Python script is using the Arduino Bridge calls to communicate with the microcontroller board.
I was able to work around it by setting the slider to start the project automatically on startup, but as soon as I close the socket and try to bind again, App Lab grabs the socket and I get a socket in use error.