Sharing my experience and tutorial below as it may be of use to some.
I thought I would try hooking up a web cam to my Yun and stream it over WiFi. Basically I am building a WiFi controlled robot and want to mount a camera which can be viewed over WiFi as well.
This is my experience:
- It does work once you get everything configured
- Fairly tricky to set up if you don’t know how
- The CPU offers little speed for encoding so you end up with a laggy video. I counted about 4 seconds lag. Also the speed gets incrementally worse as the encoding runs.
- No mjpg-streamer or gstreamer binary (unless you want to custom compile - I dont ) so I used ffserver
- Could only get it working with flv/swf format and at low resolution
- Unless you want to tinker with your operating system a bit and move your installs to the SD card, you will use about 85% of your Yun’s storage for the binaries and libraries
After this experiment, I would suggest avoiding trying to encode and stream live video over WiFi using the Yun. I think the processor is not up to this kind of task.
For this task, I have decided to use a separate security camera with inbuilt WiFi for video and run the Yun as my controller to issue commands to the robot over WiFi.
I would love to hear from anyone if they can get this running smoother.
Here are the instructions below if anyone is interested:
- Join your Yun to your WLAN
- SSH into OpenWRT via the IP or via arduino.local hostname
- Check your camera’s compatibility (some are UVC, some are GSPCA, some not supported at all). I would suggest taking a look here: http://wiki.openwrt.org/doc/howto/usb.video
- Install either the UVC driver or GSPCA (if not already installed) e.g. opkg install kmod-video-uvc
- Plug your camera into usb slot (type dmesg to see if your camera is detected and drivers working correctly). I used a Microsoft LifeCam HD-3000
- install ffmpeg via opgk install ffmpeg
- install ffserver via opkg install ffserver
- install video4linux2 package via opkg install v4l2
- Install nano package so you can edit text (I like nano) - opkg install nano
- add your micro sd card (it should appear as /dev/sda1 by default)
- Create a folder /mnt/sda1 (mkdir /mnt/sda1)
- mount your sd card - mount /dev/sda1 /mnt/sda1
- Create an ffserver config file under /etc using nano (nano /etc/ffserver.conf). You may need to modify this file if your mount paths are different
- Enter and save the following config:
Port 8090 BindAddress 0.0.0.0 MaxHTTPConnections 2000 MaxClients 10 MaxBandwidth 50000 RTSPPort 9090 NoDaemon <Feed videofeed.ffm> File /mnt/sda1/videofeed.ffm #You should match the above to your SD Card's mount point FileMaxSize 10M ACL allow 127.0.0.1 </Feed> <Stream video.swf> # coming from live feed 'videofeed' #Format mjpeg Feed videofeed.ffm Format flv AVOptionVideo flags +global_header VideoCodec flv VideoBitRate 128 VideoBufferSize 2000 VideoFrameRate 4 VideoSize 320x240 VideoQMin 1 VideoQMax 3 ACL allow 192.168.0.0 192.168.255.255 NoAudio </Stream> ACL allow localhost
- Use CTRL + W to write your file
- run ffserver command in SSH (since we are not running in daemon mode, it will occupy the SSH session). If you get an error then check your config file. Don’t close this SSH window yet
- Open a second ssh session and run the following command
ffmpeg -v verbose -r 5 -s 320x240 -f v4l2 -i /dev/video0 -c:v flv http://localhost:8090/videofeed.ffm
- Hopefully your session will start running without errors. If any errors, review your setup as per above
- Open your web browser to http://arduino.local:8090/video.swf
- Your video should start showing
Hopefully I haven’t forgotten any steps above.
As I said, its not perfect but it does work. If anyone can improve on this or wants to cross compile mjpg-streamer then please let me know
Also on a side note, if anyone wants to see what binaries are available, you can view the list here: