Guide to Setup Streaming Web Cam on the Yun

Ok I saw what you were saying muh and it helped me with my project thank you. One last think where do you have the information of what parameters to pass and what they mean??? I'm trying to find info on that.

By the way is there a easy way to save a snapshot, programmatically, on the YUN SD ? Cause if I go http://yunip:8080/?action=snapshot it takes a snapshot in my browser but it's not saved on my SD card.

Thank you again


for capturing a single image, I suggest the use of fswebcam:

Hi there:

Can anybody recommend me a webcam, with infrared leds, suitable for streaming with Arduino yun in the darkness?


Hi a couple of things:

  1. The comand opkg has a mistake in ffmpeg. If copy and paste give an error: It is easy to know the error but please correct in the first post.

root@Arduino:/mnt/sda1# opgk install ffmpeg -ash: opgk: not found

  1. I dont know if I have a new version but the v4l2 package is not anymore:

root@Arduino:/mnt/sda1# opkg install v4l2 Unknown package 'v4l2'. Collected errors: * opkg_install_cmd: Cannot install package v4l2. root@Arduino:/mnt/sda1#

I found in this two similar libraries: libv4l_0.8.6-2_ar71xx.ipk and v4l-utils_0.8.6-2_ar71xx.ipk. Are They the same?

Thanks for the post and help

Thank you for everyone's work on this subject! Has anyone attempted to use a web cam that has built-in hardware encoding in order to reduce the workload on the AR9331? I also wonder if H.264 compression would allow for better results for any given Internet upload bandwidth constraints, as compared to MJPEG. Unfortunately my new Yun is still under the Christmas tree.

This is the info I have come across so far, for utilizing hardware-based H.264 compression:,_streaming_H.264

As a side note, I don't know if the problem mentioned in the comments section for the 2nd url above has been sorted out, regarding the Logitech C920's variable frame rate issue in low light conditions. In the same comments section, "Ricky" mentions the good results he achieved for his robot car project (I realize his board may have been a 1 GHz ARM as compared to the AR9331's 400 MHz MIPS, but still). Cheers.

Thanks, with the mjpg_streamer is working well. I have a Logitech HD Webcam C310.

The is only working on Firefox browser by now.

In the future, would it be better for me to create a separate topic regarding hardware-based encoding/streaming attempts or continue to post info in this topic? Thank you.

The previous urls I mentioned culminated into this interesting project/writeup - full credits to Alexandru Csete:

In the above page's comments section "Joris Pragt" describes his attempt to use a TP-Link router running OpenWrt but he ran into problems due to the router only having 32 MB RAM. Here's hoping the Yun's 64 MB is sufficient to allow for 1920x1080 @ 30 fps.

I haven't seen any discussion of using the H.264 protocol in the forum, once you unwrap your Yun and start to play with it a new thread about H.264 would probably be a good idea. If you get it working yourself someone else would appreciate your experience, if not someone will probably help.

While I used info from this thread to get my webcam going, it is getting long with separate discussions going on and it can be hard to pick out the parts that apply. When to start a new discussion is not an easy decision but if what you ask is answered in another someone will usually post a link to the answer.

MadScience: Tested and working a charm. Video streaming smoothly at 5fps 640x480 resolution. Well done mate.

If anyone else is interested. I would suggest using this build of mjpg-streamer over ffmpeg and ffserver as the throughput is much faster.

Instructions are below. Credit to fibasile for the binary upload of mjpg-streamer: Join your Yun to your WLAN - SSH into OpenWRT via the IP or via arduino.local hostname - Check your camera's compatibility (some are UVC, some are GSPCA, some not supported at all). I would suggest taking a look here: At this time only UVC driver camera's are supported - Install either the UVC driver (if not already installed) e.g. opkg install kmod-video-uvc - Plug your camera into usb slot (type dmesg to see if your camera is detected and drivers working correctly). I used a Microsoft LifeCam HD-3000 - add your micro sd card (it should appear as /dev/sda1 by default) - Create a folder or mount point /mnt/sda1 (mkdir /mnt/sda1) - mount your sd card - mount /dev/sda1 /mnt/sda1 - use wget to download the mjpg-streamer binary ( you cant use it with drop box as it redirects to https. This module is not installed by default). I have uploaded it here for convenience : - install the package (I installed it in my root folder ~) - opkg install mjpg-streamer.ipk - View the config options here - I just ran it with the following commandmjpg_streamer -i " -d /dev/video0 -r 640x480" -o " -p 8080 -w /mnt/share" (/mnt/share is my sd card). You can also set it to start upon boot - Open your web browser to: http://arduino.local:8080/?action=stream for a stream or for a single snappy - Enjoy

Hi, first of all thank you very much for your explanation, it worked very well, with the same camera as you. But i don't really understand what utility of the sd card? Is it something like a temporary memory in order to store a little bit of the stream ?

Thanks in advance !

cristiansaavedra: 1. The comand opkg has a mistake in ffmpeg. If copy and paste give an error: It is easy to know the error but please correct in the first post.

root@Arduino:/mnt/sda1# opgk install ffmpeg -ash: opgk: not found

You have written "opgk" instead of "opkg" and the error message is expected

Hi, I would like to know if there is a way to start the stream from the Arduino program, because by now I have to manually connect using SSH and enter the following command line:

mjpg_streamer -i " -d /dev/video0 -r 640x480" -o " -p 8080 -w /mnt/share"

I would like to execute this command line from the Arduino, or from my Android application, there is a way? Thank you.

I did it using this library:

Source code:

package com.iha.wcc.job.ssh;

import android.content.Context;
import android.os.AsyncTask;
import android.widget.Toast;
import com.jcraft.jsch.*;
import java.util.Properties;

 * Run a SSH command using Jsch library.
public class SshTask extends AsyncTask<String, Integer, Boolean> {
    private static Session session;
    private static Channel channel;

    private Context context;
    private String host;
    private String user;
    private String password;
    private String command;

     * Constructor that load the necessary information to connect using SSG protocol.
     * @param context   Activity context, useful to display message.
     * @param host      Host IP to reach.
     * @param user      SSH user name.
     * @param password  SSH user password.
     * @param command   SSH command to execute.
    public SshTask(Context context, String host, String user, String password, String command){
        this.context = context; = host;
        this.user = user;
        this.password = password;
        this.command = command;

     * @param arg0
     * @return
    protected Boolean doInBackground(String... arg0) {
        JSch jsch=new JSch();

        Properties config = new Properties();
        config.put("StrictHostKeyChecking", "no");

        Session session;
        try {
            session = jsch.getSession(user, host, 22);

            ChannelExec channel = (ChannelExec)session.openChannel("exec");

            SshTask.session = session;
   = channel;

            return true;
        } catch (JSchException e) {
            return false;

    protected void onPostExecute(Boolean success){
            Toast.makeText(this.context, "Video stream successfully started.", Toast.LENGTH_SHORT).show();
            Toast.makeText(this.context, "Unable to start the camera video stream.", Toast.LENGTH_LONG).show();;

     * Shutdown the camera video stream.
    public static void disconnect(){
        if(SshTask.session != null){


opkg update opkg install nano

nano /etc/config/mjpg-streamer

config mjpg-streamer core   
option enabled      "1" 
option device       "/dev/video0"   
option resolution   "640x480"   
option fps      "30"    
option www      "/www/webcam"   
option port     "8080"

/etc/init.d/mjpg-streamer enable /etc/init.d/mjpg-streamer stop /etc/init.d/mjpg-streamer start

access video from http://arduino.local:8080

The init.d way is nice :)

BTW Vadorequest, can't you do that with runShellCommand?

And what about the [u]audio[/u]?

Many cams have mic. So, can we streaming the audio at the same time?


Good question, I didn't try to do that. Should be possible.

Vadorequest: Good question, I didn't try to do that. Should be possible.

when you have a IP web Cam. It is possible. I had one.

However, I think that the audio only worked with one browser, MS explorer, I think... not sure.

I have been using Microsoft HD-3000 webcam with success.

Now I want to use another UVC webcam. However, it does not work.

Must I do something different if I want to use another cam? Should I indicate to the driver that the new cam has to be used now?

BTW, no ideas on audio streaming?

Merry Christmas,


I have the streaming setup to my Yun and can view in browser.

So how do I get this stream to a website streaming service?

I have searched around and cannot make heads or tails on how to do this.

demolishun: ... So how do I get this stream to a website streaming service? ...

What do you mean?