How to build a streaming server with a webcam and a Raspberry pi
Here is described how to obtain a live streaming server with ffmpeg and ffserver using a webcam and a Raspberry; it’ s easy enough. Unfortunately AFAIK, the default ffmpeg should segfault if used for streaming live video feed. For this reason we need to compile it ourselves.
It takes some times (the make command) so be sure to have sufficient time.
Here is the recipe:
- Configured webcam
How to cook:
We need to install ffmpeg, but as told we need to compile from sources, and for this reason we need to modify the aptitude package manager sources list files:
Add the following lines into /etc/apt/sources.list
deb-src http://www.deb-multimedia.org wheezy main deb http://www.deb-multimedia.org wheezy main non-free
In a previous article the first of the two apt repos listed in the modifications above was pointing to Sid. I have done the procedure again and this time i used Wheezy, it works, so i updated this part. And the reader is invited to do so.
apt-get install deb-multimedia-keyring
Remove or comment the second line from /etc/apt/sources.list
#deb http://www.deb-multimedia.org wheezy main non-free
apt-get source ffmpeg-dmo
Check for the actual version and go into the newly created dir:
Now (2013-01-15) is:
./configure --enable-filter=movie --enable-avfilter make sudo make install
If you concatenate the commands together and use them as root, you have enough time to get a coffee…outside.
Concatenated commands (if you used the three commands before you don’t need to do this!)
./configure --enable-filter=movie --enable-avfilter && make && make install
Sudo is needed if you are NOT root.
We need to create a configuration file for ffserver, we will place it in /etc/ and call it ffserver.conf
The content of this file will be as follows:
Port 80 BindAddress 0.0.0.0 MaxClients 10 MaxBandwidth 50000 CustomLog - #NoDaemon <Feed feed1.ffm> file /tmp/webcam.ffm FileMaxSize 10M </Feed> #We are going to use flash format (sorry internet explorer and iphone/pad/pod/whatever) <Stream test.swf> Feed feed1.ffm Format swf VideoFrameRate 4 VideoSize 320x240 VideoBitRate 64 VideoBufferSize 10 VideoQMin 6 VideoQMax 31 #VideoIntraOnly NoAudio </Stream> <Stream stat.html> Format status # Only allow local people to get the status ACL allow localhost ACL allow 192.168.0.0 192.168.255.255 #FaviconURL http://blog.giuseppeurso.net/favicon.ico </Stream> # Redirect index.html to the appropriate site <Redirect index.html> URL http://blog.giuseppeurso.net/ </Redirect>
Please, note the port 80 is in use, you can change it for a more convenient port if, 8080 for example, you already have a webserver instance running listening on that port.
Although the producer claims a resolution of 720, i know for sure it is interpolated by the poor chip of the camera, degradating the resulting quality. That’s why you will have better results if you use a resolution as close as possible with the camera native resolution.
Also, keep in mind that this parameter and the framerate & bitrate values can determine a bad quality stream.
Now it’s time to start ffserver with the command
ffserver -f /etc/ffserver.conf &
executed as root: this will start ffserver and send it in background.
Then we need to start capturing the video with ffmpeg and transmit the feed to ffserver with the configured path
ffmpeg -v verbose -r 15 -s 320x240 -f video4linux2 -i /dev/video0 http://localhost/feed1.ffm &
You are advised to save this command into an .sh shell script file , chmod +x and then you can use a single word command to star streaming or, better, you can do a startup script for this task.
An ideal startup shell script would do:
killall ffserver killall ffmpeg ffserver -f /etc/ffserver.conf ffmpeg -v verbose -r 10 -s 352x240 -f video4linux2 -i /dev/video0 http://localhost/feed1.ffm &
Point your browser to the Raspberry IP address specifying protocol and port as follows: