Streaming Your Desktop With Audio And Webcam Overlay In A Browser Using ffmpeg, crtmpserver And Flowplayer
1 Preliminary note
This tutorial is based on Ubuntu Lucid, but will work on later releases as well with small changes. I will show how I stream my desktop with audio from pulse and webcam from video4linux2. I will also show how to configure crtmpserver and flowplayer so you can watch the live stream from a web browser. In this scenario I use separate host for the tools. Flowplayer is installed on a server running Apache, crtmp server is on a separate server and ffmpeg is installed on the streaming desktop.
2 Tools used
ffmpeg:
This is the audio and video encoder. I will use this to capture the desktop with x11grab, audio from pulse and webcam from video4linux2. The codecs used in this tutorial is libx264 for video and libfaac for audio. The formal is flv and the output is a tcp unicast to the rtmp server. ffmpeg has great documentation, which can be found here.
crtmpserver:
crtmpserver will use the flvplayback function to feed flowplayer the livestream. Documentation can be found here.
flowplayer:
This is an open source flash player. We will use this to play the rtmp live stream in a browser. An alternative is JW Player, but I stuck with Flowplayer because of the great documentation found here.
3 Installation
ffmpeg:
You could either install ffmpeg from repository or compile it from source. I highly recommend compiling it from source as the package in ubuntu repository does not include all the features. In fact I believe you wont be able to use this guide if you use the one from repository. I will not include the installation guide here because there is a very nice guide on ffmpegs website. Remember to follow the guide for the distro you are using. I'm linking the Ubuntu 10.04 guide here.
crtmpserver:
Again this is already documented so I will not include the installation guide here. Follow the instructions described here.
flowplayer:
This is not really installed. You can choose to link to the player from flowplayer's website, or download it to you webserver and link to the path you place it. I recommend you download the player. For this guide you will need the flowplayer, the api, the rtmp plugin and jquery. For the player and the api look here. The rtmp plugin is found here, for Jquery look here.
4 Configuration
ffmpeg:
I wish to explain how each of the options I use work so that you get a better understanding to how ffmpeg works, and also make it easier for you to adapt it to your own need. I will first show the entire command, then go into the options in more details
ffmpeg -f alsa -i pulse -f x11grab -s 1680x1050 -r 30 -i :0.0+0,0 -vf "movie=/dev/video0:f=video4linux2, scale=240:-1, fps, setpts=PTS-STARTPTS [movie]; [in][movie] overlay=main_w-overlay_w-2:main_h-overlay_h-2 [out]" -vcodec libx264 -crf 20 -preset veryfast -minrate 150k -maxrate 500k -s 960x540 -acodec libfaac -ar 44100 -ab 96000 -threads 0 -f flv - | tee name.flv | ffmpeg -i - -codec copy -f flv -metadata streamName=livestream tcp://x.x.y.y:1234
As you can see it looks quite daunting at first glance, but don't worry, I will explain what everything does to the best of my extent, and also provide links with further documentation.
The ffmpeg syntax goes like this:
ffmpeg "input-option" "input" "output options" "output"
Lets start with the first input pulse and its option:
-f alsa -i pulse
For more information on using alsa with ffmpeg look here.
Here we use alsa as the the format, and pulse as input. This will grab audio from whatever source is active at the moment. If you install Pulse Volume Controller, you can easily change the source by clicking the recording tab and select what source you want to use.
So far so easy. The next input we have is our desktop:
-f x11grab -s 1680x1050 -r 30 -i :0.0+0,0
For more information on x11grab look here.
We select x11grab as the format with "-f x11grab". Then we decide how large a portion of the display to grab with "-s 1680x1050". Third we tell ffmpeg to force 30 frames per second with "-r 30", and lastly we select the input, which is the 0.0 display starting at position 0,0 (top left corner)
Next we will place the webcam as an overlay on top of the desktop. To do this we use the feature avfilter. This is a very powerfull tool and i will not go to much into detail on it. You can find all the documentation you need here.
-vf "movie=/dev/video0:f=video4linux2, scale=240:-1, fps, setpts=PTS-STARTPTS [movie]; [in][movie] overlay=main_w-overlay_w-2:main_h-overlay_h-2 [out]"
The options for -vf is enclosed in double quotes. First we set a name for the input to be used, in this case "movie". This is just a name, and you can choose whatever you want. The name is then defined with an input, in our case the webcam.
"movie=/dev/video0:f=video4linux2
Next option is used to scale the input down to a certain size. I use -1 to keep the original aspect ratio of the input, and set the width to 240 pixels.
scale=240:-1
I use the option fps to force a constant bitrate. I did this because I had issued with the webcam lagging behind the desktop causing it to be out of sync. It might not be necessary, but I have it there to be safe.
The last option is used to set the presentation timestamp on the input, starting with 0.
setpts=PTS-STARTPTS
This concludes the input options, and we close it of with [movie]; followed by [in][movie]
Here we place the actual overlay settings.
overlay=main_w-overlay_w-2:main_h-overlay_h-2 [out]"
This places the input as an overlay 2 pixels from the bottom right, and 2 pixels from bottom. We close it off with [out]
For more information on how the overlay works, look here.
We are done with all the input stuff, lets look at how we encode it.
I use libx264 for the video codec and libfaac for the audio codec.
-vcodec libx264 -crf 20 -preset veryfast -minrate 150k -maxrate 500k -s 960x540
As mentioned libx264 is my video codec of choice. "-crf 20" is the ratecontrol method, attemping to give a constant quality.
"-preset veryfast" is a one set of configuration, if you want to know more about it take a look here.
"-s 960x540" will set the output resolution. Increasing it will give better quality, but also increase the bitrate and use more CPU to encode.
I try to limit the bitrate between 150kb/s and 500kb/s. It will still go over the maxrate if it needs it, but it will prevent huge spikes
-acodec libfaac -ar 44100 -ab 96000
For audio I set the frequency to 44100 and the bitrate to 96k. This gives me good enough quality and require less bandwidth and cpu.
"-threads 0" is an option to make use of all your CPUs. I believe its on by default, but it doesnt hurt to have it there
Now you could choose your output format and output and be done. Unfortunetly for me I had to make a dirty workaround to be able to record the session at the same time I'm streaming. The reason I can't just use a second output is that it will not include the avfilter since this applied only to the first output, and needs to be added to the second output as well. This won't work because /dev/video0 is being used by the first output (Device busy)
Therfore instead of outputting it as I normally would I output it to standard out, use tee to grab everything and place it on a file, then start a second ffmpeg session that use standard out as input, copies the encoding and outputs it to the rtmp server.
If you don't need to record at the same time, just finish the command like this:
-f flv -metadata streamName=livestream tcp://x.x.y.y:1234
I use flv as the output format, since we will be using flvplayback on the rtmp server. "-metadata streamName=livestream" sets the name for the livestream. If we don't give it a name here, crtmpserver will choose a random name, making it hard for us to specify the stream in flowplayer.
Replace x.x.y.y with your crtmpserver ip adress and port number. I use 1234, you can use whatever you want really.
If you do indeed need to record, and don't find a more elegant solution (please let me know if you do!) you need to be aware that when you close ffmpeg, it will make the name.flv file unseekable. No worries though, its an easy fix. Here's how I do it:
ffmpeg -i name.flv -codec copy -f mp4 name.mp4
This places everything in a new container, and when we are about it why not place it in something better than flv! It's now seekable, but unfortunately flowplayer will not be able to play the file before it is fully buffered. This is because the mp4 format places the moov atom at the end of the file
You can fix this using qt-faststart.
For this you need to have python installed.
Download qt-faststart with git:
git clone https://github.com/danielgtaylor/qtfaststart.git
To install use the setup.py script:
cd qtfaststart
python setup.py install
You can now use qtfaststart to move the moov atom to the beginning of the file like this:
qtfaststart input.mp4 output.mp4
This configuration is aiming at making a decent quality at low bit rate. For presentations and slide-shows, the bitrate is about 200 kb/s and the quality is very acceptable. When playing videos the bitrate goes up to 1 mb/s. You can off course change these settings to fit your needs
This concludes the ffmpeg configuration.
crtmpserver:
When compiled navigate to /crtmpserver/builders/cmake/crtmpserver
You need to edit crtmpserver.lua.
Find the part "description="FLV Playback Sample", under it is the acceptors you need to edit.
{ ip="0.0.0.0", port=1234, protocol="inboundLiveFlv", waitForMetadata=true, },
Insert this in the acceptor field, or edit the one already there. The important thing here it to specify the same port you are using in ffmpeg.
When done you can start the crtmpserver. Do this from the cmake folder.
cd ..
./crtmpserver/crtmpserver ./crtmpserver/crtmpserver.lua
If you get errors something is most likely wrong with the configuration file. If not you will be placed in the console.
More information here.
That concludes the crtmpserver configuration.
flowplayer:
This can also be tricky, but luckily there are good documentation on the flowplayer site. I'll only explain how to embed the player and use the rtmp plugin.
First off you need to add the path to flowplayer in the head:
<script src="http://releases.flowplayer.org/js/flowplayer-3.2.10.min.js"></script>
I place the path to Jquery in the body:
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.7/jquery.js" type="text/javascript" charset="utf-8"></script>
I have a class called player with some fancy option to make it look a bit cooler.
<style> a.player { display:block; width:900px; height:500px; text-align:center; color:#fff; text-decoration:none; cursor:pointer; background:#000 url(/media/img/global/gradient/h500.png) repeat-x 0 0; background:-moz-linear-gradient(top, rgba(55, 102, 152, 0.9), rgba(6, 6, 6, 0.9)); -moz-box-shadow:0 0 40px rgba(100, 118, 173, 0.5); } </style>
I'm not going into detail of the class i'm using, you can make your own class if you want to. The flowplayer needs to be inside a class because it uses the width and heigth, without it you wont see it.
Now for the actual flowplayer configuration. The script must not be within a div that has html inside it, so to be safe just place it in the body.
<script> $(function() { $f("stream", "flowplayer/flowplayer-3.2.11.swf", { play: { opacity: 0.0, label: null, // label text; by default there is no text replayLabel: null, // label text at end of video clip }, clip: {url: 'livestream', live: true, autoPlay: true, provider: 'influxis', }, canvas: { backgroundImage: 'url(image/offline.png)' }, plugins: { influxis: { url: "flowplayer/flowplayer.rtmp-3.2.10.swf", netConnectionUrl: 'rtmp://x.x.x.x/flvplayback' } } }); }); </script>
Some important things here.
$f("stream", "flowplayer/flowplayer-3.2.11.swf", {
This defines the player, "stream" is the id we use to link it to a div class later on.
After this comes the configuration, there are four sections, but you only need clip and plugins.
In "play" I remove the buffer and play icon inside the player screen, as it's not really nessesary for a live stream.
In "clip" I set the url to livestream, this is the metadata we used in ffmpeg. I tell Flowplayer its a live stream with "live: true,", make the stream automatically start with "autoPlay: true,". The "provider: 'influxis'," is refered in the plugin section.
In "canvas" I set a background image so people will understand when the stream is offline
In "plugins" under the provider influxis i set the path to the rtmp plugin with "url". The "netConnectionUrl" tells where our rtmp server is. Remeber to include /flvplayback so that crtmpserver knows were to look for the stream!
Now we can embed this in a div some place, or just in body like this:
<div class="player" id="stream" style=float:left"></div>
Notice that I use the id="stream" here to link it up to the script we made earlier. It also uses the class player we made.
Now you should be able to start your ffmpeg command and watch the stream in your browser.
Conclusion
This tutorial became a bit longer that I expected, but I hope it will help somebody out there. Please let me know if you find something I did wrong, or if there are something that can be improved upon. Thanks for reading!