Screen sharing with WebRTC?

The chrome.tabCapture API is available for Chrome apps and extensions.

This makes it possible to capture the visible area of the tab as a stream which can be used locally or shared via RTCPeerConnection's addStream().

For more information see the WebRTC Tab Content Capture proposal.

Screensharing was initially supported for 'normal' web pages using getUserMedia with the chromeMediaSource constraint – but this has been disallowed.

EDIT 1 April 2015: Edited now that screen sharing is only supported by Chrome in Chrome apps and extensions.


I know I am answering bit late, but hope it helps those who stumble upon the page if not the OP.

At this moment, both Firefox and Chrome support sharing entire screen or part of it( some application window which you can select) with the peers through WebRTC as a mediastream just like your camera/microphone feed, so no option to let other party take control of your desktop yet. Other that that, there another catch, your website has to be running on https mode and in both firefox and chrome the users are gonna have to install extensions.

You can give it a try in this Muaz Khan's Screen-sharing Demo, the page contains the required extensions too.

P. S: If you do not want to install extension to run the demo, in firefox ( no way to escape extensions in chrome), you just need to modify two flags,

  • go to about:config
  • set media.getusermedia.screensharing.enabled as true.
  • add *.webrtc-experiment.com to media.getusermedia.screensharing.allowed_domains flag.
  • refresh the demo page and click on share screen button.

You guys probably know that screencapture (not tabCapture ) is avaliable in Chrome Canary (26+) , We just recently published a demo at; https://screensharing.azurewebsites.net

Note that you need to run it under https:// ,

video: {
  mandatory: {
    chromeMediaSource: 'screen'   
  }

You can also find an example here; https://html5-demos.appspot.com/static/getusermedia/screenshare.html


To the best of my knowledge, it's not possible right now with any of the browsers, though the Google Chrome team has said that they're eventually intending to support this scenario (see the "Screensharing" bullet point on their roadmap); and I suspect that this means that eventually other browsers will follow, presumably with IE and Safari bringing up the tail. But all of that is probably out somewhere past February, which is when they're supposed to finalize the current WebRTC standard and ship production bits. (Hopefully Microsoft's last-minute spanner in the works doesn't screw that up.) It's possible that I've missed something recent, but I've been following the project pretty carefully, and I don't think screensharing has even made it into Chrome Canary yet, let alone dev/beta/prod. Opera is the only browser that has been keeping pace with Chrome on its WebRTC implementation (FireFox seems to be about six months behind), and I haven't seen anything from that team either about screensharing.

I've been told that there is one way to do it right now, which is to write your own webcamera driver, so that your local screen appeared to the WebRTC getUserMedia() API as just another video source. I don't know that anybody has done this - and of course, it would require installing the driver on the machine in question. By the time all is said and done, it would probably just be easier to use VNC or something along those lines.