WebRTC and Websockets. Is there a difference

I'm assuming that WebRTC is an API that decodes/encodes audio and video, although the communication between the server and the clients is done via web sockets, or some other network protocol? I'm a bit confused. Does WebRTC have its own communications protocol?


There's two sides to WebRTC.

  1. JavaScript APIs (getUserMedia) that allow an app to access camera and microphone hardware. You can use this access to simply display the stream locally (perhaps applying effects), or send the stream over the network. You could send the data to your server, or you could use...
  2. PeerConnection, an API that allows browsers to establish direct peer-to-peer socket connections. You can establish a connection directly to someone else's browser and exchange data directly. This is very useful for high-bandwidth data like video, where you don't want your server to have to deal with relaying large amounts of data.

Take a look at the demos to see both parts of WebRTC in action.

So in a nutshell:

  • WebSockets allow full-duplex communication between a browser and a web server.
  • WebRTC's PeerConnection allows full-duplex communication between two browsers.

WebRTC uses RTP (a UDP based protocol) for the media transport, but requires an out-of-band signaling channel to setup the communication. One option for the signaling channel is WebSocket.