Stream a web camera with WebRTC

Tested on Ubuntu 17.10

In this tutorial we are creating a streaming source first,

  • Acquiring a media content from a local (usb) camera;
  • Encapsulating it in a RTP stream;
  • Creating a pstreamer P2P overlay;
  • Injecting the RTP stream in the overlay.

Then, we will launch PeerStreamer-ng to serve the P2P stream to the user through a browser supporting WebRTC. Refer to the Building section for instructions to set PeerStreamer-ng up and running.

The source

The following script captures a usb camera (/dev/video0) with FFmpeg which is used to stream it locally through RTP. It is important to note, to be able to serve the stream through WebRTC, FFmpeg must transcode the video with VP8. It also launches a source instance of pstreamer that takes such stream in input and use P2P mechanism to distribute it to the attaching peers.

Overall, this script creates a PeerStreamer-ng channel at IP address 127.0.0.1 on port 6000 and it saves it description on a local channel list file called channels.csv which we will use to feed PeerStreamer-ng with.

NAME="My Channel"
VIDEO=/dev/video0
SDPFILE="file://${PWD}/channel.sdp"
SOURCE_PEER_PORT=6000
RTP_BASE_PORT=5002
HOST_EXT_IP=127.0.0.1

echo "$NAME,$HOST_EXT_IP,$SOURCE_PEER_PORT,QUALITY,http://$HOST_EXT_IP:8000/channel.sdp" > channels.csv

ffmpeg -re -i ${VIDEO} -vcodec libvpx -deadline realtime -an -f rtp rtp://127.0.0.1:$((RTP_BASE_PORT+2))\
	-sdp_file $SDPFILE &
FFMPEG_PID=$!

Libs/pstreamer/pstreamer -p 0 -c "iface=lo,port=$SOURCE_PEER_PORT,chunkiser=rtp,audio=$RTP_BASE_PORT,\
	video=$((RTP_BASE_PORT+2)),addr=127.0.0.1,max_delay_ms=50" 
SOURCE_PEER_PID=$!

trap "kill $SOURCE_PEER_PID $FFMPEG_PID" SIGINT SIGTERM EXIT

You can download the script above from the project site.

PeerStreamer-ng

The following command executes PeerStreamer-ng and it start the HTTP interfaces. Pages are now served on port 3000 (overridable through the flags, see the command line help).

./peerstreamer-ng -c channel.csv