Overview
RTSP (Real Time Streaming Protocol) is a network control protocol designed for streaming media servers. It’s commonly used in IP cameras, surveillance systems, and professional broadcasting equipment.
Key Benefits
Widely supported : Standard protocol for IP cameras and surveillance
Low overhead : Efficient streaming with minimal latency
Bidirectional : Supports commands and control (play, pause, seek)
Transport flexibility : Works over UDP, TCP, or HTTP
Professional grade : Used in broadcast and production environments
Codec agnostic : Supports H.264, H.265, MJPEG, and more
Pulling RTSP Streams
Ant Media Server can pull RTSP streams from cameras and convert them to WebRTC, HLS, DASH, or RTMP.
Source: StreamFetcher.java
Via REST API
Create Stream Source
Start/Stop Stream
Batch Configuration
Node.js SDK
# Add RTSP stream source
curl -X POST "https://your-server.com:5443/WebRTCAppEE/rest/v2/broadcasts/create" \
-H "Content-Type: application/json" \
-d '{
"name": "IP Camera 1",
"streamId": "camera1",
"type": "streamSource",
"streamUrl": "rtsp://192.168.1.100:554/stream1"
}'
# With authentication
curl -X POST "https://your-server.com:5443/WebRTCAppEE/rest/v2/broadcasts/create" \
-H "Content-Type: application/json" \
-d '{
"name": "Authenticated Camera",
"streamId": "camera2",
"type": "streamSource",
"streamUrl": "rtsp://username:[email protected] :554/stream"
}'
Via Web Panel
Navigate to Ant Media Server web panel
Go to Application → New Live Stream → Stream Source
Enter stream details:
Name : Camera name
Stream URL : rtsp://192.168.1.100:554/stream
Stream Type : Stream Source
Click Create
Click Start Broadcasting
Common Camera URL Patterns
Generic Format
Hikvision
Dahua
Axis
Foscam
Ubiquiti/UniFi
Reolink
rtsp://[username:password@]host[:port]/path
# Examples:
rtsp://192.168.1.100:554/stream1
rtsp://admin:[email protected] :554/live/ch0
rtsp://camera.local/axis-media/media.amp
Playing RTSP Streams
Once pulled into Ant Media Server, RTSP streams are available as:
WebRTC Player
HLS Player
FFplay
VLC
< script src = "https://your-server.com:5443/WebRTCAppEE/js/webrtc_adaptor.js" ></ script >
< video id = "remoteVideo" autoplay controls ></ video >
< script >
const webRTCAdaptor = new WebRTCAdaptor ({
websocket_url: "wss://your-server.com:5443/WebRTCAppEE/websocket" ,
mediaConstraints: { video: false , audio: false },
peerconnection_config: {
iceServers: [{ urls: "stun:stun1.l.google.com:19302" }]
},
sdp_constraints: {
OfferToReceiveAudio: true ,
OfferToReceiveVideo: true
},
remoteVideoId: "remoteVideo" ,
callback : ( info ) => {
if ( info === "initialized" ) {
webRTCAdaptor . play ( "camera1" );
}
}
});
</ script >
Configuration Options
RTSP URL Parameters
Ant Media Server supports RTSP URL parameters for advanced configuration:
rtsp://camera.ip/stream?param =value
# Allowed media types (video, audio, or both)
rtsp://192.168.1.100:554/stream?allowed_media_types =video
rtsp://192.168.1.100:554/stream?allowed_media_types =audio
rtsp://192.168.1.100:554/stream?allowed_media_types =video+audio
Source: StreamFetcher.java:176-199
Transport Protocol
# RTSP transport options
rtsp_transport =tcp # TCP (more reliable, higher latency)
rtsp_transport =udp # UDP (lower latency, less reliable)
rtsp_transport =http # HTTP tunneling (firewall-friendly)
Timeout Settings
# Stream fetcher timeout (milliseconds)
settings.streamFetcherBufferTime =0
# Connection timeout
connection_timeout =10000000 # 10 seconds in microseconds
Advanced FFmpeg Options
# Pull RTSP with custom options
ffmpeg -rtsp_transport tcp \
-allowed_media_types video \
-i rtsp://192.168.1.100:554/stream \
-c copy \
-f flv rtmp://localhost/WebRTCAppEE/camera1
# With buffering control
ffmpeg -rtsp_transport tcp \
-rtsp_flags prefer_tcp \
-fflags nobuffer \
-flags low_delay \
-i rtsp://192.168.1.100:554/stream \
-c:v copy -c:a copy \
-f flv rtmp://localhost/WebRTCAppEE/camera1
Troubleshooting
Connection Issues
Cannot Connect to Camera:
# Test RTSP connectivity
ffprobe -rtsp_transport tcp rtsp://192.168.1.100:554/stream
# Check camera is reachable
ping 192.168.1.100
# Test port
telnet 192.168.1.100 554
nc -zv 192.168.1.100 554
# Try different transports
ffplay -rtsp_transport tcp rtsp://192.168.1.100:554/stream
ffplay -rtsp_transport udp rtsp://192.168.1.100:554/stream
Authentication Errors:
# Verify credentials
ffprobe rtsp://admin:[email protected] :554/stream
# URL encode special characters in password
# Example: password with @ symbol
rtsp://admin:p%[email protected] :554/stream
Timeout Issues:
# Increase timeout
ffmpeg -stimeout 5000000 \
-rtsp_transport tcp \
-i rtsp://192.168.1.100:554/stream \
-c copy output.mp4
# Check server logs
tail -f /usr/local/antmedia/log/antmedia-error.log | grep -i rtsp
Stream Quality Issues
Pixelation/Artifacts:
# Check stream codec info
ffprobe -v error -select_streams v:0 \
-show_entries stream=codec_name,width,height,bit_rate \
rtsp://192.168.1.100:554/stream
# Transcode if needed
ffmpeg -i rtsp://192.168.1.100:554/stream \
-c:v libx264 -preset fast -b:v 2000k \
-c:a aac -b:a 128k \
-f flv rtmp://localhost/WebRTCAppEE/camera1
Packet Loss (UDP):
# Switch to TCP
ffplay -rtsp_transport tcp rtsp://192.168.1.100:554/stream
# Increase buffer
ffplay -rtsp_transport udp \
-buffer_size 1024000 \
rtsp://192.168.1.100:554/stream
Audio/Video Sync Issues:
# Check timestamps
ffprobe -show_frames rtsp://192.168.1.100:554/stream | grep -E "pkt_pts_time|media_type"
# Fix sync with re-encoding
ffmpeg -i rtsp://192.168.1.100:554/stream \
-async 1 \
-vsync 1 \
-c:v libx264 -c:a aac \
-f flv rtmp://localhost/WebRTCAppEE/camera1
Reduce Latency:
# Disable buffering
settings.streamFetcherBufferTime =0
# Use TCP for reliability
rtsp_transport =tcp
# Minimize HLS latency
settings.hlsTime =2
settings.hlsListSize =3
Handle Multiple Streams:
# Monitor system resources
top -p $( pgrep -d ',' java )
# Check stream count
curl http://localhost:5080/WebRTCAppEE/rest/v2/broadcasts/count
# Optimize encoder settings for IP cameras
settings.encoderSettings =[]
# Use direct copy when possible
Best Practices
Production Recommendations
Use TCP transport : More reliable than UDP for most networks
Monitor stream health : Set up webhooks for stream failures
Implement reconnection : Configure automatic restart on failure
Use substreams : Pull lower resolution for bandwidth savings
Secure credentials : Use environment variables or vault for passwords
Network isolation : Keep cameras on separate VLAN
Camera Configuration
Optimal Settings:
Resolution : 1280x720 or 1920x1080
Frame Rate : 15-30 fps
Bitrate : 1000-4000 Kbps
Codec : H.264 (baseline or main profile)
GOP : 60 frames (2 seconds at 30fps)
Audio : AAC 128kbps (if needed)
Disable on Camera:
Motion detection overlays
Timestamp overlays
Privacy masks (if not needed)
Audio (if not required)
Security
Change Default Credentials
Network Segmentation
Monitor Access
# Never use default passwords
# Common defaults to avoid:
admin:admin
admin:12345
root:root
admin:(empty )
Integration Examples
Motion Detection Integration
// Start/stop camera based on motion detection
const startCamera = async ( cameraId ) => {
await axios . post (
`https://your-server.com:5443/WebRTCAppEE/rest/v2/broadcasts/ ${ cameraId } /start`
);
console . log ( `Camera ${ cameraId } started` );
};
const stopCamera = async ( cameraId ) => {
await axios . post (
`https://your-server.com:5443/WebRTCAppEE/rest/v2/broadcasts/ ${ cameraId } /stop`
);
console . log ( `Camera ${ cameraId } stopped` );
};
// Motion detected
motionSensor . on ( 'motion' , () => {
startCamera ( 'camera1' );
// Stop after 5 minutes
setTimeout (() => stopCamera ( 'camera1' ), 5 * 60 * 1000 );
});
Recording Schedule
#!/bin/bash
# Start cameras during business hours
START_HOUR = 9
END_HOUR = 18
CURRENT_HOUR = $( date +%H )
if [ $CURRENT_HOUR -ge $START_HOUR ] && [ $CURRENT_HOUR -lt $END_HOUR ]; then
# Start all cameras
for i in { 1..10} ; do
curl -X POST "https://your-server.com:5443/WebRTCAppEE/rest/v2/broadcasts/camera $i /start"
done
else
# Stop all cameras
for i in { 1..10} ; do
curl -X POST "https://your-server.com:5443/WebRTCAppEE/rest/v2/broadcasts/camera $i /stop"
done
fi
Multi-Camera Dashboard
<! DOCTYPE html >
< html >
< head >
< title > Camera Dashboard </ title >
< style >
.camera-grid {
display : grid ;
grid-template-columns : repeat ( auto-fit , minmax ( 400 px , 1 fr ));
gap : 10 px ;
padding : 10 px ;
}
video { width : 100 % ; height : auto ; }
</ style >
</ head >
< body >
< div class = "camera-grid" id = "cameras" ></ div >
< script src = "https://your-server.com:5443/WebRTCAppEE/js/webrtc_adaptor.js" ></ script >
< script >
const cameras = [ 'camera1' , 'camera2' , 'camera3' , 'camera4' ];
const grid = document . getElementById ( 'cameras' );
cameras . forEach ( cameraId => {
const container = document . createElement ( 'div' );
container . innerHTML = `
<h3> ${ cameraId } </h3>
<video id=" ${ cameraId } " autoplay controls></video>
` ;
grid . appendChild ( container );
// Initialize WebRTC for each camera
const adaptor = new WebRTCAdaptor ({
websocket_url: "wss://your-server.com:5443/WebRTCAppEE/websocket" ,
mediaConstraints: { video: false , audio: false },
remoteVideoId: cameraId ,
callback : ( info ) => {
if ( info === "initialized" ) {
adaptor . play ( cameraId );
}
}
});
});
</ script >
</ body >
</ html >
RTSP Server (Publishing)
RTSP server functionality for accepting RTSP streams (publishing) requires custom development or third-party integration. Ant Media Server natively supports RTSP as a client (pulling) but not as an RTSP server (receiving).
For RTSP publishing, consider:
Use RTMP for publishing to Ant Media Server
Use WebRTC for browser-based publishing
Use SRT for reliable contribution feeds
Deploy an RTSP server (like rtsp-simple-server) and pull from it
Resources