X-Git-Url: https://git.sesse.net/?a=blobdiff_plain;f=doc%2Fprotocols.texi;h=cc35982e77aadd165cdaff04b463194dee222fb2;hb=64db1a82d663958593f66ff7bf351d6a670e51a4;hp=a9ef517b24c5bf74594843cc07dc15d8d9222f3a;hpb=61c089a81b76df63c7a141f51aba7de03841489f;p=ffmpeg diff --git a/doc/protocols.texi b/doc/protocols.texi index a9ef517b24c..cc35982e77a 100644 --- a/doc/protocols.texi +++ b/doc/protocols.texi @@ -1,10 +1,10 @@ @chapter Protocols @c man begin PROTOCOLS -Protocols are configured elements in FFmpeg which allow to access +Protocols are configured elements in Libav which allow to access resources which require the use of a particular protocol. -When you configure your FFmpeg build, all the supported protocols are +When you configure your Libav build, all the supported protocols are enabled by default. You can list all available ones using the configure option "--list-protocols". @@ -19,6 +19,22 @@ supported protocols. A description of the currently available protocols follows. +@section applehttp + +Read Apple HTTP Live Streaming compliant segmented stream as +a uniform one. The M3U8 playlists describing the segments can be +remote HTTP resources or local files, accessed using the standard +file protocol. +HTTP is default, specific protocol can be declared by specifying +"+@var{proto}" after the applehttp URI scheme name, where @var{proto} +is either "file" or "http". + +@example +applehttp://host/path/to/remote/resource.m3u8 +applehttp+http://host/path/to/remote/resource.m3u8 +applehttp+file://path/to/local/resource.m3u8 +@end example + @section concat Physical concatenation protocol. @@ -36,10 +52,10 @@ resource to be concatenated, each one possibly specifying a distinct protocol. For example to read a sequence of files @file{split1.mpeg}, -@file{split2.mpeg}, @file{split3.mpeg} with @file{ffplay} use the +@file{split2.mpeg}, @file{split3.mpeg} with @file{avplay} use the command: @example -ffplay concat:split1.mpeg\|split2.mpeg\|split3.mpeg +avplay concat:split1.mpeg\|split2.mpeg\|split3.mpeg @end example Note that you may need to escape the character "|" which is special for @@ -167,10 +183,10 @@ application specified in @var{app}, may be prefixed by "mp4:". @end table -For example to read with @file{ffplay} a multimedia resource named +For example to read with @file{avplay} a multimedia resource named "sample" from the application "vod" from an RTMP server "myserver": @example -ffplay rtmp://myserver/vod/sample +avplay rtmp://myserver/vod/sample @end example @section rtmp, rtmpe, rtmps, rtmpt, rtmpte @@ -208,9 +224,9 @@ For example, to stream a file in real-time to an RTMP server using ffmpeg -re -i myfile -f flv rtmp://myserver/live/mystream @end example -To play the same stream using @file{ffplay}: +To play the same stream using @file{avplay}: @example -ffplay "rtmp://myserver/live/mystream live=1" +avplay "rtmp://myserver/live/mystream live=1" @end example @section rtp @@ -226,7 +242,7 @@ data transferred over RDT). The muxer can be used to send a stream using RTSP ANNOUNCE to a server supporting it (currently Darwin Streaming Server and Mischa Spiegelmock's -RTSP server, @url{http://github.com/revmischa/rtsp-server}). +@uref{http://github.com/revmischa/rtsp-server, RTSP server}). The required syntax for a RTSP url is: @example @@ -251,6 +267,9 @@ Use UDP multicast as lower transport protocol. @item http Use HTTP tunneling as lower transport protocol, which is useful for passing proxies. + +@item filter_src +Accept packets only from negotiated peer address and port. @end table Multiple lower transport protocols may be specified, in that case they are @@ -262,7 +281,7 @@ When receiving data over UDP, the demuxer tries to reorder received packets order for this to be enabled, a maximum delay must be specified in the @code{max_delay} field of AVFormatContext. -When watching multi-bitrate Real-RTSP streams with @file{ffplay}, the +When watching multi-bitrate Real-RTSP streams with @file{avplay}, the streams to display can be chosen with @code{-vst} @var{n} and @code{-ast} @var{n} for video and audio respectively, and can be switched on the fly by pressing @code{v} and @code{a}. @@ -272,13 +291,13 @@ Example command lines: To watch a stream over UDP, with a max reordering delay of 0.5 seconds: @example -ffplay -max_delay 500000 rtsp://server/video.mp4?udp +avplay -max_delay 500000 rtsp://server/video.mp4?udp @end example To watch a stream tunneled over HTTP: @example -ffplay rtsp://server/video.mp4?http +avplay rtsp://server/video.mp4?http @end example To send a stream in realtime to a RTSP server, for others to watch: @@ -290,10 +309,12 @@ ffmpeg -re -i @var{input} -f rtsp -muxdelay 0.1 rtsp://server/live.sdp @section sap Session Announcement Protocol (RFC 2974). This is not technically a -protocol handler in libavformat, it is a muxer. +protocol handler in libavformat, it is a muxer and demuxer. It is used for signalling of RTP streams, by announcing the SDP for the streams regularly on a separate port. +@subsection Muxer + The syntax for a SAP url given to the muxer is: @example sap://@var{destination}[:@var{port}][?@var{options}] @@ -325,6 +346,8 @@ If set to 1, send all RTP streams on the same port pair. If zero (the default), all streams are sent on unique ports, with each stream on a port 2 numbers higher than the previous. VLC/Live555 requires this to be set to 1, to be able to receive the stream. +The RTP stack in libavformat for receiving requires all streams to be sent +on unique ports. @end table Example command lines follow. @@ -335,10 +358,67 @@ To broadcast a stream on the local subnet, for watching in VLC: ffmpeg -re -i @var{input} -f sap sap://224.0.0.255?same_port=1 @end example +Similarly, for watching in avplay: + +@example +ffmpeg -re -i @var{input} -f sap sap://224.0.0.255 +@end example + +And for watching in avplay, over IPv6: + +@example +ffmpeg -re -i @var{input} -f sap sap://[ff0e::1:2:3:4] +@end example + +@subsection Demuxer + +The syntax for a SAP url given to the demuxer is: +@example +sap://[@var{address}][:@var{port}] +@end example + +@var{address} is the multicast address to listen for announcements on, +if omitted, the default 224.2.127.254 (sap.mcast.net) is used. @var{port} +is the port that is listened on, 9875 if omitted. + +The demuxers listens for announcements on the given address and port. +Once an announcement is received, it tries to receive that particular stream. + +Example command lines follow. + +To play back the first stream announced on the normal SAP multicast address: + +@example +avplay sap:// +@end example + +To play back the first stream announced on one the default IPv6 SAP multicast address: + +@example +avplay sap://[ff0e::2:7ffe] +@end example + @section tcp Trasmission Control Protocol. +The required syntax for a TCP url is: +@example +tcp://@var{hostname}:@var{port}[?@var{options}] +@end example + +@table @option + +@item listen +Listen for an incoming connection + +@example +ffmpeg -i @var{input} -f @var{format} tcp://@var{hostname}:@var{port}?listen +avplay tcp://@var{hostname}:@var{port} +@end example + +@end table + @section udp User Datagram Protocol. @@ -370,10 +450,14 @@ set the time to live value (for multicast only) @item connect=@var{1|0} Initialize the UDP socket with @code{connect()}. In this case, the -destination address can't be changed with udp_set_remote_url later. +destination address can't be changed with ff_udp_set_remote_url later. +If the destination address isn't known at the start, this option can +be specified in ff_udp_set_remote_url, too. This allows finding out the source address for the packets with getsockname, and makes writes return with AVERROR(ECONNREFUSED) if "destination unreachable" is received. +For receiving, this gives the benefit of only receiving packets from +the specified peer address/port. @end table Some usage examples of the udp protocol with @file{ffmpeg} follow.