X-Git-Url: https://git.sesse.net/?a=blobdiff_plain;f=doc%2Fffmpeg-doc.texi;h=e43e87ea0ace5d77371b46b0baffc8c9b2fc04a9;hb=7a8bfa5d674922d4413d403b059fe183deb7ddbe;hp=0f62bdad71a7c1343c07bac7e24d4e79d6a97a05;hpb=bbc35f515f9f699067b6337003faebd02781acd3;p=ffmpeg diff --git a/doc/ffmpeg-doc.texi b/doc/ffmpeg-doc.texi index 0f62bdad71a..e43e87ea0ac 100644 --- a/doc/ffmpeg-doc.texi +++ b/doc/ffmpeg-doc.texi @@ -7,8 +7,18 @@ @sp 3 @end titlepage +@chapter Synopsis -@chapter Introduction +The generic syntax is: + +@example +@c man begin SYNOPSIS +ffmpeg [[infile options][@option{-i} @var{infile}]]... @{[outfile options] @var{outfile}@}... +@c man end +@end example + +@chapter Description +@c man begin DESCRIPTION FFmpeg is a very fast video and audio converter. It can also grab from a live audio/video source. @@ -21,139 +31,6 @@ bitrate you want. FFmpeg can also convert from any sample rate to any other, and resize video on the fly with a high quality polyphase filter. -@chapter Quick Start - -@c man begin EXAMPLES -@section Video and Audio grabbing - -FFmpeg can grab video and audio from devices given that you specify the input -format and device. - -@example -ffmpeg -f oss -i /dev/dsp -f video4linux2 -i /dev/video0 /tmp/out.mpg -@end example - -Note that you must activate the right video source and channel before -launching FFmpeg with any TV viewer such as xawtv -(@url{http://bytesex.org/xawtv/}) by Gerd Knorr. You also -have to set the audio recording levels correctly with a -standard mixer. - -@section X11 grabbing - -FFmpeg can grab the X11 display. - -@example -ffmpeg -f x11grab -s cif -i :0.0 /tmp/out.mpg -@end example - -0.0 is display.screen number of your X11 server, same as -the DISPLAY environment variable. - -@example -ffmpeg -f x11grab -s cif -i :0.0+10,20 /tmp/out.mpg -@end example - -0.0 is display.screen number of your X11 server, same as the DISPLAY environment -variable. 10 is the x-offset and 20 the y-offset for the grabbing. - -@section Video and Audio file format conversion - -* FFmpeg can use any supported file format and protocol as input: - -Examples: - -* You can use YUV files as input: - -@example -ffmpeg -i /tmp/test%d.Y /tmp/out.mpg -@end example - -It will use the files: -@example -/tmp/test0.Y, /tmp/test0.U, /tmp/test0.V, -/tmp/test1.Y, /tmp/test1.U, /tmp/test1.V, etc... -@end example - -The Y files use twice the resolution of the U and V files. They are -raw files, without header. They can be generated by all decent video -decoders. You must specify the size of the image with the @option{-s} option -if FFmpeg cannot guess it. - -* You can input from a raw YUV420P file: - -@example -ffmpeg -i /tmp/test.yuv /tmp/out.avi -@end example - -test.yuv is a file containing raw YUV planar data. Each frame is composed -of the Y plane followed by the U and V planes at half vertical and -horizontal resolution. - -* You can output to a raw YUV420P file: - -@example -ffmpeg -i mydivx.avi hugefile.yuv -@end example - -* You can set several input files and output files: - -@example -ffmpeg -i /tmp/a.wav -s 640x480 -i /tmp/a.yuv /tmp/a.mpg -@end example - -Converts the audio file a.wav and the raw YUV video file a.yuv -to MPEG file a.mpg. - -* You can also do audio and video conversions at the same time: - -@example -ffmpeg -i /tmp/a.wav -ar 22050 /tmp/a.mp2 -@end example - -Converts a.wav to MPEG audio at 22050Hz sample rate. - -* You can encode to several formats at the same time and define a -mapping from input stream to output streams: - -@example -ffmpeg -i /tmp/a.wav -ab 64k /tmp/a.mp2 -ab 128k /tmp/b.mp2 -map 0:0 -map 0:0 -@end example - -Converts a.wav to a.mp2 at 64 kbits and to b.mp2 at 128 kbits. '-map -file:index' specifies which input stream is used for each output -stream, in the order of the definition of output streams. - -* You can transcode decrypted VOBs: - -@example -ffmpeg -i snatch_1.vob -f avi -vcodec mpeg4 -b 800k -g 300 -bf 2 -acodec libmp3lame -ab 128k snatch.avi -@end example - -This is a typical DVD ripping example; the input is a VOB file, the -output an AVI file with MPEG-4 video and MP3 audio. Note that in this -command we use B-frames so the MPEG-4 stream is DivX5 compatible, and -GOP size is 300 which means one intra frame every 10 seconds for 29.97fps -input video. Furthermore, the audio stream is MP3-encoded so you need -to enable LAME support by passing @code{--enable-libmp3lame} to configure. -The mapping is particularly useful for DVD transcoding -to get the desired audio language. - -NOTE: To see the supported input formats, use @code{ffmpeg -formats}. -@c man end - -@chapter Invocation - -@section Syntax - -The generic syntax is: - -@example -@c man begin SYNOPSIS -ffmpeg [[infile options][@option{-i} @var{infile}]]... @{[outfile options] @var{outfile}@}... -@c man end -@end example -@c man begin DESCRIPTION As a general rule, options are applied to the next specified file. Therefore, order is important, and you can have the same option on the command line multiple times. Each occurrence is @@ -164,19 +41,15 @@ then applied to the next input or output file. ffmpeg -i input.avi -b 64k output.avi @end example -* To force the frame rate of the input and output file to 24 fps: -@example -ffmpeg -r 24 -i input.avi output.avi -@end example - * To force the frame rate of the output file to 24 fps: @example ffmpeg -i input.avi -r 24 output.avi @end example -* To force the frame rate of input file to 1 fps and the output file to 24 fps: +* To force the frame rate of the input file (valid for raw formats only) +to 1 fps and the frame rate of the output file to 24 fps: @example -ffmpeg -r 1 -i input.avi -r 24 output.avi +ffmpeg -r 1 -i input.m2v -r 24 output.avi @end example The format option may be needed for raw input files. @@ -184,23 +57,17 @@ The format option may be needed for raw input files. By default, FFmpeg tries to convert as losslessly as possible: It uses the same audio and video parameters for the outputs as the one specified for the inputs. -@c man end -@c man begin OPTIONS -@section Main options +@c man end DESCRIPTION -@table @option -@item -L -Show license. +@chapter Options +@c man begin OPTIONS -@item -h -Show help. +@include fftools-common-opts.texi -@item -version -Show version. +@section Main options -@item -formats -Show available formats, codecs, protocols, ... +@table @option @item -f @var{fmt} Force format. @@ -231,29 +98,25 @@ The offset is added to the timestamps of the input files. Specifying a positive offset means that the corresponding streams are delayed by 'offset' seconds. -@item -title @var{string} -Set the title. - @item -timestamp @var{time} -Set the timestamp. - -@item -author @var{string} -Set the author. - -@item -copyright @var{string} -Set the copyright. - -@item -comment @var{string} -Set the comment. - -@item -album @var{string} -Set the album. +Set the recording timestamp in the container. +The syntax for @var{time} is: +@example +now|([(YYYY-MM-DD|YYYYMMDD)[T|t| ]]((HH[:MM[:SS[.m...]]])|(HH[MM[SS[.m...]]]))[Z|z]) +@end example +If the value is "now" it takes the current time. +Time is local time unless 'Z' or 'z' is appended, in which case it is +interpreted as UTC. +If the year-month-day part is not specified it takes the current +year-month-day. -@item -track @var{number} -Set the track. +@item -metadata @var{key}=@var{value} +Set a metadata key/value pair. -@item -year @var{number} -Set the year. +For example, for setting the title in the output file: +@example +ffmpeg -i in.avi -metadata title="my title" out.flv +@end example @item -v @var{number} Set the logging verbosity level. @@ -309,6 +172,8 @@ The following abbreviations are recognized: 352x288 @item 4cif 704x576 +@item 16cif +1408x1152 @item qqvga 160x120 @item qvga @@ -361,31 +226,30 @@ The following abbreviations are recognized: @item -aspect @var{aspect} Set aspect ratio (4:3, 16:9 or 1.3333, 1.7777). -@item -croptop @var{size} +@item -croptop @var{size} (deprecated - use the crop filter instead) Set top crop band size (in pixels). -@item -cropbottom @var{size} +@item -cropbottom @var{size} (deprecated - use the crop filter instead) Set bottom crop band size (in pixels). -@item -cropleft @var{size} +@item -cropleft @var{size} (deprecated - use the crop filter instead) Set left crop band size (in pixels). -@item -cropright @var{size} +@item -cropright @var{size} (deprecated - use the crop filter instead) Set right crop band size (in pixels). @item -padtop @var{size} -Set top pad band size (in pixels). @item -padbottom @var{size} -Set bottom pad band size (in pixels). @item -padleft @var{size} -Set left pad band size (in pixels). @item -padright @var{size} -Set right pad band size (in pixels). @item -padcolor @var{hex_color} -Set color of padded bands. The value for padcolor is expressed -as a six digit hexadecimal number where the first two digits -represent red, the middle two digits green and last two digits -blue (default = 000000 (black)). +All the pad options have been removed. Use -vf +pad=width:height:x:y:color instead. @item -vn Disable video recording. @item -bt @var{tolerance} -Set video bitrate tolerance (in bit/s). +Set video bitrate tolerance (in bits, default 4000k). +Has a minimum value of: (target_bitrate/target_framerate). +In 1-pass mode, bitrate tolerance specifies how far ratecontrol is +willing to deviate from the target average bitrate value. This is +not related to min/max bitrate. Lowering tolerance too much has +an adverse effect on quality. @item -maxrate @var{bitrate} Set max video bitrate (in bit/s). Requires -bufsize to be set. @@ -405,17 +269,36 @@ tell that the raw codec data must be copied as is. Use same video quality as source (implies VBR). @item -pass @var{n} -Select the pass number (1 or 2). It is useful to do two pass -encoding. The statistics of the video are recorded in the first -pass and the video is generated at the exact requested bitrate -in the second pass. +Select the pass number (1 or 2). It is used to do two-pass +video encoding. The statistics of the video are recorded in the first +pass into a log file (see also the option -passlogfile), +and in the second pass that log file is used to generate the video +at the exact requested bitrate. +On pass 1, you may just deactivate audio and set output to null, +examples for Windows and Unix: +@example +ffmpeg -i foo.mov -vcodec libxvid -pass 1 -an -f rawvideo -y NUL +ffmpeg -i foo.mov -vcodec libxvid -pass 1 -an -f rawvideo -y /dev/null +@end example -@item -passlogfile @var{file} -Set two pass logfile name to @var{file}. +@item -passlogfile @var{prefix} +Set two-pass log file name prefix to @var{prefix}, the default file name +prefix is ``ffmpeg2pass''. The complete file name will be +@file{PREFIX-N.log}, where N is a number specific to the output +stream. @item -newvideo Add a new video stream to the current output stream. +@item -vlang @var{code} +Set the ISO 639 language code (3 letters) of the current video stream. + +@item -vf @var{filter_graph} +@var{filter_graph} is a description of the filter graph to apply to +the input video. +Use the option "-filters" to show all the available filters (including +also sources and sinks). + @end table @section Advanced Video Options @@ -425,7 +308,7 @@ Add a new video stream to the current output stream. Set pixel format. Use 'list' as parameter to show all the supported pixel formats. @item -sws_flags @var{flags} -Set SwScaler flags (only available when compiled with swscale support). +Set SwScaler flags. @item -g @var{gop_size} Set the group of pictures size. @item -intra @@ -441,9 +324,10 @@ maximum video quantizer scale (VBR) @item -qdiff @var{q} maximum difference between the quantizer scales (VBR) @item -qblur @var{blur} -video quantizer scale blur (VBR) +video quantizer scale blur (VBR) (range 0.0 - 1.0) @item -qcomp @var{compression} -video quantizer scale compression (VBR) +video quantizer scale compression (VBR) (default 0.5). +Constant of ratecontrol equation. Recommended range for default rc_eq: 0.0-1.0 @item -lmin @var{lambda} minimum video lagrange factor (VBR) @@ -599,9 +483,6 @@ Calculate PSNR of compressed frames. Dump video coding statistics to @file{vstats_HHMMSS.log}. @item -vstats_file @var{file} Dump video coding statistics to @var{file}. -@item -vhook @var{module} -Insert video processing @var{module}. @var{module} contains the module -name and its parameters separated by spaces. @item -top @var{n} top=1/bottom=0/auto=-1 field first @item -dc @var{precision} @@ -611,7 +492,10 @@ Force video tag/fourcc. @item -qphist Show QP histogram. @item -vbsf @var{bitstream_filter} -Bitstream filters available are "dump_extra", "remove_extra", "noise". +Bitstream filters available are "dump_extra", "remove_extra", "noise", "h264_mp4toannexb", "imxdump", "mjpegadump". +@example +ffmpeg -i h264.mp4 -vcodec copy -vbsf h264_mp4toannexb -an out.h264 +@end example @end table @section Audio Options @@ -623,8 +507,13 @@ Set the number of audio frames to record. Set the audio sampling frequency (default = 44100 Hz). @item -ab @var{bitrate} Set the audio bitrate in bit/s (default = 64k). +@item -aq @var{q} +Set the audio quality (codec-specific, VBR). @item -ac @var{channels} -Set the number of audio channels (default = 1). +Set the number of audio channels. For input streams it is set by +default to 1, for output streams it is set by default to the same +number of audio channels in input. If the input file has audio streams +with different channel count, the behaviour is undefined. @item -an Disable audio recording. @item -acodec @var{codec} @@ -664,6 +553,13 @@ Force subtitle codec ('copy' to copy stream). Add a new subtitle stream to the current output stream. @item -slang @var{code} Set the ISO 639 language code (3 letters) of the current subtitle stream. +@item -sn +Disable subtitle recording. +@item -sbsf @var{bitstream_filter} +Bitstream filters available are "mov2textsub", "text2movsub". +@example +ffmpeg -i file.mov -an -vn -sbsf mov2textsub -scodec copy -f rawvideo sub.txt +@end example @end table @section Audio/Video grab options @@ -680,16 +576,20 @@ Synchronize read on input. @section Advanced options @table @option -@item -map input stream id[:input stream id] +@item -map @var{input_stream_id}[:@var{sync_stream_id}] Set stream mapping from input streams to output streams. Just enumerate the input streams in the order you want them in the output. -[input stream id] sets the (input) stream to sync against. +@var{sync_stream_id} if specified sets the input stream to sync +against. @item -map_meta_data @var{outfile}:@var{infile} Set meta data information of @var{outfile} from @var{infile}. @item -debug Print specific debug info. @item -benchmark -Add timings for benchmarking. +Show benchmarking information at the end of an encode. +Shows CPU time used and maximum memory consumption. +Maximum memory consumption is not supported on all systems, +it will usually display as 0 if not supported. @item -dump Dump each input packet. @item -hex @@ -697,7 +597,7 @@ When dumping packets, also dump the payload. @item -bitexact Only use bit exact algorithms (for codec testing). @item -ps @var{size} -Set packet size in bits. +Set RTP payload size in bytes. @item -re Read input at native frame rate. Mainly used to simulate a grab device. @item -loop_input @@ -709,8 +609,15 @@ Repeatedly loop output for formats that support looping such as animated GIF @item -threads @var{count} Thread count. @item -vsync @var{parameter} -Video sync method. Video will be stretched/squeezed to match the timestamps, -it is done by duplicating and dropping frames. With -map you can select from +Video sync method. +0 Each frame is passed with its timestamp from the demuxer to the muxer +1 Frames will be duplicated and dropped to achieve exactly the requested + constant framerate. +2 Frames are passed through with their timestamp or dropped so as to prevent + 2 frames from having the same timestamp +-1 Chooses between 1 and 2 depending on muxer capabilities. This is the default method. + +With -map you can select from which stream the timestamps should be taken. You can leave either video or audio unchanged and sync the remaining stream(s) to the unchanged one. @item -async @var{samples_per_second} @@ -728,9 +635,51 @@ Timestamp discontinuity delta threshold. Set the maximum demux-decode delay. @item -muxpreload @var{seconds} Set the initial demux-decode delay. +@item -streamid @var{output-stream-index}:@var{new-value} +Assign a new value to a stream's stream-id field in the next output file. +All stream-id fields are reset to default for each output file. + +For example, to set the stream 0 PID to 33 and the stream 1 PID to 36 for +an output mpegts file: +@example +ffmpeg -i infile -streamid 0:33 -streamid 1:36 out.ts +@end example @end table -@node FFmpeg formula evaluator +@section Preset files + +A preset file contains a sequence of @var{option}=@var{value} pairs, +one for each line, specifying a sequence of options which would be +awkward to specify on the command line. Lines starting with the hash +('#') character are ignored and are used to provide comments. Check +the @file{ffpresets} directory in the FFmpeg source tree for examples. + +Preset files are specified with the @code{vpre}, @code{apre}, +@code{spre}, and @code{fpre} options. The @code{fpre} option takes the +filename of the preset instead of a preset name as input and can be +used for any kind of codec. For the @code{vpre}, @code{apre}, and +@code{spre} options, the options specified in a preset file are +applied to the currently selected codec of the same type as the preset +option. + +The argument passed to the @code{vpre}, @code{apre}, and @code{spre} +preset options identifies the preset file to use according to the +following rules: + +First ffmpeg searches for a file named @var{arg}.ffpreset in the +directories @file{$FFMPEG_DATADIR} (if set), and @file{$HOME/.ffmpeg}, and in +the datadir defined at configuration time (usually @file{PREFIX/share/ffmpeg}) +in that order. For example, if the argument is @code{libx264-max}, it will +search for the file @file{libx264-max.ffpreset}. + +If no such file is found, then ffmpeg will search for a file named +@var{codec_name}-@var{arg}.ffpreset in the above-mentioned +directories, where @var{codec_name} is the name of the codec to which +the preset file options will be applied. For example, if you select +the video codec with @code{-vcodec libx264} and use @code{-vpre max}, +then it will search for the file @file{libx264-max.ffpreset}. + +@anchor{FFmpeg formula evaluator} @section FFmpeg formula evaluator When evaluating a rate control string, FFmpeg uses an internal formula @@ -742,6 +691,9 @@ The following binary operators are available: @code{+}, @code{-}, The following unary operators are available: @code{+}, @code{-}, @code{(...)}. +The following statements are available: @code{ld}, @code{st}, +@code{while}. + The following functions are available: @table @var @item sinh(x) @@ -750,16 +702,22 @@ The following functions are available: @item sin(x) @item cos(x) @item tan(x) +@item atan(x) +@item asin(x) +@item acos(x) @item exp(x) @item log(x) +@item abs(x) @item squish(x) @item gauss(x) -@item abs(x) +@item mod(x, y) @item max(x, y) @item min(x, y) +@item eq(x, y) +@item gte(x, y) @item gt(x, y) +@item lte(x, y) @item lt(x, y) -@item eq(x, y) @item bits2qp(bits) @item qp2bits(qp) @end table @@ -790,38 +748,12 @@ The following constants are available: @c man end -@ignore - -@setfilename ffmpeg -@settitle FFmpeg video converter - -@c man begin SEEALSO -ffserver(1), ffplay(1) and the HTML documentation of @file{ffmpeg}. -@c man end - -@c man begin AUTHOR -Fabrice Bellard -@c man end - -@end ignore - -@section Protocols - -The file name can be @file{-} to read from standard input or to write -to standard output. - -FFmpeg also handles many protocols specified with an URL syntax. - -Use 'ffmpeg -formats' to see a list of the supported protocols. - -The protocol @code{http:} is currently used only to communicate with -FFserver (see the FFserver documentation). When FFmpeg will be a -video player it will also be used for streaming :-) - @chapter Tips +@c man begin TIPS @itemize -@item For streaming at very low bitrate application, use a low frame rate +@item +For streaming at very low bitrate application, use a low frame rate and a small GOP size. This is especially true for RealVideo where the Linux player does not seem to be very fast, so it can miss frames. An example is: @@ -830,30 +762,216 @@ frames. An example is: ffmpeg -g 3 -r 3 -t 10 -b 50k -s qcif -f rv10 /tmp/b.rm @end example -@item The parameter 'q' which is displayed while encoding is the current +@item +The parameter 'q' which is displayed while encoding is the current quantizer. The value 1 indicates that a very good quality could be achieved. The value 31 indicates the worst quality. If q=31 appears too often, it means that the encoder cannot compress enough to meet your bitrate. You must either increase the bitrate, decrease the frame rate or decrease the frame size. -@item If your computer is not fast enough, you can speed up the +@item +If your computer is not fast enough, you can speed up the compression at the expense of the compression ratio. You can use '-me zero' to speed up motion estimation, and '-intra' to disable motion estimation completely (you have only I-frames, which means it is about as good as JPEG compression). -@item To have very low audio bitrates, reduce the sampling frequency -(down to 22050 kHz for MPEG audio, 22050 or 11025 for AC3). +@item +To have very low audio bitrates, reduce the sampling frequency +(down to 22050 Hz for MPEG audio, 22050 or 11025 for AC-3). -@item To have a constant quality (but a variable bitrate), use the option +@item +To have a constant quality (but a variable bitrate), use the option '-qscale n' when 'n' is between 1 (excellent quality) and 31 (worst quality). -@item When converting video files, you can use the '-sameq' option which +@item +When converting video files, you can use the '-sameq' option which uses the same quality factor in the encoder as in the decoder. It allows almost lossless encoding. @end itemize +@c man end TIPS + +@chapter Examples +@c man begin EXAMPLES + +@section Video and Audio grabbing + +FFmpeg can grab video and audio from devices given that you specify the input +format and device. + +@example +ffmpeg -f oss -i /dev/dsp -f video4linux2 -i /dev/video0 /tmp/out.mpg +@end example + +Note that you must activate the right video source and channel before +launching FFmpeg with any TV viewer such as xawtv +(@url{http://linux.bytesex.org/xawtv/}) by Gerd Knorr. You also +have to set the audio recording levels correctly with a +standard mixer. + +@section X11 grabbing + +FFmpeg can grab the X11 display. + +@example +ffmpeg -f x11grab -s cif -r 25 -i :0.0 /tmp/out.mpg +@end example + +0.0 is display.screen number of your X11 server, same as +the DISPLAY environment variable. + +@example +ffmpeg -f x11grab -s cif -r 25 -i :0.0+10,20 /tmp/out.mpg +@end example + +0.0 is display.screen number of your X11 server, same as the DISPLAY environment +variable. 10 is the x-offset and 20 the y-offset for the grabbing. + +@section Video and Audio file format conversion + +* FFmpeg can use any supported file format and protocol as input: + +Examples: + +* You can use YUV files as input: + +@example +ffmpeg -i /tmp/test%d.Y /tmp/out.mpg +@end example + +It will use the files: +@example +/tmp/test0.Y, /tmp/test0.U, /tmp/test0.V, +/tmp/test1.Y, /tmp/test1.U, /tmp/test1.V, etc... +@end example + +The Y files use twice the resolution of the U and V files. They are +raw files, without header. They can be generated by all decent video +decoders. You must specify the size of the image with the @option{-s} option +if FFmpeg cannot guess it. + +* You can input from a raw YUV420P file: + +@example +ffmpeg -i /tmp/test.yuv /tmp/out.avi +@end example + +test.yuv is a file containing raw YUV planar data. Each frame is composed +of the Y plane followed by the U and V planes at half vertical and +horizontal resolution. + +* You can output to a raw YUV420P file: + +@example +ffmpeg -i mydivx.avi hugefile.yuv +@end example + +* You can set several input files and output files: + +@example +ffmpeg -i /tmp/a.wav -s 640x480 -i /tmp/a.yuv /tmp/a.mpg +@end example + +Converts the audio file a.wav and the raw YUV video file a.yuv +to MPEG file a.mpg. + +* You can also do audio and video conversions at the same time: + +@example +ffmpeg -i /tmp/a.wav -ar 22050 /tmp/a.mp2 +@end example + +Converts a.wav to MPEG audio at 22050 Hz sample rate. + +* You can encode to several formats at the same time and define a +mapping from input stream to output streams: + +@example +ffmpeg -i /tmp/a.wav -ab 64k /tmp/a.mp2 -ab 128k /tmp/b.mp2 -map 0:0 -map 0:0 +@end example + +Converts a.wav to a.mp2 at 64 kbits and to b.mp2 at 128 kbits. '-map +file:index' specifies which input stream is used for each output +stream, in the order of the definition of output streams. + +* You can transcode decrypted VOBs: + +@example +ffmpeg -i snatch_1.vob -f avi -vcodec mpeg4 -b 800k -g 300 -bf 2 -acodec libmp3lame -ab 128k snatch.avi +@end example + +This is a typical DVD ripping example; the input is a VOB file, the +output an AVI file with MPEG-4 video and MP3 audio. Note that in this +command we use B-frames so the MPEG-4 stream is DivX5 compatible, and +GOP size is 300 which means one intra frame every 10 seconds for 29.97fps +input video. Furthermore, the audio stream is MP3-encoded so you need +to enable LAME support by passing @code{--enable-libmp3lame} to configure. +The mapping is particularly useful for DVD transcoding +to get the desired audio language. + +NOTE: To see the supported input formats, use @code{ffmpeg -formats}. + +* You can extract images from a video, or create a video from many images: + +For extracting images from a video: +@example +ffmpeg -i foo.avi -r 1 -s WxH -f image2 foo-%03d.jpeg +@end example + +This will extract one video frame per second from the video and will +output them in files named @file{foo-001.jpeg}, @file{foo-002.jpeg}, +etc. Images will be rescaled to fit the new WxH values. + +If you want to extract just a limited number of frames, you can use the +above command in combination with the -vframes or -t option, or in +combination with -ss to start extracting from a certain point in time. + +For creating a video from many images: +@example +ffmpeg -f image2 -i foo-%03d.jpeg -r 12 -s WxH foo.avi +@end example + +The syntax @code{foo-%03d.jpeg} specifies to use a decimal number +composed of three digits padded with zeroes to express the sequence +number. It is the same syntax supported by the C printf function, but +only formats accepting a normal integer are suitable. + +* You can put many streams of the same type in the output: + +@example +ffmpeg -i test1.avi -i test2.avi -vcodec copy -acodec copy -vcodec copy -acodec copy test12.avi -newvideo -newaudio +@end example + +In addition to the first video and audio streams, the resulting +output file @file{test12.avi} will contain the second video +and the second audio stream found in the input streams list. + +The @code{-newvideo}, @code{-newaudio} and @code{-newsubtitle} +options have to be specified immediately after the name of the output +file to which you want to add them. +@c man end EXAMPLES + +@include indevs.texi +@include outdevs.texi +@include protocols.texi +@include filters.texi + +@ignore + +@setfilename ffmpeg +@settitle FFmpeg video converter + +@c man begin SEEALSO +ffplay(1), ffprobe(1), ffserver(1) and the FFmpeg HTML documentation +@c man end + +@c man begin AUTHORS +The FFmpeg developers +@c man end + +@end ignore @bye