approaches for streaming: **Transcoded** or **direct**.
+.. _transcoded-streaming:
+
Transcoded streaming
--------------------
-Transcoded streaming was the only option supported before 1.3.0,
-and in many ways the conceptually simplest from Nageru's point of
+Transcoded streaming is in many ways the conceptually simplest from Nageru's point of
view. In this mode, Nageru outputs its “digital intermediate”
H.264 stream (see :ref:`digital-intermediate`), and you are
responsible for transcoding it into a format that is suitable
and “--http-audio-bitrate” to something your mux can transport
(see below for more information about audio transcoding).
-The stream be transcoded by a number of programs, most notably
-`VLC <http://www.videolan.org/>`_. Here's an example line
-transcoding to 1.5 Mbit/sec H.264 suitable for streaming to
-most browsers in the \<video\> tag::
-
- while :; do
- vlc -I dummy -v --network-caching 3000 \
- http://http://yourserver.example.org:9095/stream.nut vlc://quit \
- --sout '#transcode{vcodec=h264,vb=1500,acodec=mp4a,aenc=fdkaac,ab=128}:std{mux=ffmpeg{mux=mp4},access=http{mime=video/mp4},dst=:1994}' \
- --sout-avformat-options '{movflags=empty_moov+frag_keyframe+default_base_moof}' \
- --sout-x264-vbv-maxrate 1500 --sout-x264-vbv-bufsize 1500 --sout-mux-caching 3000 \
- --sout-x264-keyint 50 --sout-mux-caching 3000 \
- --sout-x264-tune film --sout-x264-preset slow
- sleep 1
- done
-
-(The for loop is to restart VLC if Nageru should be restarted.
-You can make similar loops around the other example scripts,
-or you can e.g. make systemd units for transcoding if you wish.)
-
-Of course, you may need to adjust the bitrate (and then also
-the VBV settings) and preset for your content and CPU usage.
+The stream can be transcoded by a number of programs, such as
+`VLC <http://www.videolan.org/>`_, but Nageru also has its own
+transcoder called **Kaeru**, named after the
+Japanese verb *kaeru* (換える), meaning roughly to replace or exchange.
+Kaeru is a command-line tool that is designed to transcode Nageru's streams.
+Since it reuses Nageru's decoding and encoding code, it can do almost everything you can do
+with :ref:`direct encoding <direct-encoding>`, including x264 speed control
+and Metacube output (see the section on :ref:`Cubemap integration <cubemap>` below).
+
+Using Kaeru is similar to launching Nageru, e.g. to rescale a stream to 848x480
+and output it to a 1.5 Mbit/sec H.264 stream suitable for most browsers in a \<video\> tag::
+
+ ./kaeru -w 848 -h 480 --http-mux mp4 --http-audio-codec aac --http-audio-bitrate 128 \
+ --x264-bitrate 1500 http://yourserver.example.org:9095/stream.nut
+
1.5 Mbit/sec is in the lower end of the spectrum for good
720p60 conference video (most TV channels use 12-15 Mbit/sec
for the same format).
Another variation popular today is to stream using segmented HTTP;
you can use e.g. the ffmpeg command-line tool to segment the HLS
-created by VLC into a stream that will be accepted by most smartphones::
+created by Kaeru into a stream that will be accepted by most smartphones::
ffmpeg -i http://127.0.0.1:1994/ -codec copy -f hls \
-hls_time 2 -hls_wrap 100 -bsf:v h264_mp4toannexb \
-hls_segment_filename $NAME-hls-%03d.ts stream.m3u8
+Or, of course, you can use FFmpeg to do the transcoding if you wish.
+
+
+.. _direct-encoding:
Direct encoding
---------------
the stream stored to disk is still the full-quality QSV stream.
Using Nageru's built-in x264 support is strongly preferable to
-running VLC on the same machine, since it saves one H.264 decoding
+running an external transcoder on the same machine, since it saves one H.264 decoding
step, and also uses *speed control*. Speed control automatically
turns x264's quality setting up and down to use up all remaining
CPU power after Nageru itself has taken what it needs (but no more);
flag; e.g.::
./nageru --http-x264-video --x264-preset veryfast --x264-tune film \
- --http-mux mp4 --http-audio-codec libfdk_aac --http-audio-bitrate 128
-
-Note the use here of the MP4 mux and AAC audio. “libfdk_aac” signals
-te use of Franhofer's `FDK-AAC <https://github.com/mstorsjo/fdk-aac>`_ encoder
-from Android; it yields significantly better sound quality than e.g. FAAC,
-and it is open source, but under a somewhat cumbersome license. For this
-reason, most distributions do not compile FFmpeg with the FDK-AAC codec,
-so you will need to compile FFmpeg yourself, or use a worse codec.
+ --http-mux mp4 --http-audio-codec aac --http-audio-bitrate 128
-For speed control, you can use::
+Note the use here of the MP4 mux and AAC audio. For speed control, you can use::
./nageru --x264-speedcontrol --x264-tune film \
--http-mux mp4 --http-audio-codec libfdk_aac --http-audio-bitrate 128
x264 bitrate on-the-fly from the video menu; this is primarily useful
if your network conditions change abruptly.
-A particular note about the MP4 mux: If you plan to stream for long periods
-continuously (more than about 12–24 hours), the 32-bit timestamps may wrap
-around with the default timebase Nageru is using. If so, please add the
-“--http-coarse-timebase” flag.
+.. _cubemap:
Cubemap integration
-------------------
Even with built-in x264 support, Nageru is not particularly efficient
for delivering streams to end users. For this, a natural choice is
-`Cubemap <http://cubemap.sesse.net/>`_; Cubemap scales without problems
+`Cubemap <http://cubemap.sesse.net/>`__; Cubemap scales without problems
to multiple 10 Gbit/sec NICs on a quite normal machine, and you can easily
add multiple Cubemap servers if so needed. Nageru has native support for
-Cubemap's *Metacube2* transport encoding; simply add “.metacube” to
+Cubemap's *Metacube2* transport encoding; to use it, add “.metacube” to
to the end of the URL, e.g. with a cubemap.config fragment like this::
stream /stream.mp4 src=http://yourserver.example.org:9094/stream.mp4.metacube pacing_rate_kbit=3000 force_prebuffer=1500000
Note that you will want a pacing rate of about 2:1 to your real average
bitrate, in order to provide some for temporary spikes (the default allows
spikes of 2x the nominal bitrate, but only on a one-second basis) and
-TCP retransmits. See the cubemap documentation for more information about
+TCP retransmits. See the Cubemap documentation for more information about
how to set up pacing.
+
+Single-camera stream
+--------------------
+
+In addition to the regular mixed stream, you can
+siphon out MJPEG streams consisting of a single camera only. This is useful
+either for running a very cheap secondary stream (say, a static overview camera
+that you would like to show on a separate screen somewhere), or for simple
+monitoring during debugging.
+
+The URL for said stream is “http://yourserver.example.org:9095/feeds/N.mp4”,
+where N is the card index (starting from zero). The feed is in MJPEG format
+and MP4 mux, regardless of other settings, just like the multicamera mux
+for Futatabi. (You are allowed to use a card that is not part of the
+multicamera mux, if you have limited the number of such cards.) For
+more technical details, see :ref:`futatabiformat`. Kaeru can transcode such
+streams to a more manageable bitrate/format, if you wish.