- Proper sound support: Syncing of multiple unrelated sources through
high-quality resampling, freely selectable input, cue out for headphones,
- dynamic range compression, simple EQ (lowpass), level meters conforming
+ dynamic range compression, simple EQ (low-cut), level meters conforming
to EBU R128.
- Theme engine encapsulating the design demands of each individual
DRM instead of X11, to use a non-Intel GPU for rendering but still use
Quick Sync (by giving e.g. “--va-display /dev/dri/renderD128”).
- - Two or more Blackmagic USB3 cards, either HDMI or SDI. These are driven
- through the “bmusb” driver embedded in bmusb/, using libusb-1.0.
- Note that you will want a recent Linux kernel to avoid LPM (link power
- management) and bandwidth allocation issues with USB3.
+ - Two or more Blackmagic USB3 or PCI cards, either HDMI or SDI.
+ The PCI cards need Blackmagic's own drivers installed. The USB3 cards
+ are driven through the “bmusb” driver embedded in bmusb/, using libusb-1.0.
+ If you want zerocopy USB, you need libusb 1.0.21-rc1 or newer,
+ as well as a recent kernel (4.6.0 or newer). Zerocopy USB helps not only
+ for performance, but also for stability.
- Movit, my GPU-based video filter library (https://movit.sesse.net).
- You will need at least version 1.3.0.
+ You will need at least version 1.3.1.
- Qt 5.5 or newer for the GUI.
- libmicrohttpd for the embedded web server.
- - ffmpeg for muxing, and for encoding audio.
+ - x264 for encoding high-quality video suitable for streaming to end users.
+
+ - ffmpeg for muxing, and for encoding audio. You will need at least
+ version 3.1.
- Working OpenGL; Movit works with almost any modern OpenGL implementation.
- Nageru has been tested with Intel on Mesa 10.x and 11.x (you probably want
- 11.x), and with NVIDIA's proprietary drivers. AMD's proprietary drivers
- (fglrx) are known not to work due to driver bugs; I am in contact with
- AMD to try to get this resolved.
+ Nageru has been tested with Intel on Mesa (you want 11.2 or newer, due
+ to critical stability bugfixes), and with NVIDIA's proprietary drivers.
+ AMD's proprietary drivers (fglrx) are known not to work due to driver bugs;
+ I am in contact with AMD to try to get this resolved.
- libzita-resampler, for resampling sound sources so that they are in sync
between sources, and also for oversampling for the peak meter.
git submodule update --init
apt install qtbase5-dev qt5-default pkg-config libmicrohttpd-dev \
libusb-1.0-0-dev liblua5.2-dev libzita-resampler-dev libva-dev \
- libavcodec-dev libavformat-dev libswscale-dev libmovit-dev \
- libegl1-mesa-dev libasound2-dev
+ libavcodec-dev libavformat-dev libswscale-dev libavresample-dev \
+ libmovit-dev libegl1-mesa-dev libasound2-dev libx264-dev
+
+Exceptions as of July 2016:
+
+ - libusb 1.0.21-rc1 is not yet in stretch or sid; you need to fetch it
+ from experimental.
-The patches/ directory contains some patches for upstream software that help
-Nageru performance and/or stability. They are all meant for upstream, but
-probably will not be in by the time Nageru is released. All except the bmusb
-patch are taken to be by Steinar H. Gunderson <sesse@google.com> (ie., my work
-email, unlike Nageru itself and bmusb), and under the same license as the
-projects they patch.
+The patches/ directory contains a patch that helps zita-resampler performance.
+It is meant for upstream, but was not in at the time Nageru was released.
+It is taken to be by Steinar H. Gunderson <sesse@google.com> (ie., my ex-work
+email), and under the same license as zita-resampler itself.
To start it, just hook up your equipment, type “make” and then “./nageru”.
It is strongly recommended to have the rights to run at real-time priority;
performance. The same goes for PulseAudio.
Nageru will open a HTTP server at port 9095, where you can extract a live
-H264+PCM signal in QuickTime mux (e.g. http://127.0.0.1:9095/stream.mov).
+H264+PCM signal in nut mux (e.g. http://127.0.0.1:9095/stream.nut).
It is probably too high bitrate (~25 Mbit/sec depending on content) to send to
users, but you can easily send it around in your internal network and then
transcode it in e.g. VLC. A copy of the stream (separately muxed) will also
be saved live to local disk.
+If you have a fast CPU (typically a quadcore desktop; most laptops will spend
+most of their CPU on running Nageru itself), you can use x264 for the outgoing
+stream instead of Quick Sync; it is much better quality for the same bitrate,
+and also has proper bitrate controls. Simply add --http-x264-video on the
+command line. (You may also need to add something like "--x264-preset veryfast",
+since the default "medium" preset might be too CPU-intensive, but YMMV.)
+The stream saved to disk will still be the Quick Sync-encoded stream, as it is
+typically higher bitrate and thus also higher quality. Note that if you add
+".metacube" at the end of the URL (e.g. "http://127.0.0.1:9095/stream.ts.metacube"),
+you will get a stream suitable for streaming through the Cubemap video reflector
+(cubemap.sesse.net). A typical example would be:
+
+ ./nageru --http-x264-video --x264-preset veryfast --x264-tune film \
+ --http-mux mp4 --http-audio-codec libfdk_aac --http-audio-bitrate 128
+
+If you are comfortable with using all your remaining CPU power on the machine
+for x264, try --x264-speedcontrol, which will try to adjust the preset
+dynamically for maximum quality, at the expense of somewhat higher delay.
+
+See --help for more information on options in general.
The name “Nageru” is a play on the Japanese verb 投げる (nageru), which means
to throw or cast. (I also later learned that it could mean to face defeat or