X-Git-Url: https://git.sesse.net/?a=blobdiff_plain;f=hdmisdi.rst;h=a04c62ba3d09336680d80acb5d6936a2f75e90a8;hb=HEAD;hp=64fc924675b93bb31dc4c04a5d95c89fc7ad3958;hpb=116d29711ce2a5067a41fd5605148910e91a3fb9;p=nageru-docs diff --git a/hdmisdi.rst b/hdmisdi.rst index 64fc924..a04c62b 100644 --- a/hdmisdi.rst +++ b/hdmisdi.rst @@ -8,16 +8,17 @@ the stream on another PC, but for many uses, the end-to-end latency is too high, and you might not want to involve a full extra PC just for this anyway. -Thus, since 1.5.0, Nageru supports using a spare output card for HDMI/SDI +Thus, Nageru supports using a spare output card for HDMI/SDI output, turning it into a simple, reasonably low-latency audio/video switcher. Setting up HDMI/SDI output -------------------------- -Turning on HDMI/SDI output is simple; just right-click on the live view and +To turn on HDMI/SDI output, right-click on the live view and select the output card. (Equivalently, you can access the same functionality -from the *Video* menu in the regular menu bar.) Currently, this is supported +from the *Video* menu in the regular menu bar, or you can give the +*--output-card=* parameter on the command line.) Currently, this is supported for DeckLink cards only (PCI/Thunderbolt), as the precise output protocol for the Intensity Shuttle cards is still unknown. The stream and recording will keep running just as before. @@ -25,11 +26,23 @@ keep running just as before. A video mode will automatically be picked for you, favoring 59.94 fps if possible, but you can change the mode on-the-fly to something else if you'd like, as long as the resolution matches with what you've set up at program start. -Note that whenever HDMI/SDI output is active, the output card will be the + + +Unsynchronized HDMI/SDI output +------------------------------ + +By default, whenever HDMI/SDI output is active, the output card will be the master clock; you cannot change it to any of the input cards. This also means that the frame rate you choose here will determine the frame rate for the stream. +In Nageru 2.1.0 or newer, you can use the flag --output-card-unsynchronized +to counteract this (there is currently no way to do it from the GUI). +This is for if you want just a monitor output without synchronizing +your entire stream chain to the output card (ie., you want to keep +some other camera as the master). Sound support is untested, and is +probably going to crackle a fair bit. + A note on latency ----------------- @@ -163,12 +176,14 @@ Audio and video queue lengths do not need to match exactly; the two streams and for HDMI/SDI output. +.. _measuring-latency: + Measuring latency ................. In order to optimize latency, it can be useful to measure it, but for most people, it's hard to measure delays precisely enough to distinguish reliably -between e.g. 70 and 80 milliseconds by eye alone. Nageru gives you some simple +between e.g. 70 and 80 milliseconds by eye alone. Nageru gives you some tools that will help. The most direct is the flag *--print-video-latency*. This samples, for every @@ -193,5 +208,42 @@ lowest and highest will be printed. Do note that the measurement is still done over a single *output* frame; it is *not* a measurement over the last 100 output frames, even though the statistics are only printed every 100th. -TODO: Write something about time codes here. - +For more precise measurements, you can use Prometheus metrics to get percentiles +for all of these points, which will measure over all frames (over a one-minute +window). This yields more precise information than sampling every 100 frames, +but setting up Prometheus and a graphic tool is a bit more work, and usually not +worth it for simple measurement. For more information, see :doc:`monitoring`. + +Another trick that can be useful in some situations is *looping* your signal, +ie., connecting your output back into your input. This allows you to measure +delays that don't happen within Nageru itself, like any external converters, +delays in the input driver, etc.. (It can also act as a sanity check to make +sure your A/V chain passes the signal through without quality degradation, +if you first set up a static picture as a signal and then switch to the loop +input to verify that the signal stays stable without color e.g. shifts [#]_. +See the section on :doc:`the frame analyzer ` for other ways of +debugging signal integrity.) + +For this, the *timecode output* is useful; you can turn it on from the Video +menu, or through the command-line flag *--timecode-stream*. (You can also +output it to standard output with the flag *--timecode-stdout*.) It contains +some information about frame numbers and current time of day; if you activate +it, switch to the loop input and then deactivate it while still holding the +loop input active, the timecode will start repeating with roughly the +same length as your latency. (It can't be an exact measurement, as delay is +frequently fractional, and a loop length cannot be.) The easiest way to find +the actual length is to look at the recorded video file by e.g. dumping each +frame to an image file and looking at the sequence. + +In general, using Nageru's own latency measurement is both the simplest and +the most precise. However, the timecode is a useful supplement, since it +can also test external factors, such as network stream latency. + +.. [#] If you actually try this with Nageru, you will see some + dark “specks” slowly appear in the image. This is a consequence of + small roundoff errors accumulating over time, combined with Nageru's + static dither pattern that causes rounding to happen in the same + direction each time. The dithering used by Nageru is a tradeoff between + many factors, and overall helps image quality much more than it + hurts, but in the specific case of an ever-looping signal, it will + cause such artifacts.