Gaining real-time access to the wide array of video sources at an incident location has always been one of the “golden rings” of interest for on-scene Command. We largely trust our eyes and value visual confirmation as we consider the criticality and impacts of the decisions that need to be made. For incident-based video sources and the associated live streaming, low latency influences the sense of immediacy of the video streams used for decision support.
In most Public Safety and/or military scenarios, low-latency streaming is absolutely mission critical as any delays in the streamed video can have serious implications on coordinated and time-sensitive responses to events that are unfolding in real time. Some types of live video requires lower latency than others, but few will argue that “fire control” is one of the areas where 6 second latency dramatically impacts the safety of a “fire” command.
Latency Explained
A simple way to think about latency is the idea of “delay”. It’s the time it takes to move a frame of video between point A and point B. Video latency, as a more specific application of the term, is used to describe the time between the capturing of a frame of video and the end user having it displayed on their laptop or mobile screen. It’s also important to consider the factors that introduce latency in a video stream:
- Bandwidth: 5G will minimize the bandwidth issue on local subnets, but until it is fully deployed everywhere (or, your agency has the financial wherewithal to deploy a private CBRS cloud), most of us are still looking to 4G/LTE. Furthermore, while 5G will provide lightning fast interconnects, backhauls coming off of the 5G local network may still be running at 4G speeds. While bandwidth does indeed play a big role in the latency equation, new encoders that provide novel compression can help mitigate the total impact bandwidth plays. More on that in a moment.
- Encoding/Decoding: Encoders translate the imagery that the camera sees into TCP/IP based data that can be moved across a network, while decoders take the data and transform it in such a way that your compute device can display it for you. Encoders and video formats are largely a set of algorithms that “do the math” for conversion, but then compression. Your speed may vary based on the encoder that is used to package your drone/body worn video.
- Format: The format of the video can be bubbled down to frame rate and image quality. Sending 30 frames per second of 1080 video consumes more compute and bandwidth than sending 4 frames per second of same video. For many in the intelligence, surveillance, and reconnaissance business, image quality trumps frame rate.
Much can be done by twisting the “knobs and dials” of your encoding platform, network service provider platform, and/or the type of connection. Yet, the 900-pound elephant in the room are the streaming protocols and the role they play in providing a good viewing experience. It’s the primary thing we hear with our customers in the field and we have spent a great deal of time exploring protocols, encoders, and other technology enablers that cut down on video latency.
Announcing the new Blueforce WarpVideo Plugin for BlueforceTACTICAL Android (soon iOS)
The Blueforce WarpVideo Plugin for BlueforceTACTICAL turns your Android (and soon iOS) mobile device into a ultra-low latency video streaming device allowing you to leverage devices you already own and carry. The plugin is now in BETA and our early test results are showing sub-1 second latency (approaching 500ms) over standard Verizon 4G networks. When enabled, the plugin streams live video from your mobile device, and is made securely sharable with other mobile Blueforce users, BlueforceCOMMAND, and other standards-based enterprise systems.
The backend video streaming engine can be stood up in minutes via a single click in the AWS Marketplace, and provides multi-point distribution, but also recording for later playback. We also focused on a streaming engine attribute that allowed forward deployment on mobile edge compute on 5G MECs, but also can run within a Docker container, which means it can run on a whole host of body worn, small compute (i.e., Raspberry PI 4), vehicle, and/or cloud compute devices enabling austere use when no cloud backhaul is available.
To learn more or request a demo, click the button below or send an email to info@blueforcedev.com.