FRAME SKIPPING ULTRA LOW DELAY VIDEO TRANSMISSION SYSTEM

FRAME SKIPPING ULTRA LOW DELAY VIDEO TRANSMISSION SYSTEM

CHALLENGE

Camera systems can replace conventional sensors (radar sensors), when combined with visual tra- cking thechniques. To make the camera systems a reliable solution to be implemented in fast feed- back loops the sensor-to-controller latency needs to be drastically reduced.

INNOVATION

This invention allows for achieving a mean latency (time difference between a visible event taking place and the event being displayed on the screen) downto21.2ms(1). Low latency is achieved by using a high frame-rate camera in combination with frame-skipping or buffer preemption.

  1. The FRAME-SKIPPING is based on the idea of selecting relevant frames and send them to the encoder, while not significant frames are not further processed.
  2. The PREEMPTIONreduces the transmission delay when a relevant frame is ready to be trans- mitted after being encoded but a previous regular frame is queued in the encoder buffer. Low latency is achieved by flushing the regular frames from the buffer (Figure 1).

COMMERCIAL OPPORTUNITIES

The invention can be applied for visualising and processing low-delay-videos, with the purpose of remotely controlling a mobile device in a confined and complex environment. In Figure 2 an examp- le showing the remote control of a drone, based on framed-skipping and preemption, is presented. Further applications find place in the fields of the autonumous driving and of the human-robot cooperation.

DEVELOPMENT STATUS

A prototype is in preparation.

 

Figure 1: Frame transmission without and with preemption, adapted from (1). With preemption the regular frame is canceled from the encoder buffer (dashed line in the right panel) in order to transmit the relevant frame in a shorter time.

 

Figure 2: The video from the camera mounted on the drone is encoded and sent with minimal delay to a server. When a relevant frame comes (red square) the previous regular frame is flushed (blue square with blue dashed line), compare with Figure 1. In the server the images are processed and utilzed for localizing the current position of the drone and analyzing the sorrounding environment. The server sends the results of the processing back to the drone, and triggers the next action.

REFERENCES

(1) IEEE Transaction on Multimedia, doi: 10.1109/TMM.2017.2726189 (published on line). (2) DE 10 2015 121 148 A1; WO 2017/093205 A1.

Dr. Tobia Mancabelli
E-Mail:
Phone:
Reference Number:
tmancabelli@baypat.de
+49 (0) 89 5480177 - 11
B75157