Difference between revisions of "MISC-TN-015: Proof-of-Concept of an industrial, high-frame-rate video recording/streaming system"

From DAVE Developer's Wiki
Jump to: navigation, search
(Future work)
(Software implementation)
Line 40: Line 40:
 
[[#fig1|Figure 1]] also shows a simplified block diagram of the application software architecture that was developed to implement this solution. The application is a multi-threaded program. The high-level business logic is coded in a finite state machine (FSM), which interacts with the threads. Each thread takes care of a particular task.
 
[[#fig1|Figure 1]] also shows a simplified block diagram of the application software architecture that was developed to implement this solution. The application is a multi-threaded program. The high-level business logic is coded in a finite state machine (FSM), which interacts with the threads. Each thread takes care of a particular task.
  
During normal operation, the high-frame-rate stream (indicated by the red flow in the previous picture) generated by the image sensor is acquired by thread T1. T1 also stores these frames into the alarm buffer and passes them to the thread T4 after a down rate conversion (low-frame-rate stream is denoted in green). In parallel, T4 creates a compressed video stream to be transmitted over the local network.
+
During normal operation, the high-frame-rate stream generated by the image sensor is acquired by the thread T1. This stream indicated by the red flow in the previous picture. T1 also stores the frames coming from the sensor into the alarm buffer and passes them to the thread T4 after a down rate conversion (low-frame-rate stream is denoted in green). In parallel, T4 creates a compressed video stream to be transmitted over the local network.
  
 
When an alarm is detected by thread T3, T2—which is usually idle—is enabled. Once the alarm buffer is filled, this thread stores it persistently on a solid-state drive (SSD).
 
When an alarm is detected by thread T3, T2—which is usually idle—is enabled. Once the alarm buffer is filled, this thread stores it persistently on a solid-state drive (SSD).
Line 47: Line 47:
  
 
===Sizing the alarm buffer===
 
===Sizing the alarm buffer===
Alarm buffer's size is related to the size of the time window surrounding the alarm event, as depicted in the previous image.
+
Alarm buffer's size is related to the size of the time window surrounding the alarm event, as depicted in the following image.
  
  
TBD
+
[[File:MISC-TN-015-alarm-buffer.png|thumb|center|Fig.2: Time window surrounding an alarm event]]
  
  
Line 63: Line 63:
  
 
===Alarm recordings===
 
===Alarm recordings===
 
  
 
==Future work==
 
==Future work==

Revision as of 14:10, 5 August 2020

Info Box
NeuralNetwork.png Applies to Machine Learning


History[edit | edit source]

Version Date Notes
1.0.0 August 2020 First public release

Introduction[edit | edit source]

This Technical Note (TN for short) illustrates a Proof-of-Concept (PoC) that DAVE Embedded Systems made for a customer operating in the industrial automation market. The goal was to build a prototype of a high-frame-rate video recording/streaming system. In a typical scenario, illustrated in the following picture, this device would be used in fast automatic manufacturing lines for two purposes:

  • remote monitoring
  • detailed off-line "post-mortem" failure analysis.


Fig. 1: Typical usage scenario


In essence, the system consists of a high-frame-rate image sensor (*) shooting a specific area of the line. The frames captured by the sensors are addressed to an embedded platform for further processing, as detailed in the following sections.


(*) Resolution and frame rate of this stream have to be carefully determined in function of the characteristics of the scene to shoot, first and foremost the speed of moving objects framed by the sensor and its lens. In the case under discussion, the customer specified a resolution of 1280x720, a frame rate of 300 fps, and the use of a global shutter.

Functionalities[edit | edit source]

Streaming capability is used to monitor the production line remotely. Under normal operation, this is enough for the human operators to get an overview of the line while it is working. For this purpose, a simple low-frame-rate video stream (25 fps or something), which can be viewed on a remote device, does the job.

The most interesting functionality is related to recording capability associated to alarm events, however. As shown in the previous image, the production line is governed by a Programmable Logic Controller (PLC), which is interfaced to several actuators and sensors. Of course, the line may be subject to different kinds of faults. The most severe—for instance, a major mechanical failure—can lead to the automatic stop of the line. Thanks to the aforementioned sensors, the PLC is aware of such faulty conditions. In these situations, it triggers an alarm signal directed to the video recording system. Whenever an alarm is detected, the recording system saves (on a persistent storage device) high-frame-rate footage showing what happened right before and right after the alarm event. Automation engineers and maintenance personnel can leverage afterwards this fine-grained sequence of frames to analyze in detail the scene around the occurrence of the alarm event, searching for its root cause (this process is also referred to as post-mortem analysis).

Software implementation[edit | edit source]

Figure 1 also shows a simplified block diagram of the application software architecture that was developed to implement this solution. The application is a multi-threaded program. The high-level business logic is coded in a finite state machine (FSM), which interacts with the threads. Each thread takes care of a particular task.

During normal operation, the high-frame-rate stream generated by the image sensor is acquired by the thread T1. This stream indicated by the red flow in the previous picture. T1 also stores the frames coming from the sensor into the alarm buffer and passes them to the thread T4 after a down rate conversion (low-frame-rate stream is denoted in green). In parallel, T4 creates a compressed video stream to be transmitted over the local network.

When an alarm is detected by thread T3, T2—which is usually idle—is enabled. Once the alarm buffer is filled, this thread stores it persistently on a solid-state drive (SSD).

The application also integrates a web interface (thread T5) that allows to supervise and control the recording/streaming system. For example, it can be used to enable/disable the alarm recording functionality and to read statistical information.

Sizing the alarm buffer[edit | edit source]

Alarm buffer's size is related to the size of the time window surrounding the alarm event, as depicted in the following image.


Fig.2: Time window surrounding an alarm event


Let t0 be the time associated with the occurrence of an alarm. t0-tB is the time before the alarm to be recorded. tA-t0 is the time after the alarm to be recorded. Consequently, tB-tA is the size of the entire window to be recorder. For example, for the the PoC here described:

  • t0-tB = 8s
  • tA-t0 = 3s
  • tB-tA = 11s

Once the size of the time window is knowkn, it is straightforward to determine how much RAM is required for the buffer. In the case under discussion, the size of a frame is approximately 1280x720x8bpp=921600 byte. One second of recording is thus 921600x300≈264 MByte. In conclusion, the buffer has to be at least 264x11=2904 MByte to contain all the required frames.


which, in turn, enters "alarm mode." This mode is used to

Alarm recordings[edit | edit source]

Future work[edit | edit source]

ML

alarm detection latency

Credits[edit | edit source]