Changes

Jump to: navigation, search
no edit summary
In essence, the system consists of a high-frame-rate image sensor (*) and interfaced to a Linux-based embedded platformalso denoted as PP in the rest of the document. The sensor frames a specific area of the line and sends the video a constant-rate flow of frames to a processing platform (also denoted as PP in the rest of the document) for further processing, as detailed in the following sections.
Streaming capability is used to monitor the production line remotely. Under normal operation, this is enough for the human operators to get an overview of the line while it is working. For this purpose, a simple low-frame-rate video stream (25 fps or something), which can be viewed on a remote device, does the job.
The most interesting functionality is related to the recording capability associated to with alarm events, however. As shown in the previous image, the production line is governed by a Programmable Logic Controller (PLC), which is interfaced to several actuators and sensors. Of course, the line may be subject to different kinds of faults. The most severe—for instance, a major mechanical failure—can lead to the automatic stop of the line. Thanks to the aforementioned sensors, the PLC is aware notified in real-time of such faulty conditions. In these situations, it triggers an alarm signal directed to the video recording system. Whenever an alarm is detected, the recording system saves (on a persistent storage device) high-frame-rate footage showing what happened right before and right after the alarm event. Automation engineers and maintenance personnel can leverage afterwards this fine-grained sequence of frames to analyze in detail the scene around the occurrence of the alarm event, searching for its root cause (this process is also referred to as ''post-mortem analysis'').
==Software implementation==
[[#fig1|Figure 1]] also shows a simplified block diagram of the application software architecture that was developed to implement this solution. The application is a multi-threaded program, which runs on the processing platform. The high-level business logic is coded in a finite state machine (FSM), which interacts with the threads. Each thread takes care of a particular task.
During normal operation, the high-frame-rate stream generated by the image sensor is acquired by the thread T1. This stream is indicated by the red flow in the previous picture. T1 also stores the frames coming from the sensor into the an alarm buffer and passes them to the thread T4 after a down rate conversion (low-frame-rate stream is denoted in green). In parallel, T4 creates a compressed video stream to be transmitted over the local network.
When an alarm is detected by thread T3, thread T2—which is usually idle—is enabled. Once the alarm buffer is filled, this thread stores it persistently on a solid-state drive (SSD).
The application also integrates a web interface (thread T5) that allows to supervise and control the recording/streaming system. For example, it can be used to enable/disable the alarm recording functionality and to read statistical information.
Let t<sub>0</sub> be the time associated with the occurrence of an alarm. t<sub>0</sub>-t<sub>B</sub> is the time window before the alarm to be recorded. t<sub>A</sub>-t<sub>0</sub> is the time window after the alarm to be recorded. Consequently, t<sub>A</sub>-t<sub>B</sub> is the size of the entire window to be recorded. For exampleSpecifically, for the the PoC here described:
*t<sub>0</sub>-t<sub>B</sub> = 8s
*t<sub>A</sub>-t<sub>0</sub> = 3s
*t<sub>A</sub>-t<sub>B</sub> = 11s.Once the size of the time window is known, it is straightforward to determine how much RAM memory is required for the buffer. In the case under discussion, the size of a frame is approximately 1280x720x8bpp=921600 byte. One second of recording is thus 921600x300≈264 MByte. In conclusionTherefore, the buffer has to be at least 264x11=2904 MByte to contain all the required frames.
===Alarm recordings===
The processing platform is equipped with a dedicated SSD that is used to store alarm buffers only. The operating system and the application software are stored on a different flash memory instead. Of course, its the size of the SSD should be determined depending on the maximum number of alarms to be stored at the same time. The directory containing the files associated with the alarm events is also shared via [https://en.wikipedia.org/wiki/Samba_(software) Samba (SMB) protocol]. This allows to easily access these files from Windows PC's connected to the same local network as well.
For each alarm, a separate subdirectory is created. In such a subdirectory, the following files are stored:
* a video file generated from the individual frames.
It is worth mentioning the naming scheme used for the BMP files. From the programming perspective, each frame retrieved from the sensor image is an instance of a class—the application software makes use of the object-oriented programming (OOP) model, in fact. Interestingly, these instances, besides raw pixel data, include a timestamp as wellbesides raw pixel data. This timestamp, which has a 20ns resolution, is based on a local clock free-running counter integrated into the image sensor, which is clocked by a local clock. Every time a new frame is captured, it is associated with the current value of a the free-running counter. This counter is clocked by the aforementioned local clock. That being the case, timestamps are not that useful if taken separately, as because they are expressed as an absolute value, which is not related to any usable reference clock. They can be extremely valuable, however, if used as relative quantities, for instance when it comes to determining the elapsed time between two frames. LetFor instance, let's assume that we want to measure the time between two frames (ΔT), say F1 and F2. Let TS1 and TS2 be the associated timestamps respectively. To calculate ΔT expressed in ns, we just need to do (TS2-TS1)x20. An example of the use utilization of such a method timestamps is described in [[#Real-timeness and frame dropping|this section]].
When saving the frames as BMP files, the timestamp value would be lostthough. To avoid losing this precious information, file names are formatted such that timestamps are preserved too. See for example the following list, which refers to an alarm triggered on August 6th, 2020 at 10:30:08 Central European Summer Time (CEST):
<pre class="board-terminal">
sysadmin@hfrcpoc1-0:/mnt/alarms/2020-08-06_10.30.08_CEST$ ll
</pre>
The name of the subdirectory (<code>2020-08-06_10.30.08_CEST</code>) is self explanatory. File names formatting is a little bit trickier and . It looks like this:
<pre>
<progressive counter>_<time offset relative to alarm frame>.bmp
</pre>
The first part is a progressive counter, starting from 000000. In other words, the image whose file name is something like 000000_x.bmp refers to the frame captured at the time beginning of the recording window (t<sub>B</sub>). At the other end, the image whose file name starts with the highest counter refers to the frame captured at the time end of the recording window (t<sub>A</sub>). This scheme is convenient for automatically processing the frames because it allows to order them very easily.
The second part of the image file names is the relative time offset (expressed in seconds) with respect to the "alarm frame." The alarm frame, in turn, is the first frame captured after the detection of an alarm signal. In other words, it is supposed to be the closest—temporally—to the event causing the alarm. Consequently, the alarm frame is always named as <code>n_+0.000000000.bmp</code>. In other words, this frame is associate with t<sub>0</sub>. In the example shown above, the alarm frame's name is <code>002424_+0.000000000.bmp</code>. This rule allows to determine straightforwardly how close a frame is to the alarm event. For instance, see the following image that refers to a screenshot captured on the processing platform itself.
The Just by looking at the file name, one can understand that the frame shown was captured about 1.58s before the alarm event. Of course, the same image can be viewed on a remote Windows PC too, by accessing the shared directory via SMB:
4,650
edits

Navigation menu