ML-TN-005 — Real-time Social Distancing and video streaming on Orca SBC
History[edit | edit source]
|1.0.0||January 2022||First public release|
Introduction[edit | edit source]
This Technical Note (TN for short) illustrates an interesting use case of the i.MX8M Plus-powered Orca Single Board Computer. In a nutshell, this example demonstrates the following functionalities, which are typical for many video processing applications:
- Capturing live video streams from professional MIPI CSI-2 camera modules
- Hardware-accelerated color space conversion (Graphics Processing Unit, GPU)
- Hardware-accelerated real-time inferencing (Neural Processing Unit, NPU)
- Hardware-accelerated video encoding (Video Processing Unit, VPU)
- Hardware-accelerated video decoding (Video Processing Unit, VPU)
- Hardware-accelerated GUI (Graphics Processing Unit, GPU).
Testbed[edit | edit source]
Basically, the testbed consists of two Orca SBC's connected as shown in the following diagram.
Orca SBC #1 is also interfaced to the following camera modules by Basler:
Both SBC's run Yocto Linux based on the 5.4.70 BSP by NXP.
Implementation[edit | edit source]
Camera modules interfacing[edit | edit source]
The camera modules are connected to the MIPI CSI-2 ports of the i.MX8M Plus. Each one exploits 4 MIPI lanes.
The camera modules are configured as follows:
- camera module #1 (daA2500-60mc)
- resolution: 640x360
- frame rate: 15fps
- camera module #2 (daA3840-30mc)
- resolution: 1280x720
- frame rate: 30fps
Video streams processing[edit | edit source]
Stream #1[edit | edit source]
Camera #1 stream is the one processed by the Social Distancing (SD) algorithm, which is described in detail in this document. As such, this stream is fed to an NPU-powered neural network. This stream is also color space converted to fulfill the input requirements of the network. In order to offload the ARM Cortex-A53 cores, the conversion is performed by the GPU.
The output stream of the SD algorithm is hardware-encoded before being streamed over a Gigabit Ethernet connection. On the Orca SBC #2 side, the stream is hardware-decoded and visualized on a HDMI monitor.
Stream #2[edit | edit source]
Stream #2 originates at the daA3840-30mc camera module. Then, it is hardware-encoded and streamed by the Orca SBC #1. Finally, it is received by the Orca SBC #2, which hardware decodes and displays it.
GUI application[edit | edit source]
A GUI application runs on the Orca SBC #2 as well. This application is engineered with Crank Storyboard 6 and shows some parameters of the Social Distancing algorithm while it operates. The GUI application remotely communicates with the Social Distancing application — which is written in Python — through Storyboard IO API. According to official documentation, the Storyboard IO API, formerly known as GREIO, provides a platform independent communication API that allows inter-task and inter-process queued message passing. This is primarily used to allow external communication with a Storyboard application. For more details, please see this article.
Testing[edit | edit source]
Functional tests were conducted in an environment that mimics a real-world scenario. The goal is to use the SD algorithm for people detection in a hazardous area where a cobot operates and thus where only authorized personnel are allowed to enter — the cobot shown below is part of the actual ATE test-benches of DAVE Embedded Systems manufacturing department.
Orca SBC #1 is installed in the proximity of the cobot, while Orca SBC #2 is located in a different room where human operators can monitor the test bench remotely. Of course, advanced notification mechanisms could be also implemented in order to alarm operators if people are detected in the hazardous area.