Difference between revisions of "ML-TN-005 — Real-time Social Distancing and video streaming on Orca SBC"

From DAVE Developer's Wiki
Jump to: navigation, search
(History)
 
(21 intermediate revisions by 2 users not shown)
Line 15: Line 15:
 
|-
 
|-
 
|1.0.0
 
|1.0.0
|December 2021
+
|January 2022
 
|First public release
 
|First public release
 
|-
 
|-
Line 22: Line 22:
 
=Introduction=
 
=Introduction=
 
This Technical Note (TN for short) illustrates an interesting use case of the i.MX8M Plus-powered [[ORCA_SBC|Orca Single Board Computer]]. In a nutshell, this example demonstrates the following functionalities, which are typical for many video processing applications:
 
This Technical Note (TN for short) illustrates an interesting use case of the i.MX8M Plus-powered [[ORCA_SBC|Orca Single Board Computer]]. In a nutshell, this example demonstrates the following functionalities, which are typical for many video processing applications:
* Capturing live video streams from MIPI CSI-2 cameras
+
* Capturing live video streams from professional MIPI CSI-2 camera modules
* Hardware-accelerated color space conversion (GPU-powered)
+
* Hardware-accelerated color space conversion (Graphics Processing Unit, GPU)
* Hardware-accelerated real-time inferencing (NPU-powered)
+
* Hardware-accelerated real-time inferencing (Neural Processing Unit, NPU)
* Hardware-accelerated video encoding (two streams)
+
* Hardware-accelerated video encoding (Video Processing Unit, VPU)
* Hardware-accelerated video decoding (two streams)
+
* Hardware-accelerated video decoding (Video Processing Unit, VPU)
* Hardware-accelerated GUI (GPU)
+
* Hardware-accelerated GUI (Graphics Processing Unit, GPU).
  
 
=Testbed=
 
=Testbed=
 
Basically, the testbed consists of two Orca SBC's connected as shown in the following diagram.
 
Basically, the testbed consists of two Orca SBC's connected as shown in the following diagram.
 
+
{| style="background-color: #ededed;padding: 20px"
TBD
+
|[[File:ML-TN-005 dual camera system.png|none|thumb|right|170x170px|Dual Camera Module with Machine Learning Processing Unit]]
 
+
|[[File:OrcaSBC-demo-dual-camera3.png|thumb|center|600px|
 +
Test Bed block diagram
 +
]]
 +
|[[File:ML-TN-005 HDMI GUI module.png|alt=Decoded and Display Module|thumb|left|170x170px|Decoded and Display Module]]
 +
|}
 
Orca SBC #1 is also interfaced to the following camera modules by Basler:
 
Orca SBC #1 is also interfaced to the following camera modules by Basler:
*daa2500-60mc
+
*[https://www.baslerweb.com/en/products/cameras/area-scan-cameras/dart/daa2500-60mc-s-mount/ daA2500-60mc]
*daa3840-30mc
+
*[https://www.baslerweb.com/en/products/cameras/area-scan-cameras/dart/daa3840-30mc-s-mount/ daA3840-30mc]
  
 
Both SBC's run Yocto Linux based on the 5.4.70 BSP by NXP.
 
Both SBC's run Yocto Linux based on the 5.4.70 BSP by NXP.
Line 44: Line 48:
 
The camera modules are connected to the MIPI CSI-2 ports of the i.MX8M Plus. Each one exploits 4 MIPI lanes.
 
The camera modules are connected to the MIPI CSI-2 ports of the i.MX8M Plus. Each one exploits 4 MIPI lanes.
  
The camera modules are configured TBD
+
The camera modules are configured as follows:
*camera module #1: daa2500-60mc
+
*camera module #1 (daA2500-60mc)
 
** resolution: 640x360
 
** resolution: 640x360
 
**frame rate: 15fps
 
**frame rate: 15fps
*camera module #2: daa3840-30mc
+
*camera module #2 (daA3840-30mc)
** resolution:
+
** resolution: 1280x720
**frame rate:
+
**frame rate: 30fps
  
 
==Video streams processing==
 
==Video streams processing==
Camera module #1 stream
 
  
===Social Distancing algorithm===
+
=== Stream #1 ===
The Social Distancing algorithm is described in detail in [[ML-TN-002_-_Real-time_Social_Distancing_estimation this document]].
+
Camera #1 stream is the one processed by the Social Distancing (SD) algorithm, which is described in detail in [[ML-TN-002_-_Real-time_Social_Distancing_estimation|this document]]. As such, this stream is fed to an NPU-powered neural network. This stream is also color space converted to fulfill the input requirements of the network. In order to offload the ARM Cortex-A53 cores, the conversion is performed by the GPU.
==GUI==
+
 
A GUI application runs on the Orca SBC #1 too. This application is engineered with [https://www.cranksoftware.com/storyboard Crank Storyboard 6] and shows some parameters of the Social Distancing algorithm.
+
The output stream of the SD algorithm is hardware-encoded before being streamed over a Gigabit Ethernet connection. On the Orca SBC #2 side, the stream is hardware-decoded and visualized on a HDMI monitor.
 +
 
 +
===Stream #2===
 +
Stream #2 originates at the daA3840-30mc camera module. Then, it is hardware-encoded and streamed by the Orca SBC #1. Finally, it is received by the Orca SBC #2, which hardware decodes and displays it.
 +
==GUI application==
 +
A GUI application runs on the Orca SBC #2 as well. This application is engineered with [https://www.cranksoftware.com/storyboard Crank Storyboard 6] and shows some parameters of the Social Distancing algorithm while it operates. The GUI application remotely communicates with the Social Distancing application — which is written in Python — through [https://resources.cranksoftware.com/cranksoftware/v5.3.0/docs/webhelp/index.html#ww_sbio.html Storyboard IO API]. According to official documentation, ''the Storyboard IO API, formerly known as GREIO, provides a platform independent communication API that allows inter-task and inter-process queued message passing. This is primarily used to allow external communication with a Storyboard application''. For more details, please see [https://support.cranksoftware.com/hc/en-us/articles/360056943752-Receiving-Events-from-a-Storyboard-Application this article].
 +
 
 +
= Testing =
 +
Functional tests were conducted in an environment that mimics a real-world scenario. The goal is to use the SD algorithm for people detection in a hazardous area where a cobot operates and thus where only authorized personnel are allowed to enter — the cobot shown below is part of the actual [https://en.wikipedia.org/wiki/Automatic_test_equipment ATE] test-benches of DAVE Embedded Systems manufacturing department.
 +
 
 +
Orca SBC #1 is installed in the proximity of the cobot, while Orca SBC #2 is located in a different room where human operators can monitor the test bench remotely. Of course, advanced notification mechanisms could be also implemented in order to alarm operators if people are detected in the hazardous area.
 +
 
 +
{| class="wikitable" | width="100%"
 +
| {{#ev:youtube|EMvh6gMmpXQ|600|center|Simultaneous dual video capturing and encoding plus Machine Learning People detector on NXP i.MX8MP|frame}}
 +
|}

Latest revision as of 11:15, 27 January 2022

Info Box
NeuralNetwork.png Applies to Machine Learning


History[edit | edit source]

Version Date Notes
1.0.0 January 2022 First public release

Introduction[edit | edit source]

This Technical Note (TN for short) illustrates an interesting use case of the i.MX8M Plus-powered Orca Single Board Computer. In a nutshell, this example demonstrates the following functionalities, which are typical for many video processing applications:

  • Capturing live video streams from professional MIPI CSI-2 camera modules
  • Hardware-accelerated color space conversion (Graphics Processing Unit, GPU)
  • Hardware-accelerated real-time inferencing (Neural Processing Unit, NPU)
  • Hardware-accelerated video encoding (Video Processing Unit, VPU)
  • Hardware-accelerated video decoding (Video Processing Unit, VPU)
  • Hardware-accelerated GUI (Graphics Processing Unit, GPU).

Testbed[edit | edit source]

Basically, the testbed consists of two Orca SBC's connected as shown in the following diagram.

Dual Camera Module with Machine Learning Processing Unit
Test Bed block diagram
Decoded and Display Module
Decoded and Display Module

Orca SBC #1 is also interfaced to the following camera modules by Basler:

Both SBC's run Yocto Linux based on the 5.4.70 BSP by NXP.

Implementation[edit | edit source]

Camera modules interfacing[edit | edit source]

The camera modules are connected to the MIPI CSI-2 ports of the i.MX8M Plus. Each one exploits 4 MIPI lanes.

The camera modules are configured as follows:

  • camera module #1 (daA2500-60mc)
    • resolution: 640x360
    • frame rate: 15fps
  • camera module #2 (daA3840-30mc)
    • resolution: 1280x720
    • frame rate: 30fps

Video streams processing[edit | edit source]

Stream #1[edit | edit source]

Camera #1 stream is the one processed by the Social Distancing (SD) algorithm, which is described in detail in this document. As such, this stream is fed to an NPU-powered neural network. This stream is also color space converted to fulfill the input requirements of the network. In order to offload the ARM Cortex-A53 cores, the conversion is performed by the GPU.

The output stream of the SD algorithm is hardware-encoded before being streamed over a Gigabit Ethernet connection. On the Orca SBC #2 side, the stream is hardware-decoded and visualized on a HDMI monitor.

Stream #2[edit | edit source]

Stream #2 originates at the daA3840-30mc camera module. Then, it is hardware-encoded and streamed by the Orca SBC #1. Finally, it is received by the Orca SBC #2, which hardware decodes and displays it.

GUI application[edit | edit source]

A GUI application runs on the Orca SBC #2 as well. This application is engineered with Crank Storyboard 6 and shows some parameters of the Social Distancing algorithm while it operates. The GUI application remotely communicates with the Social Distancing application — which is written in Python — through Storyboard IO API. According to official documentation, the Storyboard IO API, formerly known as GREIO, provides a platform independent communication API that allows inter-task and inter-process queued message passing. This is primarily used to allow external communication with a Storyboard application. For more details, please see this article.

Testing[edit | edit source]

Functional tests were conducted in an environment that mimics a real-world scenario. The goal is to use the SD algorithm for people detection in a hazardous area where a cobot operates and thus where only authorized personnel are allowed to enter — the cobot shown below is part of the actual ATE test-benches of DAVE Embedded Systems manufacturing department.

Orca SBC #1 is installed in the proximity of the cobot, while Orca SBC #2 is located in a different room where human operators can monitor the test bench remotely. Of course, advanced notification mechanisms could be also implemented in order to alarm operators if people are detected in the hazardous area.

Simultaneous dual video capturing and encoding plus Machine Learning People detector on NXP i.MX8MP