Open main menu

DAVE Developer's Wiki β

ML-TN-005 — Real-time Social Distancing and video streaming on Orca SBC

Revision as of 15:47, 7 December 2021 by U0001 (talk | contribs) (Camera modules interfacing)

Info Box
NeuralNetwork.png Applies to Machine Learning


Contents

HistoryEdit

Version Date Notes
1.0.0 December 2021 First public release

IntroductionEdit

This Technical Note (TN for short) illustrates an interesting use case of the i.MX8M Plus-powered Orca Single Board Computer. In a nutshell, this example demonstrates the following functionalities, which are typical for many video processing applications:

  • Capturing live video streams from MIPI CSI-2 camera modules
  • Hardware-accelerated color space conversion (GPU)
  • Hardware-accelerated real-time inferencing (NPU)
  • Hardware-accelerated video encoding (two streams)
  • Hardware-accelerated video decoding (two streams)
  • Hardware-accelerated GUI (GPU).

TestbedEdit

Basically, the testbed consists of two Orca SBC's connected as shown in the following diagram.



Orca SBC #1 is also interfaced to the following camera modules by Basler:

  • daa2500-60mc
  • daa3840-30mc

Both SBC's run Yocto Linux based on the 5.4.70 BSP by NXP.

ImplementationEdit

Camera modules interfacingEdit

The camera modules are connected to the MIPI CSI-2 ports of the i.MX8M Plus. Each one exploits 4 MIPI lanes.

The camera modules are configured TBD

  • camera module #1: daa2500-60mc
    • resolution: 640x360
    • frame rate: 15fps
  • camera module #2: daa3840-30mc
    • resolution: 1280x720
    • frame rate: 30fps

Video streams processingEdit

Stream #1Edit

Camera module #1 stream is the one submitted to the Social Distancing (SD) algorithm, which is described in detail in this document. As such, this stream is fed to an NPU-powered neural network. This stream is also color space converted to fulfill the input requirements of the network. In order to offload the ARM Cortex-A53 cores, the conversion is performed by the GPU.

The output stream of the SD algorithm is hardware encoded before being streamed over a Gigabit Ethernet connection. On the Orca SBC #2 side, the stream is hardware decoded and visualized on a HDMI monitor.

Stream #2Edit

Stream #2 originates at the daa3840-30mc camera module. Then, it is hardware-encoded and streamed by the Orca SBC #1. Finally, it is received by the Orca SBC #2, which hardware decodes and visualizes it.

GUIEdit

A GUI application runs on the Orca SBC #1 too. This application is engineered with Crank Storyboard 6 and shows some parameters of the Social Distancing algorithm while it operates. The GUI application communicates with the SD application — which is written in Python — through Storyboard IO API. According to official documentation, the Storyboard IO API, formerly known as GREIO, provides a platform independent communication API that allows inter-task and inter-process queued message passing. This is primarily used to allow external communication with a Storyboard application.

TestingEdit

Functional tests were conducted in an environment that mimics a real-world scenario. TBD

TBD inserire video