ML-TN-005 — Real-time Social Distancing and video streaming on Orca SBC

From DAVE Developer's Wiki
Revision as of 15:58, 6 December 2021 by U0001 (talk | contribs) (Stream #2)

Jump to: navigation, search
Info Box
NeuralNetwork.png Applies to Machine Learning


History[edit | edit source]

Version Date Notes
1.0.0 December 2021 First public release

Introduction[edit | edit source]

This Technical Note (TN for short) illustrates an interesting use case of the i.MX8M Plus-powered Orca Single Board Computer. In a nutshell, this example demonstrates the following functionalities, which are typical for many video processing applications:

  • Capturing live video streams from MIPI CSI-2 cameras
  • Hardware-accelerated color space conversion (GPU-powered)
  • Hardware-accelerated real-time inferencing (NPU-powered)
  • Hardware-accelerated video encoding (two streams)
  • Hardware-accelerated video decoding (two streams)
  • Hardware-accelerated GUI (GPU)

Testbed[edit | edit source]

Basically, the testbed consists of two Orca SBC's connected as shown in the following diagram.


OrcaSBC-demo-dual-camera2.png


Orca SBC #1 is also interfaced to the following camera modules by Basler:

  • daa2500-60mc
  • daa3840-30mc

Both SBC's run Yocto Linux based on the 5.4.70 BSP by NXP.

Implementation[edit | edit source]

Camera modules interfacing[edit | edit source]

The camera modules are connected to the MIPI CSI-2 ports of the i.MX8M Plus. Each one exploits 4 MIPI lanes.

The camera modules are configured TBD

  • camera module #1: daa2500-60mc
    • resolution: 640x360
    • frame rate: 15fps
  • camera module #2: daa3840-30mc
    • resolution:
    • frame rate:

Video streams processing[edit | edit source]

Stream #1[edit | edit source]

Camera module #1 stream is the one submitted to the Social Distancing (SD) algorithm, which is described in detail in this document. As such, this stream is fed to an NPU-powered neural network. This stream is also color space converted to fulfill the input requirements of the network. In order to offload the ARM Cortex-A53 cores, the conversion is performed by the GPU.

The output stream of the SD algorithm is hardware encoded before being streamed over a Gigabit Ethernet connection. On the Orca SBC #2 side, the stream is hardware decoded and visualized on a HDMI monitor.

Stream #2[edit | edit source]

Stream #2 originates at the daa3840-30mc camera module. Then, it is hardware-encoded and streamed by the Orca SBC #1. Finally, it is received by the Orca SBC #2, which hardware decodes and visualizes it.

GUI[edit | edit source]

A GUI application runs on the Orca SBC #1 too. This application is engineered with Crank Storyboard 6 and shows some parameters of the Social Distancing algorithm while it operates. The GUI application communicates with the SD application — which is written in Python — through Storyboard IO API. According to official documentation, the Storyboard IO API, formerly known as GREIO, provides a platform independent communication API that allows inter-task and inter-process queued message passing. This is primarily used to allow external communication with a Storyboard application.

Testing[edit | edit source]

TBD