Open main menu

DAVE Developer's Wiki β

MISC-TN-011: Running an Azure-generated TensorFlow Lite model on Mito8M SoM using NXP eIQ

Revision as of 09:23, 25 March 2020 by U0001 (talk | contribs)

Info Box
DMI-Mito-top.png Applies to MITO 8M
NeuralNetwork.png Applies to Machine Learning
Warning-icon.png This technical note was validated against specific versions of hardware and software. What is described here may not work with other versions. Warning-icon.png

Contents

HistoryEdit

Version Date Notes
1.0.0 March 2020 First public release

IntroductionEdit

In this Technical Note (SBCX-TN-005) (TN for short), a simple image classifier was implemented on the Axel Lite SoM.

In this TN (MISC-TN-010), it is illustrated how to run NXP eIQ Machine Learning software on i.MX8M-powered Mito8M SoM.

This article combines the results shown in the TN's just mentioned. In other words, it describes how to run the same image classifier used in SBCX-TN-005 with the eIQ software stack. The outcome is an optimized imaging classification application written in C++ running on Mito8M SoM and that makes use of eIQ software stack.

Workflow and resulting block diagramEdit

The following picture shows the block diagram of the resulting application and part of the workflow used to build it.

First of all, the TensorFlow (TF) model generated with Microsoft Azure Custom Vision was converted into the TensorFlow Lite (TFL) format.

Then, a new C++ application was written, using the examples provided by TFL as starting points. After debugging this application on a host PC, it was migrated to the edge device (a Mito8M-powered platform, in our case) where it was natively built. The root file system for eIQ, in fact, provides the native C++ compiler as well.

Running the applicationEdit