Difference between revisions of "ML-TN-001 - AI at the edge: comparison of different embedded platforms - Part 5"

From DAVE Developer's Wiki
Jump to: navigation, search
(Created page with "{{InfoBoxTop}} {{AppliesToMachineLearning}} {{InfoBoxBottom}} __FORCETOC__ == History == {| class="wikitable" border="1" !Version !Date !Notes |- |1.0.0 |November 2020 |First...")
 
Line 17: Line 17:
 
==Introduction==
 
==Introduction==
 
This Technical Note (TN for short) belongs to the series introduced [[ML-TN-001_-_AI_at_the_edge:_comparison_of_different_embedded_platforms_-_Part_1|here]].
 
This Technical Note (TN for short) belongs to the series introduced [[ML-TN-001_-_AI_at_the_edge:_comparison_of_different_embedded_platforms_-_Part_1|here]].
 
  
 
This article illustrates the comparison in terms of performance of a Machine Learning-based classification application when accelerated with different Neural Processing Units, namely the NXP i.MX8M Plus NPU and the [https://coral.ai/products/accelerator/ Google Coral Edge TPU].
 
This article illustrates the comparison in terms of performance of a Machine Learning-based classification application when accelerated with different Neural Processing Units, namely the NXP i.MX8M Plus NPU and the [https://coral.ai/products/accelerator/ Google Coral Edge TPU].
  
 
Originally, the idea was to use the classifier described in [[ML-TN-001_-_AI_at_the_edge:_comparison_of_different_embedded_platforms_-_Part_1#Reference_application_.231:_fruit_classifier|this section]], which was already tested with the i.MX8M Plus NPU as described [[ML-TN-001_-_AI_at_the_edge:_comparison_of_different_embedded_platforms_-_Part_4|in this TN]]. However, this idea had to be discarded because of the unexpected difficulties detailed in the following sections.
 
Originally, the idea was to use the classifier described in [[ML-TN-001_-_AI_at_the_edge:_comparison_of_different_embedded_platforms_-_Part_1#Reference_application_.231:_fruit_classifier|this section]], which was already tested with the i.MX8M Plus NPU as described [[ML-TN-001_-_AI_at_the_edge:_comparison_of_different_embedded_platforms_-_Part_4|in this TN]]. However, this idea had to be discarded because of the unexpected difficulties detailed in the following sections.
 
 
 
==Test bed==
 
==Test bed==
 
As stated previously, it was not possible to use the fruit classifier application for testing. This is due to the fact that the compiler for Coral TPU was not able to handle this model because of flatten layers. At the time of this article, in fact, this kind of layer [https://coral.ai/docs/edgetpu/models-intro/#supported-operations was not listed among the supported types].
 
As stated previously, it was not possible to use the fruit classifier application for testing. This is due to the fact that the compiler for Coral TPU was not able to handle this model because of flatten layers. At the time of this article, in fact, this kind of layer [https://coral.ai/docs/edgetpu/models-intro/#supported-operations was not listed among the supported types].

Revision as of 16:20, 10 November 2020

Info Box
NeuralNetwork.png Applies to Machine Learning


History[edit | edit source]

Version Date Notes
1.0.0 November 2020 First public release

Introduction[edit | edit source]

This Technical Note (TN for short) belongs to the series introduced here.

This article illustrates the comparison in terms of performance of a Machine Learning-based classification application when accelerated with different Neural Processing Units, namely the NXP i.MX8M Plus NPU and the Google Coral Edge TPU.

Originally, the idea was to use the classifier described in this section, which was already tested with the i.MX8M Plus NPU as described in this TN. However, this idea had to be discarded because of the unexpected difficulties detailed in the following sections.

Test bed[edit | edit source]

As stated previously, it was not possible to use the fruit classifier application for testing. This is due to the fact that the compiler for Coral TPU was not able to handle this model because of flatten layers. At the time of this article, in fact, this kind of layer was not listed among the supported types.