Difference between revisions of "ML-TN-001 - AI at the edge: comparison of different embedded platforms - Part 3"

From DAVE Developer's Wiki
Jump to: navigation, search
(Introduction)
Line 18: Line 18:
  
 
==Introduction==
 
==Introduction==
This Technical Note (TN for short) details the execution of the same inference application described [[ML-TN-001_-_AI_at_the_edge:_comparison_of_different_embedded_platforms_-_Part_2|here]] on the [https://www.nxp.com/products/processors-and-microcontrollers/arm-processors/i-mx-applications-processors/i-mx-8-processors/i-mx-8m-plus-arm-cortex-a53-machine-learning-vision-multimedia-and-industrial-iot:IMX8MPLUS NXP i.MX8M Plus EVK]. The results achieved are also compared to the ones produced by the i.MX8M-powered platform that was considered previously.
+
This Technical Note (TN for short) details the execution of the same inference application described [[ML-TN-001_-_AI_at_the_edge:_comparison_of_different_embedded_platforms_-_Part_2|here]] on the [https://www.xilinx.com/products/boards-and-kits/zcu104.html Xilinx Zynq UltraScale+ MPSoC ZCU104 Evaluation Kit]. The results achieved are also compared to the ones produced by the platforms that were considered in the [[ML-TN-001_-_AI_at_the_edge:_comparison_of_different_embedded_platforms_-_Part_1#Articles in this series|previous articles of this series]].

Revision as of 15:17, 7 September 2020

Info Box
DMI-Mito-top.png Applies to MITO 8M
Work in progress


History[edit | edit source]

Version Date Notes
1.0.0 September 2020 First public release

Introduction[edit | edit source]

This Technical Note (TN for short) details the execution of the same inference application described here on the Xilinx Zynq UltraScale+ MPSoC ZCU104 Evaluation Kit. The results achieved are also compared to the ones produced by the platforms that were considered in the previous articles of this series.