Target platform 'DLXCKU5PE' is not supported for quantization.

21 Ansichten (letzte 30 Tage)
KH
KH am 26 Sep. 2025
Kommentiert: KH am 6 Nov. 2025 um 8:28
Hi,
DLXCKU5PE is my self-generated deep learning bitstream with the datatype of int8. When I want to validate the quantized deep learning network on my FPGA platform.
An error occurred.
I wonder how to solve this problem. Or only offical evaluation boards support this function.

Akzeptierte Antwort

Anjaneyulu Bairi
Anjaneyulu Bairi am 16 Okt. 2025 um 4:34
Hi,
This error usually arises in Deep Learning HDL Toolbox when:
  • The target FPGA platform you selected (DLXCKU5PE) is not officially supported by the quantization workflow in MATLAB/HDL Coder/Deep Learning HDL Toolbox.
  • The platform is either custom or not included in the list of supported boards for quantized deployment.
Try to follow below steps which might helpful for you:
1. Check Supported Platforms
2. Custom Board Registration
  • For custom boards, you may need to create a custom platform registration using the dlhdl.Target and dlhdl.Board classes, but quantization support may still be limited.
3. Try Float Deployment
  • If quantized (int8) deployment is not supported, you may be able to deploy your network using single (floating point) precision instead.
I hope this helps!
  1 Kommentar
KH
KH am 6 Nov. 2025 um 8:28
Thanks,
I am glad to receive your reply.
I have solved this problem by following the guide at
The Matlab can work with my platform.
However, the accuracy drops by roughly 4%. I’m currently looking into ways to recover this loss.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu FPGA, ASIC, and SoC Development finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by