Design Pixel-Streaming Algorithms for Hardware Targeting

Workflow Overview

Hardware-targeted video processing designs use a pixel-streaming video format. Serial processing is efficient for hardware designs because less memory is required to store pixel data for computation. Vision HDL Toolbox™ blocks use a streaming pixel interface, with an accompanying control signal bus. The control signals indicate the relative location of each pixel within the image or video frame. The interface mimics the timing of a video system, including inactive intervals between frames. This protocol allows each block to operate independently of image size and format and makes the design more resilient to video timing errors.

The model templates for pixel-streaming algorithms help you get started. The templates convert from frames to pixel streams, and pixel streams to frames. To design an algorithm for FPGA deployment, add Vision HDL Toolbox blocks to the template. To open a template, see Create Model Using Simulink Templates.

Each template has five shaded areas:

  • Source — Contains the video data sources for your model. The Video Capture block provides either live camera data or a test pattern from the hardware device. Alternatively, you can select a data file input.

  • Conversion (1) — Converts frame video data into a pixel stream. The Image Frame To Pixels block maps the resolutions supported by the Video Capture block into custom timing parameters on the Frame To Pixels block (from Vision HDL Toolbox). Also, the model flattens the color components as required for the FPGA-targeting tools. The targeting tools recognize the control bus, and flatten the signals in the generated HDL code.

  • Algorithm — Contains the subsystem where you design your algorithms. This subsystem can include ports for physical board signals. It can also include ports connected to AXI-Lite registers, which you can control from the ARM® processor. This algorithm can optionally connect to external memory, such as a frame buffer, see Model External Memory Interfaces.

  • Conversion (2) — Converts the pixel stream back to frames and reconstructs the color components.

  • Display — Contains the display and evaluation of the results of the video processing algorithm.


The reference design on the Zynq® device requires the same video resolution and color format for the entire data path. The resolution you select on the Video Capture block must match that of your camera input. The design you target to the user logic section of the FPGA must not modify the frame size or color space of the video stream.

When you are satisfied with the results of the video processing in Simulink®, run HDL Workflow Advisor on the Pixel-Streaming Algorithm subsystem to generate code and load the design to the FPGA. See Target an FPGA on Zynq Hardware.

See Also

| |

Related Topics