Main Content

visionhdl.FrameToPixels

Convert frame-based video to pixel stream

Description

The visionhdl.FrameToPixels System object™ converts color or grayscale frame-based video to a pixel stream and control structure. The control structure indicates the validity of each pixel and its location in the frame. The pixel stream format can include padding pixels around the active frame. You can configure the frame and padding dimensions by selecting a common video format or by specifying custom dimensions. For details about the pixel stream format, see Streaming Pixel Interface.

Use this object to generate input for a function targeted for HDL code generation. This object itself does not support HDL code generation.

If your design converts frames to a pixel stream and later converts the stream back to frames, specify the same video format for the FrameToPixels object and the PixelsToFrame object.

To convert frame-based video to a pixel stream:

  1. Create the visionhdl.FrameToPixels object and set its properties.

  2. Call the object with arguments, as if it were a function.

To learn more about how System objects work, see What Are System Objects?

Creation

Description

F2P = visionhdl.FrameToPixels(Name,Value) returns a System object that serializes input frames into a pixel stream. Set properties using one or more name-value pairs. Enclose each property name in single quotes.

example

Properties

expand all

Unless otherwise indicated, properties are nontunable, which means you cannot change their values after calling the object. Objects lock when you call them, and the release function unlocks them.

If a property is tunable, you can change its value at any time.

For more information on changing property values, see System Design in MATLAB Using System Objects.

Number of values used to represent each pixel, specified as 1, 3, or 4 components.

  • For grayscale video, set this property to 1.

  • For color video, for example, {R,G,B} or {Y,Cb,Cr}, set this property to 3.

  • For color video with an alpha channel for transparency, set this property to 4.

The visionhdl.FrameToPixels object returns a P-by-NumComponents matrix, where P is the total number of pixels in the padded frame.

Dependencies

When NumComponents is greater than 1, you must set the NumPixels property to 1.

Number of pixels transferred on the streaming interface for each cycle, specified as 1, 2, 4, or 8. To enable multipixel streaming and increase throughput for high-resolution or high-frame-rate video, set this property to 2, 4, or 8. The visionhdl.FrameToPixels object returns a P-by-NumPixels matrix, where P is the total number of pixels in the padded frame.

Note

You can simulate System objects with a multipixel streaming interface, but you cannot generate HDL code for System objects that use multipixel streams. To generate HDL code for multipixel algorithms, use the equivalent Simulink® blocks.

Dependencies

When NumPixels is greater than 1, you must set the NumComponents property to 1.

Dimensions of active and inactive regions of a video frame. To select a predefined format, specify the VideoFormat property as one of the options in the first column of the table. For a custom format, set VideoFormat to 'Custom', and specify the dimension properties as integers. The frame dimensions are indicated in the diagram.

Dimensions of the active and inactive regions of the video frame, labeled with their corresponding property names

Video FormatActive Pixels Per LineActive Video LinesTotal Pixels Per LineTotal Video LinesStarting Active LineEnding Active LineFront PorchBack Porch
240p320240 40232412404438
480p6404808005253651516 144
480pH7204808585253351216122
576p7205768646254762212132
720p1280720165075025744110260
768p102476813448061077724296
1024p128010241688106642106548360
1080p (default)192010802200112542112188192
1200p160012002160125050124964496
2KCinema204810802750112542112163963
4KUHDTV384021604400225042220188472
8KUHDTV7680432088004500424361881032
CustomUser-
defined
User-
defined
User-
defined
User-
defined
User-
defined
User-
defined
User-
defined
User-
defined

Note

When using a custom format, the properties you enter for the active and inactive dimensions of the image must add up to the total frame dimensions.

For the horizontal direction, TotalPixelsPerLine must be greater than or equal to FrontPorch + ActivePixelsPerLine. The object calculates BackPorch = TotalPixelsPerLineFrontPorchActivePixelsPerLine.

For the vertical direction, TotalVideoLines must be greater than or equal to StartingActiveLine + ActiveVideoLines − 1. The object calculates EndingActiveLine = StartingActiveLine + ActiveVideoLines − 1.

If you specify a format that does not conform to these rules, the object reports an error.

Note

When using a custom format, ActivePixelsPerLine must be greater than 1. Also, set the horizontal blanking interval, or BackPorch + FrontPorch, according to these guidelines.

  • The total of BackPorch + FrontPorch must be at least 2 times the largest kernel size of the algorithm in the objects following the visionhdl.FrameToPixels object. If the kernel size is greater than 4, the total porch must be at least 8 pixels.

  • The BackPorch property value must be at least 6 pixels. This property is the number of inactive pixels before the first valid pixel in a frame.

For more information on blanking intervals, see Configure Blanking Intervals.

Note

When using multipixel streaming (NumPixels > 1), these requirements apply.

  • The video format must have horizontal dimensions divisible by the NumPixels property value. The horizontal dimensions are set by these properties: ActivePixelsPerLine, TotalPixelsPerLine, FrontPorch, and BackPorch. Standard video protocols 480p, 768p, 1024p, 1080p, 1200p, 4k UHD, and 8k UHD support NumPixels equal to 4 or 8.

  • The minimum input frame size for multipixel streaming is 18 rows by 32 columns.

  • Choose your kernel size and ActivePixelsPerLine such that ActivePixelsPerLine/NumPixels is at least the kernel width.

Usage

Description

[pixels,ctrlOut] = F2P(frm) converts the input image matrix, frm, to a vector of pixel values, pixels, and an associated vector of control structures, ctrlOut. The control structure indicates the validity of each pixel and its location in the frame. The output pixels include padding around the active image, specified by the VideoFormat property.

For details about the pixel stream format, see Streaming Pixel Interface.

example

Input Arguments

expand all

Input image, specified as an ActiveVideoLines-by-ActivePixelsPerLine-by-NumComponents matrix, where:

  • ActiveVideoLines is the height of the active image.

  • ActivePixelsPerLine is the width of the active image.

  • NumComponents is the number of components used to express a single pixel.

Set the size of the active image using the VideoFormat property. If the dimensions of frm do not match the dimensions specified by VideoFormat, the object returns a warning.

Data Types: uint | int | logical | fi | double | single

Output Arguments

expand all

Pixel values, returned as a P-by-NumComponents matrix or P-by-NumPixels matrix, where:

  • P is the total number of pixels in the padded image, which is TotalPixelsPerLine × TotalVideoLines.

  • NumComponents is the number of components used to express a single pixel.

  • NumPixels is the number of pixels transferred on the streaming interface per cycle. When NumPixels is greater than 1, you must set NumComponents to 1.

    Note

    You can simulate System objects with a multipixel streaming interface, but you cannot generate HDL code for System objects that use multipixel streams. To generate HDL code for multipixel algorithms, use the equivalent Simulink blocks.

Set the size of the padded image using the VideoFormat property. The data type of the pixel values is the same as im.

Control structures associated with the output pixels, returned as a P-by-1 vector. P is the total number of pixels in the padded image, which is TotalPixelsPerLine × TotalVideoLines. Each structure contains five control signals indicating the validity of the pixel and its location in the frame. For multipixel streaming, the control signals apply to each set of NumPixels values. See Pixel Control Structure.

Object Functions

To use an object function, specify the System object as the first input argument. For example, to release system resources of a System object named obj, use this syntax:

release(obj)

expand all

stepRun System object algorithm
releaseRelease resources and allow changes to System object property values and input characteristics
resetReset internal states of System object

Examples

collapse all

This example converts a custom-size grayscale image to a pixel stream. It uses the visionhdl.LookupTable object to obtain the negative image. Then it converts the pixel stream back to a full-frame image.

Load the source image from a file. Select a portion of the image matching the desired test size.

frmOrig = imread('rice.png');
frmActivePixels = 64;
frmActiveLines = 48;
frmInput = frmOrig(1:frmActiveLines,1:frmActivePixels);
figure
imshow(frmInput,'InitialMagnification',300)
title 'Input Image'

Figure contains an axes object. The hidden axes object with title Input Image contains an object of type image.

Create a serializer object and specify the size of the inactive pixel regions.

frm2pix = visionhdl.FrameToPixels( ...
      'NumComponents',1, ...
      'VideoFormat','custom', ...
      'ActivePixelsPerLine',frmActivePixels, ...
      'ActiveVideoLines',frmActiveLines, ...
      'TotalPixelsPerLine',frmActivePixels+10, ...
      'TotalVideoLines',frmActiveLines+10, ...
      'StartingActiveLine',6, ...     
      'FrontPorch',5);

Create a lookup table (LUT) object to generate the negative of the input image.

tabledata = linspace(255,0,256);
inverter = visionhdl.LookupTable(tabledata);

Serialize the test image by calling the serializer object. pixIn is a vector of intensity values. ctrlIn is a vector of control signal structures.

[pixIn,ctrlIn] = frm2pix(frmInput);

Prepare to process pixels by preallocating output vectors.

[~,~,numPixelsPerFrame] = getparamfromfrm2pix(frm2pix);
pixOut = zeros(numPixelsPerFrame,1,'uint8');
ctrlOut = repmat(pixelcontrolstruct,numPixelsPerFrame,1);

For each pixel in the stream, look up the negative of the pixel value.

for p = 1:numPixelsPerFrame  
    [pixOut(p),ctrlOut(p)] = inverter(pixIn(p),ctrlIn(p));
end

Create a deserializer object with a format that matches the format of the serializer. Convert the pixel stream to an image frame by calling the deserializer object. Display the resulting image.

pix2frm = visionhdl.PixelsToFrame( ...
      'NumComponents',1, ...
      'VideoFormat','custom', ...
      'ActivePixelsPerLine',frmActivePixels, ...
      'ActiveVideoLines',frmActiveLines, ...
      'TotalPixelsPerLine',frmActivePixels+10);
[frmOutput,frmValid] = pix2frm(pixOut,ctrlOut);
if frmValid
    figure
    imshow(frmOutput,'InitialMagnification',300)
    title 'Output Image'
end

Figure contains an axes object. The hidden axes object with title Output Image contains an object of type image.

Version History

Introduced in R2015a

expand all