In last week’s blog, we set up the processing system (PS) and the programmable logic (PL) to be able to output live video using the DisplayPort Controller.
In this blog, we are going to create the SW that is necessary to output a test pattern generated in the PL on the screen.
The first thing we need to do once the Vivado HW is available is export the hardware the SDK and open SDK. This will pull in the HW description and allow us to create a new application and BSP.
Once the BSP is generated, we need to ensure it is correctly configured for the live video application. We do this by opening the MSS file and changing the driver which is used by the DisplayPort Controller.
We need to do this as there are four possible configurations of the DisplayPort Controller and depending upon which configuration is used a different driver is needed.
- Memory to Data Path — For this case, use dppsu API.
- PL to Data Path — For this case, use dppsu API.
- PL to PL — For this case, use avbuf API.
- Memory to PL — For this case, use avbuf API.
We can select the desired API by re-customizing the BSP settings and selecting the necessary driver. Note that, when we use the DPPSU, the BSP will still contain the AVBUF API as they are required too.
While the AVBUF defines the configuration and use of the audio / visual pipeline, the DPPSU defines the configuration of the DisplayPort Transmitter. Therefore, when we want to transit video external to the MPSoC, we need to use it as well as the AVBUF drivers.
Once this is completed, we are free to generate our application code. In this code, we are going to do the following.
- Configure the Test Pattern Generator to create the desired test pattern.
- Configure the Video Timing Controller to generate the desired video timing.
- Set up the Interrupt Controller.
- Define the DisplayPort Controller settings using the DPPSU and AVBUF API. This includes video mode, pixel encoding, lane count and lane rate.
- Configure the live video input and PL clocking options.
The complete code is available on my GitHub; however, we can use functions such as:
- XAVBuf_SetInputLiveVideoFormat(&AVBufPtr, RGB_8BPC);
- XAVBuf_InputVideoSelect(AVBufPtr, XAVBUF_VIDSTREAM1_LIVE, XAVBUF_VIDSTREAM2_NONE);
- XAVBuf_SetAudioVideoClkSrc(AVBufPtr, XAVBUF_PL_CLK, XAVBUF_PS_CLK);
All of the necessary functions and type definitions are contained within xavbuf.h and xdppsu.h
Once the code is completed, we can download the application and bitstream to the Ultra96 V2 using the JTAG / UART pod, and if a suitable display is connected, we should see a test pattern.
For this first example, I set the test pattern generator to output a checker board design. Running the software showed the following image on my DisplayPort capable monitor.
Of course, if you are having issues with getting the design up and running, there are two ILAs within the design. We can connect to these using the Vivado Hardware Manager and explore the output of the test pattern generator and the AXI Stream to video out.
Now that we know how we can get the DisplayPort Controller working with the live video input, we will begin to take a look at creating more image processing solutions which make use of it.
I am especially keen to use the high-speed breakout connector on the Ultra96 V2.
The project is on my GitHub.
See My FPGA / SoC Projects: Adam Taylor on Hackster.io
Get the Code: ATaylorCEngFIET (Adam Taylor)
Access the MicroZed Chronicles Archives with over 300 articles on the FPGA / Zynq / Zynq MpSoC updated weekly at MicroZed Chronicles.