Hello, enthusiast! This is my first post to Hackster.io and I'm excited to share what I've been working on! However, as English is not my first language, I will have to ask for your forgiveness if my sentences are not crafted perfectly.
My name is Ruben and I'm a recent graduate of electronics engineering at the moment of writing this document. My interests are digital electronics, signal processing, and embedded systems.
Video streaming is common to our daily lives. It goes from services like Netflix to surveillance systems, conference calls and surgical imagery. Devices utilized for this purpose range from servers to PC’s, and embedded systems.
This project is focused on video streaming using an embedded system as the IP video source. The heart of this implementation is the Avnet MaaxBoard, an embedded Pi-hat compatible board based on the NXP i.MX 8M Quad ARM v8 processor. We will use the Debian GNU/Linux operating system for this project.
We will use GStreamer, which is a multimedia framework that works under the principle of pipelines that allow data (video, audio, images) to be modified and transmitted efficiently. The goal is to show how GStreamer pipelines are built for video streaming with a webcam
Hardware Used- An Ubuntu 20.04 workstation
- 1920 x 1080 HDMI monitor for the MaaXBoard with a high-quality HDMI cable
- 5V -3A USB type C power supply
- NXP's MCU-LINK
- USB 2.0 cable for MCU-LINK
- Ethernet Cable CAT 5E or better
- Logitech's C270 HD camera
- 32 GB micro SD card
I assume that you have previously downloaded and ran a BSP image onto your MaaxBoard’s SD card. This, that you already setup your board's Wi-Fi connection. If you haven't please check this well-written document (Pre-work section):
https://www.hackster.io/flint-weller/avnet-maaxboard-training-2021-april-may-13a51d
Make sure to have already your MCU-LINK connected between the host machine and the board. Please ensure that your board is plugged to its power adapter and that it’s properly running this BSP (kernel release 4.14.78):
You can decide to not use other terminal emulators or serial converters. But I am using the NXP´s MCU-LINK and tio.
Installing tmuxBelow sections of the project require the execution of simultaneous commands inside the MaaXBoard's command shell. Thus, I suggest you to install tmux, a tool that allows to execute more than two sessions inside a terminal.
Install tmux on the MaaXBoard:
# apt install tmuxRun tmux:
# tmuxNow open a second session. It will allow you to execute simultaneous commands within the tio's terminal session.
# tmux new-session -d -s <session name>Now jump between sessions using any of this commands:
// Press CTRL + B + s for a menu of sessions. Switch to the one of your liking using the keyboard's arrows and then press enter
# Press CTRL + ( to switch session in one direction. Press CTRL + ( to move in the other directionDocument's Convention on CommandsYo will find instructions meant to be executed on both workstation and MaaxBoard. So, I'll be following the convention below for making easier to identify were each command should be issued. See below:
Workstation:
$ some-command
>
Example outputMaaXBoard:
# some-command
>
...
Example output
...Where:
$ refers to a command executed on the workstation
# is executed at the MaaXBoard
> denotes an expected or example output. Whereas it can be accompanied by ... which means that I extracted a relevant portion to show from an outputPulling out Webcam's IDNOTE: please skip this step if you are using a CSI camera.
We are going to identify our webcam using the MaaXBoard's terminal. Attach your webcam into the board. Then, make sure that is has been correctly enumerated by the Linux Kernel using the lsusb command.
Identify your webcam's ID by looking into lsusb output. We will use that in next sections. For example, mine has the 046d:0825 ID.
# lsusb
>
Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 003 Device 002: ID 046d:0825 Logitech, Inc. Webcam C270
Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 002 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hubRetrieving Webcam's Name and Location Using v4l2-utilsV4l2 is a dedicated framework that allows a Linux system to process video data. In our case, it will handle the video input from out camera and feed that into a GStreamer pipeline. Fortunately, it is preinstalled onto our MaaXBoard's BSP.
In addition to v4l2, we will use the v4l2 utilities to retrieve our camera's location within the roof file system. This location will be used by v4l2 to fetch camera's video. Issue below commands to install the utilities:
# apt update
# apt install v4l-utilsNow that the utils have been installed. Let's discover our camera's name and location:
# v4l2-ctl --list-devices
>
i.MX6S_CSI (platform:30a90000.csi1_bridge):
/dev/video0
UVC Camera (046d:0825) (usb-xhci-hcd.4.auto-1):
/dev/video1From above output. Identify your webcam by looking into the USB ID that's enclosed in parenthesis. Please be aware that your result might be different to the one I shown here. Mine is video1 and its path is /dev/video1. We will use the camera's path in future steps.
NOTE: if you are following this lab using a CSI camera, its device name should be video0.
Instaling up GStreamer on both MaaXBoard and WorkstationGStreamer uses a pipeline architecture divided using blocks, or elements. These are connected through a chain that starts with a source and ends with a sink. Whereas the port connection between elements is called a pad, which has certain properties that make a source compatible with a sink.
A comprehensive set of documentation is provided by GStreamer developers:
https://gstreamer.freedesktop.org/documentation/tutorials/basic/gstreamer-tools.html?gi-language=c
As we’re going to use GStreamer on both PC and board, we should make sure that its installed and up to date. First on the workstation:
$ sudo apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev libgstreamer-plugins-bad1.0-dev gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav gstreamer1.0-doc gstreamer1.0-tools gstreamer1.0-x gstreamer1.0-alsa gstreamer1.0-gl gstreamer1.0-gtk3 gstreamer1.0-qt5 gstreamer1.0-pulseaudioAnd now on the board (the package is already installed, but it's worth updating it):
# apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev libgstreamer-plugins-bad1.0-dev gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav gstreamer1.0-doc gstreamer1.0-tools gstreamer1.0-x gstreamer1.0-alsa gstreamer1.0-gl gstreamer1.0-gtk3 gstreamer1.0-qt5 gstreamer1.0-pulseaudioBuilding a videotest Pipeline with GStreamerWe are going to exercise a simple GStreamer pipeline on both machines.
On the workstation:
$ gst-launch-1.0 videotestsrc ! videoconvert ! autovideosinkWith the Maaxboard:
# gst-launch-1.0 videotestsrc ! videoconvert ! autovideosinkThe snapshot below shows the output of above pipeline on the PC. It is a window that will run indefinitely. You shall see that the video might comprise the whole screen of the monitor attached to your board. Press CTRL + C to end the pipeline on both machines.
Now, let’s analyze what we just did.
We created a pipeline composed of three elements connected using a pipe '!' symbol. First, a source called videotest. Then a colorspace converter (videoconvert). Finally, a sink called called autovideosink.
Just for your reference, below are some definitions of above elements ( by FreeDesktop):
videotestsrc
Used to produce test video data in a wide variety of formats. The video test data produced can be controlled with the "pattern" property.videconvert
Convert video frames between a great variety of video formats.autovideosink
Automatically detects an appropriate video sink to use. It does so by scanning the registry for all elements that have "Sink" and "Video" in the class field of their element information, and also have a non-zero autoplugging rank.The pipeline that we built moments ago is represent as follows:
This section shows how to add the camera into the GStreamer pipeline. Above diagram depicts how we will craft a new test on the MaaXBoard.
To show the camera stream onto the HDMI monitor that is connected to the board. Issue the following command using the path of your camera after the v4l2src element. Remember the Retrieving webcam's name and location using v4l2-utils section?
# gst-launch-1.0 v4l2src device=/dev/video1 ! videoconvert ! autovideosinkThis is the result.
The above gif depicts a photo of the video output shown on my monitor. As you can see, the image is not displayed properly, and this behavior persists every time the pipeline is issued. Some frames are displayed with is correct color with a periodicity of almost one second. The problem is persistent after I tried to issue the gst-launch command many times.
You might encounter this issue as well. That’s why I would like to show you my workaround by exploring useful commands on the next sections.
Debugging the First Attempt of Local Camera StreamLet’s look at the output of the pipeline when it has more than one-minute running and stop it using CTRL+ C on you workstation's keyboard.
> ^Chandling interrupt. Interrupt: Stopping pipeline ... Execution ended after 0:00:48.495957963 Setting pipeline to PAUSED ... Setting pipeline to READY ... Setting pipeline to NULL ... Total showed frames (344), playing for (0:00:48.496143480), fps (7.093). Freeing pipeline ..Video was shown with a latency of 7.093 frames per second (fps). But you might be used to hear that video systems are able to display images at 60 fps or more. So, our first assumption would be that the CPU is filled with this operation. Are the FPS associated to resolution and, as a result, to a low frame rate? Let’s find out whether we are using full processor or not.
Install htop for measuring the resource utilization of the board’s CPUs.
# apt install htopLaunch the local stream again and open an htop instance in a new tmux session.
# htopImage below shows an instance of htop running on the board while running the local camera stream. While the main processor of the MaaXBoard has four cores, the first one is handling most of processing tasks (23.8% utilization for this CPU). We will compare this metrics to a normal idle state of the OS in the next picture.
For comparison, let’s review the CPU’S performance while OS is at idle state. Now, the first core is running at 1.3 %.
The following capture shows the processor running with the test video (the first pipeline we built). As expected, CPU resource utilization is negligible.
And the CPU1 is not heavily used, it is assumed that is leaving the system with sufficient bandwidth to execute other tasks. So, CPU utilization is not causing the green lines on the webcam video. So, this lead us to a couple of questions: what resolution are we using? Which video format?
It is worth mentioning that the video configuration was automatically set by GStreamer on our first attempt. Hence, we must add the -v parameter to pipeline to identify it:
# gst-launch-1.0 -v v4l2src device=/dev/video1 ! videoconvert ! autovideosink > New clock: GstSystemClock … GstPipeline: pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)960, framerate=(fraction)15/2, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)bt709, interlace - mode=(string)progressive …Input and output syntax is known as caps, which are the settings that we need to change in our pipeline to configure video resolution and format. Use the gst-devices-monitor tool for probing the multimedia devices connected to the board. This includes our camera and will tell us which video settings can be supported by it:
# gst-device-monitor-1.0
> Probing devices... Failed to start device monitorUh, oh! This time, something is impeding the monitor to start. However, there is another utility of GStreamer that we can use to overcome that limitation. And that is the use of debug variables.
Debug variables are useful for getting a log of what happen while building a pipeline. They come in a set of debug flags with a distinct level of information that can be provided. It ranges from simple warnings and errors to a complete log with OS information. For more information, please review the following link:
https://gstreamer.freedesktop.org/documentation/tutorials/basic/debugging-tools.html?gi-language=c
I used the debug variable 3, known as FIXME flag. It is it applies to simple debug the cases as it displays an easy-to-read log. Enable this variable on your board as shown. You will see that debug more information is now shown.
# export GST_DEBUG="*:3"Now, issue the device monitor
# gst-device-monitor-1.0
> Probing devices... 0:00:00.059104079 4660 0xaaaad50cd240 WARN default gstdevicemonitor.c:458:gst_device_monitor_start:
No filters have been set, will expose all devices found 0:00:00.201592020 4660 0xaaaad50cd240 ERROR pulse pulsedeviceprovider.c:439:gst_pulse_device_provider_start: Failed to connect: Connection refused Failed to start device monitor!The new log is telling us that pulseaudio rejected a connection. This is a service that allows device-monitor to show us info about audio devices. We just need to start it and run gst-device-monitor-1.0 again.
# pulseaudio --start
> W: [pulseaudio] main.c: This program is not intended to be run as root (unless --system is specified).Press CTRL+ C and now issue the monitor again.
# gst-device-monitor-1.0
> … Device found:
name :
UVC Camera (046d:0825)
class : Video/Source
caps : video/x-raw, format=(string)YUY2, width=(int)1280, height=(in t)960, pixelaspect-ratio=(fraction)1/1, framerate=(fraction){ 15/2, 5/1 }; video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, pixel-aspectratio=(fraction)1/1, framerate=(fraction){ 15/2, 5/1 };
….
video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, pixel-aspectratio=(fraction)1/1, framerate=(fraction){ 20/1, 15/1, 10/1, 5/1 }; ... video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspectratio=(fraction)1/1, framerate=(fraction){ 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 };
...
video/x-raw, format=(string)YUY2, width=(int)160, height=(int)120, pixel-aspectratio=(fraction)1/1, framerate=(fraction){ 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 };You shall notice that colorspace format remains the same for video processing (YUY2). While the camera can provide use a video in raw format (video/x-raw). Framerate is known as the fps. This means that the previous pipeline (that we built for a local video transmission) used the highest quality that our camera can provide to the board.
Another thing to bring up is that the FPS supported by my camera increase when resolution is lowered .
You guess it! I will try to use a lower resolution to test the pipeline. So that’s why I wanted to get caps from previous command. We will replace with the caps we copied from above.
# gst-launch-1.0 -v v4l2src device=/dev/video1 ! “” ! videoconvert ! autovideosinkLet’s try with a resolution of 800x600.
# gst-launch-1.0 -v v4l2src device=/dev/video1 ! “video/x-raw, format=(string)YUY2, \ width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, \ framerate=(fraction)5/1” ! videoconvert ! autovideosinkStill, same video cropped with green lines. So, lets try with the lowest possible resolution of 160x120 to ensure that problem is not related to CPU performance.
# gst-launch-1.0 -v v4l2src device=/dev/video1 ! "video/x-raw, format=(string)YUY2, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1" ! videoconvert ! autovideosinkError still appears. While CPU utilization is not telling us where the problem is.
It also shows the same error. Please see the htop result in the picture below. Now the usage of the first core varies between 4.7 and 7 %. Less resolution and same green screen tell us that CPU usage might not be cause of the problem. We will address the issue on next section.
Fixing Local Camera StreamThis section involves pursuing a solution related to how the video for linux subsystem handles video from the webcam. There may be more solutions to the one I am providing, so I am encouraging any experienced reader to propose a different workaround than the one I'm shown here.
The Solution presented involves changing the I/O mode of the V4L2 API: The V4L2 API defines several different methods to read from or write to a device. All drivers exchanging data with applications must support at least one of them. The classic I/O method using the read() and write() function is automatically selected after opening a V4L2 device.
https://www.kernel.org/doc/html/v4.15/media/uapi/v4l/io.html
Let’s find which I/O method are available within our board. For this, use gst-inspect. It allows us to grab information on the supported plugins or elements that our GStreamer installation has. I use this to verify which pipeline elements are valid within each other. Following command will display a complete list of elements available within a system.
# gst-inspect-1.0Now, on the workstation:
$ gst-inspect-1.0For my workaround, I reviewed the v4l2src element:
# gst-inspect-1.0 v4l2src
> …
io-mode : I/O mode
flags: readable, writable
Enum "GstV4l2IOMode" Default: 0, "auto"
(0): auto - GST_V4L2_IO_AUTO
(1): rw - GST_V4L2_IO_RW
(2): mmap - GST_V4L2_IO_MMAP
(3): userptr - GST_V4L2_IO_USERPTR
(4): dmabuf - GST_V4L2_IO_DMABUF
(5): dmabuf-import - GST_V4L2_IO_DMABUF_IMPORT
…The V4L2 API documentation (link provided above) matches with the inspect of v4l2src. You can infer that we were using the auto mode, the classic I/O method. Here’s some info about it:
Drivers may need the CPU to copy the data, but they may also support DMA to or from user memory, so this I/O method is not necessarily less efficient than other methods merely exchanging buffer pointers. It is considered inferior though because no meta-information like frame counters or timestamps are passed. This information is necessary to recognize frame dropping and to synchronize with other data streams.
https://www.kernel.org/doc/html/v4.15/media/uapi/v4l/rw.html
That is giving us a clue. The auto I/O mode might not be providing useful sync information to the pipeline. So, other method should be tried. Which one?
According to my tests, the mmap I/O method turned out to be working! You can issue it right now on your board. Just add a new io parameter to the previous the local camera stream pipeline.
# gst-launch-1.0 v4l2src device=/dev/video1 io-mode=2 ! videoconvert ! autovideosinkPlease see the description for the mmap method:
Streaming is an I/O method where only pointers to buffers are exchanged between application and driver, the data itself is not copied. Memory mapping is primarily intended to map buffers in device memory into the application’s address space. Device memory can be for example the video memory on a graphics card with a video capture add-on. However, being the most efficient I/O method available for a long time, many other drivers support streaming as well, allocating buffers in DMA-able main memory.
https://www.kernel.org/doc/html/v4.15/media/uapi/v4l/mmap.html
Here are our performance metrics with htop and this new pipeline. The CPU1 utilization is practically the same as with the test without the io-mode=2. However, memory usage increased, on average, 12 megabytes from previous method.
Now, let's check the CPU utilization stats:
We improved our handling of the camera on past section. Now let's try a method for video streaming from the MaaXBoard to the workstation. This process will be achieved using an CAT5-E Ethernet cable as the link between two machines.
On your workstation, perform the following steps:
1. Click on the menu on the right top of your screen
2. Go to Settings→ Network and press the configuration icon on the Ethernet settings:
3. Then click on the IPv4 tab. Select the “Shared to other computers” method.
NOTE:
The reason of using an Ethernet cable first is to test a streaming with full bandwidth. But, you can get rid of the cable and send video to the workstation using a wireless connection ( both machines can be connected trough WiFi to a local router and send data to each other using its IP addresses).
Testing the Ethernet Connection Between the MaaXBoard and the WorkstationTo perform any transmission from the MaaXboard to PC, we need to take note the IP of both. Make sure to connect the Ethernet link between both machine before any testing. For this issue the ip address show comand below.
First on the MaaXBoard:
# ip address show
>
...
2: eth0:
...
inet 10.42.0.160/24 brd 10.42.0.255 scope global dynamic noprefixroute eth
...MaaXBoard's IP address for eth0 port is 10.42.0.160. This address should be different on your setup.
Now, on the workstation:
$ ip address show
...
3: enp0s31f6:
...
inet 10.42.0.1/24 brd 10.42.0.255 scope global noprefixroute enp0s31f6
...Ethernet address of the workstation is 10.42.0.1, and it may also be differente on your setup.
NOTE:
Please be aware that this name of ports are specific to my workstation model. And it can differ from yours. That's why I am showing below procedure to make that I am using the correct IP address for both devices.
Now ping devices between each other. First from MaaXBoard to PC.
# ping 10.42.0.1
>
PING 10.42.0.1 (10.42.0.1) 56(84) bytes of data.
64 bytes from 10.42.0.1: icmp_seq=1 ttl=64 time=1.23 ms
64 bytes from 10.42.0.1: icmp_seq=2 ttl=64 time=1.56 ms
64 bytes from 10.42.0.1: icmp_seq=3 ttl=64 time=1.23 ms
[10124.947696] fec 30be0000.ethernet eth0: Link is Down
[10134.164833] fec 30be0000.ethernet eth0: Link is Up - 1Gbps/Full - flow control off "maaxboard" 22:59 30-Aug-21
[10136.211235] fec 30be0000.ethernet eth0: Link is Down
64 bytes from 10.42.0.1: icmp_seq=22 ttl=64 time=3.15 ms1Gbps/Full - flow control off
64 bytes from 10.42.0.1: icmp_seq=23 ttl=64 time=1.17 ms
64 bytes from 10.42.0.1: icmp_seq=24 ttl=64 time=1.29 ms
64 bytes from 10.42.0.1: icmp_seq=25 ttl=64 time=1.20 ms
^C
--- 10.42.0.1 ping statistics ---
25 packets transmitted, 7 received, 72% packet loss, time 464ms
rtt min/avg/max/mdev = 1.167/1.547/3.154/0.668 msHalfway the ping test, I disconnected the Ethernet link and link went down, and then recovered after I plugged in the cable again. To cancel ping command, just issue CTRL+C (applies also in workstation).
Let's do the inverse process, ping the MaaXBoard from workstation:
$ ping 10.42.0.160
>
PING 10.42.0.160 (10.42.0.160) 56(84) bytes of data.
64 bytes from 10.42.0.160: icmp_seq=1 ttl=64 time=1.77 ms
64 bytes from 10.42.0.160: icmp_seq=2 ttl=64 time=1.76 ms
64 bytes from 10.42.0.160: icmp_seq=3 ttl=64 time=1.26 ms
64 bytes from 10.42.0.160: icmp_seq=4 ttl=64 time=1.80 ms
From 10.42.0.1 icmp_seq=6 Destination Host Unreachable
From 10.42.0.1 icmp_seq=7 Destination Host Unreachable
From 10.42.0.1 icmp_seq=8 Destination Host Unreachable
64 bytes from 10.42.0.160: icmp_seq=16 ttl=64 time=3.44 ms
64 bytes from 10.42.0.160: icmp_seq=17 ttl=64 time=1.73 ms
64 bytes from 10.42.0.160: icmp_seq=18 ttl=64 time=1.85 ms
^C
--- 10.42.0.160 ping statistics ---
18 packets transmitted, 7 received, +3 errors, 61.1111% packet loss, time 17251ms
rtt min/avg/max/mdev = 1.258/1.942/3.435/0.636 ms, pipe 3I disconnected and connected the cable during the process. As you can see, I just made sure that the link was OK between both machines. So I am encouraging you to do a similar process as it can help you to assess communication errors with more ease.
Streaming an Uncompressed Video from MaaxBoard to WorkstationWe are going to use all the concepts learned to transmit video from the board to the PC. This will use the UDP protocol to payload the uncompressed data from the MaaxBoard to PC. The board will be known as the source, and the PC as the client under this format.
I am using UDP (instead of TCP) as it is connection-less (while TCP is connection oriented). It has has significantly lower overhead than TCP, thus making it more efficient. Moreover, UDP can be used for streaming services as it allows the loss of frames when a connection is unstable
NOTE:
Before doing any tests involving networking, please make sure that the firewall of your PC is correctly configured. If you don’t set any appropriate rule to receive information over a network, it is very likely that you won’t be able to get any data. You can try to issue the sudo ufw reset command to reset your firewall settings. Do this at your own risk.
I will start providing you with a source pipeline model for the source (MaaXBoard):
# gst-launch-1.0 -v v4l2src device=<YOUR CAMERA> io-mode=2 ! "CAPS" ! videoconvert ! queue ! rtpvrawpay ! queue ! udpsink host=<WORKSTATION'S IP> port=<Range: 0 - 65535 Default: 5004>Where CAPS is mean to be replaced with one of the supported resolutions that we get with the gst-device-monitor command.
Implemented source pipeline
The following command starts the uncompressed-video transmission to the workstation.
# gst-launch-1.0 -v v4l2src device=/dev/video1 io-mode=2 ! "video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)960, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/2" ! videoconvert ! queue ! rtpvrawpay ! queue ! udpsink host=10.42.0.1 port=1234
>
…
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2, depth=(string)8, width=(string)1280, height=(string)960, colorimetry=(string)BT709-2, payload=(int)96, ssrc=(uint)1810753670, timestampoffset=(uint)3471799532, seqnum-offset=(uint)30015, a-framerate=(string)7.5
...Make note of the output from above pipeline. Look for the caps that start with application/x-rtp and copy them into the CAPS of the following sink pipeline. We are using the application/x-rtp caps as they provide the workstation with a mean to interpret an RTP (real time protocol) transmission.
Now, take a look into this sink pipeline model that will run on the workstation:
gst-launch-1.0 -v udpsrc port=1234 ! "<CAPS>" ! rtpvrawdepay ! queue !
videoconvert ! autovideosinkNow replace the CAPS with the string that we copied from the source pipeline (starts with application/x-rtp) into the sink pipeline.
$ gst-launch-1.0 -v udpsrc port=1234 ! "application/x-rtp, media=(string)video, clockrate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2, depth=(string)8, width=(string)1280, height=(string)960, colorimetry=(string)BT709 -2,payload=(int)96, ssrc=(uint)2464691286, timestamp-offset=(uint)959957491, seqnumoffset=(uint)32133, a-framerate=(string)7.5" ! rtpvrawdepay ! queue ! videoconvert ! autovideosinkHere is the video output after a minute . I am using here the highest resolution that my USB camera can handle.
Also, to measure fps on the sink, you can use the fpsdisplaysink element instead of autovideosink. Here is an example of it:
$ gst-launch-1.0 -v udpsrc port=1234 ! "application/x-rtp, media=(string)video, clockrate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2, depth=(string)8, width=(string)1280, height=(string)960, colorimetry=(string)BT709 -2,payload=(int)96, ssrc=(uint)2464691286, timestamp-offset=(uint)959957491, seqnumoffset=(uint)32133, a-framerate=(string)7.5" ! rtpvrawdepay ! queue ! videoconvert ! fpsdisplaysinkJust for your reference, here is the CPU utilization provided by htop (MaaXBoard).
NOTE:
You may realize that when using higher resolutions, image can be cropped with some lines missing. Or even, some green lines may appear. This is fixed with the use of an encoder, such as H264.
Streaming a Compressed Video from MaaXBoard to WorkstationThe last portion of this document covers the use of an encoded video transmission. For this, I will show you the result with the h.264 encoder on my setup.
MaaXBoard:
# gst-launch-1.0 -v v4l2src device=/dev/video1 io-mode=2 ! videoconvert ! x264enc tune=zerolatency ! rtph264pay ! udpsink host=10.42.0.1 port=1324Workstation:
$ gst-launch-1.0 udpsrc port=1324 ! application/x-rtp,media=video,clock-rate=90000,encodingname=H264,payload=96 ! rtph264depay ! avdec_h264 ! fpsdisplaysinkYou can see that I set the same video resolution on this experiment. Sadly, I achieved no more than 3 fps displayed on the workstation. And this is linked to an excessive use of CPU (see performance metrics below):
Here are a few improvements that could be done to the project:
- Consider that webcam used couldn’t provide high-resolution video and was limited 7.5 fps at its best.
- A future work with either a better webcam or an OV5640 camera with 1080p and 15 fps would suit better for an application that requires better quality of image. This will take advantage of the i.MX 8 MQ CSI interface.
- As per my research, the Debian-based BSP of the MaaxBoard doesn’t include necessary colorspace and offload-video engines dedicated for the i.MX 8MQ (such as imxvideoconvert_g2 and vpuenc_h264). These are plugins that are provided by NXP and will allow to get better results with GStreamer pipelines in both compressed and uncompressed video transmission methods.
Stay tuned for a second project with improved pipeline examples so better resolution and performance can be achieved.
AcknowledgmentsI want to thank my colleague, Flint Weller, for his valuable suggestions throughout the creation of this document. Special thanks also to all the users that shared their experience and knowledge on the GStreamer developer forums.
You can check out other Flint's posts over here:
https://www.hackster.io/flint-weller
Suggested materialThere is an awesome talk on GStreamer that everyone intereseted should watch. It was hosted by Jan Schmidt and it is available on YouTube:
Free Desktop GStreamer Tutorial:
https://gstreamer.freedesktop.org/documentation/tutorials/index.html?gi-language=c


Comments