Emojis are essential to modern day communication. Emails, text messages, social media platforms, and live stream chats all integrate emojis so the user can convey emotions in visual form, without having to type long sentences.
We see emojis often as a part of an app or software, but never at the hardware level. So, I decided to bring emojis down to that hardware level, utilizing Xilinx’s ZYNQ-7000 SoC platform. And, to add a little spice to the project 😊, I decided to aminate them!
Let’s start with high-level overview of the design. For the animation and image generation, I used Video Graphics Array (VGA) technology for its design simplicity when compared to HDMI, DVI, and DisplayPort. The VGA driver is implemented in Verilog, and outputs a 480p picture @ 60fps. For animation control input, the UART core on the ZYNQ hard processing system is used. Your text input to the serial terminal controls which emoji is displayed on the monitor. The figure below shows the high-level interactions of the system.
If you have no experience with VGA, please refer to this tutorial from Simply Embedded’s Youtube channel. It is a fantastic resource for new coders and just for a refresher. The VGA driver I created outputs a 640 pixel by 480 pixel picture @ 60 fps.
One note is that the VGA driver needs a 25 MHz clock, and the Arty Z7 board has a 125 MHz clock, so make sure to include the clk_gen module which is a PLL clock divider.
Generating the GraphicsThe different emojis I created are labeled in the Github repo as happy, sad, mad, and crazy. If you dive into these, the code can be a little confusing, so lets start with a basic image generation for VGA.
An easy thing to start with is the base of the emoji’s face, a yellow square.
The black outline represents the edges of the screen. What I did is break the image down into vertical sections wherever there are different colors and take note of what the vertical pixel count is at those section changes. See the pictures below.
Now we define each pixel, depending on what section it is in. The pixel count starts at the top left of the screen, so section 1 needs to be defined first. I used if-statements for this.
Section 1 runs from pixel 35 (account for blanking) to pixel 134 (with 135 being the start of section 2), so we can make this if-statement:
if (i_vcounter < 135)
r_color <= white;
i_vcounter
is the vertical counter input that keeps track of the vertical pixel count, r_color
is the register that holds the color data for the pixel, and "white" is a constant in the form of a localparam
equal to a 12 bit hex value of FFF (12’hFFF
), representing an equal mix of red, green, and blue, in that order.
Section 2 is a little more difficult, because now there are two colors to worry about, white and yellow. White is represented as 12’hFFF
, while yellow is 12’hFF0
(equal parts red and green).
To know when to switch colors, we rely on the i_hcounter input, which is the horizontal pixel counter. Therefore, section 2 can be broken up into different sub-sections by the horizontal counter.
For section 2, I will code it as:
if (i_vcounter >= 135 && i_vcounter < 604) // define the vertical boundaries
begin
if (i_hcounter < 324)
r_color <= white;
else if (i_hcounter >= 324 && i_hcounter < 604)
r_color <= yellow;
else if (i_hcounter > 604)
r_color <= white;
end
Using this coding scheme, different vertical sections can be defined, each with different colors depending on the horizontal counter. This is how I created all my images. For a more in-depth tutorial, check out my YouTube video on this here.
AnimationAnimation is done by creating separate pictures, or frames, and stringing them together to make a “moving” image.
In the code, this is done by creating each separate frame first, then tying them together using a case statement. Each frame is displayed for a defined amount of time. Take for example the “happy” emoji code.
// time in clock cycles to hold each frame still
localparam FRAME_1_time = 37500000; // 0.5 sec
localparam FRAME_2_time = 2500000; // 100 ms
localparam FRAME_3_time = 2500000; // 100 ms
localparam FRAME_4_time = 37500000; // 1.5 sec
// total time in clock cycles of animation before looping around
localparam FRAME_total_time = FRAME_1_time + FRAME_2_time + FRAME_3_time + FRAME_4_time;
//////////////////////////////////////////////////////////////////////////////
// frame timing
always @ (posedge clk)
begin
if (clk_counter < FRAME_total_time)
clk_counter <= clk_counter + 1;
else
clk_counter <= 0;
end
always @ (posedge clk)
begin
if (clk_counter < FRAME_1_time)
FRAME <= FRAME_1;
else if (clk_counter >= FRAME_1_time && clk_counter < (FRAME_1_time + FRAME_2_time))
FRAME <= FRAME_2;
else if (clk_counter >= (FRAME_1_time + FRAME_2_time) && clk_counter < (FRAME_1_time + FRAME_2_time + FRAME_3_time))
FRAME <= FRAME_3;
else if (clk_counter >= (FRAME_1_time + FRAME_2_time + FRAME_3_time) && clk_counter <= FRAME_total_time)
FRAME <= FRAME_4;
end
// end frame timing
///////////////////////////////////////////////////////////////////////////////
The time is defined in clock cycles, so a little math is needed to determine “real-world” time.
Creating a Block DiagramThe emoji displayed on the screen depends on what you type into the Vitis serial terminal, but how does that physically work in hardware? Well, the C code on the ZYNQ processor encodes what you type as a 2 bit binary value (4 possible values for 4 different animations), and sends it via AXI GPIO interconnect to the Verilog (rtl) code. This value is used as a “select” value for a multiplexer, which takes in all the pixel color data for each animation, but only outputs one animation, depending on the encoded “select” value from the PS.
To start, create a new project in Vivado, and add all design/constraint sources. For creating the block diagram, you have two options, you can go old fashioned and connect it up according to the picture on the Github repo (bd.jpeg). Just know that the AXI GPIO should be configured as single channel, all outputs, and 2 bits wide. Also add a constant “0” to the i_rst
port of the clk_gen module. It looks like this port is active low according to the block diagram, but it isn’t, and I am not sure why it shows this.
The second option is to run the TCL script provided in the Github repo. If you have never ran a TCL script before, don’t worry, I hadn’t before this project either! Just download and extract it to your project, and go up to “tools” and click, “run TCL script.” Navigate to where the script is located and run it!
Viola! You will have an awesome block diagram! Now, create an HDL wrapper and generate the bitstream, and we will move on to the software portion.
Creating the SoftwareOnce the bitstream is generated, export your hardware and bitstream, then launch Vitis. I created a “hello world” project from the templates, and renamed the “helloworld.c” file to “main.c”. Lets go over the C code.
First, include these libraries:
#include <stdio.h>
#include <string.h>
#include "platform.h"
#include "xgpio.h"
#include "xparameters.h"
#include "xil_printf.h"
The ones in parenthesis are Xilinx libraries. Some are used with the AXI GPIO block. The string.h library is needed for the strcmp()
function later on.
In the set up part of main.c, we need to configure the AXI GPIO block we used.
Make an instance of XGpio and initialize it to the correct device ID. We only used one AXI GPIO block in our project, so our device ID is 0.
XGpio mux_select;
XGpio_Initialize(&mux_select,XPAR_AXI_GPIO_0_DEVICE_ID);
Next, set the data direction to output, because the PS outputs data to the PL.
XGpio_SetDataDirection(&mux_select,1,0);
The last value shown in the parenthesis is the data direction value (1 for input, 0 for output). The middle value, a "1" in this example, should only be changed if dual channel AXI GPIO is enabled (which it is not in this project).
Then, we enter a for-loop that runs 5 times.
for(int i = 0; i < 5; i++){
scanf("%s", my_string);
print("\n");
printf("you entered: %s\n", my_string);
if(strcmp(my_string, "happy") == 0){
mux_sel_data = 0;
}
else if(strcmp(my_string, "sad") == 0){
mux_sel_data = 1;
}
else if(strcmp(my_string, "mad") == 0){
mux_sel_data = 2;
}
else if(strcmp(my_string, "crazy") == 0){
mux_sel_data = 3;
}
This loop compares what you type into the console to a string, and if it matches, changes the "select" value of the multiplexer in the PL (through AXI GPIO).
The last thing to do is make sure this value is written out to the PL using this function:
XGpio_DiscreteWrite(&mux_select,1,mux_sel_data);
Again, the "1" in the middle refers to the AXI GPIO channel, don't change it.
This isn't all the code, just a brief overview. Make sure to view the entire file on the Github repo.
Launching the ProjectPlug in the PmodVGA to the Arty board, the USB cable from your computer to the board, and the VGA cable from the Pmod to the TV. Make sure the VGA port on your TV is selected as the source for the picture.
Hit Ctrl+S to save, and Ctrl+B to build your Vitis project. In the Explorer window, right click on your system project, hover over "Debug As, " and select "Launch Hardware"
This will program the FPGA with the bitstream, and launch the application software.
Once the debug window pops up, find the "Vitis Serial Terminal" (usually in the bottom center of screen), and click the green "+" button.
Select the proper port and use a Baud rate of 115200.
Maximize the terminal window, and hit the "resume" button on the tool bar to run the code.
Follow the directions in the terminal and have fun! Here are some still images of the animations.
Thank you so much for spending your time implementing my project! If you have any questions, please comment them and I will try to answer as many as possible. Play around with the animations, maybe make the faces different colors, or add other features like a nose or lips. I want to see what you create :)
Comments