It was about 20 years ago, when I have started studying C programming in high school. It was the 90s, and everybody but me just had a mobile phone. At the time Nokia phones were ubiquitous, affordable and reliable, and included videogames, of which the most popular was by far snake. I have learnt playing snake during some boring class on my friend's mobile. It was therefore a natural choice to go for a snake clone, when after discovering that the Borland Turbo C compiler included a graphic library, I decided to try making a videogame.
I guess my game had to be terrible: I had worked long time on an ASCII-art "Snake!" banner, built a text-mode window manager library for the menus, and learnt about double-buffering handling for the graphics, but also I had totally misunderstood memory management, since nobody had ever told me what memory pointers were actually meant to. As a consequence, a few things were making raw use of unallocated memory. But I had a more than obsolete 100MHz 486 PC with 9Mb of RAM that my father had recovered as a decommissioned device from his office: it was running DOS while programming, so with a single task, as long as I was occupying relatively little memory, everything was running just fine. Moreover, the timing library was totally unreliable on my machine, and while I had built a timing calibration library that was using the RTC to calibrate the speed of execution of the machine, the game had still a tendency to just run faster on more powerful PCs.
I think I have lost the source code, but I had some executable laying around in some backup device, and when I had tried to run it on newer Windows XP machines a few years later, it would just crash really fast mentioning an attempt to access a protected memory region.
Still, in retrospective I am quite surprised of what I could accomplish unguided, with only basic programming training, and with the only help of the documentation bundled with the Turbo C compiler.
Fast forward to today, after finally having learnt how to program my NXP Rapid IoT Prototyping kit with mbed, before embarking on a moonshot project with it, like running neural networks, porting the Arduino API, or getting BLE to work, I felt like it was time to make a funnier use of the things I had just learnt.
Repositories on MbedYou can then imagine my nostalgic feeling when I bumped into the Silicon Labs Hungry Gecko repository shared on mbed, and why it became the obvious intermediate-level project candidate.
Having said that, if you fancy go directly compiling and running the code, you can simply go to the mbed repository, or you can continue reading if you care more about the technical details I had to figure out.
LCDThe main work was about to adapt the original code to the Rapid IoT LCD driver and the different LCD form factor and resolution.
I had already verified an existing driver on the Rapid IoT kit, that seemingly worked with no flaws.
However, when trying to fit the graphic functions I had troubles to understand how the.window and.update methods were supposed to work together.
After implementing the right.window() selection call for all of the graphic elements, I saw that the game was still not working properly, and particularly even when selecting a reduced x section, the whole lines were actually updated, and only the pixels drawn after the latest.update call were kept. As a consequence every time I was trying to draw anything, everything else that was already drawn on the same line was actually erased.
After looking at the window method implementation, I have noticed that in my case the GraphicsDisplay::window was actually never called.
/** Set window region */
void ColorMemLCD::window( int x, int y, int w, int h )
{
#if ( LCD_DEVICE_HEIGHT != LCD_DISP_HEIGHT_MAX_BUF )
if( ( ( x & 0x01 ) == 0x01 )||
( ( w & 0x01 ) == 0x01 ) ) {
/* Parameter Error */
GraphicsDisplay::window( x, y, w, h );
return;
}
Also, I had seen that guessed that the frame buffer allocated in RAM could not contain the whole screen. I have then figured out that the vertical size of the memory buffer was managed by the definition of the constant LCD_DISP_HEIGHT_MAX_BUF, that I have therefore increased to the size of the whole screen.
/** @def
* window system define
*/
#define LCD_DISP_WIDTH (176)
#define LCD_DISP_HEIGHT (176)
#define LCD_DISP_HEIGHT_MAX_BUF (176) // (44)
After that, the.window and.update function calls started behaving as I was expecting, but the refresh rate became much lower.
My understanding than in this configuration the.update call refreshes all the existing frame buffer at once. I have then reworked the code to include a single.update() call in the main loop.
Still I have kept the option to get back to the slow implementation of the code, by setting to 1 the constant MULTI_UPDATE that I have added to the file settings.h.
#define MULTI_UPDATE 0
According to the datasheet, the display requires continuosly switching polarity with a minimum frequency of 1Hz. I have implemented this in the pollingUpdate function, however I find that this process adds an unpleasant flickering to the screen, that is especially noticeable when using a non-blank backgroudn color.
#define POLLING_LOOP 3
void pollingUpdate(void)
{
static uint8_t idx = POLLING_LOOP;
idx--;
if(idx==0) {
display.polling();
idx = POLLING_LOOP;
// led_blue = !led_blue;
}
}
TickerThe original Hungry Gecko code used a LowPowerTicker object to time at a fixed rate the button capture and screen refresh. This was not working right away on the Rapid IoT kit. Particularly it seems any call to the sleep() or _WFI() would put the core to sleep without ever be able to awake again.
I have then figured out the default Ticker object works fine instead and moved to that implementation.
If I will ever have the need I might go back and try to fix low power mode: maybe it is just a matter of enabling an interrupt source.
TouchThe original code used 2 buttons for controlling the snake, however due to the positioning of the buttons in the Rapid IoT kit, I find this solution awkward and counter-intuitive. Therefore I thought it would have been more obvious to use the touch sensors, that are conveniently positioned in a cross position (up, down, left, right).
I had already tested quickly a driver for the SX9500 I had found, and verified it would return something, however after a trying to extract the real touch readings, I was actually not able to extract any intelligible data. I went then back to the NXP Rapid IoT SDK, and extracted the SX9500 chip settings that, I have pushed in the init() method. After doing so, I have updated my touch controller example and observed it was finally working well.
Hopefully, this also results in a driver that can be reused by someone else in other projects.
Comments