This tutorial draws from the getting started guide here.
Kit contentsBesides the MAX78000, the kit contained:
- OVM7692-RYAA - camera module
- Olimex ARM-JTAG-20-10 adapter
- Olimex ARM-USB-OCD-H
- Extra jumpers
I'm using Windows, so I simply download and run the Maxim Micros SDK (If you're on Linux, see the original tutorial here for install instructions).
- At the first screen that appears, click the Settings button in the lower left corner of the application.
- Click on the Repositories tab.
- Add three temporary repositories by clicking on Temporary repositories and clicking the Add button three times. Add them in the "repository" section. Add the following URLs:
http://www.mxim.net/prerelease/msdk/dist/libs
http://www.mxim.net/prerelease/msdk/dist/dev
http://www.mxim.net/prerelease/msdk/toolchain/win32
- Click on the Use temporary repositories only checkbox to enable it.
- Click the OK button and follow the on-screen instructions to complete the installation.
We'll be using the camera module in the kit to recognize dogs and cats (I don't have a dog or cat, so I showed it photos on my phone, which worked ok). Connect the camera to the kit facing outwards, as shown:
1. Make sure PWR switch is in the "OFF" position.
2. Check that jumper JP1 is installed,
3. jumpers P0_0 and P0_1 are installed on JH1.
4. Connect USB cables from the PC to the USB/PWR connector (CN1) of the EV kit. This cable will power the board and provide a virtual serial port connection to the MAX78000's UART.
5. Install JP7 and JP14 to enable the external boost controller.
6. Move the power switch to the "ON" position.
If your board is new, it should have the blink app preloaded. Test it out with the following steps (otherwise skip to the next step):
- Open TeraTerm or a similar serial terminal application, and connect with a baud rate of 115200 and 8-N-1 settings.
- Reset the EV kit via SW5. You will see message from the MAX78000 appear in the terminal and LED1 (D1) on the board will begin blinking at a steady rate.
Connect the ARM-JTAG-20-10 adapter to your board and use the provided micro-USB to plug it into your PC.
The first step is to build the example from the SDK.
Click on Start and then select the shortcut under "Maxim Integrated SDK" to open the MinGW shell.
To build the example, simply change directory into the example folder and run make. We'll be building this example that recognizes cats and dogs:
cd /c/MaximSDK/Examples/MAX78000/CNN/cats-dogs_demo
make
You'll have to have two MinGW shells open to connect to the board.
Open a second MinGW Shell and navigate to the OpenOCD directory. Launch OpenOCD (open on-chip debugger):
cd /c/MaximSDK/Tools/OpenOCD
openocd -f interface/cmsis-dap.cfg -f target/max78000.cfg -s/c/MaximSDK/Tools/OpenOCD/scripts
Note: I got the error unable to find CMSIS-DAP device
when I plugged the debug adapter to a USB extender. When I connect the USB directly to my PC it works fine.
From the original MinGW shell, navigate to the build folder where the elf file was saved.
cd /c/MaximSDK/Examples/MAX78000/CNN/cats-dogs_demo/build
Run this command to launch GDB:
arm-none-eabi-gdb max78000.elf
From GDB, connect OpenOCD:
(gdb) target remote localhost:3333
Reset the MAX78000:
(gdb) monitor reset halt
Use GDB to load file onto board
(gdb) load
Verify the application
(gdb) compare-sections
Reset the device and run the application:
(gdb) monitor reset halt
(gdb) c
Congrats! You can now recognize dogs and cats in real time using the camera!
The MAX78000 boasts incredibly low power - but how low? I decided to measure.
Measuring CNN Current
To measure the CNN current, connect a low impedance current meter (<5 mΩ) across JP13. If the meter impedance is greater than 5 mΩ then also remove R14 from the board.
The results? Not very much power is needed - this CNN only draws 7uA at peak current!
Going FurtherYou can find more examples for the SDK under the folder C:/MaximSDK/Examples/MAX78000/
The CNN folder has all of the image recognition examples. You can also train your own models by following the ai8x-training.
Comments