M5Stack's LLM630 Compute Kit Wants to Host Your On-Device Large Language Models at the Edge

Aixim AX630C system-on-chip delivers enough compute for local LLMs, M5Stack claims, with Llama3.2-1B among those available at launch.

Gareth Halfacree
2 months ago β€’ HW101 / Machine Learning & AI

Embedded and hobbyist hardware specialist M5Stack has announced a new development board targeting those looking to work with on-device large language models (LLMs) at the edge: the M5Stack LLM630 Compute Kit, built around an Aixin AX630C system-on-chip (SoC).

"[The] LLM630 Compute Kit is an AI large language model inference development platform designed for edge computing and smart interactive applications," M5Stack says of its latest development board design. "[The] LLM630 Compute Kit is suitable for security monitoring, smart retail, smart agriculture, smart home control, interactive robotics, and education, providing powerful computing capabilities and flexible expandability for edge AI applications."

M5Stack's latest development board is a compact system-on-module designed for large language models: the LLM630 Compute Kit. (πŸ“Ή: M5Stack)

The LLM630 Compute kit pairs a system-on-module with a compact carrier board, barely wider than the physical Ethernet port at one end. The SOM is based around the Aixin AX630C system-on-chip, which pairs two Arm Cortex-A53 general-purpose processor cores running at up to 1.2GHz with a neural coprocessor delivering a claimed 3.2 tera-operations per second (TOPS) of compute at INT8 precision β€” rising to 12.8 TOPS if you drop to INT4 precision. There's 4GB of RAM, but this is shared: 2GB is given over to the Arm cores for general use, and 2GB reserved for the neural coprocessor.

The module includes 32GB of on-board eMMC 5.1 storage, expandable via microSD Card, and mates to a carrier board delivering a wired gigabit Ethernet port, single-band Wi-Fi 6 through an Espressif ESP32-C6 companion processor, USB 2.0 Host or Device connectivity plus a USB to UART converter on a pair of USB Type-C connectors, one two-lane MIPI Display Serial Interface (DSI) supporting up to 1080p30 and one four-lane MIPI Camera Serial Interface (CSI) supporting up to 4k30, a battery management system compatible with 3.7V lithium-ion or -polymer batteries, and an integrated Bosch Sensortec BMI270 six-axis inertial measurement unit (IMU).

The SOM at the kit's heart is supported by the StackFlow framework, which delivers a low-code environment for the tweaking and deployment of artificial intelligence (AI) models for everything from computer vision to speech recognition and wake-word detection β€” but it's the Aixin's capacity for on-device large language models (LLMs) that is most likely to appeal. M5Stack says the hardware supports a range of LLMs and other machine learning models including Qwen2.5-0.5 1.5B, Llama3.2-1B, and the multi-modal InternVL2.5-1B, while the software will receive ongoing upgrades "to support cutting-edge large models."

The M5Stack LLM630 Compute Kit has been listed on the company's store at $69.90, though is not expected to go on sale until after the Chinese New Year.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles