NVIDIA has announced availability of the Jetson AGX Orin 32GB system-on-module (SOM), a part it had originally said would not be available until the end of the year — pricing the high-end embedded machine learning device at $999.
The Jetson AGX Orin 32GB, designed for on-device artificial intelligence and machine learning workloads, offers 200 TOPS of INT8 compute across eight Arm Cortex-A78AE processor cores, an Ampere GPU with 1,792 CUDA cores and 56 Tensor cores, and on-board vision and deep-learning accelerator coprocessors. The model launched this week includes 32GB of LPDDR5 memory and 64GB of eMMC 5.1 storage on-board.
NVIDIA unveiled the Jetson AGX Orin family back in November last year, promising a sixfold performance boost over its previous-generation system-on-module parts — but it wasn't until March this year that anyone could pick up the Jetson AGX Orin Developer Kit. Even then, the modules themselves were delayed — and while the company revised its original estimate of year-end availability, it just barely missed out on hitting its new July deadline by a matter of days.
The 32GB SOM is priced at roughly half the cost of the Jetson AGX Orin Developer Kit, but offers weaker performance. While both include a Jetson AGX Orin SOM with 32GB of RAM, the Developer Kit uses a variant, which has 12 Arm cores, 2,048 CUDA cores, and 64 Tensor cores to deliver 275 TOPS of INT8 performance.
At the same time as commercial availability of the module, NVIDIA has announced compatible designs from its partner ecosystem with a list of full systems with the module pre-installed and carrier boards published to the company's developer site.
The Jetson AGX Orin 32GB Module is now available to buy via the NVIDIA Store at $999; the AGX Orin Developer Kit, with more powerful SOM variant, carrier board, cooling system, and housing is available at $1,999. NVIDIA has not yet confirmed a launch date for the Jetson AGX Orin 64GB module nor the lower-end parts in the family.