Every day, supermarkets around the world discard thousands of perfectly edible fruits and vegetables. Not because they're unsafe or nutritionally lacking, but simply because they don't meet visual standards. This widespread preference for “picture-perfect” produce has created a hidden but massive sustainability crisis.
In 2024, Germany generated over 18 million tons of food waste, with a substantial portion stemming directly from retail shelves and household rejection of cosmetically flawed goods. According to the German Environment Agency, nearly half of this waste could have been avoided with smarter handling, better incentives, and more accurate assessments of actual food quality. In some discount supermarket chains, the problem is especially stark: only 17 percent of apples on display are ever sold. The rest are discarded, primarily for failing to meet cosmetic expectations—even if they're still perfectly fresh and nutritious.
This is not just a local issue. Globally, the United Nations reports that over 1 billion tons of food are wasted every year. Fruits and vegetables, due to their perishability and fragile appearance, represent the largest share of this total. The consequences go far beyond waste itself. Food loss is responsible for an estimated 8 to 10 percent of global greenhouse gas emissions, putting enormous pressure on the environment. What's particularly tragic is that most of this food is discarded without any reliable method to determine its true condition or remaining shelf life. Retailers lack real-time tools to objectively assess freshness, often defaulting to overcautious practices. Meanwhile, shoppers are rarely rewarded for choosing less visually appealing, but otherwise high-quality, produce, leaving little motivation to make sustainable choices.
The result is a broken feedback loop: retailers dispose of produce prematurely to avoid risk, and consumers reinforce these behaviors by buying only the “best-looking” items. This cycle not only drives waste, but also contributes to inflated prices, reduced access to affordable food, and a growing environmental burden.
The Solution: SmartScaleSmartScale offers a fundamentally different approach to evaluating and selling fresh produce. Instead of relying on appearance alone, it uses radar-based imaging and artificial intelligence to assess the true internal and external condition of fruits and vegetables - right at the point of sale.
The system is designed to operate as part of a smart retail checkout experience. When a customer places a piece of fruit on the built-in radar-equipped scale, SmartScale instantly scans it using high-frequency radar. This allows the device to detect both visible and invisible imperfections, such as soft spots, bruising, or signs of overripeness. At the same time, the system identifies the fruit type and calculates a freshness score using a custom-trained neural network running directly on the device.
Based on the assessed freshness, SmartScale immediately applies a proportional discount and prints a price label that reflects the fruit’s actual value, encouraging customers to make informed, sustainable choices. The customer experiences a seamless, intuitive transaction: savings are visible, the impact is tangible, and the process takes just a few seconds.
By equipping supermarkets with a scalable, AI-powered pricing assistant, SmartScale helps reduce food waste, improve inventory turnover, and increase sell-through of produce that might otherwise be discarded. For consumers, it transforms sustainable shopping into a smart financial decision. And for the broader food ecosystem, it represents a powerful example of how edge AI can solve real-world problems - quietly, efficiently, and at scale.
SmartScale doesn’t just optimize produce pricing. It redefines what “fresh” means in the modern grocery store, and makes sustainability a natural part of everyday shopping behavior.
Technical ExecutionThe current SmartScale prototype was developed as a focused proof of concept, using bananas as the initial fruit type for training and testing. This choice allowed for tighter control of variables such as ripening stages, shape, and size during data collection, while also targeting one of the most commonly wasted produce items in retail environments. Despite this initial focus, the entire system was designed with scalability in mind; both in terms of hardware performance and model architecture, making it adaptable to a wide variety of fruits and vegetables in future iterations.
At the core of the system is the Infineon PSoC™ 6 AI Evaluation Kit, which integrates a low-power dual-core MCU with a high-frequency radar sensor. Radar was selected as the primary sensing technology for its ability to detect subtle internal and surface-level changes in fruit quality, even when such defects are invisible to the human eye. Unlike optical sensors, radar is unaffected by ambient lighting and can provide consistent results in varied retail environments.
During operation, when a banana is placed on the SmartScale, the radar module captures a time-domain reflection profile that corresponds to the fruit’s internal structure and surface characteristics. This raw signal is processed locally to extract frequency and amplitude features, which are then passed into a lightweight convolutional neural network. The network, trained to recognize patterns associated with ripeness and surface degradation, outputs both a classification (confirming the fruit type) and a freshness score, which is scaled between 0 and 100. This score is then mapped to a dynamic price reduction and displayed on the screen. Further versions could print the discount directly onto a thermal label.
Model training was conducted using DEEPCRAFT Studio, an embedded AI development platform that enabled rapid prototyping and performance tuning for resource-limited devices. Its signal processing tools were instrumental in converting raw radar reflections into clean, standardized input tensors. DEEPCRAFT also allowed for easy model quantization and pruning, helping compress the neural network to fit within the PSoC 6's modest memory footprint without sacrificing accuracy. The resulting model runs inference in under 250 milliseconds, making it responsive enough for real-time checkout scenarios.
All computation occurs entirely on the device, with no reliance on cloud connectivity. This ensures low latency and maintains customer data privacy, while also making the solution deployable in low-infrastructure retail settings. Though the current model is limited to bananas, the same architecture can be expanded to accommodate a broader range of produce types. With additional labeled radar data, the classification layer can be retrained or extended to include apples, tomatoes, avocados, and other high-waste items, each benefiting from the same freshness scoring and discount mechanism.
Ultimately, SmartScale demonstrates that edge AI can deliver fast, reliable insights into food quality using compact, affordable hardware and non-invasive sensors. By validating the concept with bananas and building a generalizable pipeline, the project lays the groundwork for a more intelligent, sustainable retail future where cosmetic imperfections are no longer a barrier to smart pricing and waste reduction.
System ArchitectureSmartScale was designed with modularity and responsiveness in mind, separating critical sensing and inference tasks from user-facing functionality through a dual-MCU architecture. This architecture allows each component to focus on its strengths, balancing real-time AI performance with flexible connectivity and dynamic web-based visualization.
At the core of the system is the Infineon PSoC™ 6 AI Evaluation Kit, which handles all radar sensing and on-device machine learning. When a piece of fruit is placed on the integrated radar sensor, the PSoC captures and processes the raw signal data. After local feature extraction, a quantized neural network infers a ripeness score in real time. This score, scaled from 0 to 100, represents the internal and external freshness of the fruit based on radar signal distortion, softening patterns, and moisture content.
Once the ripeness score is calculated, it is transmitted over UART to a secondary microcontroller: the Adafruit ESP32 Feather V2. The ESP32 serves as the system's connectivity and interface hub. It listens for incoming ripeness scores from the PSoC 6 and uses that data to compute a corresponding discount value, based on a preconfigured pricing model. This logic can be easily tuned depending on store policy or inventory priorities.
The ESP32 also hosts a lightweight web server, which delivers a real-time webpage to a nearby tablet at the point of sale. This page is automatically updated with the fruit’s ripeness score, discount percentage, and final price. By decoupling the inference pipeline from the user interface, the system ensures responsive web updates without affecting sensor throughput or neural network performance.
This distributed architecture offers several advantages. It isolates the radar and AI workload to a dedicated processor, freeing the ESP32 to focus solely on network communication and user interaction. It also provides flexibility for future extensions, such as integrating NFC for checkout, pushing data to the cloud for analytics, or supporting multilingual web interfaces across different regions. The use of UART for inter-device communication ensures reliable, low-latency data transfer with minimal wiring and overhead.
Together, the PSoC 6 and ESP32 create a well-balanced system that showcases the potential of edge AI in real-world retail scenarios. By bridging embedded intelligence with accessible, real-time web feedback, SmartScale makes the technology seamless for both store staff and everyday shoppers.
Data Collection & Model TrainingThe training process behind SmartScale began with a tightly focused dataset: five bananas observed over a two-week period, scanned repeatedly using the radar sensor on the Infineon PSoC™ 6 AI Evaluation Kit. Each fruit was measured from multiple angles and orientations throughout various ripening stages, capturing radar reflections that encode internal moisture levels, structural changes, and surface softness - key indicators of freshness that radar is uniquely positioned to detect.
I categorized the scans into four ripeness classes: ripe, overripe_1, overripe_2, and overripe_3. Each class was then mapped to a freshness score for consistency:
- ripe = 100
- overripe_1 = 75
- overripe_2 = 50
- overripe_3 = 25
While this small-scale dataset allowed for rapid prototyping, it also introduced certain limitations. Most notably, the ripeness labeling was based on my human observation by assessing texture, color, and firmness by touch. For instance, the boundary between overripe_1 and overripe_2 was often a matter of judgment rather than a precise measurement.
These labels, while consistently applied, were inherently subjective and not derived from scientific or chemical freshness benchmarks. As a result, the model learned to predict ripeness based on a signal-to-score mapping that reflects human intuition more than objective ground truth. This makes the current model suitable for demonstration purposes but not yet ready for commercial deployment or fine-grained accuracy across broader use cases.
Despite these constraints, the collected data did reveal meaningful trends. Bananas early in their shelf life produced cleaner, stronger radar reflections, while aging ones exhibited more noise, signal diffusion, or internal inconsistencies.
DEEPCRAFT Studio simplified the preprocessing pipeline, offering tools to clean and normalize the radar data, visualize feature sets, and rapidly experiment with model architectures. The final model, trained on the limited dataset, was then quantized and pruned for microcontroller deployment. While inference results aligned well with the hand-labeled scores during testing, further evaluation under more varied and controlled conditions will be essential to validate the model’s generalizability.
The current model achieves real-time inference on the PSoC 6 with latencies under 250 milliseconds. However, scaling this system to support a wider range of fruits and to deliver more objective freshness scores will require both expanded data collection and scientifically grounded labeling methods. This could involve integrating firmness testers, ethylene gas sensors, or spectrometers for labeling calibration, alongside human assessment.
The full DEEPCRAFT Studio training pipeline is detailed here for those who want to explore or replicate the process.
Web Server DevelopmentTo handle the front-end interaction for SmartScale, I developed a lightweight web server running on an Adafruit ESP32 Feather V2. The ESP32 receives the ripeness score over UART from the PSoC™ 6 AI Kit and uses that value to calculate a real-time discount. It then updates a simple HTML interface that’s displayed on a nearby tablet or smartphone via Wi-Fi.
I chose to build this part of the system using PlatformIO. Although Infineon’s ecosystem is robust, I found PlatformIO to be more familiar and flexible for developing ESP32-based applications, especially with its better dependency management, built-in support for the Arduino framework, and easier debugging workflow.
The server itself is built using the ESPAsyncWebServer library, which allowed me to serve dynamic content efficiently without blocking the main loop, crucial for keeping the UART communication responsive. Once the ESP receives the ripeness score from the PSoC 6, it applies a simple pricing logic (e.g., more ripeness = deeper discount) and injects the updated values into an HTML template rendered on the fly.
Although the interface is minimal in its current form, it provides a clear, real-time snapshot of the analysis performed by the AI model. This UI could easily be expanded to show trend graphs, historical freshness data, or even customer-facing sustainability tips in future versions.
The choice to offload the web server to the ESP32 was driven partly by time constraints. I considered hosting the web server directly on the PSoC 6 AI Kit itself, but didn’t manage to implement and debug the networking stack in time for the prototype. Still, the current architecture, with clear separation of inference and interface, worked reliably and made iteration on both components easier.
Lessons LearnedThis project taught me a lot! Not just about radar and embedded AI, but also about navigating the real-world challenges of prototyping on unfamiliar hardware. At the beginning, getting everything up and running was tougher than I expected. ModusToolbox didn’t work out of the box, which was a frustrating start. But to their credit, the contest team responded incredibly quickly and helped me get back on track. That kind of hands-on support made a huge difference early on.
Once I got into development, I realized that a lot of the sample code was heavily templated and abstracted. While that’s great for quick starts, it made it hard for me to understand what was happening under the hood, especially when working with the radar interface or trying to optimize for real-time performance. I found myself wishing for a simpler, bare-metal example to better understand how everything fit together.
That said, once I had the core system running, I was genuinely impressed by the hardware itself. The radar sensor was more powerful than I expected. It could detect differences in banana ripeness that were almost impossible to see visually. And the PSoC 6’s dual-core setup, low power draw, and built-in AI acceleration made real-time inference feel surprisingly smooth. I also appreciated the Qwiic connectors and accessible I/O, which made integrating the ESP32 over UART fast and hassle-free.
Working with DEEPCRAFT Studio was another highlight. It made training and deploying my model much easier than building a full pipeline from scratch. However, I also realized that labeling ripeness data by hand, based mostly on touch and appearance, was a weak link. While it worked for a proof-of-concept, I’d want more objective ground truth for any future model to be truly reliable.
Next Steps & Future WorkNow that I’ve proven the idea can work (at least for bananas), the next step is to scale up. I want to collect more radar data from different fruits like apples, tomatoes, and avocados, especially the ones that often get thrown out in stores because they don’t look perfect. But to make the model more accurate and generalizable, I’ll need to upgrade my labeling process. That might mean bringing in tools like firmness testers or even using gas sensors to track ethylene production as a more scientific measure of ripeness.
On the software side, I plan to build a multi-class classification system that first identifies the type of fruit and then runs the appropriate freshness model for that type. This way, the system could be dropped into any supermarket and adjust in real time to whatever produce is placed on the scale.
Another improvement I’m exploring is moving the entire web server directly onto the PSoC 6 AI Kit. Right now, the system uses an external ESP32 Feather to handle networking and serve the discount UI to a nearby tablet. In theory, the PSoC 6 has enough horsepower and networking capability to serve a simple web interface on its own, eliminating the need for a second microcontroller. Unfortunately, I ran out of time during the competition to make that integration work, but it’s high on my list of things to revisit.
I also want to expand the user interface, which is currently a lightweight web page served from the ESP32 to a nearby tablet. In the future, I could see that evolving into a full dashboard for store managers, with analytics, alerts for aging stock, and suggestions for dynamic pricing adjustments. And eventually, I’d love to integrate with checkout systems using NFC or barcodes, so customers can get discounts and info without any extra steps.
This project started with just five bananas and a radar sensor, but it's already opened up a lot of exciting possibilities for smarter, more sustainable retail. I'm looking forward to pushing it even further.
Comments