Okay. Here is the gist and long story. Opposites but good in that one can gather required data quickly and perform the task efficiently.
https://docs.beagle.cc/boards/beagley/ai/demos/object-detection-tutorial.html
That link actually does work and it is up now for your viewing pleasure. The object detection source is on that page and works mostly. One would have to know what he or she is doing to handle the script and source.
Here is a small calculated photo of the object detection working...
I actually tested it and found out I was half dog. But, when I placed my hand out of the running USB Camera and placed my face in the way, I was almost 100% human.
I also picked up a mouse and showed it to the ole camera. The camera reported back via tensors that the mouse was a controller. This was all done on a HDMI screen and via documents in source and commands.
Now, onto the real deal!
So, Buildroot. Yeppers, it is a thing in technology that I have grown to hate and love simultaneously. Mostly, the disregard of Buildroot is because of me and my lack of reading beforehand what it is I need to be doing. But! I have a a build of sorts and it is building as I type. OKAY! It just finished.
Here are some commands to get you started:
1. get Buildroot
2. compile and cross compile the Buildroot Source
3. I used these commands for the BeagleY-AI from beagleboard.org
a. make ARCH=arm64 beagley_ai_defconfig
b. make ARCH=arm64 beagley_ai_defconfig menuconfig
c. make ARCH=arm64
If in between building your legitimate beagley-ai build from Buildroot, the installments cease and error out, try to use these specific commands within menuconfig:
1. Only pick one option at a time. For instance, I picked and changed a bunch of things, options, and added arguments that all seemed to fail for me.
2. Then, I went collective and simple on my build. I changed the type of Busybox arguments to none. I changed the toolchain to external and ported ARM's toolchain instead of Bootlin's toolchain and picked systemd instead of the Busybox set of files. I also changed the way in which my shell would be represented.
3. I also added one library called libgpiod. I did not port the libgpiod2 source library because I know nothing about it (presumably).
Anyway, my build finished and now when I boot my board, I am hoping to handle some GPIO pins in the near future. As this is a work in progress, I will be adding context and content when I achieve or find an avenue to handle specifics of successfully building Buildroot, having GPIO access, and maybe (since it is available) much more.
Oh and if you are new to Buildroot, they have a website, github to use git alongside the building process, and Bootlin does a fine job of handling much of what the am335x offered to people and businesses in.pdf format and it is all on the Buildroot website.
Getting the BeagleY-AI has not been as easy with comparing the manual from Bootlin's Buildroot.pdf accessibility on the am335x until now. Beagleboard.org has done all the heavy lifting and made a defconfig.
They also made a set of scripts to handle booting into a Buildroot instance on an embedded device. Okay! Well, the build has finished and I need to test the micro SD Card on the machine in question. Wish me luck.
Seth
P.S. More content on the way, i.e. especially if I get my motors to start spinning in revolution. I am not expecting anything in the 200 MHz range but with libgpiod-dev and their tools, I may be able to move some stepper motors and a bit quicker than some of my other videos of moving parts. I always wanted to find a way to handle source in a manner that was catching and moving:
So, the Camera sees while my motor moves in command from what the camera is witnessing. Something along those lines but with little assistance and more MOTORS!
Comments