Exploring the possibilities with the Petoi Intelligent Camera Module
Part 1: Introduction to the Intelligent Camera Module
We have frequently used the Petoi Bittle quadruped robot in many of our studies and discussions here. Our past posts have included an introduction, ways to program it with a mobile app, and a demonstration of the Bittle on a ice skating rink. In this series of tutorials, we will explore how equipping the Bittle with an Intelligent Camera module makes it much more fun and interetsing.
Petoi Intelligent Camera Module
One of the little known accessories available for this robot is the Intelligent Camera Module. This is essentially a 2 inch x 2 card with a camera and the ESP32 processor which can bring some of the abilities of computer vision to your Bittle robot such as tracking a human or a ball. This unit although resold by Petoi is the same as the MU3 Vision Sensor which you could also buy on Amazon (although I am not sure if the one on Amazon would provide you the correct cables to connect to your Bittle).
While the camera module can bring many interesting abilities to Bittle, buying this module would make sense mainly for the hobbyist and for those who are familiar/want to learn Arduino and can do a bit of simple C coding. This series of articles is designed to help beginner and intermediate users get the most out of the Intelligent Camera Module.
Getting this to work
Getting the module to work is very straightforward process, thanks to detailed instructions on how to connect the hardware and the Petoi Desktop App. Quite simply, you need to mount the camera module on Bittle’s mouth and have Bittle’s head snap onto the module. The module is then wired to the Nyboard via the on-board I2C ports. The video on Petoi’s website is pretty impressive and informative. Note that if you own the BiBoard V0 instead of the Nyboard, you need to get the BiBoard extension hat to connect the camera module.
With the hardware installed, you need to flash a new firmware to get the camera module to work (The default fimware has the camera disabled). After connecting Bittle to your computer with the USB port interface, you just need to choose the correct settings as in the figure below and upgrade the firmware. Avoid resetting any joints or altering the calibrations if you are already starting from a per-callibrated unit.
Once you turn on your Bittle, it can now recognize a ball and a human body. The LED lights in front of the camera turn blue when an object is detected, and turn red when no object is detected. Here is a short video demonstrating how Bittle can adjust its frame to track a ball being moved in front of it.
Extending the capabilities via Arduino
While the above example is neat, you will soon get bored if you limit yourself to the firmware installed by the Petoi Desktop App. To derive the full value of the camera module, you must try to program it via Arduino and play with the camera equipped Bittle with your own programmed version of the firmware. The following example demonstrates some easy things you can try.
OpenCat Open Source Code
The firmware which drives Bittle is open source and available at Github. We assume that you can download the firmware, open it on Arduino, compile it, and upload it using the instructions in the Github home page. It is quite trivial once you solve the first hiccups and get a hang of the process. Note that you must download and install the MU camera library in Arduino to get this part of the code working.
With respect to the camera module, the code is located in src/camera.h located here. We will make an enhancement available in the open source code, but left disabled for now. This enhancement simply has Bittle walk towards the object (such as a ball or a body) that it recognizes. But if the object is too close, it backs away. Here is a snippet of the code from camera.h
#ifdef WALK
if (width > 45 && width != 52) //52 maybe a noise signal
widthCounter++;
else
widthCounter = 0;
if (width < 25 && width != 16) {
//16 maybe a noise signal
tQueue->addTask('k', currentX < -15 ? "wkR" : (currentX > 15 ? "wkL" : "wkF"), (50 - width) * 50); //walk towards you
tQueue->addTask('k', "sit");
tQueue->addTask('i', "");
currentX = 0;
} else if (widthCounter > 2) {
tQueue->addTask('k', "bk", 1000); //the robot will walk backward if you get too close!
tQueue->addTask('k', "sit");
tQueue->addTask('i', "");
widthCounter = 0;
currentX = 0;
} else
To get this part of the code working, we simply uncomment the #define of WALK at the following line so that the above code is activated. The variable width demotes the width of the detected object on the camera screen. The commands such as wkF and bk stand for walk forward and backward respectively. We then compile and upload the changed code. We power ON Bittle. Here is a demonstration on how Bittle now walks towards the object and then walks away when it gets very close to the object.
More things to try
The fun doesn’t stop here. The camera module is very rich in features and can recognize a variety of shapes, colors, numbers, and traffic directions. The module comes with a deck of cards which you can use to aid the recognition. It can also recognize hand gestures, which is a very useful feature. We will be documenting a lot more interesting expeirments with the camera module. If you are interested, please do subscribe and stay tuned for the next round of experiments.
Rongzhong from Petoi reached out to me and pointed out that the head joint in my video was not moving. I made another video after fixing the head joint. Here is the new video:
https://youtu.be/zpKtvxUw564