The power of programmable quadruped robots
See how Bittle is used for Structural Health Monitoring of bridges
Who is Bittle?
In my earlier articles, I have talked about my latest robot, the Petoi Bittle which is a miniature but fully functional version of the Boston Robotics Spot robot, but at a price point of about $300 which makes it fairly accessible to consumers.
As more research builds up on how Bittle can be programmed and used, we get a feel of the powerful capabilities and industrial use cases of such robots.
Lets first discuss what Bittle is. Bittle is a quadruped robot (Quadruped robots mimic animals having 4 limbs). Bittle has 8 joints on its legs, a servo motor running on each of its joints. Bittle is controlled by the an Arduino compatible motion controller. The easiest way to program a Bittle is to attach a Raspberry Pi to it. The Raspberry Pi can then be booted with Ubuntu or ROS and one can then write a python program to control the Bittle and program it to perform different actions. Dmity has a great article on how you can program Bittle with the Raspberry Pi.
Bittle was first released during a Kickstarter campaign in 2020, and we can now see its adoption in research. A very interesting research work by Kay Smarsly from Hamburg University of Technology came out recently where researchers used Bittle to monitor structural issues in the Köhlbrand Bridge in Hamburg, Germany. Let us look into this research and what makes it unique in greater detail.
Structural Health Monitoring (SHM)
Structural Health Monitoring (SHM) is a widely employed technique that is used to detect infrastructure damage and deterioration early, thereby reducing the likelihood of catastrophic structural events such as a bridge collapse. With roads and bridges being used much more than they were designed multiple decades earlier for, SHM plays a very important role in helping administrators decide when to retrofit/ re-strengthen bridges. Researchers showed how Bittle can be equipped with: (1) A Analog Devices 3-axial ADXL355 digital output accelerometer to collect acceleration data relevant to SHM of civil infrastructure, (2) A OmniVision OV5647 image sensor for navigation, and (3) yncronization and data processing across multiple nodes to achieve SHM.
Building miDOG
First, lets take a look at how Bittle was configured into a unit that researchers named miDOG for the experiment. The following figure is a snapshot from the research paper. You can see how the Raspberry Pi is connected to the Arduino board, and how the camera and accelerometer are fitted to Bittle’s body. The trick here is to maintain a good balance of weights, so that Bittle can walk and run as desired, and not fall off.
To evaluate miDOG, acceralation data collected from an existing smartBRIDGE Hamburg system consisting of a dense array of high-precision sensors was compared with acceleration data collected by the mIDOGs. Two miDOGs were deployed for the experiment. The job of each miDOG was to navigate the interior of the bridge steel box girder overcoming stiffener beams and plates which would hinder the motion of wheeled robots, locate the sensing locations, and then collect, synchronize and analyze the acceleration data.
How it works?
Each of the miDOG started at an initial deployment position. They walked along a line drawn on the surface of the bridge interior and stopped when they identified the visual markers. Once both mIDOGs reached the sensing locations, the mIDOGs established a wireless local network for synchronizing information that the sensing location had been reached. Subsequently, the mIDOGs took up measuring posture (notice the crawled posture in the pciture), and started to collect acceleration data and analyze the data to do a FFT transformation and identify peaks. A very neat video of how this was done is available here. It’s really impressive to see how Bittle walked in a straight line carrying all that baggage pover a rugged surface.
Evaluation
The researchers evaluated the miDOG framework in a laboratory setting and in the above real life bridge setting. In the lab setting, the Bittle was kept on top of a multi-story shear-frame structure, and the results from the accelerometer were compared to a single high-precision accelerometer attached directly to the frame. In the real-life setting, the results from miDOG were compared with data compared from the smartBRIDGE Hamburg system. While the results were great in the lab setting, they were not as great on the real life experiment. As an example, one can see from the result below that the FFT output in the two cases look very different. The top image is the FFT result from the smartBRIDGE sensors. The bottom image is the FFT result from miDOG.
Success and furthur work
One success was that the miDOGs were able to reach the visual markers while walking on a rugged surface and crossing some obstacles. Thus, using quadruped robots definitely has a major advantage in terms of reducing the costs and barriers to SHM. However, the researchers need to prove that data can be correctly collected, and they plan to do this by enhancing the sensors that were used in miDOG as well as enhancing the way the sensors were attached to Bittle’s frame.