The two-story Open Oceans tank at the Pittsburgh Zoo & PPG Aquarium contains 100,000 gallons of salt water, 30 species of sea life – and one submersible robot, or Reefbot, named CLEO.
Young visitors to the exhibit use a control station to remotely pilot CLEO around the tank and use its high-definition video camera to track fish and snap photos. By comparing the images from CLEO with reference photos, visitors can identify the type of fish. In the process, the young explorers are helping researchers at the Robotics Institute develop software that might someday be used by scientists to automatically detect, classify and count fish in natural habitats.
Reefbot is a joint project of the zoo and the Robotics Institute, with funding through Spark, a program of The Sprout Fund. Ashley Kidd, an aquarist at the zoo, developed the idea and Justine Kasznica, a local business consultant for high-tech start-ups, managed the project. David Wettergreen, associate research professor of robotics, oversaw the project at the Robotics Institute, where PhD students Mark Desnoyer, Michael Furlong and Scott Moreland and senior research engineer John Thornton built the robot and developed the software.
Graduate students in the Human-Computer Interactions Methods course taught this fall by Bonnie John, professor in the Human-Computer Interaction Institute, designed the robot’s interface and the project website, http://reefbot.com/.
CLEO (Children Learning through Education and Observation) was built by adding an HD video camera and making other modifications to a commercially available submersible. Moreland said reliability was a major consideration. Most remotely operated vehicles (ROVs) used by oceanographers operate for only a few hours at a time and are routinely serviced between dives; CLEO, by contrast, will need to operate for a couple of weeks before scheduled servicing.
CLEO, about a foot and a half long, moves too slowly to chase fish or cause any damage to itself, the tank or the sea life, Moreland said. Navigation software includes safeguards to prevent the tethered submarine from getting caught in crevices, caves and obstructions.
Desnoyer, whose doctoral thesis will focus on intelligent camera systems, led development of CLEO’s smart camera technology, which helps detect fish and may eventually be able to automatically classify fish. Aquarium visitors who use CLEO to identify fish in the tank are helping to train the system.
Though humans are identifying the fish based on photos, what CLEO is learning in this process is a set of attributes that it can associate with particular species, Wettergreen explained. “It might be the size of a fin and the color of a tail that identifies one species and the pattern of stripes and the body shape that identifies another,” he said. “But in fact some of the attributes CLEO learns might not be things that people would even recognize, such as ratios of properties.”
Wettergreen said scientists who study deep coral reefs might be particularly interested in the technologies being developed for CLEO. In contrast to corals that flourish in shallow, tropical waters, these deep reefs are difficult for human divers to study in detail. The reefs are threatened with extinction because of rising ocean temperatures and changing ocean chemistry. ROVs that could autonomously identify and count the creatures and organisms living on these deep corals could provide invaluable knowledge. Further funding will be necessary to pursue those goals, he added.