Source: Deep Learning on Medium
Drones, Cameras, and Deep Learning: Exploring Ways to Save the Coral Reefs
Coral reefs are some of the most valuable ecosystems on the planet. Though they only cover less than 1% of the Earth’s surface, coral reefs support over 25% of all marine life. Additionally, humans rely heavily on coral reefs for numerous reasons. According to the WWF, coral reefs provide nearly $30 billion each year in goods and services. It’s estimated that a billion people are dependent on the coral reefs in some capacity. Besides fishing and tourism, coral reefs provide medicinal value through drugs developed from the reef’s animals and plants, as well as coastal protection, since they act as a buffer for adjacent shorelines to protect against waves and erosion. However, despite the benefits we gain from coral reefs, we continue to destroy them.
We’ve already lost or severely damaged over 30% of all coral reefs, and it’s estimated that over 90% will be destroyed by 2050. As of 2018, 50% of the coral on the Great Barrier Reef has been lost. Overfishing and acidification due to rising temperatures have been the major causes of the reefs’ demise, but there are ways we can monitor, protect, and regrow the reefs.
Both under-water and in-air drones are currently being used to monitor reefs, collect data, and even plant new coral and kill starfish! Commercially available drones are being re-purposed with two types of cameras to monitor reefs. The first is a high-resolution digital camera and the second is a hyperspectral camera. The images are taken using light from the red, green, and blue spectra of light which are then used to build a 3D model on a computer. These cameras allow for imaging up to 10 feet below the surface, the area at which most coral reefs reside. This imaging helps monitor the condition of the reefs because each component that makes up the reef’s environment has its own spectral signature, even bleached and unbleached coral.
As for aquatic drones, there’s the Queensland University of Technology’s RangerBot. According to its developer, Professor Matthew Dunbabin, it’s “the world’s first underwater robotic system designed specifically for coral reef environments, using only robot-vision for reel-time navigation, obstacle avoidance, and complex science missions.” In addition to multiple thrusters operated by a person using a tablet-based controller, the drone uses computer vision and machine learning for obstacle avoidance and real-time navigation.
In addition to navigation, the RangerBot’s computer vision is used for identifying crown-of-thorns starfish (COTS), one of the largest starfish in the world, and one that preys on hard coral. RangerBot can identify COTS with an accuracy rate of 99.4%. Once identified, the robot deploys an arm that injects the starfish with bile salts, triggering necrosis and killing the starfish within 48 hours with a 100% mortality rate. Prior to this drone, managing COTS among the reefs was up to human divers. RangerBot is far cheaper and far more efficient, as shown by the graphic below.
But RangerBot isn’t only good for monitoring the reefs and killing starfish. Aside from preventative action and data-gathering, the drone’s newest job is helping to promote new reef growth. Scientists are collecting hundreds of millions of coral spawn and developing them into baby corals or larvae. The drone is then used to spread the larvae into depleted regions of coral.
Until recently, only experts have been able to measure reef health and changes over time from images gathered by these drones. However, with the help of these cameras and drones, deep learning processes are also being used to automatically analyze reef photos, bringing down costs and significantly speeding up the process. Researchers from the University of California, Berkeley’s Artificial Intelligence Research Center and the University of Queensland’s Global Change Institute have unveiled a deep learning algorithm that analyzes photos 900 times faster than traditional methods, while being just as accurate. The deep learning algorithm makes it possible for computers to identify as many as 40 different categories of corals, sponges, algae, and other elements in about 225,000 reef images.
The more pictures that are taken and then analyzed by this algorithm, the faster scientists can identify struggling reefs and develop protections, regulate coastal activity, and design experiments to help save these reefs. Some of these experiments may involve growing coral in nurseries, and the data gathered can be used to determine where transplants have the best chance of surviving.