If an oil spill were to hit B.C.’s southern coast, threatening the local orca population, the Department of Fisheries and Oceans (DFO) could respond in a way that wasn’t technologically possible just two years ago, says Paul Cottrell.
For years the marine mammal co-ordinator counted on a network of 18 hydrophones – underwater listening devices lining much of Vancouver Island – to detect calls of the endangered southern resident killer whales and track their movements in the Salish Sea.
But what if artificial intelligence could be harnessed to automatically detect the calls of that one particular subgroup of orcas around the clock? That was the pitch Google’s (Nasdaq:GOOG) artificial-intelligence division made to the DFO at a 2018 workshop in Victoria.
“The opportunity to work with such cutting-edge individuals and technology was amazing,” Cottrell said. “It was basically no cost – just the expertise to develop this tool.”
Using millions of hours of public YouTube videos, Google AI had already developed technology that could understand and identify sounds of everything from breaking glass to crying babies.
But that model had not yet been adapted to recognize animal calls, so the DFO provided Google AI with 1,800 hours of underwater sounds and 68,000 annotations to help distinguish the difference between a dolphin and an orca, for example.
“We used this as training data for our machine-learning model; we showed other examples to our existing sound-understanding model and essentially adapted it to detect accurately the presence of an orca,” said Julie Cattiau, a product manager at Google AI. “It was definitely really fun and thrilling.”
The tech giant allowed Cattiau and her team in Silicon Valley to spend about 20% of their working time on the side project over about 18 months as part of the company’s AI for Social Good initiative.
For the new model to be put to practical use, the DFO and Google AI partnered with non-profit group Rainforest Connection, which had developed a platform to detect the sounds of chainsaws revving up in the Amazon and alert authorities to illegal logging.
Now when the hydrophones detect sounds that the AI technology identifies as the calls of the 73 southern resident killer whales, Cottrell and his team receive live alerts on their location through an app developed by Rainforest Connection.
The team can then review the sounds picked up by Google’s technology, categorizing each as accurate or false, helping the technology to learn.
“That helps the program to continually get better, and it will probably be at one point much better than we are,” Cottrell said.
Now, in the event of an oil spill, he said the DFO can be confident of the orcas’ location and help change their direction of travel.
And in dangerous transit areas such as Active Pass, a narrow strait frequented by ferries and other vessels, the DFO can help divert traffic if killer whales are travelling through.
“It just reduces the vessel strike risk, which is huge. BC Ferries doesn’t want to be in Active Pass when killer whales … are there,” Cottrell said.
He said the next goal is to work with Google to see if the technology can distinguish between the three pods that compose the southern resident killer whales.
“There are a few humans that can do it but … [the technology] will just help us more,” Cottrell said.