People surf it. Spiders crawl it. Gophers navigate it. Now, a leading group of cognitive biologists and computer scientists want to make the tools of the Internet accessible to the rest of the animal kingdom.
Dubbed the Interspecies Internet, the project aims to provide intelligent animals such as elephants, dolphins, magpies, and great apes with a means to communicate with one another and with people online. And through artificial intelligence, virtual reality, and other digital technologies, researchers hope to crack the code of all the chirps, yips, growls, and whistles that underpin animal communication.
Oh, and musician Peter Gabriel is involved.
“We can use data analysis and technology tools to give nonhumans a lot more choice and control,” the former Genesis front man said at the inaugural Interspecies Internet Workshop, held in July in Cambridge, Mass. “This will be integral to changing our relationship with the natural world.”
Eighteen years ago, Gabriel visited a primate research center in Atlanta, where he jammed with two bonobos, a male named Kanzi and his half-sister Panbanisha. “It blew me away,” he says.
Gabriel eventually teamed up with Internet protocol co-inventor Vint Cerf, cognitive psychologist Diana Reiss, and Internet of Things pioneer Neil Gershenfeld to propose building an Interspecies Internet. For the past six years, the architects of this Dolittlesque initiative embarked on two small pilot projects, one for dolphins and one for chimpanzees.
At her Hunter College lab in New York City, Reiss developed what she calls the D-Pad—a touch pad for dolphins. Joining forces with Rockefeller University biophysicist Marcelo Magnasco, she invented an optical-detection system in which images and infrared sensors are projected from an underwater viewing gallery onto a glass panel, creating an aquatic touch screen through which dolphins can play specially designed apps, including one dubbed Whack-a-Fish.
Meanwhile, in the United Kingdom, Gabriel worked with Alison Cronin, director of the ape rescue center Monkey World, to test the feasibility of using FaceTime with chimpanzees.
The chimps engaged with the technology, Cronin reported at the workshop. However, our hominid cousins proved as adept at video-telephonic discourse as my 3-year-old son is at video chatting with his grandparents—which is to say there were a lot of silly games and not much meaningful conversation. The buggy, rudimentary attempt at interspecies online communication shows that building the Interspecies Internet will not be as simple as giving out Skype-enabled tablets to smart animals.
“There are all sorts of problems with creating a human-centered experience for another animal,” says Gabriel Miller, director of research and development at the San Diego Zoo.
Miller has been working on animal-focused sensory tools such as an “Elephone” (for elephants) and the “Joy Branch” (for birds), but it’s not easy to design efficient interactive systems for other creatures—and for the Interspecies Internet to be successful, Miller points out, “that will be super foundational.”
Researchers are making progress on natural-language processing of animal tongues. Through a nonprofit organization called the Earth Species Project, former Firefox designer Aza Raskin and early Twitter engineer Britt Selvitelle are applying deep-learning algorithms developed for unsupervised machine translation of human languages to fashion a Rosetta Stone–like tool capable of interpreting the vocalizations of whales, primates, and other animals.
Inspired by the scientists who first documented the complex sonic arrangements of humpback whales—a discovery that ushered in the modern marine conservation movement—Selvitelle hopes that an AI-powered animal translator can have a similar impact on environmentalism today.
“A lot of shifts happen when someone who doesn’t have a voice gains a voice,” he says.
Verification and validation remain a challenge with this sort of AI software. Normally, machine-learning algorithms are benchmarked against a human expert, but who is to say if a cybernetic translation of a sperm whale’s clicks is accurate or not?
According to primatologist Sue Savage-Rumbaugh, expertly trained bonobos could serve as bilingual interpreters, translating the argot of apes into the parlance of people, and vice versa.
Not just any trained ape will do, though. They have to grow up in a mixed Pan/Homo environment, as Kanzi and Panbanisha did. Unlike all other research primates, those apes thus grew up versed in both bonobo and human languages.
Panbanisha died in 2012, but Kanzi, nearly 39, is still going strong, and could in theory be recruited to check the accuracy of any Google Translate–like app for bonobo hoots, barks, grunts, and cries. And if wild bonobos in Central Africa can be coaxed to gather around a computer screen, Savage-Rumbaugh is confident Kanzi could communicate with them that way. “It can all be put together,” she says. “We can have an Interspecies Internet.”
That’s music to the ears of Jeremy Coller, a private equity specialist whose foundation partially funded the Interspecies Internet Workshop. At the workshop, the foundation announced the creation of the Coller Doolittle Prize, a US $100,000 award to help fund further related research. A working group also formed to synthesize plans for the emerging field, facilitate future event planning, and guide testing of shared technology platforms.
Why would a multimillionaire with no background in digital communications or psychology want to back the initiative? For Coller, the motivation boils down to empathy: “If I can have a chat with a cow,” he says, “maybe I can have more compassion for it.” —Elie Dolgin
An extended version of this article appears in our Tech Talk blog.
POST YOUR COMMENTS at https://spectrum.ieee.org/animalinternet0919