The rainbow looks different to a human than it does to a honeybee or a zebra finch. That’s because these animals can see colors that we humans simply can’t. Now scientists have developed a new video recording and analysis technique to better understand how the world looks through the eyes of other species. The accurate and relatively inexpensive method, described in a study published on January 23 in PLOS Biology, is already offering biologists surprising discoveries about the lives of different species.
Humans have three types of cone cells in their eyes. This trio of photoreceptors typically detects red, green and blue wavelengths of light, which combine into millions of distinct colors in the spectrum from 380 to 700 nanometers in wavelength—what we call “visible light.” Some animals, though, can see light with even higher frequencies, called ultraviolet, or UV, light. Most birds have this ability, along with honeybees, reptiles and certain bony fish.
But it’s difficult to document the moving world through these animals’ eyes. To capture such a wide range of light, cameras must sacrifice visual detail. Scientists can combine high-resolution photographs from multiple cameras tuned to different wavelengths or properties of light. They can also use spectrophotometry, a method that relies on specialized lab equipment to take many different measurements of a single object. Both of these methods are time-intensive, however, and only work on still images taken in highly controlled conditions. For biologists who study animal behavior, these still photographs aren’t enough. “A lot of times, the change of color is the important or interesting part of a signal,” says lead study author Vera Vasas, a biologist now at the University of Sussex in England.
To capture animal vision on video, Vasas and her colleagues developed a portable 3-D-printed enclosure containing a beam splitter that separates light into UV and the human-visible spectrum. The two streams are captured by two different cameras. One is a standard camera that detects visible-wavelength light, and the other is a modified camera that is sensitive to UV. On its own, the UV-sensitive camera wouldn’t be able to record detailed information on the rest of the light spectrum in a single shot. But paired together, the two cameras can simultaneously record high-quality video that encompasses a wide range of the light spectrum. Then a set of algorithms aligns the two videos and produces versions of the footage that are representative of different animals’ color views, such as those of birds or bees.
This resulting video and data are useful for scientific research—for example, conservation work to develop bird-safe windows or to minimize the impacts of light pollution on insects. The setup can also build false-color reconstructions of the videos that approximate what it might look like to have this UV vision.
Capturing video in this way “fills a really important gap in our ability to model animal vision,” says Jolyon Troscianko, a visual ecologist at the University of Exeter in England, who wasn’t involved in the new research. He notes that in nature, “a lot of interesting things move,” such as animals that are engaging in mating dances or rapid defense displays. Until now, researchers studying these dynamic behaviors have been stuck with the human perspective.
Calibrated and tested against the gold standard of spectrophotometry, the new strategy offers nearly the same level of accuracy with far less work. “It’s shockingly accurate,” Vasas says. The technique is already revealing unseen phenomena of the natural world, she adds: for example, by recording an iridescent peacock feather rotating under a light, the researchers found shifts in color that are even more vibrant to fellow peafowl than they are to humans. Vasas and her colleagues also captured the brief startle display of a black swallowtail caterpillar and saw for the first time that its hornlike defense appendages are UV-reflective.
“None of these things were hypotheses that we had in advance,” Vasas says. Moving forward, “I think it will reveal a lot of things that I can’t yet imagine.”
Replicating the system from scratch with 3-D-printed materials and readily available commercial parts would cost just a few thousand dollars, estimates senior study author Daniel Hanley, an assistant professor of biology at George Mason University. In addition to enabling video, the team’s system is orders of magnitude cheaper than other cameras meant to capture UV light, he says. Plus, Hanley adds, those existing cameras are lower-resolution than what the new technique provides.
Other researchers are also eager to try out the method. “I can’t wait to get my hands on the video camera,” says Eunice Jingmei Tan, an assistant professor of ecology and evolutionary biology at the National University of Singapore. Tan studies the color displays and signaling behaviors of spiders and insects in Southeast Asian forests. Currently, she’s working to understand “motion masquerade,” or the way that some arthropods conceal themselves from predators by matching the color and movement of their surroundings. The new setup could be a big help, Tan says, if she’s able to replicate it.
There are some limitations: the camera system needs to be focused manually and has a limited frame rate, so it would be tough to follow especially fast-moving critters, Tan notes. Dim lighting conditions also pose a challenge, Hanley adds. And the method in its current state doesn’t capture every aspect of animal vision. Many animals, for instance, see polarized light or light in the infrared spectrum, which the system would need to be adjusted to perceive, he says.
Still, the researchers hope the technique will provide a unique glimpse into the animal world. There’s a rainbow of possibilities out there—and now we can envision it a little more clearly.