How Animals See

New research lets us glimpse the kaleidoscopic visual world of creatures great and small.

Orange-barred sulphur butterfly in UV light, captured in colors that animals can perceive.

Courtesy of Daniel Hanley

Daniel Hanley was four or five when he tried his first cup of plain yogurt with wheat germ. “It was an atypical snack,” he says, but he remembers loving it instantly. Hanley walked into his family’s backyard in upstate New York to eat his yogurt at the forest’s edge, where he was visited by two adult robins.

“They were approaching me, calling to me with small chirps,” he says. So Hanley chirped back. For 10 or 15 minutes, the boy and the birds kept up a spirited exchange. He says he was “really sure that they were trying to communicate with me.”

That was the moment when Hanley first recalls being fascinated by the minds of animals and their perceptual world. “I spent a long time wondering about how they experience such encounters,” he says. “What do they sense, perceive, and feel?”

Not quite four decades later, as a sensory ecologist at George Mason University and a National Geographic Explorer, Hanley remains just as enthralled with that question and the notion that “animals have value and wealth and their own perspectives and their own worldviews,” as he told me. He longs to know how an animal’s perception of its world impacts its judgments, behaviors, and decisions. Early in his career, he decided to focus much of his attention on bird eggs—the evolution of egg color and, more recently, cuckoo eggs in particular. These medium-sized birds lay their eggs in the nests of other species, forcing the host to raise the baby cuckoos, a phenomenon known as brood parasitism. Hanley wondered which features of a cuckoo’s egg so effectively dupe the host birds into thinking it’s one of theirs. The answer, he found, is color mimicry. “Many of the hosts are subjected to a fairly big cuckoo egg,” he says. “The size and the shape and all that stuff doesn’t matter near as much as color. So they’re really tied to this world of color.”

Hanley also chose to study bird eggs because they’re straightforward to work with and experiment on—after all, “they don’t get up and walk away.” But Hanley knew that a bunch of eggs lying around a nest is an overly simplistic reduction of the kind of scenario animals usually navigate. Their visual world is a dynamic and fluid one that’s constantly moving and evolving. An organism has to interact with all that shifting complexity every moment to stay alive. Imagine a tiger slinking through the forest; plants and flowers fluttering in the breeze and attracting pollinators; a bird-of-paradise strutting its feathers in a clearing; or a frilled lizard fanning out the fold of skin encircling its neck to startle a predator. When we use our naked human eye to watch any of these scenes unfold, we’re not seeing the whole picture. To understand what’s at stake and what’s really happening from an animal’s perspective, Hanley says we need to view the world through the organism's visual system.

This kind of picture represented a wholly more challenging one for Hanley to capture, visualize, and interpret. “It’s really hard to wrap your head around what another animal can see,” he observes. His eggs would only get him so far.

Around this time, he connected with Vera Vasas, a computational biologist now at the University of Sussex who was facing a similar struggle. She had worked on building a computer model to understand how honeybees learn images and patterns. “The first question that I was supposed to put in the model,” she says, “[was:] OK, so what are the bees seeing? Then I went, ‘Oh shit.’ We don't actually know, do we?”

To understand what’s at stake and what’s really happening from an animal’s perspective, Hanley says we need to view the world through the organism’s visual system.

Previously, to understand what a scene might look like to a honeybee, researchers assembled images of flowers “from digital photos taken through a UV, a blue, and a green filter matching the spectral sensitivity of the bees’ photoreceptors.” This approach yielded tantalizing but fairly limited and entirely static glimpses into the insect’s view of its surroundings. “It’s very restricted,” Vasas says. “There’s a lot that you are just suspecting [is] out there,” but she had no way of seeing it for herself. What she wanted was a moving image—to see the world as a honeybee sees it, to know “how another animal [would] experience what I’m experiencing,” she says.

So Hanley and Vasas teamed up with a dozen colleagues to build something that had never been attempted successfully before—a digital video camera that could be pointed at a complex, dynamic scene and that would instantaneously relay back that same view as though seen through the eyes of a mammal, bird, or insect.


This lofty idea soon became mired in the reality of executing it—”a frantic struggle of difficulties,” according to Hanley. “It was quite a hard project. I feel as though [at] every step we had to relearn and we had to redo.”

“The whole camera,” says Vasas, “is more or less built on existing knowledge. But it had to be pulled together and unearthed and understood.” Between the selection of the optics, aligning the images, sorting out the focus, teasing apart what the camera was doing internally, finessing the components they wanted to 3D print, and more, Vasas says it felt like the project required an infinite number of iterations. “Every time, there was another thing [that] made it not work.”

But after years of trying, Vasas, Hanley, and their colleagues finally did it. They made a device to see what animals see.

Here’s how it works. Two consumer Sony cameras are positioned perpendicularly to one another. Light entering the system first hits a special piece of glass called a beam splitter. It routes ultraviolet light to a camera that’s sensitive to UV but allows visible light to travel to the second camera (that has a sensor for red light, another for green light, and a third for blue light). “Once you hit go,” says Hanley, “both cameras start rolling footage. And they’re both seeing the same thing that’s occurring in front of the lens at the same time.” The result is footage of the scene, split into four streams: one based on red light only, one on green light only, one on blue light only, and one on UV light only. If you know an animal’s sensitivity to different wavelengths of light, you can then compute the red, green, blue, and UV contributions to capture what that organism would see. And you can easily recompute the colors to render out the same scene according to the perspective of a different animal. There’s a fair bit of software that’s required to make the whole thing sing. “[Vera] worked her magic behind the scenes with Python,” says Hanley, referring to the computer programming language.

The results are stunningly kaleidoscopic and even psychedelic in some cases. For Vasas, it was “a relief being able to see what I have been trying to imagine before.” She says the camera allows her to skip a lot of the mental gymnastics to remember what should be visible. Now, “It’s just there.”


Check out this “bird’s eye view” of a museum specimen of the orange-barred sulphur butterfly, showcasing how brightly reflective it is of UV light. It's a wavelength to which most songbirds are sensitive. Here, it's rendered purple and magenta to make it visible to us.


In this next video of three male orange sulphur butterflies (a different species) as a songbird might see it, UV iridescence is displayed as purple. Males display this iridescence as a key mating signal, one that humans miss altogether.


Here’s a video of two northern mockingbirds interacting in a tree as a songbird would view it. The UV signal has been overlaid as magenta to make it visible to the human eye.


“For all the animals that can perceive UV, the sky isn’t really blue,” says Vasas. “It’s a much brighter UV because of the way the sunlight gets scattered in the atmosphere.” As she and her coauthors state in the paper, “Thus, while the sky may appear blue to our eyes, it would appear UV-blue to many other organisms.” This is something Vasas says she could have looked up. But she found it arresting to watch this video and see the sky through the eyes of a bird—as a “glowing orb of UV,” to borrow Hanley’s words.

Look at this video of a black swallowtail caterpillar being coaxed into an anti-predatory display as a honeybee might see things. The UV, blue, and green light streams are shown as blue, green, and red, respectively.


Note the two-pronged “osmeterium” that emerges from just behind the caterpillar’s head, which it uses to ward off predators (in this case, the researcher’s pair of forceps). Before this camera, an osmeterium “really wasn’t something that was accessible,” says Hanley. “It’s [typically] inside their body, hidden. It’s soft. It’s only used in certain contexts and then it [goes] right back in.” Osmeteria contain chemicals that help defend the caterpillar. This video also shows how the organ reflects UV light (depicted as magenta), possibly serving as a visual warning as well.

Finally, here arrayed in a grid is how four different species would view a rainbow—a mouse (A), a honeybee (B), a bird (C), and a human (D). Not only do other species pick up the UV portion of the rainbow (which is invisible to us), but the mouse’s smaller number of different photoreceptors means it might view the same rainbow as being composed of fewer, broader bands.


“Our art teachers always said that there’s a difference between looking and seeing,” says Hanley. “When you actually see something, it’s different than when you just simply look at it. And rainbows are a good case study of this.”


If successful, this invention by Vasas, Hanley, and their colleagues could help researchers understand animal behavior and cognition using motion instead of still images—”the full display,” as Hanley puts it. “Within my discipline,” he says, “there’s all sorts of questions that start with some type of color that’s moving.”

Take camouflage. To appreciate how an organism actually strives to conceal itself, Vasas says you have to know what’s visible and invisible to both the camouflaging animal and the observer it’s hiding from. There are agricultural applications as well. For instance, we rely on bees as pollinators, but they’re not faring well globally. “Understanding what signals they detect and what it means to them,” says Hanley, might give us insights into why their colonies are collapsing.

Then there are the more imaginative applications. Hanley considers installing one of these cameras at a children’s museum where kids could toggle between, say, a snake’s view and a hamster’s view of the same scene. He says that kind of uninhibited exploration can unlock a deep and lasting curiosity about the natural world. Not unlike his early experience communicating with robins over yogurt, it gives young people an intuitive understanding that animals have their own way of perceiving and interacting with their surroundings—each one just as valid as the next. And Hanley, who’s collaborated with filmmakers before, sees this camera as a new, flexible tool that cinematographers can use to represent an animal’s visual perspective authentically in their movies. (These twin aims of rigorous science and artistic expression helped convince the National Geographic Society to fund the project.)

Ultimately, this invention reveals how arbitrary any visual scene really is, and could help stop humans from historically evaluating animal intelligence and acuity based on how we understand reality, rather than the way other species do. “Whatever you see in the world is very specific to you,” explains Vasas. “Perception is deeper than just what’s happening at the photoreceptor,” adds Hanley.

“Maybe [this tool] can bring about some understanding of animals,” Vasas says, “knowing that what they see at the basic level is so different to us. They experience things differently. They think differently.”

“But,” she says, “that doesn’t necessarily mean that they are less than humans.” ♦

Change the frequency.
Subscribe to Broadcast