Reading Time: 4 Minutes Print this page

Did you know that some animals – including honeybees and some birds – can see ultraviolet (UV) light?

This is due to the capabilities of the photoreceptors in their eyes, which are far beyond the range of human eyes.

Plants and orange sulphur butterflies (Colias eurytheme), as they are seen by birds (in avian RNL false colors), with insert (bottom right) showing how humans see the same scene. Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0 

What animals see has been replicated before, but never in motion. Traditional methods such as spectrophotometry are often extremely time consuming, and require specific lighting.

But researchers have developed a ground-breaking camera and software system that captures animal-view videos of moving objects under natural lighting conditions.

“Our new approach allows researchers and filmmakers to record animal-view videos that capture the interplay of colours over time,” says Dr Vera Vasas, research fellow in Ecology and Evolution at the University of Sussex in England.

“Now we can fully appreciate how much information we missed when we were photographing immobile objects in the lab. This new-found ability to accurately record animal-specific colours in motion is a crucial step towards our understanding of how animals see the world.”

How does it work?

The camera simultaneously records video in four separate colour channels: blue, green, red and UV. Based on existing knowledge of the different photoreceptors in the eyes of different animals, this data can be processed into “perceptual units” to produce an accurate video representation of a particular animal’s unique vision.

“The system works by splitting light between two cameras,” Vera says. “One camera has been modified to be sensitive to UV light, while the other is just a regular stock camera, sensitive to visible light. This separation of UV from visible light is achieved with a piece of optical glass, called a beam splitter. This optical component reflects UV light in a mirror-like fashion, but allows visible light to pass through just the same way as clear glass does. In this way the system can capture light simultaneously from the four distinct wavelength regions. We then use a series of algorithms to standardise our footage and transform it to representations of animal vision.”

The recording and video processing pipeline: Scenes are (1) projected to an internal beam splitter that reflects UV light and passes visible light to 2 independent cameras. This design eliminates the need for switching filters and so allows for the rapid collection of multispectral recordings (videos or images). Following data collection, users can use our pipeline to (2) align the recordings automatically. The recordings are (3) linearized and normalized automatically using the custom color card or a set of grayscales of known reflectivity. This step estimates the light captured by the camera sensors (camera catches, CC). Finally, the camera catches are (4) transformed to animal quantum catches (AC, in this case representing honeybee (Apis mellifera) vision), which can subsequently be (5) visualised as false colour images or videos (labeled as “bee”) by colouring the UV, blue, and green quantum catch images as blue, green, and red, respectively. These are compared to the composition of the linear images or videos (labeled as “human”). In this case, we demonstrate the pipeline using a black-eyed Susan (Rudbeckia hirtap). Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0 

The results

Here’s just a sample of the incredible videos already created with this new technology:

Plants and orange sulphur butterflies (Colias eurytheme), filmed as they are seen by birds (in avian RNL false colors). Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0 

Phoebis philea and Anteos sp. butterflies (museum specimen), filmed as they are seen by birds (in avian RNL false colors). Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0 

A zebra swallowtail butterfly (Protographium marcellus) and flowers, filmed as they are seen by honeybees (Apis mellifera). Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0 
An unidentified jumping spider, filmed as they are seen by honeybees (Apis mellifera). Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0 
Bees in their natural environment, filmed as they are seen by honeybees (Apis mellifera). Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0 
Bees in their natural environment, filmed as they are seen by honeybees (Apis mellifera). Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0 
A northern mockingbird (Mimus polyglottos) in a tree, filmed as seen by birds (in avian RNL false colors). Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0 

Just the beginning

The result is vision that is not only fascinating to look at, but that also has multiple real-world applications.

We have a number of ideas that we are planning to address with our camera,” says Vera. “You could image the iridescent mating displays of birds, and how they appear to the intended audience of females. The camera can also be used for fast digitisation of museum specimens.

“We are also excited about some welfare aspects,” she adds. “We can evaluate the visibility of UV-absorbing stickers on windows, for example, that are intended to be visible for birds, reducing the number being injured or killed from accidentally striking glass.

“But the most exciting questions will be those we have yet to consider. Only now that we have started taking videos of the natural world, we are beginning to see how much information is out there.”


Related: The science behind ant vision