We present a non-destructive method for sampling spatial variation in the direction of light scattered from structurally complex materials. By keeping the material intact, we preserve gross-scale scattering behavior, while concurrently capturing fine-scale directional contributions with high-resolution imaging. Results are visualized in software at biologically-relevant positions and scales.
Light interacts with an organism’s integument on a variety of spatial scales. For example in an iridescent bird: nano-scale structures produce color; the milli-scale structure of barbs and barbules largely determines the directional pattern of reflected light; and through the macro-scale spatial structure of overlapping, curved feathers, these directional effects create the visual texture. Milli-scale and macro-scale effects determine where on the organism’s body, and from what viewpoints and under what illumination, the iridescent colors are seen. Thus, the highly directional flash of brilliant color from the iridescent throat of a hummingbird is inadequately explained by its nano-scale structure alone and questions remain. From a given observation point, which milli-scale elements of the feather are oriented to reflect strongly? Do some species produce broader “windows” for observation of iridescence than others? These and similar questions may be asked about any organisms that have evolved a particular surface appearance for signaling, camouflage, or other reasons.
In order to study the directional patterns of light scattering from feathers, and their relationship to the bird’s milli-scale morphology, we developed a protocol for measuring light scattered from biological materials using many high-resolution photographs taken with varying illumination and viewing directions. Since we measure scattered light as a function of direction, we can observe the characteristic features in the directional distribution of light scattered from that particular feather, and because barbs and barbules are resolved in our images, we can clearly attribute the directional features to these different milli-scale structures. Keeping the specimen intact preserves the gross-scale scattering behavior seen in nature. The method described here presents a generalized protocol for analyzing spatially- and directionally-varying light scattering from complex biological materials at multiple structural scales.
The color and pattern of an organism’s integument play ecologically and socially critical functions in most animal taxa. These phenotypic properties are determined by the interaction of light with the structure of the integument, which can exhibit optical scattering that varies both spatially (across the surface of the integument) and directionally (with change in lighting and viewing direction). In complex biological materials, such as feathers, the direction of light scattering is influenced by the orientation of repeating milli-scale geometry. These milli-scale structures themselves may be embedded with nano-scale structures, such as melanin arrays, which often inherit the milli-scale orientation. From nano- to macro-scales, the structure of the integument has evolved functionally to increase the signaling capability of the organism. In order to assess the influence of the morphology of different scales upon the overall appearance, tools to measure and analyze the color of biological structures need flexibility to isolate directional light scattering at various scales of magnification.
We developed image-based measurement tools to study how the performance of a feather’s complex and varied milli-scale morphology (barb rami, distal barbules, and proximal barbules) expands the range of expression possible from nano-scale structures alone. In a single image recorded by the camera, we observed that light reflected differently at different locations on the surface of the feather, that is, light reflectance was spatially-varying. When we moved the light and camera direction with respect to the feather, we observed the reflectance changed, that is, light reflectance was directionally-varying1. Following these observations, we designed a protocol to methodically move the light and camera around the subject using a spherical gantry2,3, with which we captured 2 dimensions of surface position (X and Y), 2 dimensions of light direction (latitude and longitude), and 2 dimensions of camera direction (latitude and longitude) (Figure 2). In software we visually explored the 6 dimensions of the scattered light as a function of position, illumination direction and view direction.
Previous research into the reflectance from integuments has too frequently discounted the contribution of directionality — e.g. diffuse vs. specular or isotropic vs. anisotropic reflection — to color expression. Most color measurements have fixed the incident light, object, and viewing geometry to carefully avoid directional effects. For instance, to eliminate specular reflection from color measurements, it is common to place the light normal to the surface and record the reflectance at 45° from the normal. Studies that do link morphology to directionally-varying reflectance typically focus on the nano-scale and its iridescent consequences4-8. Few consider the contribution of micro-, milli-, and macro-scale geometries to the far-field optical signature8-11. It is therefore common to employ a light detector to aggregate reflectance across a single area of interest that may include multiple milli- and/or macro-scale components, such as barb rami, barbules, and even entire feathers6,8,11-17. When the region of interest is either smaller than the resolution limit of the detector or does not conform to the shape of the detector’s field of view, the common protocol specifies specimen dissection to isolate the light scattering from the specific milli-scale element8,10,13,15.
We have developed a more encompassing protocol for measurement acquisition and visualization that encourages exploration of the many variables often ignored in other more focused studies. We measure light scattering over a sphere of directions and across a region of space using a massive set of high-dynamic range, high-resolution photographs taken from a systematic set of light and viewing directions. We employ a high-resolution imaging sensor with its 2D array of fine-scale pixel detectors. Aggregation in hardware occurs at the pixel-level, at a scale smaller than the milli-scale elements we are measuring. A second stage aggregates individual pixels in software as the user selects the shape and size of the region of interest. Accordingly, a single measurement set can be repeatedly analyzed in software to explore different aspects of light interaction with material at multiple biologically-relevant positions and scales. By eliminating dissection and measuring the entire feather, our protocol has the advantage of leaving the morphology of the feather vane intact, retaining natural context and function that is, light’s interactions between constituent milli-scale elements.
Light scattering from organismal structure is multidimensional and difficult to quantify. Measured 6D light scattering cannot as yet be attributed to specific morphology within a hierarchy of scale with any singular instrument. But we have made an important step in this pursuit. We have developed a tool encompassing three complementary methods — sampling reflectance using the gantry, exploring large data volumes in software, and visualizing data subsets graphically — to extend our ability to measure 6D light scattering at any point on a material, down to the milli-scale. As protocols such as ours are employed, we predict biologists will identify a myriad of directionally- and spatially-varying traits and corresponding structural adaptations at multiple scales of development. Using our tools we are engaged in characterizing the signaling potential of the directional and spatial expression of milli-scale structures, and hope to shed light on their adaptive consequences. We address a range of questions, such as: from any given observation point, which fine-scale elements or gross-scale regions of the feather reflect strongly? How does the orientation of the fine-scale elements influence the direction of scattered light? What morphological conditions produce a satiny gloss vs. a sequined sparkle of the iridescent ornament? Do some species produce broader “windows” for observation of iridescence than others? These questions may be asked about birds and their feathers but also about any other organisms that have evolved a particular surface appearance for signaling, camouflage, or other reasons.
When using our methods to measure a sample, the experimenter must decide on a set of camera and light directions, and for each combination of camera and light directions, the camera makes several exposures with different shutter speeds. Moving the camera requires additional processing, because it changes the view of the sample as seen in the image, so we normally use a small number of camera directions and a larger number of light source directions.
In the detailed protocols below, we first describe how to perform a measurement with many light source directions and a single camera direction, and how to process and visualize the resulting data (Protocol 1). In the primary protocol, which can be used by itself when a single view is sufficient to observe the phenomena being studied, we always keep the camera view perpendicular to the sample (Primary Routine in Figure 1). When multiple camera directions are required, the resulting oblique views of the sample can be warped to undo the effects of moving the camera and thereby to align the images exactly with the canonical perpendicular view. To compute these warps, we perform additional calibration steps that use observations of targets placed around the sample to precisely determine the motion of the camera relative to the sample. Protocol 2 details this calibration procedure and explains how to select parameters and run Protocol 1 multiple times to gather data from multiple views (Secondary Routines in Figure 1). Finally, Protocol 3 details the additional steps that must be inserted into Protocol 1 to rectify the oblique views during data processing.
1. Measure Scattered Light in the Direction of the Surface Normal over the Sphere of Incident Directions (Primary Routine in Figure 1)
2. Measure Scattered Light in Multiple Camera Directions (Secondary Routines in Figure 1)
Multiple camera views and non-uniform directional sampling allow us to study particular features of the directional reflectance. With the addition of calibration Steps 2.A and 2.B, Protocol 1 has been expanded to handle multiple camera views. Two specific examples graphically illustrated as Secondary Routines II.A and II.B in Figure 1 are set forward in Steps 2.C and 2.D below. In such cases, the camera direction is altered from its canonical direction (normal to the surface), meaning that the object is photographed from a direction inclined from its surface normal. Since images must be mapped into the same coordinate system, we rectify and warp each photograph to match the canonical orientation by referencing the flash-photographed targets surrounding the sample (Figure 9).
3. Projective Transformation
Projective transform each HDR image into the canonical view or the view direction orthogonal to the surface plane. This protocol is accessed by Step 1.E.3.b when a measurement run is part of a multiple camera direction set, such as the examples outlined in Protocol 2 and graphically illustrated as Secondary Routines in Figure 1.
a Dcraw is an open-source computer program developed by David Coffin. It converts a camera’s proprietary RAW-formatted image (i.e. unprocessed CCD data) to a standard image format. See http://www.cybercom.net/~dcoffin/dcraw/.
b Bouguet Toolbox is a camera calibration toolbox for MATLAB developed by Jean-Yves Bouguet. See http://www.vision.caltech.edu/bouguetj/calib_doc.
The primary measurement of our protocol (Routine I in Figure 1) fixed the camera direction normal to the surface and only moved the light. Since light scattering adheres to the principle of reciprocity, the result is the same whether we hold the camera constant while moving the light over the hemisphere or vice versa. When we fix either the camera or the light, the complete 4-dimensional direction set is undersampled. A fuller picture of the scattering behavior is observed when, unlike the primary measurement, both light and camera are moved away from the surface normal and in a multiplicity of directions. Ideally, we could measure light scattering from many camera directions, even as many as the number of incident light directions, to yield a symmetrical data set. In practice, this would require far too many exposures. In our experience, we can obtain sufficient information about different viewing positions by moving the camera a few times assuming 180° rotational symmetry about the surface normal. During the secondary measurement phase, we acquired measurements from 7 viewing directions distributed over the hemisphere and within 60° of the zenith18,19 (Routine II.A in Figure 1).
In the figures of this paper, we show representative data measured from a feather of Lamprotornis purpureus (Purple Glossy Starling), the reflectance of which is iridescent, glossy, and anisotropic (Figure 5). In each of the 7 viewing directions, reflected light is gathered from hundreds of incident lighting directions on the hemisphere. The directions form a narrow band orthogonally oriented to the central axis of the feather (see feather image in Figure 4). The iridescence color shift is subtle (bluish-green at normal incidence and greenish-blue at grazing incidence) when the feather is viewed normal to its surface as seen in the {0°,0°} RGB plot of Figure 5. As the viewing angle approaches grazing, the angles between the viewing direction and the grazing incident directions are maximized, leading to a more striking color shift (bluish-green at 0° and magenta at 240° between incident and viewing directions) as seen in the {60°,0°} RGB plot in Figure 5.
We can afford to step the light and camera at much finer angular resolution when we restrict the movements to 1 dimension. Figure 6 shows the chromaticity of the reflectance of L. purpureus plumage as a function of the angle between the incident and viewing directions, where the incident and viewing directions are in the plane containing the specular band, which is perpendicular to the longitudinal axis of the distal barbule. As the iridescent color arcs through chromaticity space, the hue shifts from bluish-green to purple.
Spatial variation in the directional reflectance is visible where different (X,Y) coordinates of the integument correspond to different milli-scale structures. In the case of L. purpureus only one structure — the distal barbule — is visible over most of the area. By contrast, in C. cupreus, three milli-scale structures — the rami, distal barbules, and proximal barbules — are clearly distinguished in the data; we can observe that reflectance from the feather is oriented with respect to the longitudinal axis of each structure (Figure 8).
Figure 1. This schematic overview depicts two mounting methods, the spherical gantry coordinate system, types of acquisition sampling and their respective results. Click here to view larger figure.
Figure 2. The flattened feather is visible through an aperture in a metal plate surrounded by a ring of targets. A spherical gantry can be posed to measure light scattering from a feather at multiple incident lighting and viewing directions. L=Light arm (latitude). C= Camera arm (latitude). B=Camera Base (longitude). T=Turntable (longitude). F=Feather.
Figure 3. Average directional scattering may be computed from a point, line or rectangular region of feather vane.
Figure 4. Example of directional scattering plotting functions (R*=Reflectance, T*=Transmittance, P*=Top, F*=Front, S*=Side, A*=Arbitrary) and color schemes (*1=Luminance, *2=RGB, *3=Chromaticity). Click here to view larger figure.
Figure 5. The luminance (top) and RGB color (bottom) of the hemispherical reflectance in direction cosine space as viewed from the (elevation angle, azimuth angle) coordinate pairs: {0°,0°}, {30°,0°}, {30°,90°}, {60°,0°}, {60°,45°}, {60°,90°}, and {60°,135°}. The reflectance is averaged from a 25×25 pixel rectangular region of the lateral vane of a tertial L. purpureus (Purple Glossy Starling) feather. The red arrows represent camera directions. Click here to view larger figure.
Figure 6. Chromaticity of the reflectance as a function of the half-angle between the incident lighting and viewing directions: CIE 1976 Uniform Chromaticity Scales (USC) with magnified region. Click here to view larger figure.
Figure 7. Reflectance as a function of the angle between the incident lighting and viewing directions, in-plane with (red) and perpendicular to (shaded) the longitudinal axis of the distal barbule: (A) Dominant wavelength, (B) Percent chroma, (C) Percent luminance. The color shading in plot A is the RGB color of the reflectance. Negative wavelength values represent colors in the non-spectral purple triangle. Click here to view larger figure.
Figure 8. Average directional reflectance of distal barbules and proximal barbules between two adjacent rami of the C. cupreus (African Emerald Cuckoo).
Figure 9. (A) Non-rectified image illuminated by gantry lamp, (B) Non-rectified image illuminated by flash on camera, (C) Filtered target candidates on affine-transformed, flash-illuminated image, (D) Acceptably sharp targets within depth of field, (E) Rectified lamp-illuminated image, (F) Rotated feather tip up, cropped and masked. Click here to view larger figure.
Though the performance and function of many pigmentary and structural colorations are well recognized, the morphology of many integuments is so complex that their structural detail and function are poorly understood20. Integuments have developed specializations that vary spatially over the surface of the organism to differentially reflect light directionally toward the viewer. Directionality has received attention primarily in the study of iridescence due to its color shift with change of incident and viewing angle, and research into iridescence of biological integument has garnered primarily 1D and some 2D measurements8,12,17. But generalized 6D measurements have not been routine in the study of integuments21-23, iridescent or otherwise, and the literature on organismal color phenotypes is constrained by the lack of directional color data of the type our method provides.
The feather is an especially rich integumentary material comprising arrangements of milli-scale structure of the barb: rami, distal barbules, and proximal barbules. The small scale of the elements and their complex arrangements make it difficult to discern the light scattering performance of the individual elements. Our protocol successfully isolated milli-scale structure from the influence of macro-scale geometry. By characterizing the functional consequences of the directional expression of milli-scale structures to the far-field signature of the feather, we enabled inquiry into their adaptive consequences.
We faced practical tradeoffs between spectral, spatial and angular resolution. We chose high spatial, medium angular and low spectral for our studies. Other combinations could be used, but some (e.g. all high) lead to unworkably long measurement times. Attention must be focused where it is important for the particular phenomena being studied. In choosing to employ an RGB camera with a Bayer filter mosaic, we designed our protocol to match the human visual system. The RGB camera could be replaced and our protocol adapted to measure the relative color stimulus of any organism, e.g. sensitivity in the UV spectrum is needed to measure avian tetra-chromatic color24,25. A spectral imaging camera would provide the most general solution25.
We demonstrated our protocol with tertial wing feathers since they are colorful and easily flattened against a reference plate. Unfortunately, the aperture of the metal plate revealed only a fraction of the feather surface. If we could simultaneously measure the 3D shape of the feather surface while measuring its reflectance25, we could avoid mechanically flattening the feather and instead measure the entire feather in its natural, unflattened state.
Interactive, specialized, integrated tools for visualizing data provide substantial benefit to scientists exploring and interpreting large data volumes. The greater the integration and interactivity, the easier connections in the data are observed. In our software, a user can interactively plot average directional scattering as a function of surface position (Figure 4). Further development of our software could integrate other plotting functions (Figures 6, 7) to extend the interactive experience.
The authors have nothing to disclose.
This research was funded by the National Science Foundation (NSF CAREER award CCF-0347303 and NSF grant CCF-0541105). The authors would like to thank Jaroslav Křivánek, Jon Moon, Edgar Velázquez-Armendáriz, Wenzel Jakob, James Harvey, Susan Suarez, Ellis Loew, and John Hermanson for their intellectual contributions. The Cornell Spherical Gantry was built from a design due to Duane Fulk, Marc Levoy, and Szymon Rusinkiewicz.