Why satellite images have different colors 

In Virtually Hawaii, we show you a large number of images taken from aircraft and spacecraft that have unusual colors compared to the ones we can see with our eyes. Often, we are asked "why is one part of the image red and the other blue?". This is because we have chosen to use display three different wavelengths on our computer screen where the surface is highly reflective (bright) at these wavelengths. These colors are the result of using instruments that study different parts of the spectrum to the part that our eyes can see. While we do not want to give you a complete course in physics, we thought that you might like to have a bit of the background to these remote sensing images.

First, we need to know that a spacecraft (Landsat, SPOT, or the SIR-C radar) does not "see" in color. Every image is obtained in black and white at a precise wavelength (usually between 0.4 to 12.0 microns). These electronic cameras only collect information in black and white, but they can obtain many images at the same time in different parts of the spectrum.

If we look at the diagram below of the spectrum, we see several broad regions that include the ultraviolet (wavelengths between 0.3 - 0.4 microns), visible (0.4 to 0.7 microns), near-infrared (0.7 to 1.2 microns), the solar reflected infrared (1.2 to 3.2 microns), the mid-infrared (3.2 to 15 microns) and the far infrared (longer than 15.0 microns).


back.jpg (5747 bytes)

Back to Home

Next