Last update: October 21st, 2019 at 9:00 am
A series of images show the fog in the Rockies Valleys and the state of Washington. The first image, seen when you first open this page, is a “true color” composite. If you place your mouse over the image, it will shift to a “false color” composite image; move the mouse to the side of the image and it will revert back to the first image. The building blocks of all images are called picture elements, or pixels. With a true color composite image, pixel colors are assigned according to the intensity with which the land or ocean surface reflects sunlight. For example, objects or phenomena that reflect the red portion of the spectrum are assigned the color red; things that reflect red very strongly are given a higher intensity than things that reflect red less strongly. The same is true for blue and green, the other primary colors. Sensors aboard the satellite collect these intensity values and use them to create a picture, much like a hand-held camera. However, unlike the human eye, satellites also collect information in other parts of the spectrum, such as the near infrared. By assigning these other bands that humans are not able to see to colors that we can see, a false color composite it formed. Often, these images are useful for highlighting certain features. In this image, longer wavelength sunlight (short-wave infrared) is assigned to the color red; objects that strongly reflect the short-wave infrared (such as bare soil) will appear deep red. The near infrared is assigned to the color green; vegetation is highly reflective of near infrared light and thus appears very bright green. The red portion of the spectrum is assigned the color blue; together, this combination of bands is called a “7, 2, 1” false color composite.
Credit: Jeff Schmaltz