By Sarah Ruiz
Imagine if you had never seen your reflection. Your entire view of your face would be based on where you assumed your various features were positioned. You wouldn’t know what color your eyes were or how big your nose was, and you would be unable to tell whether your face was changing with age or if you got a sunburn that time you fell asleep on the beach.
Reflections are a useful tool for gathering information about objects that are too close to observe from afar. This is the concept behind the technology of remote sensing, which uses reflected energy to learn more about our planet. At a basic level, remote sensing is science’s way of holding a mirror out at arm’s length to get a better look at what is happening on the face of the earth.
Of course, the trick is getting the “mirror”— or in the case of remote sensing, the fine-tuned multispectral imaging equipment — far enough away from the earth to get a proper view. Originally, this meant mounting cameras to the undersides of airplanes or even hot air balloons, but since the 1970s, the vehicle of choice has been satellites. One of the success stories is NASA’s Landsat program that has been continuously recording data about the earth’s surface with a series of seven satellites (almost eight, but Landsat 6 failed at launch) for nearly five decades.
Satellites can carry a range of sensors. Some are active, sending out radio waves (Radar) or focused light beams (LIDAR) and measuring the energy that rebounds back up towards it. LIDAR systems can collect data even when the sun is not shining on the target area, while radio waves have the extra advantage of being able to travel through clouds.
NASA’s Landsat missions contain passive sensors. Passive sensors are more laissez-faire, detecting energy — sunlight usually — that is naturally reflected off the earth’s surface. The data collected is based on the broad spectrum of electromagnetic wavelengths. Sensors are tuned to detect distinct bands along the spectrum — from visible light (e.g. red bands, blue bands) to the longer infrared wavelengths — and the amount of each band that is reflected up to the sensor allows scientists to create a picture of what is going on the earth’s surface. Black objects will reflect no visible light, while a white one will reflect all wavelengths equally.
From the resulting images we can glean a lot about our world. From discovering lost civilizations to targeting emergency response after a natural disaster, satellite images have countless applications. For Global Forest Watch (GFW), the ability to detect changes in tree cover is the critical application.
The strong reflection signature of trees and other plants is their green color and is what allows remote sensors to see them from orbit — and to know when they’ve been cut down. The chlorophyll in leaves absorbs red wavelengths and reflects those in the green and near infrared bands. The more green and infrared wavelengths are reflected, the more trees are present in an image.
The data on GFW is created using a model that detects reflectance values of each specific imagery to determine the areas and timing of forest loss. Scientists at the University of Maryland trained the model using higher resolution images and pre-existing maps to classify the new Landsat imagery as “tree” or “no tree”. The model incorporates a range of factors like the Normalized Difference Vegetation Index – an index that detects how “alive” a certain plant is. Living plants have high reflectance values for infrared wavelengths and low values for red, while soils, wood, and dead organic material have high reflectance in the red band. This avoids mistakenly categorizing the brown leaves and bare branches in winter as forest loss.
GFW’s ability to take advantage of Landsat imagery to produce a global forest monitoring platform highlights why remote sensing has become such a revolutionary technology. The imagery has achieved a state-of-the-art quality — NASA’s Landsat data is delivered in 30x30meter squares and has been for the past 40 years. Beyond this, it has been made radically accessible. Since 2008, anyone has been able to view and download the data from the United States Geological Survey (USGS) website free of charge, which has made satellite imagery a primary tool for forest and land cover monitoring. Without it, GFW would not be possible.
NASA has planned the launch of a ninth Landsat mission for December 2020 with sensors that improve the range, intensity and temporal coverage of data recorded by previous missions. So long as remote sensing images remain freely available, new applications will continue to evolve improving the earth’s reflection in our mirror and advancing what we know about our planet.