OrbitalHub

The place where space exploration, science, and engineering meet

Domain is for sale. $50,000,000.00 USD. Direct any inquiries to contact@orbitalhub.com.

Archive for the Earth Science category

 

 

On April 12, 2026, at 10:08 p.m. local solar time (12:08 UTC), the GPM Core Observatory passed directly over the center of Typhoon Sinlaku. From orbit, the satellite captured a detailed, three-dimensional snapshot of precipitation inside the storm, resolving structures that are not accessible to conventional surface-based observations. This overpass provided a high-resolution dataset describing rainfall intensity, vertical structure, and storm organization at a critical stage in the typhoon’s evolution.

The Global Precipitation Measurement mission, a joint effort between NASA and JAXA, is designed to quantify precipitation globally with consistent calibration. The Core Observatory serves as the reference standard for a constellation of satellites, ensuring that measurements of rainfall and snowfall across different platforms remain physically comparable. Its orbit, inclined at approximately 65 degrees, allows it to observe precipitation systems across the tropics and mid-latitudes, including regions where tropical cyclones form and intensify.

Typhoon Sinlaku, like other tropical cyclones, is a thermodynamically driven system powered by heat exchange between the ocean and atmosphere. Warm ocean waters supply energy through evaporation, increasing the moisture content of the lower atmosphere. As moist air rises within the storm, it cools and condenses, releasing latent heat. This heat release drives further upward motion, sustaining convection and reinforcing the storm’s circulation. The distribution and intensity of precipitation within the system are directly linked to these processes, making rainfall measurements a key diagnostic of storm strength and structure.

The GPM Core Observatory carries two primary instruments for observing precipitation. The first is the Dual-frequency Precipitation Radar, which operates at both Ku-band and Ka-band frequencies. By transmitting microwave pulses toward Earth and measuring the reflected signal, the radar can determine the location, intensity, and vertical distribution of precipitation. The use of two frequencies allows for improved characterization of hydrometeors, including raindrops, snow, and ice particles, as different wavelengths interact differently with particle sizes.

The second instrument is the GPM Microwave Imager, a passive sensor that measures naturally emitted microwave radiation from Earth’s atmosphere and surface. Microwave signals are affected by the presence of liquid and frozen precipitation, allowing the instrument to infer rainfall rates over wide swaths. While the imager provides broader coverage, the radar delivers detailed vertical profiles. Together, these instruments produce a comprehensive dataset describing both the horizontal and vertical structure of precipitation.

During the overpass of Typhoon Sinlaku, the Dual-frequency Precipitation Radar captured cross-sectional views of the storm, revealing the internal organization of convective bands and the eyewall region. The eyewall, typically associated with the most intense winds and heaviest rainfall, showed strong reflectivity values, indicating high precipitation rates and deep convective towers. Surrounding rainbands displayed varying intensities, reflecting differences in moisture availability, atmospheric stability, and local dynamics.

The vertical structure observed by the radar is particularly important for understanding storm intensity. Strong updrafts within convective cells lift moisture to higher altitudes, where it condenses and forms precipitation. The height and distribution of these updrafts can be inferred from radar reflectivity profiles. In the case of Sinlaku, the radar data indicated well-developed convective cores, suggesting active energy transfer within the storm system.

The Microwave Imager complemented these observations by providing a broader view of precipitation distribution. By measuring brightness temperatures across multiple frequency channels, the instrument identified regions of heavy rainfall and areas dominated by ice-phase precipitation. These measurements help distinguish between stratiform and convective precipitation, which have different implications for storm dynamics and energy balance.

From an engineering perspective, the ability to collect such data depends on precise calibration and system stability. The radar must maintain accurate timing and signal strength to ensure that reflected signals are correctly interpreted. The satellite’s orientation and pointing accuracy are critical, as small deviations can affect measurement geometry. Thermal control systems maintain instrument performance by keeping components within specified temperature ranges, despite the varying thermal environment of low Earth orbit.

Data collected during the overpass are transmitted to ground stations and processed using retrieval algorithms that convert raw measurements into physical quantities such as rainfall rate and hydrometeor distribution. These algorithms incorporate models of electromagnetic scattering, atmospheric absorption, and surface emissivity. The resulting datasets are then assimilated into weather prediction models, improving forecasts of storm track, intensity, and precipitation.

The observations of Typhoon Sinlaku contribute to both operational forecasting and scientific research. Accurate measurements of precipitation help meteorologists assess flood risk and issue warnings. At the same time, detailed structural data improve understanding of how tropical cyclones evolve, including processes such as eyewall replacement cycles, intensity fluctuations, and interactions with environmental conditions.

One of the key advantages of the GPM mission is its ability to provide consistent measurements across different storms and regions. By maintaining a calibrated reference standard, the Core Observatory ensures that data collected over Sinlaku can be compared directly with observations of other storms. This consistency is essential for building long-term datasets used in climate studies, where trends in precipitation and storm behavior are analyzed over decades.

The overpass of Typhoon Sinlaku illustrates the integration of science and engineering required to observe complex atmospheric systems from space. The satellite’s instruments translate electromagnetic signals into quantitative descriptions of precipitation, while the underlying physical models connect those measurements to the dynamics of the storm. The result is a detailed, three-dimensional representation of a system that spans hundreds of kilometers but is resolved at scales relevant to both weather forecasting and scientific analysis.

In practical terms, the data from this event enhance situational awareness for regions affected by the storm and contribute to improving predictive capabilities for future events. In a broader context, they support ongoing efforts to understand the role of precipitation in Earth’s climate system, including how it may change in response to global warming.

The GPM Core Observatory’s observation of Typhoon Sinlaku demonstrates the capability of modern satellite systems to capture detailed information about dynamic weather events. It reflects the continued development of remote sensing technologies and the importance of international collaboration in monitoring Earth’s atmosphere.

Video credit: NASA

 

  • Facebook
  • Google
  • Slashdot
  • Reddit
  • Live
  • TwitThis

 

 

Telesat is positioning its Lightspeed low Earth orbit constellation as a critical component of defense communications networks, with a planned laser communications demonstration in 2027 that could validate the system for high-demand applications including missile defense. The Canadian satellite operator announced the strategy during the Satellite 2026 conference in Washington, D.C., highlighting changes to the system design aimed at military compatibility.

The company plans to launch the first two Lightspeed satellites in December 2026, with a laser communications relay demonstration scheduled for 2027 under a $30 million NASA contract awarded in 2022. The test will simulate a data relay scenario in orbit: one satellite will act as a mission spacecraft, the other as a relay node. A subsequent phase will involve a Planet Labs imaging satellite equipped with an optical terminal, which will send data through the Lightspeed system to a ground station.

Chuck Cynamon, president of Telesat Government Solutions, emphasized that the demonstration represents a proof point for the Pentagon’s growing interest in space-based data networks. “There’s a demand for hybrid architectures,” Cynamon stated, pointing to the Space Force’s development of what it calls a “space data network” intended to connect satellites, sensors, and weapons into a unified real-time architecture.

The Golden Dome missile defense initiative would depend on such networks as its core transport layer, routing data between sensors, command systems, and interceptors in near real time. Gen. Michael Guetlein, who leads Golden Dome, has indicated that funding for the space data network is increasing, with Cynamon noting that “there’s probably no limit on how much capability is going to be needed on orbit from a space data network.”

The company has modified its system design to align with military requirements, including adding military Ka-band frequencies aligned with the Pentagon’s existing wideband satcom systems. Each of the planned 198 Lightspeed satellites will carry four optical terminals supplied by Tesat-Spacecom, enabling high-speed links between spacecraft that can move large volumes of information with low latency while reducing exposure to jamming or interception.

The capacity pool model Telesat intends to offer the government would allow access to Lightspeed’s bandwidth and potentially optical connections without owning satellites. “We could also offer a pool of optical connections on a daily, weekly or monthly basis,” Cynamon explained, reflecting a broader shift toward hybrid architectures that blend military and commercial infrastructure.

Telesat expects to begin commercial service in 2028 after deploying the first 156 satellites, with launches contracted to SpaceX in batches of roughly 15 spacecraft. The company enters a competitive field dominated by SpaceX’s Starlink and Starshield, along with emerging systems such as Amazon LEO. Both competitors are pursuing defense business and deploying optical inter-satellite links.

One emerging demand driver is the concept of orbital data centers, which Cynamon noted could further increase pressure on satellite networks to expand capacity and move data more quickly between space and the ground. “I think it’s going to put pressure on the ability to have large pipes and land data quickly on the ground,” he observed.

Optical communications between satellites operate at frequencies far higher than traditional radio-frequency links, typically using near-infrared wavelengths around 1550 nanometers. This frequency choice offers several advantages for space-based communications, including narrower beam divergence that enables higher data rates while reducing interference between neighboring links.

The fundamental principle involves modulating a laser beam with data and directing it precisely at a receiving terminal, requiring extremely precise pointing and tracking systems. The transmitting terminal must aim its beam with accuracy measured in microradians, roughly equivalent to aligning two lasers pointed from opposite ends of a football field and having them meet at the 50-yard line.

Data rates for optical links can reach 10 gigabits per second or higher, compared to typical radio-frequency satellite links measured in megabits per second. This capacity advantage becomes particularly significant for applications involving large data volumes, such as high-resolution imagery or video from Earth observation satellites.

The laser links used in satellite constellations employ coherent detection, where the receiving terminal mixes the incoming optical signal with a locally generated laser to extract the data. This technique provides sensitivity improvements over direct detection methods, enabling links across distances of thousands of kilometers with minimal transmit power.

Atmospheric effects present challenges for optical links that radio frequencies avoid, including scattering by molecules and aerosols, absorption by water vapor, and turbulence that can cause beam wander and scintillation. For inter-satellite links above Earth’s atmosphere, these effects largely disappear, making optical communications most attractive for links between spacecraft rather than from space to ground.

 

  • Facebook
  • Google
  • Slashdot
  • Reddit
  • Live
  • TwitThis

 

 

There are satellites that flash briefly across the sky and then fade into history, and there are satellites that quietly build a legacy measured not in months, but in generations. The Landsat program belongs firmly to the latter. Since 1972, when the first Landsat spacecraft began circling Earth, the mission has carried forward a simple but transformative idea: that if we observe our planet consistently, patiently, and scientifically, we can understand how it changes—and why.

Landsat was born during a time when space exploration was dominated by lunar ambitions and planetary probes. Yet a handful of scientists and engineers recognized that one of the most important frontiers lay much closer to home. The Earth itself was changing under the pressure of agriculture, urban expansion, deforestation, water use, and climate variability. The Landsat program was designed to provide something unprecedented: a continuous, calibrated, and publicly available record of the planet’s land surface.

From the beginning, the mission’s goals were ambitious. Landsat satellites were built to measure reflected sunlight and emitted thermal radiation from Earth’s surface across multiple wavelengths. This spectral approach allowed scientists to distinguish forests from croplands, healthy vegetation from drought-stressed fields, snow from clouds, and sediment-rich rivers from clear lakes. By observing the same locations again and again over decades, Landsat turned snapshots into time series, revealing patterns that would otherwise remain invisible.

The engineering behind Landsat is a study in precision. Each spacecraft travels in a near-polar, sun-synchronous orbit at an altitude of roughly 700 kilometers. This orbit ensures that the satellite passes over any given location at approximately the same local solar time, maintaining consistent lighting conditions for imaging. Stability and repeatability are paramount. The sensors must be radiometrically calibrated to detect subtle changes in surface reflectance over time. A difference of just a few percent in measured brightness can signal shifts in vegetation health or soil moisture.

Over successive missions, Landsat’s instruments evolved. Early satellites relied on the Multispectral Scanner (MSS), which offered groundbreaking though relatively coarse imagery. Later generations introduced the Thematic Mapper (TM) and Enhanced Thematic Mapper Plus (ETM+), expanding spectral coverage and spatial resolution. With Landsat 8, launched in 2013, the program entered a new era of digital precision with two primary instruments: the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS). Together, they extended the spectral range, improved signal-to-noise performance, and ensured compatibility with the historical data record.

The continuity of the Landsat archive is not an accident—it is a design philosophy. Every new satellite must be cross-calibrated against its predecessor so that the global dataset remains scientifically consistent. This continuity has allowed researchers to track deforestation in the Amazon, glacier retreat in Greenland, urban expansion in Asia, and agricultural water use in the American West. Landsat’s data policy, which made imagery freely available starting in 2008, transformed global access to Earth observation, catalyzing research, commercial innovation, and environmental monitoring on a planetary scale.

It is within this lineage that Landsat 9 emerged.

Launched on September 27, 2021, from Vandenberg Space Force Base aboard an Atlas V rocket, Landsat 9 was not conceived as a revolution, but as a promise kept. Its mission was to ensure that the Landsat record—now spanning more than half a century—would continue without interruption. Developed by NASA and operated jointly by NASA and the U.S. Geological Survey (USGS), Landsat 9 carries forward the twin-instrument architecture pioneered by Landsat 8, with refined performance and improved reliability.

At the heart of Landsat 9 is the Operational Land Imager 2 (OLI-2), an advanced multispectral sensor that captures reflected sunlight across visible, near-infrared, and shortwave infrared wavelengths. These spectral bands are carefully chosen to reveal the chemical and structural properties of land surfaces. Vegetation reflects strongly in the near-infrared; water absorbs much of it. Soils, minerals, and built environments each leave distinct spectral signatures. By measuring these patterns, OLI-2 allows scientists to compute vegetation indices, monitor crop productivity, detect wildfire scars, and assess coastal health.

Complementing OLI-2 is the Thermal Infrared Sensor 2 (TIRS-2), which measures land surface temperature. Thermal data are essential for understanding evapotranspiration, drought conditions, urban heat islands, and volcanic activity. Land surface temperature is not merely a climate statistic; it is a dynamic variable that shapes ecosystems, agriculture, and human comfort. TIRS-2 improves upon earlier thermal sensors with better stray-light control and enhanced calibration, strengthening confidence in long-term temperature records.

Together, OLI-2 and TIRS-2 produce imagery with a spatial resolution of 30 meters for most bands and 100 meters for thermal measurements, revisiting the same location every 16 days. When combined with Landsat 8, the effective revisit time drops to eight days, increasing temporal coverage and reducing data gaps caused by cloud cover.

The engineering sophistication of Landsat 9 extends beyond its instruments. The spacecraft platform was built by Northrop Grumman and designed for durability and efficiency, with redundant systems and precise attitude control to maintain stable pointing. The satellite continuously transmits data to ground stations, where it is processed, calibrated, and archived by the USGS. Each image enters a public repository that now contains millions of scenes—a living chronicle of Earth’s surface.

Perhaps the most remarkable aspect of Landsat 9 is how unremarkable it strives to be. Its purpose is not spectacle, but continuity. It does not chase novelty; it protects consistency. In an era of rapid technological turnover, Landsat 9 embodies a different ethos: that sustained observation is as important as innovation.

As climate change accelerates, water resources tighten, and urban populations grow, the need for objective, long-term data becomes ever more urgent. Landsat 9 contributes to this global awareness by quietly collecting photons reflected and emitted from Earth’s surface, converting them into calibrated digital records. These records feed into agricultural planning, disaster response, forest management, and climate science.

The Landsat program began as an experiment in seeing our planet from above. Over five decades, it has become a foundational instrument for understanding it. Landsat 9 stands as the latest steward of that legacy—a spacecraft designed not just to observe the Earth, but to ensure that future generations can compare their world to the one we see today.

In that sense, Landsat 9 is more than a satellite. It is a continuation of a conversation between humanity and its home, a steady voice reminding us that change is measurable, and therefore knowable.

Video credit: NASA Goddard

 

  • Facebook
  • Google
  • Slashdot
  • Reddit
  • Live
  • TwitThis
December 17, 2025

The Ozone Hole: A Global Atmospheric Story

Posted by

 

 

Mea AI adiutor dicit:

High above Earth’s surface, in a region of the atmosphere called the stratosphere, lies a thin layer of ozone (O₃) that acts as a vital shield for life on our planet. This ozone layer absorbs the Sun’s harmful ultraviolet (UV) radiation, especially the most energetic UV-B wavelengths that can damage DNA in living cells and increase risks such as skin cancer, cataracts, and harm to ecosystems. In the 1970s and 1980s, scientists discovered something striking over the South Pole: each Antarctic spring, a dramatic thinning of the ozone layer developed above the continent. This thinning — commonly referred to as the ozone hole — isn’t a literal hole in space, but rather a region where ozone concentrations drop sharply below typical values, leaving a “thin spot” in the stratospheric shield.

The ozone hole forms because of a complex interplay between chemical reactions and Antarctic atmospheric conditions. In the cold, dark winter months over the Southern Hemisphere, temperatures in the polar stratosphere can plummet, enabling the formation of polar stratospheric clouds (PSCs). These ice clouds act as active sites for chemical reactions that release highly reactive forms of chlorine and bromine from human-made compounds such as chlorofluorocarbons (CFCs), halons, and other ozone-depleting substances (ODS). Once the Sun returns to the polar region in late winter and early spring, sunlight drives rapid chemical reactions on PSC surfaces that destroy ozone molecules. The result is a dramatic depletion of ozone concentrations in a broad region over Antarctica each year.

The primary culprits behind ozone depletion are synthetic chemicals that were extensively used in industrial and consumer products throughout the mid-20th century. Chlorofluorocarbons (CFCs), once common in refrigeration, air conditioning, foam blowing agents, and aerosol propellants, are especially potent at destroying ozone once they reach the stratosphere. In the upper atmosphere, UV radiation breaks down these stable molecules, releasing chlorine atoms that catalytically destroy ozone — a single chlorine atom can destroy thousands of ozone molecules before it is removed from the stratosphere. Bromine from other halons and chemicals contributes similarly, although to a lesser extent. These processes are temperature-sensitive, which is why extreme polar conditions amplify ozone loss over Antarctica.

The annual cycle of the ozone hole is tied to these reactions and to atmospheric dynamics. Each Southern Hemisphere spring (roughly August through October), as sunlight returns to polar regions, ozone destruction accelerates and the depleted region expands. At its peak, scientists measure the total area where ozone levels fall below a specified threshold — often 220 Dobson Units — to quantify the “size” or extent of the ozone hole. After the peak, as temperatures warm and atmospheric circulation resumes, ozone-rich air from lower latitudes mixes back in, and the depleted region gradually disappears until the next winter.

The discovery of the ozone hole prompted a remarkable international environmental response. In 1987, nations around the world adopted the Montreal Protocol on Substances that Deplete the Ozone Layer, a treaty designed to phase out the production and use of ozone-depleting chemicals. Over subsequent years, the Protocol was strengthened through a series of amendments and adjustments, extending controls to additional substances, accelerating phase-outs, and providing financial and technical support to developing countries. The result has been one of the most successful global environmental agreements in history: measured concentrations of many ozone-depleting substances in the stratosphere have declined substantially since their regulatory phase-out began.

Scientific monitoring shows clear signs that the ozone layer is slowly healing. In 2025, the ozone hole over Antarctica reached its annual maximum extent on September 9th, spanning about 8.83 million square miles (22.86 million square kilometers) — roughly twice the size of the contiguous United States but significantly smaller than many decades past. That year’s maximum ranks as the fifth smallest ozone hole since 1992, the year that marked the beginning of long-term recovery trends associated with the Montreal Protocol’s implementation. According to NASA and NOAA data, the average size of the hole over the height of the 2025 depletion season (Sept. 7 through Oct. 13) was also notably lower than in many previous years, and the depleted region began breaking up earlier than typical for the past decade.

Despite year-to-year variability driven by atmospheric temperatures, winds, and exceptional events like volcanic eruptions, the long-term trend points toward gradual recovery. Scientists estimate that — if current international commitments continue and ozone-depleting substances remain controlled — the Antarctic ozone layer could recover to pre-1980 levels later this century. Continued monitoring and enforcement are essential, however, because fluctuations in climate and emerging risks (such as byproducts from industrial processes or atmospheric effects of increased rocket launches) have the potential to influence ozone chemistry.

The ozone layer’s health matters because it directly affects life on Earth. Ozone absorbs UV-B radiation from the Sun, shielding organisms at the surface and in shallow waters from DNA-damaging rays that can cause skin cancer, cataracts, and immune suppression in humans, and stress in plant and marine ecosystems. Increased UV exposure can reduce crop yields, disrupt phytoplankton populations at the base of marine food webs, and accelerate the degradation of materials such as plastics. The seasonal ozone hole therefore represents a period when vulnerable regions — particularly high southern latitudes — experience elevated UV radiation at the surface, making monitoring and mitigation critically important.

The story of the ozone hole is thus both a cautionary tale and a hopeful one. It reveals how human industrial activity altered the composition of Earth’s atmosphere in ways that had global consequences, but it also demonstrates the power of international cooperation to address environmental challenges. The Montreal Protocol remains a testament to what coordinated global action can achieve: a successful trajectory toward healing a planetary-scale environmental problem that once seemed almost impossible to reverse. Continued vigilance, observation, and commitment will be key to ensuring the ozone layer’s full recovery in the decades ahead — protecting life on Earth from harmful radiation and preserving the delicate balance of our planet’s atmosphere.

Video credit: NASA Goddard

 

  • Facebook
  • Google
  • Slashdot
  • Reddit
  • Live
  • TwitThis

 

 

Mea AI adiutor dicit:

Sentinel-6B represents the next leap in monitoring our planet’s oceans, a critical mission driven by a collaboration between NASA, NOAA, ESA (the European Space Agency), EUMETSAT, and France’s CNES. Slated for launch in November 2025 aboard a SpaceX Falcon 9 from Vandenberg Space Force Base, this satellite continues a decades-long legacy of radar altimetry measurements that trace back to the TOPEX/Poseidon era.

The heart of Sentinel-6B lies in its mission to precisely measure sea surface height across roughly 90% of the world’s oceans. This is not just a climate mission: the data will feed into operational ocean models, improve weather forecasts, and play a critical role in coastal planning — informing everything from flood risk to shipping routes. Moreover, because sea level is one of the most direct indicators of climate-driven change, Sentinel-6B helps maintain the continuity of a vital long-term dataset.

Beyond ocean heights, Sentinel-6B will also monitor the atmosphere. Using a technique called GNSS radio occultation, it will capture vertical profiles of temperature and humidity in Earth’s atmosphere, enhancing the accuracy of weather prediction models. This atmospheric data even supports NASA’s Engineering Safety Center, helping plan safer reentry paths for future Artemis missions.

The satellite is outfitted with a sophisticated suite of instruments. Its Poseidon-4 altimeter will send radar pulses to the ocean surface and measure their return time to derive sea level measurements. A microwave radiometer (AMR-C) will correct for atmospheric water vapor, which affects radar accuracy. Its GNSS-RO receiver gathers data for the radio occultation measurements, while a DORIS system and a GNSS precise orbit determination package help pin down the satellite’s position with extreme precision. A laser retroreflector array (LRA) further enhances orbit tracking.

The Sentinel-6B mission carries profound implications for climate science, public safety, and operational forecasting. By extending the sea-level record well into the 2030s, it enables scientists and policymakers to track ocean trends with greater fidelity than ever before. This continuity is vital: without it, we risk losing sight of how fast sea levels are changing and which regions are most vulnerable.

As Sentinel-6B prepares for launch, it promises not only to safeguard critical infrastructure but also to deepen our understanding of Earth’s changing climate system. Through robust international collaboration and cutting-edge technology, this mission underscores how satellites remain our most powerful tools in charting the future of our oceans.

Video credit: NASA

 

  • Facebook
  • Google
  • Slashdot
  • Reddit
  • Live
  • TwitThis

 

 

Mea AI adiutor dicit:

Launched on April 15, 1999, from Vandenberg Air Force Base in California aboard a Delta II rocket, Landsat 7 marked a new chapter in Earth observation. This satellite, a collaborative endeavor between NASA, the U.S. Geological Survey (USGS), and NOAA, was the seventh in the long-running Landsat program that began in 1972. With a sun-synchronous, near-polar orbit at an altitude of approximately 705 kilometers, Landsat 7 was designed to pass over the same part of the Earth every 16 days, capturing high-resolution imagery under consistent lighting conditions at around 10:00 a.m. local solar time.

The spacecraft itself was engineered by Lockheed Martin and featured a three-axis stabilized platform, which allowed precise orientation in space. It drew power from solar arrays supported by nickel-cadmium batteries and used a hydrazine monopropellant system for orbital maintenance. One of its significant upgrades over previous Landsat missions was the inclusion of a solid-state data recorder capable of storing roughly 378 gigabits of data. This feature allowed the satellite to store imagery until it could downlink it to a ground station, enabling more flexible operations and broader global coverage.

At the heart of Landsat 7’s success was its sole scientific instrument: the Enhanced Thematic Mapper Plus (ETM+). This powerful sensor was a “whisk-broom” scanner, capturing data across eight spectral bands. Six of these bands covered the visible, near-infrared, and shortwave infrared portions of the electromagnetic spectrum with a resolution of 30 meters. A thermal infrared band operated at 60 meters resolution, while a high-resolution panchromatic band offered detail at 15 meters. Each scene covered an area of roughly 183 by 170 kilometers.

One of ETM+’s distinguishing features was its rigorous calibration. Equipped with a full-aperture solar calibrator and internal lamps, ETM+ maintained its radiometric accuracy to within five percent. This exceptional calibration made it the gold standard for satellite remote sensing, enabling cross-calibration with other Earth-observing missions such as NASA’s Terra and EO-1 satellites.

However, Landsat 7’s mission was not without challenges. On May 31, 2003, the satellite’s scan line corrector (SLC)—a mechanism that compensated for the motion of the satellite to ensure complete image coverage—failed. This hardware malfunction introduced zigzag-shaped data gaps that affected roughly 22 to 30 percent of each image. Despite the setback, Landsat 7 continued to operate, and the data it captured remained valuable. Scientists developed methods to fill in the gaps using data from adjacent passes, allowing continued scientific use and analysis.

Originally designed for a five-year mission, Landsat 7 exceeded expectations by remaining active for over two decades. In 2017, the final station-keeping maneuvers were performed to maintain the satellite’s orbital parameters. As fuel levels dropped, the satellite’s orbit began to drift slightly, but its imaging capabilities remained intact. In April 2022, the satellite was placed in a lower orbit to support calibration of other Earth-observing systems, and it continued to acquire data intermittently until January 2024. On June 4, 2025, the mission officially came to an end.

Throughout its operational life, Landsat 7 played a vital role in Earth sciences. It provided consistent, high-resolution imagery that supported a wide range of applications, including environmental monitoring, land use planning, disaster response, water resource management, agriculture, and climate change research. The data collected were used in studies that tracked deforestation in the Amazon, urban sprawl in North America, and agricultural patterns in sub-Saharan Africa, among countless other projects.

One of Landsat 7’s most transformative impacts came in 2008, when USGS made its entire Landsat archive—including Landsat 7 data—available to the public at no cost. This decision revolutionized the field of remote sensing, opening the doors to researchers, educators, governments, and businesses worldwide. The number of Landsat scene downloads skyrocketed, leading to an explosion in published scientific studies and practical applications.

Beyond its imagery, Landsat 7 served as a radiometric benchmark. Its ETM+ sensor was so well-calibrated that it became a reference instrument, helping to ensure consistency and accuracy across other satellite missions. This legacy continued with Landsat 8, launched in 2013, and Landsat 9, which entered service in 2021. Even in its final years, Landsat 7 contributed to efforts to standardize Earth observation through proposed servicing missions and calibration support.

Landsat 7’s mission may have ended, but its legacy endures. For over 20 years, it provided humanity with a clearer picture of our changing planet, setting new standards in satellite imaging and democratizing access to Earth observation data. As scientists and decision-makers confront the challenges of climate change, food security, and sustainable development, the insights first captured by Landsat 7 continue to inform policy and shape our understanding of the world.

Video credit: NASA Goddard

 

  • Facebook
  • Google
  • Slashdot
  • Reddit
  • Live
  • TwitThis