The year 2021 has seen a flurry of extreme events around the globe. Among the many that have captured headlines so far this year:
- Devastating flooding in Australia, Europe, Asia, and the U.S. Northeast.
- California’s massive Dixie Fire, now the state’s second largest on record.
- A crippling U.S. polar vortex event that paralyzed Texas in February with bitter cold temperatures and massive power outages.
From the unique vantage point of space, we’ve been able to observe and monitor these events, no matter where they’ve occurred. Satellite data from NASA and other institutions are critical to understanding how and why extreme events take place.
This year’s events come on the heels of a record-breaking 2020 in the United States. According to the National Oceanic and Atmospheric Administration, there were 22 separate weather and climate-related disasters last year where the overall damages/costs for each reached or exceeded $1 billion. Last year also saw a record number of tropical cyclones form in the Atlantic Basin.
There’s growing evidence that people and the planet are increasingly impacted by extreme events. According to the Fourth National Climate Assessment, published in 2018 by the U.S. Global Change Research Program, “more frequent and intense extreme weather and climate-related events, as well as changes in average climate conditions, are expected to continue to damage infrastructure, ecosystems, and social systems that provide essential benefits to communities.”
As the impacts of extreme events continue to mount, interest has grown in the scientific community to study whether specific extreme events can be partially attributed to human activities. With the help of climate models, scientists have conducted an impressive array of studies, looking for possible links between human activities and extreme events such as heat waves, rainfall and flooding events, droughts, storms, and wildfires.
Increasingly, they’re able to draw robust connections. There are reductions in the number of cold waves, increases in the number of heat waves on the ocean and on land, increases in the intensity of rainfall and drought, and increases in the intensity of wildfires. Despite the complications and uniqueness of individual events, scientists are finding significant human contributions to many of them.
An interactive map produced by CarbonBrief in 2020, shown below, provides visible evidence of these studies. On it, red dots represent different extreme events where scientists have found a substantial contribution from human activities – that is, human activities have made these events more frequent or more intense. For some of the blue dots, however (associated with rainfall events), scientists have yet to find a substantial human contribution.
Events with a big thermodynamic component – that is, those where there’s a big impact because of heat – are being made more intense or more frequent because of human activities. In contrast, for extreme events that are more dependent on the dynamics of the atmosphere, the links to human activities are less clear.
Here are a few examples of extreme events where scientists are finding connections to human activities.
The continued increase in global mean temperatures in response to rising levels of greenhouse gases sets the expectation that we’ll see a corresponding increase in global heat extremes. Indeed, this is being borne out by daily temperature data across the globe. Studies of individual heat waves, such as the devastating event that took place in the Pacific Northwest this summer, suggest such events have become tens to hundreds of times more likely because of human-driven climate change.
A global examination of how often heat waves are occurring, as well as their cumulative intensity (how many days heat waves last above a certain temperature level), published last year by Australian scientists from the Climate Change Research Centre and the University of New South Wales Canberra, reveals a clear increase of more than two days per decade in the number of heat wave days since the 1950s.
The intensity of droughts is increasing. It’s not so much that scientists are seeing less rainfall, though that’s certainly happening in some places. Rather, in places where drought conditions exist, soils are becoming drier due to other factors, such as increased soil evaporation and decreased snowpack, which is reducing the amount of river flow during summer and fall. In the American Southwest, scientists estimate human-caused climate change is making droughts 30 to 50 percent more intense. 1
When it does rain, we’re also seeing trends in how much rain falls. A prime example is Hurricane Harvey, which caused devasting flooding in parts of Texas in 2017. The storm dropped up to 40 inches (102 centimeters) of rain on some areas.
In locations where scientists have data of sufficient quality, observations are showing an increasing intensity of rainfall. This is coincident with the observed overall increase in atmospheric water vapor (about four percent per degree Fahrenheit of warming). The more water vapor that air contains, the more it can rain out during convection or as air masses collide.
There have been hurricanes and intense storms throughout history, so what’s changed? Model studies confirm that, for instance, about 20 percent of Harvey’s rainfall was attributable to human-produced warming of the climate and waters in the Gulf of Mexico. 2, 3 More generally, climate simulations confirm that this increased intensity is a robust result.
It’s important to note that impacts from extreme events are mainly a question of thresholds – the amount of flooding needed to overtop a levee, or overwhelm storm drains – so every inch (of additional rain) counts. So, while total rainfall may increase only slightly, it’s the extreme precipitation events that disproportionately cause problems.
The Bottom Line
The combination of models and observations, informed by the unique view that space provides, imply that almost all the current multi-decadal trends we’re seeing in climate are the result of human activities. In addition, there’s increasing confidence that human-induced climate change is making extreme events statistically much more likely.
This doesn’t mean every extreme event has a substantial human contribution. But with extreme events such as heat waves, wildfires and intense precipitation, we’re seeing, in event after event, a very clear human fingerprint.
- Williams, A.P.; Cook, E.R.; Smerdon, J.E.; Cook, B.I.; Abatzoglou, J.T.; Bolles, K.; Baek, S.H.; Badger, A.M.; Livneh, B. 2020: Large contribution from anthropogenic warming to an emerging North American megadrought. Science. 368 (6488), 314-318, doi:10.1126/science.aaz9600.
- van Oldenborgh, G.J.; van der Wiel, K.; Sebastian, A.; Singh, R.; Arrighi, J.; Otto, F.; Haustein, K.; Li, S.; Vecchi, G.; Cullen, H. 2017: Attribution of extreme rainfall from Hurricane Harvey. Environ. Res. Lett. 12. 124009.
- Risser M.D.; Wehner M.F. 2017. Attributable human-induced changes in the likelihood and magnitude of the observed extreme precipitation during Hurricane Harvey. Geophys Res Lett. 44:12, 457–12, 464.
How do we work together to create a nation resilient against climate change?
Earlier today, NASA joined forces with FEMA to co-host their Resilient Nation Partnership Network Alliances for Climate Action Virtual Forum Series.
NASA’s researchers, innovators, and pioneers are on the forefront of climate action. NASA’s Earth observation and research supports the Biden administration’s climate agenda, which outlines putting the climate crisis at the center of our country’s foreign policy and national security. President Biden has been clear: The climate crisis requires an all-hands-on-deck, whole-of-government approach.
Of course, we can’t mitigate climate change unless we measure and understand it. That’s NASA’s expertise.
The sad reality is that climate change is already impacting our communities. The cost is enormous. It is loss of life. It is loss of livelihoods. And it is loss of communities. Unless we act, and we act decisively, the poorest among us will suffer disproportionately, and instability will increase – both here at home and abroad.
Over the past year and a half, we have all experienced firsthand the importance of looking ahead and the importance of understanding and planning for potential disasters. The poorest and most vulnerable among us are too often those who pay the highest price for inaction.
The Biden administration has made advancing racial equity and support for underserved communities a top priority. It is such a priority, that President Biden signed an Executive Order to do just that on day one of his presidency.
The Biden administration has also focused on advancing environmental justice, which we know is closely linked with equity. Together, we need to further develop the capacity to monitor and reduce the detrimental impacts of hurricanes and floods. This will have a tremendous impact on America’s underserved communities.
With the clear effects of climate change, the devastation from hurricanes and floods is severe, and growing more severe with each new year.
In the face of disasters, there are significant equity issues when it comes to which communities gets support, and when. But we need more. We need continued agency cooperation. We need a mission control center for climate change.
NASA uses a mission control center for every launch and mission. In the case of the International Space Station, it has operated 24/7, 365 for over two and a half decades. No less effort should be made to reverse the heating of our planet and to restore mother nature’s environmental balance.
NASA is one of the world’s greatest experts in climate science, engaged in a broad range of activities to track and mitigate the effects of climate change. And we are actively focusing on making that data available and useful to U.S. citizens and beyond.
Today I also announced that in addition to our existing Earth Science programs, we are exploring a new concept at NASA: A climate resilience design center that can help state, local, tribal, and territorial governments develop their climate resilience strategies.
This is not something that we can do alone. It is an endeavor that is going to take collaboration with other agencies, like FEMA, and it’s going to take data from commercial companies and from international partners.
But as one of the lead U.S. climate science agencies, NASA will take a leading role in helping our nation, and the world, prepare for the challenges to come.
Our decisions will determine the fate of Earth. Let us protect it. Let us act boldly and with urgency. Let us preserve it for this generation – and generations that follow.
You can watch my full remarks here.
Earth is surrounded by an immense magnetic field, called the magnetosphere. Generated by powerful, dynamic forces at the center of our world, our magnetosphere shields us from erosion of our atmosphere by the solar wind, particle radiation from coronal mass ejections (eruptions of large clouds of energetic, magnetized plasma from the Sun’s corona into space), and from cosmic rays from deep space. Our magnetosphere plays the role of gatekeeper, repelling these forms of energy that are harmful to life, trapping most of it safely away from Earth’s surface. You can learn more about Earth’s magnetosphere here.
Since the forces that generate our magnetic field are constantly changing, the field itself is also in continual flux, its strength waxing and waning over time. This causes the location of Earth’s magnetic north and south poles to gradually shift, and to even completely flip locations every 300,000 years or so. That might be somewhat important if you use a compass, or for certain animals like birds, fish and sea turtles, whose internal compasses use the magnetic field to navigate.
Some people have claimed that variations in Earth’s magnetic field are contributing to current global warming and can cause catastrophic climate change. However, the science doesn’t support that argument. In this blog, we’ll examine a number of proposed hypotheses regarding the effects of changes in Earth’s magnetic field on climate. We’ll also discuss physics-based reasons why changes in the magnetic field can’t impact climate.
1. Shifts in Magnetic Pole Locations
The position of Earth’s magnetic north pole was first precisely located in 1831. Since then, it’s gradually drifted north-northwest by more than 600 miles (1,100 kilometers), and its forward speed has increased from about 10 miles (16 kilometers) per year to about 34 miles (55 kilometers) per year. This gradual shift impacts navigation and must be regularly accounted for. However, there is little scientific evidence of any significant links between Earth’s drifting magnetic poles and climate.
2. Magnetic Pole Reversals
During a pole reversal, Earth’s magnetic north and south poles swap locations. While that may sound like a big deal, pole reversals are common in Earth’s geologic history. Paleomagnetic records tell us Earth’s magnetic poles have reversed 183 times in the last 83 million years, and at least several hundred times in the past 160 million years. The time intervals between reversals have fluctuated widely, but average about 300,000 years, with the last one taking place about 780,000 years ago.
During a pole reversal, the magnetic field weakens, but it doesn’t completely disappear. The magnetosphere, together with Earth’s atmosphere, continue protecting Earth from cosmic rays and charged solar particles, though there may be a small amount of particulate radiation that makes it down to Earth’s surface. The magnetic field becomes jumbled, and multiple magnetic poles can emerge in unexpected places.
No one knows exactly when the next pole reversal may occur, but scientists know they don’t happen overnight: they take place over hundreds to thousands of years.
In the past 200 years, Earth’s magnetic field has weakened about nine percent on a global average. Some people cite this as “evidence” a pole reversal is imminent, but scientists have no reason to believe so. In fact, paleomagnetic studies show the field is about as strong as it’s been in the past 100,000 years, and is twice as intense as its million-year average. While some scientists estimate the field’s strength might completely decay in about 1,300 years, the current weakening could stop at any time.
Plant and animal fossils from the period of the last major pole reversal don’t show any big changes. Deep ocean sediment samples indicate glacial activity was stable. In fact, geologic and fossil records from previous reversals show nothing remarkable, such as doomsday events or major extinctions.
3. Geomagnetic Excursions
Recently, there have been questions and discussion about “geomagnetic excursions:” shorter-lived but significant changes in the magnetic field’s intensity that last from a few centuries to a few tens of thousands of years. During the last major excursion, called the Laschamps event, radiocarbon evidence shows that about 41,500 years ago, the magnetic field weakened significantly and the poles reversed, only to flip back again about 500 years later.
While there is some evidence of regional climate changes during the Laschamps event timeframe, ice cores from Antarctica and Greenland don’t show any major changes. Moreover, when viewed within the context of climate variability during the last ice age, any changes in climate observed at Earth’s surface were subtle.
Bottom line: There’s no evidence that Earth’s climate has been significantly impacted by the last three magnetic field excursions, nor by any excursion event within at least the last 2.8 million years.
1. Insufficient Energy in Earth’s Upper Atmosphere
Electromagnetic currents exist within Earth’s upper atmosphere. But the energy driving the climate system in the upper atmosphere is, on global average, a minute fraction of the energy that drives the climate system at Earth’s surface. Its magnitude is typically less than one to a few milliwatts per square meter. To put that into context, the energy budget at Earth’s surface is about 250 to 300 watts per square meter. In the long run, the energy that governs Earth’s upper atmosphere is about 100,000 times less than the amount of energy driving the climate system at Earth’s surface. There is simply not enough energy aloft to have an influence on climate down where we live.
2. Air Isn’t Ferrous
Finally, changes and shifts in Earth’s magnetic field polarity don’t impact weather and climate for a fundamental reason: air isn’t ferrous.
Ferrous? Say what?? Bueller? Bueller?
Ferrous means “containing or consisting of iron.” While iron in volcanic ash is transported in the atmosphere, and small quantities of iron and iron compounds generated by human activities are a source of air pollution in some urban areas, iron isn’t a significant component of Earth’s atmosphere. There’s no known physical mechanism capable of connecting weather conditions at Earth’s surface with electromagnetic currents in space.
Solar storms and their electromagnetic interactions only impact Earth’s ionosphere, which extends from the lowest edge of the mesosphere (about 31 miles or 50 kilometers above Earth’s surface) to space, around 600 miles (965 kilometers) above the surface. They have no impact on Earth’s troposphere or lower stratosphere, where Earth’s surface weather, and subsequently its climate, originate.
In short, when it comes to climate, variations in Earth’s magnetic field are nothing to get charged up about.
NASA’s work has generated countless spinoffs that are now on the front lines of the fight against climate change. That shouldn’t be a surprise, since the agency’s missions include studying Earth and improving aircraft efficiency.
But that’s not the only way NASA’s innovations make an impact. Many advances to meet the harsh demands of space travel are also helping to reduce greenhouse gases, improve alternative energy sources, and increase our understanding of the causes and effects of climate change.
Read on for a few examples, and head over to spinoff.nasa.gov/climate-change for a roundup of dozens more.
Trapping Greenhouse Gases
Carbon dioxide, a greenhouse gas, is the most prominent driver of climate change on Earth. On Mars, however, where most of the atmosphere is CO2, the gas could come in handy. Under NASA contracts, one engineer helped develop technology to capture Martian carbon dioxide and break it into carbon and oxygen for other uses, from life support to fuel for a journey home.
Although it never flew, Perseverance will test out a similar idea, using an experimental system called MOXIE (Mars Oxygen In-Situ Resource Utilization Experiment). Meanwhile, the earlier technology led to a system that now captures natural gases at oil wells, instead of wastefully burning them off and dumping the resulting CO2 into the atmosphere.
And another version of the system helps beer breweries go “greener” by capturing carbon dioxide from the brewing process, rather than venting it, and using it for carbonation instead of buying more.
Conserving energy is a crucial consideration for space travel, and many innovations NASA has come up with in that arena are now widespread in improving energy efficiency on Earth.
For example, NASA helped create a type of reflective insulation to efficiently maintain a comfortable temperature within spacecraft and spacesuits. In the decades since, this insulation has been adapted and used in homes and buildings around the world.
Another material pioneered to insulate cryogenic rocket fuel against the balmy weather around the launch pad at Cape Canaveral, Florida, now saves energy by preserving temperatures at industrial facilities. And a coating invented to protect spacecraft during the extreme heat of atmospheric entry improves the efficiency of incinerators, boilers, and refractories, ovens, and more.
Shrinking Air Travel’s Carbon Footprint
Air travel is a major contributor to human-made greenhouse gases. Designing aircraft to fly more efficiently reduces the amount of fuel they burn, and in turn, their resulting emissions. And many of the improvements that make modern aircraft more efficient come straight from NASA.
In fact, some of the agency’s most significant contributions to aeronautic fuel efficiency can be traced back to the work of a single NASA engineer in the 1960s and ’70s. Richard Whitcomb designed and tested an entirely new wing shape – the supercritical wing – that significantly increased efficiency at high speeds and eliminated weight.
He then designed upturned wingtips that make use of air vortices that would otherwise create drag. Now incorporated into nearly all commercial planes, these advances combined save billions of dollars’ worth of fuel, along with associated CO2 emissions, every year.
In the decades since, NASA has continued to work with industry partners to improve airplane efficiency, and the agency is now supporting the cutting edge of all-electric flight.
Advancing Renewable Energy
Because there are no fossil fuels on Mars, NASA became interested in wind energy to power future Martian operations. So, the space agency helped a company develop a wind turbine that could operate in a similarly harsh environment – the South Pole. Rugged and designed for easy maintenance and efficiency at extremely low temperatures, more than 800 of the resulting turbines are now generating power on Earth.
Unexpectedly, software NASA supported for improved aircraft design and maintenance has also led to more efficient, long-lasting wind turbines. And several solar panel manufacturers have benefited from the agency’s long reliance on the sun for energy.
Understanding Climate Change
Mountains of data from a fleet of Earth-observing NASA satellites help countless other agencies, researchers, and companies better understand the causes and effects of climate change. The agency has worked with commercial partners to make this data manageable and easier to mine for information. Other companies have benefited from NASA’s support for technology to monitor conditions on the ground and in the oceans and atmosphere, including innovative devices to sense local greenhouse gases and ocean conditions. The resulting data helps to verify and enrich the agency’s models of Earth weather and climate, which span decades, circle the globe, and peer into the future.
NASA has a long history of transferring technology to the private sector. The agency’s Spinoff publication profiles NASA technologies that have transformed into commercial products and services, demonstrating the broader benefits of America’s investment in its space program. Spinoff is a publication of the Technology Transfer program in NASA’s Space Technology Mission Directorate.
For more information on how NASA brings space technology down to Earth, visit:
Ensuring the accuracy of Earth’s long-term global and regional surface temperature records is a challenging, constantly evolving undertaking.
There are lots of reasons for this, including changes in the availability of data, technological advancements in how land and sea surface temperatures are measured, the growth of urban areas, and changes to where and when temperature data are collected, to name just a few. Over time, these changes can lead to measurement inconsistencies that affect temperature data records.
Scientists have been building estimates of Earth’s average global temperature for more than a century, using temperature records from weather stations. But before 1880, there just wasn’t enough data to make accurate calculations, resulting in uncertainties in these older records. Fortunately, consistent temperature estimates made by paleoclimatologists (scientists who study Earth’s past climate using environmental clues like ice cores and tree rings) provide scientists with context for understanding today’s observed warming of Earth’s climate, which has no historic parallel.
Over the past 140 years, we’ve literally gone from making some temperature measurements by hand to using sophisticated satellite technology. Today’s temperature data come from many sources, including more than 32,000 land weather stations, weather balloons, radar, ships and buoys, satellites, and volunteer weather watchers.
To account for all of these changes and ensure a consistent, accurate record of our planet’s temperature variations, scientists use information from many sources to make adjustments before incorporating and absorbing temperature data into analyses of regional or global surface temperatures. This allows them to make “apples to apples” comparisons.
Let’s look more closely at why these adjustments are made.
To begin with, some temperature data are gathered by humans. As all of us know, humans can make occasional mistakes in recording and transcribing observations. So, a first step in processing temperature data is to perform quality control to identify and eliminate any erroneous data caused by such errors – things like missing a minus sign, misreading an instrument, etc.
Changes to Land Weather Stations
Next are changes to land weather stations. Temperature readings at weather stations can be affected by the physical location of the station, by what’s happening around it, and even by the time of day that readings are made.
For example, if a weather station is located at the bottom of a mountain and a new station is built on the same mountain but at a higher location, the changes in latitude and elevation could affect the station’s readings. If you simply averaged the old and new data sets, the station’s overall temperature readings would be lower beginning when the new station opens. Similarly, if a station is moved away from a city center to a less developed location like an airport, cooler readings may result, while if the land around a weather station becomes more developed, readings might get warmer. Such differences are caused by how ground surfaces in different environments absorb and retain heat.
Then there are changes to the way that stations collect temperature data. Old technologies become outdated or instrumentation simply wears out and is replaced. Using new equipment with slightly different characteristics can affect temperature measurements.
Data adjustments may also be required if there are changes to the time of day that observations are made. If, for example, a network of weather stations adopts a uniform observation time, as they did in the United States, stations making such a switch will see their data affected, because temperature is dependent on time of day.
Scientists also make adjustments to account for station temperature data that are significantly higher or lower than that of nearby stations. Such out-of-the-ordinary temperature readings typically have absolutely nothing to do with climate change but are instead due to some human-produced change that causes the station readings to be out of line with neighboring stations. By comparing data with surrounding stations, scientists can identify abnormal station measurements and ensure that they don’t skew overall regional or global temperature estimates.
In addition, since the number of land weather stations is increasing over time, forming more dense networks that increase the accuracy of temperature estimates in those regions, scientists also take those improvements into account so data from areas with dense networks can be appropriately compared with data from areas with less dense networks.
Changes to Sea Surface Temperature Measurements
Much like the trends on land, sea surface temperature measurement practices have also changed significantly.
Before about 1940, the most common method for measuring sea surface temperature was to throw a bucket attached to a rope overboard from a ship, haul it back up, and read the water temperature. The method was far from perfect. Depending on the air temperature, the water temperature could change as the bucket was pulled from the water.
During the 1930s and ‘40s, scientists began measuring the temperature of ocean water piped in to cool ship engines. This method was more accurate. The impact on long-term ocean surface temperature records was to reduce the warming trend in global ocean temperatures that had been observed before that time. That’s because temperature readings from water drawn up in buckets prior to measurement are, on average, a few tenths of a degree Celsius cooler than readings of water obtained at the level of the ocean in a ship’s intake valves.
Then, beginning around 1990, measurements from thousands of floating buoys began replacing ship-based measurements as the commonly accepted standard. Today, such buoys provide about 80% of ocean temperature data. Temperatures recorded by buoys are slightly lower than those obtained from ship engine room water intakes for two reasons. First, buoys sample water that is slightly deeper, and therefore cooler, than water samples obtained from ships. Second, the process of passing water samples through a ship’s inlet can slightly heat the water. To compensate for the addition of cooler water temperature data from buoys to the warmer temperature data obtained from ships, ocean temperatures from buoys in recent years have been adjusted slightly upward to be consistent with ship measurements.
So Many Climate Data Sets, So Little Disagreement
Currently, there are multiple independent climate research organizations around the world that maintain long-term data sets of global land and ocean temperatures. Among the best known are those produced by NASA, the National Oceanic and Atmospheric Administration (NOAA), the U.K. Meteorological Office's Hadley Centre/Climatic Research Unit (CRU) of the University of East Anglia, and Berkeley Earth, a California-based non-profit.
Each organization uses different techniques to make its estimates and adjusts its input data sets to compensate for changes in observing conditions, using data processing methods described in peer-reviewed literature.
Remarkably, despite the differences in methodologies used by these independent researchers, their global temperature estimates are all in close agreement. Moreover, they also match up closely to independent data sets derived from satellites and weather forecast models.
NASA’s GISTEMP Analysis
One of the leading data sets used to conduct global surface temperature analyses is the NASA Goddard Institute for Space Studies (GISS) surface temperature analysis, known as GISTEMP.
GISTEMP uses a statistical method that produces a consistent estimated temperature anomaly series from 1880 to the present. A “temperature anomaly” is a calculation of how much colder or warmer a measured temperature is at a given weather station compared to an average value for that location and time, which is calculated over a 30-year reference period (1951-1980). The current version of GISTEMP includes adjusted average monthly data from the latest version of the NOAA/National Centers for Environmental Information (NCEI) Global Historical Climatology Network analysis and its Extended Reconstructed Sea Surface Temperature data.
GISTEMP uses an automated process to flag abnormal records that don’t appear to be accurate. Scientists then perform manual inspections on the suspect data.
GISTEMP also adjusts to account for the effects of urban heat islands, which are differences in temperatures between urban and rural areas.
The procedure used to calculate GISTEMP hasn’t changed significantly since the mid-1980s, except to better account for data from urban areas. While the growing availability of better data has led to adjustments in GISTEMP’s regional temperature averages, the adjustments haven’t impacted GISTEMP’s global averages significantly.
While raw data from an individual station are never adjusted, any station showing abnormal data resulting from changes in measurement method, its immediate surroundings, or apparent errors, is compared to reference data from neighboring stations that have similar climate conditions in order to identify and remove abnormal data before they are input into the GISTEMP method. While such data adjustments can substantially impact some individual stations and small regions, they barely change any global average temperature trends.
In addition, results from global climate models are not used at any stage in the GISTEMP process, so comparisons between GISTEMP and model projections are valid. All data used by GISTEMP are in the public domain, and all code used is available for independent verification.
The Bottom Line
Independent analyses conclude the impact of station temperature data adjustments is not very large. Upward adjustments of global temperature readings before 1950 have, in total, slightly reduced century-scale global temperature trends. Since 1950, however, adjustments to input data have slightly increased the rate of global warming recorded by the temperature record by less than 0.1 degree Celsius (less than 0.2 degrees Fahrenheit).
A final note: while adjustments are applied to station temperature data being used in global analyses, the raw data from these stations never changes unless better archived data become available. When global temperature data are processed, the original records are preserved and are available to anyone who wants them, at no cost, online. For example, the NOAA National Climatic Data Center's U.S. and global records may be accessed here.
The year 2020 will be remembered for many things, not the least of which were a series of devastating fires around the globe that bear the fingerprints of climate change. From Australia and South America’s Amazon and Pantanal regions, to Siberia and the U.S. West, wildfires set new records and made news year-round.
It was an especially bad year for wildfires on the U.S. West Coast. Five of California’s 10 largest wildfires on record happened in 2020, and the state set a new record for acres burned. According to CAL FIRE, the state’s Department of Forestry and Fire Protection, more than 9,600 wildfires burned nearly 4.2 million acres through mid-December, causing more than 30 fatalities and damaging or destroying nearly 10,500 structures.
The Golden State wasn’t alone. Oregon, Washington, and Colorado were also particularly hard hit. In fact, as of mid-December 2020, the National Interagency Fire Center reported more than 10.6 million acres burned and nearly 17,800 buildings destroyed across its seven geographic area coordination centers in the western half of the contiguous United States.
It was the fire equivalent of a perfect storm. Record drought conditions across the Western United States in late 2019 extended into early 2020, and were followed by the hottest summer on record in the Northern Hemisphere. Add in unusually dry air, strong wind events, and an outbreak of summer thunderstorms in Northern California in August, and conditions were ripe for a dangerous fire season.
Natasha Stavros is an applied science system engineer at NASA’s Jet Propulsion Laboratory in Southern California who studies wildfires. She says that not only is the U.S. West experiencing more frequent wildfires, but they’re also happening at the same time, putting a strain on resources. They’re also bigger, more severe, and faster than ever before, and more destructive, with 15 of the 20 most destructive wildfires in California history occurring within the past decade.
Stavros attributes these trends to three primary factors: a changing climate, greater availability of fuel, and the expansion of urban areas, which brings with it more ignitions.
Climate Change: A Powerful Catalyst
“Climate affects how long, how hot and how dry fire seasons are,” she said. “As climate warms, we’re seeing a long-term drying and warming of both air and vegetation.”
In recent decades, the U.S. West has warmed, and the frequency and severity of heat waves and droughts has increased. According to the National Oceanic and Atmospheric Administration (NOAA), temperatures in California have increased approximately 2 degrees Fahrenheit (1.1 degrees Celsius) since the beginning of the 20th century. This has dried out the air. Fire seasons are also starting earlier and ending later each year, while snow packs are shrinking, leading to earlier spring snowmelt and longer, more intense dry seasons.
These warmer and drier conditions are also making U.S. Western wildfires more severe. Another recent study led by Sean Parks of the U.S. Forest Service finds the amount of Western U.S. land burned by “high-severity” wildfires (fires that destroy more than 95 percent of trees) has increased 800 percent since 1985.
More Fuel to Burn
Another factor driving changes in U.S. Western wildfires is a greater availability of fuel. Drier air stresses vegetation, making forests more susceptible to severe wildfires, while droughts are creating more dead fuel. But, as Stavros explains, there are limits.
“Fire is both fuel- and flammability-limited,” she said. “Take the state of Washington. You have lots of trees, but it tends to be really wet and cold there, so fires are limited by the flammability of the fuels. In a place like Nevada, however, the amount of fuel is limited, but it tends to be dry. Droughts increase fires in flammability-limited areas, but don’t have an impact in fuel-limited areas. Ironically, you have to have rain to have a fire.”
Fuels in the Western U.S. are also building up due to a century of intentional wildfire suppression. “Prescribed fires are important to reduce fuels, while mitigating the effects of smoke,” she said. “For example, ozone, regulated by the Clean Air Act, is problematic in the summer season when conditions are optimal for ozone formation. Wildfire emissions can increase these concentrations. Altering the timing of smoke emissions through the use of prescribed burning so that emissions occur outside of the ozone season may have a positive effect and reduce health impacts.”
Ignition Sources on the Rise
Yet another factor driving changes in Western U.S. wildfires is a greater number of ignition sources, both natural and human-caused.
Wildfires caused by lightning tend to occur in remote areas that are harder for firefighters to reach. These lightning-triggered wildfires are occurring more frequently. According to the U.S Forest Service, between 1992 and 2015, 44 percent of Western U.S. wildfires were triggered by lightning. Those fires were responsible for 71 percent of all land burned. Some studies predict climate change will increase the frequency of lightning in the future, but further research is needed.
Human-caused fires are also on the rise, due to increased human development of land at what’s known as the wildland-urban interface – the edge of wildland areas. This significantly increases opportunities for both accidental and intentionally set wildfires. It also tends to make these fires more destructive to lives and property.
How Wildfires Are Impacting Climate
While the impact of climate change on wildfires is well-established, wildfires are also affecting climate, with associated impacts on ecosystems, air and water quality, and human health. These climate impacts may be significant.
Wildfires release carbon emissions that affect climate and drive climate change-related events that contribute to even more wildfires. The specific type of emissions they produce is determined by what they burn and how complete the combustion process is. The largest amounts of carbon emitted are in the form of carbon dioxide - a powerful greenhouse gas - and carbon monoxide. The quantity of each gas depends on whether a fire is flaming or smoldering. Dry fuels combust more easily and are more likely to be flaming.
To put the carbon dioxide emissions from wildfires into perspective, September 2020 data from the Global Fire Emissions Database show that California wildfires in 2020 generated more than 91 million metric tons of carbon dioxide. That’s roughly 30 million metric tons more carbon dioxide emissions than the state emits annually from power production.
Wildfires also emit aerosols (tiny, floating solid and/or liquid particles of organic and inorganic matter). These aerosols can come in the form of black carbon, brown carbon, or both. When a fire is really hot, it produces more black carbon, commonly known as soot, char, or ash. When fires are less hot and smoldering, they produce more brown carbon, which reflects light, making it appear brown or yellow. Both types of carbon warm Earth’s climate, but black carbon has a stronger warming effect. Scientists currently know more about black carbon and its effects on climate than they do about brown carbon.
Scientists are also working to better understand the amount of ammonia wildfires release. When mixed with sunlight, ammonia produces two secondary aerosols - ammonium sulfate and ammonium nitrate – both of which have a cooling effect on climate. Ammonia also contributes to the formation of brown carbon.
Recently, scientists studying the devastating Australia wildfires of late 2019-early 2020 discovered that an outbreak of a rare type of fire-generated thundercloud had punched into Earth’s stratosphere, the second lowest layer of Earth’s atmosphere. The large quantity of smoke that made it into the stratosphere then circled the globe, reducing the amount of sunlight that reached the ground for several months. The smoke slightly cooled Earth’s surface by an as-yet undetermined amount (likely a small fraction of a degree, similar to the cooling effect of a moderate volcanic eruption). The event illustrates how large future wildfires may, at times, have a slight cooling effect on climate.
Studying the trace gas and aerosol emissions from wildfires and prescribed burns was the objective of a joint 2019 NASA-NOAA field campaign called Fire Influence on Regional to Global Environments Experiment – Air Quality (FIREX-AQ). FIREX-AQ combined aircraft measurements, ground sampling and satellite data to correlate wildfire emissions to fuel and fire conditions on the ground; study wildfire plumes, including how they’re transported in the atmosphere and how they impact air quality downwind; and assess how effective satellites are in estimating fire emissions.
The air quality impacts of the 2020 U.S. Western wildfires were truly extraordinary, at times making day as dark as night and tinging skies in major urban areas a surreal red. Some locations recorded air quality readings higher than 500 on the Air Quality Index scale (anything above 300 is considered hazardous to health). But smoke doesn’t know state or national boundaries - it drifted east thousands of miles across many parts of the United States, north into Canada and even as far as Europe. Researchers at Stanford University in Stanford, California, estimated California wildfire smoke likely led to at least 1,200 and as many as 3,000 excess California deaths between Aug. 1 and Sept. 10, 2020 alone.
Another climate impact of U.S. Western wildfires is their role in converting ecosystems from one type to another. Wildfires are necessary for healthy forest ecosystems. They help clear the forest floor of dead organic material, allow sunlight to reach it, add nutrients to the soil, provide habitat for animals and birds by clearing heavy brush so new plants can grow, and kill disease and insect infestations, among their many benefits. But when their frequency or severity is disturbed, it can throw things dangerously out of whack. In time, this may lead to the loss of some forests, as climate change increases the frequency of fires and makes it harder for ecosystems to reestablish.
“When you have major disturbance events like droughts and fires back-to-back in quick succession, you can change ecosystems,” Stavros said. “We’re starting to see this in some regions as wildfire frequency increases. Southern California’s mountains are covered with chaparral shrubs whose seedlings are only triggered to open by the extreme heat of a wildfire, and they’ve adapted to burning every seven to 15 years. If you increase the wildfire frequency, you begin depleting the seed bank and the chaparral may not regrow, because the only seedlings available for growth are often invasive species. In places like Arizona, Colorado, Washington, Oregon, and Idaho, we’re starting to see forests turn into prairies and grasslands. It’s not yet widespread, but it’s happening.”
Of course, the climate impacts of wildfires aren’t limited to the contiguous Western United States. In Alaska, increased wildfire activity is causing fires to burn through dense peatlands, releasing significant quantities of methane and carbon dioxide that exacerbate global warming. Other areas of global concern include Australia; Southeast Asia; the Amazon; Siberia, Canada and other parts of the Arctic; and even the Mediterranean region. The climate impacts of fires in each of these regions varies.
“The worst fires for climate are actually coming from Southeast Asia, the Amazon, and the Arctic, because you have carbon that’s been sitting there for a long time and then put back into the atmosphere when it burns,” Stavros said.
Adapting to a Fierier Future
One thing is clear: fires are likely to become an increasingly consequential fact of life as the U.S. West continues to get warmer and drier. Society will need to adapt.
“The impact of fire is much more than just area burned,” Stavros said. “It’s lives lost, infrastructure damaged, degraded air quality. We can use our scientific understanding to inform systematic approaches to managing how we live in a world with fire: how and where we build, how and where we perform maintenance on power lines, etc.
“Everybody cares when they can see and smell the smoke, but when it’s gone, they stop,” she added. “But the problem isn’t going to go away.”
Recently, an international research team published a comprehensive review in the journal Reviews of Geophysics on our state of understanding of Earth's "climate sensitivity," a key measure of how much our climate will change as greenhouse gas emissions increase. Essentially, by narrowing the range of estimates, the researchers found that climate sensitivity isn’t so low that it should be ignored, but it’s also not so high that there is no hope for the planet’s recovery.
We asked the two NASA authors on the study — Kate Marvel, jointly of Columbia University in New York and NASA’s Goddard Institute of Space Studies (GISS) in New York; and GISS Director Gavin Schmidt — to discuss their roles in the study and its significance for understanding the impacts of our warming world on climate.
Q. What exactly is climate sensitivity and why is it important to know its true value?
Schmidt: “We know from studies of the past that Earth’s climate can change dramatically. The evidence shows that the amount of greenhouse gases in the atmosphere can vary over time and make a big difference to the climate. Scientists try to quantify that by estimating how much the surface air temperature, averaged over the whole globe, would change if we doubled the amount of one typical but specific greenhouse gas – carbon dioxide. That number, called climate sensitivity, has quite a wide uncertainty range, and that has big implications for how serious human-made climate change will be.”
Q. Your team was able to narrow the range of estimates of Earth's climate sensitivity by more than 43 percent, from the previously accepted range of 1.5 to 4.5 Kelvin first established in 1979 (roughly 3 to 9 degrees Fahrenheit), to a narrower range of 2.6 to 3.9 Kelvin (roughly 4.5 to 7 degrees Fahrenheit). Why is it important for scientists to narrow this range of uncertainty? What does it mean in practical terms to be able to reduce uncertainties in measuring climate sensitivity?
Schmidt: “Scientists would like to reduce that uncertainty so that we can have more confidence in how we need to mitigate and adapt to future changes. For instance, how much sea level might rise, or how heat waves will get worse, or rainfall patterns change, are tied to the climate sensitivity combined with our actions in changing the atmosphere. A higher climate sensitivity would mean we would have to do more to avoid big changes, while a lower value would mean we’d have more time to adapt. It’s useful to note that we expect to reach double carbon dioxide levels later this century, and that while a few degrees might not seem like much, it's a big deal for the planet. The difference between forests beyond the Arctic Circle or glaciers extending down to New York City is only a range of about 8 K (about 14 degrees Fahrenheit) in the global average, while it changes sea level by 150 meters (more than 400 feet)!”
Q. How can better estimates of climate sensitivity impact policy decisions?
Marvel: “The most important thing about climate sensitivity is that it's not zero. Increasing atmospheric carbon dioxide definitely makes it warmer and increases the risk of extreme weather like drought, downpours, and heat waves. But better estimates of climate sensitivity are important for motivating action. Our results show that it would be foolish to rely on nature to save us from climate change — we don't think it's likely that sensitivity is low. But conversely, it's unlikely that climate sensitivity is so high as to make action pointless.”
Schmidt: “I’m not sure that our policy decisions are that finely tuned to the science of climate sensitivity other than knowing that climate really is sensitive to increasing greenhouse gases. Many climate policies are robust to those uncertainties, but many adaptation decisions will depend on knowing how bad things will get.”
Q. Why has it been so difficult over the past 40 years to narrow this range? What made this new estimate possible?
Schmidt: “There are three main reasons why this has been difficult. First, knowledge of past climate change has been difficult to quantify in globally coherent ways. Of course, we have known about the ice ages for a century or more, but getting accurate estimates of the global changes in temperature, greenhouse gases, and ice sheets has taken time and has needed many scientists working on many different aspects of the problem to come together. Second, the climate change signal has taken time to come out of the ‘noise’ of normal variability. In the 1980s and 1990s, people were still arguing about whether the warming over the 20th century was significant, but with another 20 years of record-breaking temperatures, that has been very clearly shown. Third, our understanding of the processes in the climate that affect sensitivity — clouds, water vapor, aerosols, etc. — has improved immensely with the development of satellite remote sensing, and every decade we are producing better and more useful information. But as these lines of evidence have matured, the need to come up with new methods to tie them all together coherently has become acute — and that was the impetus for this roughly 4-year effort.”
Marvel: “Yes, and in modeling, clouds are some of the biggest wildcards. See go.ted.com/katemarvel.”
Q. What types of evidence did the team consider in reaching its conclusions? Where do the lines of evidence agree and disagree most substantially?
Schmidt: “There are three main sources of information: changes since the late 19th century that have been measured in real time, our understanding of physical processes (particularly clouds), and new and more complete information from periods in the paleoclimate record (the geological past) where the planet was significantly cooler or warmer than today. All of the lines of evidence are mostly commensurate, but specific issues mean that the recent record isn’t good at constraining the high-end values because of the imprecise role of aerosols, and paleoclimate change is less able to constrain the low end because of the uncertain nature of that data. Together, however, we can mostly rule those tails out.”
Q. What were a few of the most significant findings for each of the three lines of evidence studied (feedback processes, the historical warming record, and paleoclimate records)?
Marvel: “For a long time, many people thought that sensitivity estimates derived from paleoclimate — the far past — were incompatible with estimates derived from more recent observations. But there's a difference between a past climate state in which the planet has reached an equilibrium — a ‘new normal’ — and our current climate, where things are very much in flux and continuing to change. There is some uncertainty in just how different the future will look from what we're experiencing now — it's possible we're moving into a new world for which we don't have a recent analogue. And when we take that uncertainty into account in a rigorous way, we find that the far past and the near future may not be telling us such different things after all.”
Schmidt: “What was interesting was that by starting off with a view of climate sensitivity that was a little more sophisticated than people had used previously, we found that there was more coherence among the different lines of evidence than others had found, and since the information we are using really is very independent, that allowed us to narrow the uncertainty.”
Q. Your team used a "Bayesian approach" to calculate your estimated range of climate sensitivity. In layman's terms, what is that?
Schmidt: “A Bayesian approach is really just a mathematical representation of how we do science in general. We have an initial hypothesis, we get some evidence that may or may not support it, and then we update our understanding based on that evidence. And then we do it again (and again, and again, etc.). Over time, and as more evidence accumulates, we hopefully hone in on the most correct answer. Using Bayesian methods allowed us to pull together disparate threads of evidence in a coherent way — allowing for different degrees of confidence in each of the lines of evidence. What is great is that in the future, as more evidence is discovered, we can continue the process and update our understanding again.”
Q. What role did global climate models play in the team's findings?
Marvel: “Complex climate models are useful tools (see here for a good overview). But in this paper, we relied largely on observations: satellite and ground-based measurements of recent trends, paleoclimate datasets, and basic physical principles.”
Schmidt: “Climate models help frame the questions we are asking and can be examined to see how climate patterns in space and time connect to things we can directly observe. But we know that climate models have a lot of uncertainty related (for instance) to cloud processes, and so we didn’t use them directly to estimate sensitivity. You could, however, use our results to assess whether a climate model has a sensitivity that is within our independently constrained range.”
Q. Your new estimated range of Earth's climate sensitivity finds the value is around the mid-point of the previous estimate range rather than on the lower or higher end. What does that mean in practical terms for projections of Earth's global temperatures and Earth's climate in this century?
Schmidt: “It means that climate sensitivity is not so low that we can ignore it, nor is it so high that we should despair. Ultimately, it tells us that while human-made climate change is (and will continue to be) a problem, our actions as a society can change that trajectory.”
Q. How likely is it that Earth's climate sensitivity could be higher than 3.9 Kelvin? Lower than 2.6 Kelvin?
Schmidt: “There are subjective elements to the analysis we performed, and other people could decide to weight things a little differently. We explored some of these alternative choices and that broadens the uncertainty a little, but basically, we estimate that there is about a one-in-six chance that it was less than the low end, and one-in-six that it was higher than the high end. That’s not impossible, but, if true, then a lot of our assessments would have to be quite a ways off.”
Q. The concentration of carbon dioxide in Earth's atmosphere is currently around 414 ppm (parts per million). What are the projections for future carbon dioxide increases under the range of current emissions scenarios and how does having a better estimate of climate sensitivity improve our understanding of how our climate may change in the future?
Schmidt: “The future trajectory of carbon dioxide will depend on what we do as a society — if we decide to burn all the fossil fuels we can find, we could reach 900 ppm by the end of the century, but if we aggressively reduce emissions, we could stay below 500 ppm, maybe lower. The climate sensitivity tells us what we can expect in terms of temperature — between another 1 or 2 degrees Celsius (1.8 or 3.6 degrees Fahrenheit) for the low scenario, which would be very serious, to between 4 and 7 degrees Celsius (7.2 and 12.6 degrees Fahrenheit) for the high end scenario, which would be a disaster.”
Q. What about your study did you find most surprising?
Marvel: “How difficult it was to get everyone with all their different expertise working together on a big, joint effort. In the end, I think everyone realized how important it was and how this will be a strong basis for everyone’s future research.”
Schmidt: “How consistent the results were across all three different approaches.”
Q. What was your role in the study?
Marvel: “I was one of the lead scientists on the section looking at historical constraints on sensitivity, making sure that we took into account the differences in how things changed over the 20th century and how things will change going forward, and working to make sure that the uncertainties in historical climate records were properly included.”
Schmidt: “I worked mainly on the paleoclimate section, making sure that we used the most appropriate data from key periods in the planet’s history (like the last ice age or the last time carbon dioxide was as high as it is now — some 3 million years ago).”
Global sea level rise is complex as well. To begin with, it has multiple causes, including the thermal expansion of the ocean as it warms, runoff of meltwater from land-based ice sheets and mountain glaciers, and changes in water that’s stored on land. These factors combine to raise the height of our global ocean about 3.3 millimeters (0.13 inches) every year. That rate is accelerating by another 1 millimeter per year (0.04 inches per year) every decade or so.
Another factor that makes sea level rise complex is that it’s not uniform around the globe. If you look at a global map of sea level rise, you’ll find it’s happening rapidly in some places and more slowly in others. This means that although sea level rise affects coastal areas all over our ocean planet, some regions feel its effects sooner and more severely than others. This is reflected in future projections of sea level rise, with many cities in Asia expected to be among the hardest hit localities. Here in the United States, cities expected to see the worst impacts include New York, Miami and New Orleans, to name but a few.
Indeed, at any given place and time around our planet, sea level rise varies. But why is that? It turns out that when it comes to sea level rise, it’s all local. And it’s all relative.
Relative Sea Level
“Relative sea level” refers to the height of the ocean relative to land along a coastline. Common causes of relative sea level change include:
- Changes due to heating of the ocean, and changes in ocean circulation
- Changes in the volume of water in the ocean due to the melting of land ice in glaciers, ice caps, and ice sheets, as well as changes in the global water cycle
- Vertical land motion (up or down movements of the land itself at a coastline, such as sinking caused by the compaction of sediments, or the rise and fall of land masses driven by the movement of continental or oceanic tectonic plates)
- Normal, short-term, frequent variations in sea level that have always existed, such as those associated with tides, storm surges, and ocean waves (swell and wind waves). These variations can be on the order of meters or more (discussed in more detail in our previous blog post).
Let’s look at these factors more closely.
When you heat up water, it expands and takes up more space. How much it expands depends on how deep the warming occurs as well as the temperature of the water to begin with. For example, in Earth’s tropics, a 1-degree Celsius (1.8 degrees Fahrenheit) warming in the temperature of the top 100 meters (328 feet) of the ocean raises sea level there by about 3 centimeters (1.2 inches). This thermal expansion of the ocean is responsible for between one-third and one-half of the overall global sea level rise observed over the last two decades. Because Earth’s ocean isn’t warming at the same rate everywhere, it results in regional differences in relative sea level rise, with areas that are warming faster seeing faster sea level rise.
Changes in ocean circulation also contribute to regional sea level differences. For example, in the United States, El Niño, a cyclical, naturally-occurring ocean circulation pattern of warming (in the central and eastern tropical Pacific Ocean) and cooling (in the western tropical Pacific Ocean) can temporarily raise relative sea level along the West Coast by more than a foot for up to a couple of years. Similarly, along the U.S. East Coast, the speedup or slowdown of the major ocean current known as the Gulf Stream can temporarily add or subtract as much as 5 centimeters (2 inches) of sea level height to local coastlines.
Next, there’s melting land ice in the Greenland and Antarctic ice sheets and Earth’s glaciers and ice caps. The largest contribution is from Greenland, which loses hundreds of billions of tons of ice a year and is a major contributor to sea level rise across the globe. As Greenland loses ice, the land beneath its ice sheet rises as the weight of the ice sheet is removed. As a result, Greenland itself doesn’t see any local sea level rise.
But all of its melted ice — currently averaging 281 gigatons a year, as measured by the U.S./German Gravity Recovery and Climate Experiment (GRACE) and GRACE Follow-on (GRACE-FO) satellite missions — has to go somewhere. Gravity causes it to flow into the ocean, causing sea level to rise thousands of miles away. Data from GRACE-FO tell us that melting land ice in glaciers, ice caps, and ice sheets contributed about two-thirds of global sea level rise during the last decade.
As land ice in Greenland, Antarctica and elsewhere melts, it changes Earth’s gravity field and slightly shifts the direction of Earth’s rotation. This causes uneven changes in sea level across the globe. Each melting ice mass around the world creates its own unique pattern of sea level change in the global ocean. For example, when ice melts in Antarctica, the amount of sea level rise it generates in California and Florida is up to 52 percent greater in those locations than if the global ocean just filled up uniformly, like water in a bathtub. Scientists use gravity data from the GRACE-FO mission to calculate patterns of sea level change associated with the loss of ice from glaciers, ice caps and ice sheets, as well as from changes in land water storage.
Then there’s vertical land motion along coastlines. When land sinks (a process known as subsidence), it causes a relative increase in sea levels. When land rises (known as uplift), it results in a relative decrease in sea levels.
A number of factors, both natural and human-produced, cause land to rise or sink, including:
- Adjustments related to the rebound of land during and following the retreat of past ice sheets in North America and Eurasia at the end of the last Ice Age (known as isostatic, or post-glacial, rebound). The retreat of the ice sheets lightened the load of mass on the underlying mantle deep below Earth’s surface, causing Earth’s surface there to slowly rise. Land areas that were once near the edge of these ancient ice sheets, such as along the U.S. eastern seaboard, are today falling, exacerbating sea level rise there.
Plate tectonics. Earth is divided into multiple slowly moving tectonic plates that interact with each other along plate boundaries. At some plate boundaries, the motion of one plate under, over, or past another results in vertical uplift or subsidence of the land surface above.
Natural or human-produced compaction of sediments, such as those caused by pumping groundwater, or oil and gas. Subsidence related to groundwater withdrawal can be especially pronounced in areas with large populations and extensive agriculture. Sediments can also be compacted by construction activities by humans or by the natural settling of new soils. In the United States, subsidence is a major factor in relative sea level rise along parts of the Gulf and East Coasts.
Oceanographer and climate scientist Josh Willis of NASA’s Jet Propulsion Laboratory in Southern California says that when it comes to relative sea level rise at any particular coastal location, subsidence is the most immediate consideration.
“People in coastal areas need to know what the land is doing right now where they live,” he said. “Is it sinking? If so, how fast? When you combine a sinking coastline with sea level rise caused by other contributing factors, you’re in trouble. Remember, scientists are projecting feet of global-mean sea level rise in this century. But in some places, land can sink by one foot in a decade. We have to understand all of these pieces before we can project future sea level rise at a beach near you.”
Climate scientists will tell you a key challenge in studying climate change is the relative dearth of long-term monitoring sites around the world. The oldest continuously operating station — the Mauna Loa Observatory on Hawaii’s Big Island, which monitors carbon dioxide and other key constituents of our atmosphere that drive climate change — has only been in operation since the late 1950s.
This obstacle is even more profound in the world’s coastal areas. In the global open ocean, the international Argo program’s approximately 4,000 drifting floats have observed currents, temperature, salinity and other ocean conditions since the early 2000s. But near coastlines, the situation is different. While coastal weather stations are plentiful, their focus is to produce weather forecasts for commercial and recreational ocean users, which aren’t necessarily useful for studying climate. The relative lack of long-term records of surface and deep ocean conditions near coastlines has limited our ability to make accurate oceanographic forecasts.
A meteorological and oceanographic coastal station in the small Spanish coastal town of L’Estartit is a notable exception. Located in the Catalan Costa Brava region of the northwest Mediterranean Sea, the L’Estartit station has collected inland data on air temperature, precipitation, atmospheric pressure and humidity since 1969, and has also made oceanographic observations at least weekly since 1973. This makes L’Estartit the longest available uninterrupted oceanographic data time series in the Mediterranean. A new NASA-funded study presents a detailed analysis of the site, revealing climate trends for its Mediterranean coastal environment spanning nearly a half century.
“The long-term data set from L’Estartit is a treasure trove that’s useful for assessing the regional impacts of climate change and how it’s evolved over time."
The study, led by Jordi Salat of the Institut de Ciències del Mar (CSIC) in Barcelona, provides estimates of annual trends in sea and atmospheric temperature and sea level, along with seasonal trends. It also compares data from the site with previous and other estimates of climate trends in the region. Co-authors include Josep Pascual, also with CSIC; oceanographers Jorge Vazquez and Mike Chin of NASA’s Jet Propulsion Laboratory in Southern California; and Mar Flexas of Caltech, also in Southern California.
The Evolution of Modern Ocean Monitoring
The existence of the L’Estartit station reflects the results of decades of scientific research dating back to the 20th century. This body of work has established the vital role the ocean plays, in conjunction with our atmosphere, in shaping Earth’s global weather and climate. While sea level and sea state have been monitored regularly for some time, other measurements of oceanic conditions haven’t been as well-chronicled. In order to reconstruct the climate history of the ocean, scientists have typically relied on data from coastal tide gauges and stationary mooring stations, along with oceanographic cruises that weren’t generally part of any coordinated monitoring program.
By the 1980s, however, as Earth’s global climate warming trend became evident, scientists began to establish international programs to conduct long-term studies of the ocean. As a result, in recent years, scientists have increasingly acknowledged the value of having the oceanographic equivalent of weather forecasts. Maintaining regular, long-term records of air temperature, water temperatures at the surface and at various depths, winds, sea level, salinity, and other key oceanographic parameters gives scientists valuable information on long-term average values, how variable our climate is and on long-term changes and trends. Moreover, they help scientists better evaluate how humans are contributing to climate change.
Over the past 20 to 30 years, new technologies have given scientists the ability to monitor the ocean all the way from the sea surface to the ocean floor. These include satellites, drifters, gliders, moorings, buoys, Argo profilers and ship data. These data are used as inputs to computer models to estimate the state of the ocean, make ocean forecasts and estimate climate trends.
L’Estartit: Monitoring a Climate Hot Spot
Maintained by voluntary observer Josep Pascual in collaboration with CSIC and the authority of the marine protected area, the L’Estartit station is well positioned to monitor the Mediterranean, a region of our planet that’s significantly impacted by climate change. It lies at the southern end of a relatively narrow offshore continental shelf and along the coastal side of the Northern Current, the main along-slope ocean current in the northwestern Mediterranean.
You can think of the Mediterranean as sort of a miniature ocean, since most of the processes that take place in the global ocean also take place here, albeit at different time scales in some instances. Its relatively small size also makes it more accessible to monitoring than many other regions of the global ocean. Because it’s located in Earth’s mid latitudes, it experiences significant seasonal variations, which affect the way it exchanges heat with the atmosphere.
The L’Estartit site collects a broad array of oceanographic data. In addition to the data mentioned previously, the site began continuous measurements of potential daily evaporation in 1976; and has measured sea state, along with wind speed and direction, since 1988. With the installation of a tide gauge in the harbor in 1990, continuous sea level data have been collected. Also added in the 1990s were conductivity-temperature-depth (CTD) profiles and water samples to analyze the temperature and salinity of the water column.
L’Estartit’s long-term data record makes it possible for scientists to calculate trends for a variety of atmospheric and oceanic climate attributes, including air temperature, sea surface and sub-surface temperature to a depth of 80 meters (262 feet), air pressure, relative humidity, relative cloudiness, wind, salinity, changes in ocean stratification, estimates of favorable conditions for evaporation, sea level and precipitation.
“The long-term data set from L’Estartit is a treasure trove that’s useful for assessing the regional impacts of climate change and how it’s evolved over time,” said Vazquez. “The data can be used as reference for other areas in the Mediterranean. The strong agreement between the site’s measurements of sea surface temperatures and satellite data of sea surface temperatures demonstrates how L’Estartit can serve as both a long-term ground truth site to validate satellite observations and as a regional monitoring site for climate change.”
Vazquez says data from the site have been used in numerous climate research studies and have also been used to document a variety of extreme events, from cold spells and heat waves to storms.
A Half-Century of Climate Trends
The researchers’ analysis of the nearly 50-year data set reveals numerous climate trends. For example, air temperature has increased by an average of 0.05 degrees Celsius (0.09 degrees Fahrenheit) per year during this time. Sea surface temperature has increased by an average of 0.03 degrees Celsius (0.05 degrees Fahrenheit) per year, while the temperature of the ocean at a depth of 80 meters (262 feet) has increased by an average of 0.02 degrees Celsius (0.04 degrees Fahrenheit) per year.
While sea level in the Mediterranean decreased from the 1960s to the 1990s due to changes in the North Atlantic Oscillation (a multi-decadal cyclical fluctuation of atmospheric pressure over the North Atlantic Ocean that strongly influences winter weather in Europe, Greenland, northeastern North America, North Africa and northern Asia), it’s been on the rise since the mid-1990s. The L’Estartit data show that sea level at that site is currently rising at a rate of 3.1 millimeters (0.12 inches) per year.
The researchers found that some of the long-term climate trends they observed were more pronounced during some seasons than in others. For example, trends in air temperature and sea surface temperature were significantly stronger during spring, while the trend for ocean temperature at 80 meters was greatest during autumn. Among their other findings, they noted a small increase in the number of days per year that experience summer-like sea conditions. They also found an almost two day per year drop in conditions favorable for marine evaporation, which may be related to an observed decrease in springtime coastal precipitation.
Vazquez says the good statistical comparison between sea surface temperature values and trends from the L’Estartit data set and data from available satellite products is encouraging. “The long-term consistency of the direct measurements with our satellite data gives scientists the opportunity to validate climate trends across multiple decades,” he said. “Data from L’Estartit should serve as a wake-up call to the global climate science community to immediately begin similar initiatives and ensure their continuity over time.”
The L’Estartit data are available to the public free of charge. The digitized data are accessible at http://meteolestartit.cat/. The remote sensing data used in the study may be retrieved through NASA’s Physical Oceanography Distributed Active Archive Center (PO.DAAC) at http://podaac.jpl.nasa.gov.
Lots of forces are at work on the world’s ocean, and NASA studies them all. When it comes to sea level, NASA does much more than just measure it; they also seek to understand it. But for non-scientists, fathoming the forces that determine sea levels around the world can sometimes be a bit daunting, so here’s a little guide to some of the basics.
Let’s dive in.
Waves in the Bathtub
Most of the time, Earth’s ocean looks pretty darn flat to those of us here on the ground, like the water in a bathtub. If you’re on a boat at sea, the only topography you’re going to notice on the ocean is waves. Generated by the friction between wind and water, wind waves range from tiny ripples on a calm sea to storm-generated monsters that can tower more than 100 feet (30 meters) high. Some wind waves are generated locally. Others, called swells, which result from winds that blew somewhere else in the past, travel across the ocean surface.
But even in the absence of waves, it turns out the ocean isn’t really flat at all. It has hills and valleys just like land surfaces do, though they’re relatively small — up to about 2 meters (6.5 feet) high.
These small variations in ocean surface topography are influenced by many factors, including the temperature of the water, how much salt it contains (its salinity), the pressure of the atmosphere above the ocean surface, and ocean currents.
Currents move ocean waters around our planet over long distances, primarily in a horizontal direction, reshaping the ocean’s surface and causing it to tilt. They’re generated by various forces, including winds, breaking waves, ocean temperature, salinity, and a phenomenon known as the Coriolis effect (which causes water and wind to deflect to the right in the Northern Hemisphere and to the left in the Southern Hemisphere). Currents flow around the ocean’s hills and valleys, much like wind blows around areas of high and low pressure in our atmosphere.
Ocean currents happen in the open ocean and generally don’t have a big impact on coastlines, with a few major exceptions, such as the Gulf Stream in the Atlantic Ocean along the U.S. East Coast and a similar Pacific Ocean current off the coast of Japan called the Kuroshio, which transports water northward up Japan’s east coast and then due east. As our planet warms, it affects wind patterns that drive most of these currents, changing them.
While all of these factors are important drivers of ocean surface topography, there’s an even larger force working to shape the ocean: changes in Earth’s geoid. The geoid is the shape that Earth’s ocean surface would take if the only influences acting upon it were gravity and Earth’s rotation. Changes in the solid Earth affect Earth’s gravitational field, causing the height of Earth’s geoid to vary by up to 100 meters (328 feet) around the globe. For example, in places where Earth’s crust is thick and dense, the gravitational pull causes extra water to pile up. In addition, the shape of the geoid is partly determined by geologic features on the floor of the ocean, including seamounts (underwater mountains) and valleys, which pull the water due to the force of gravity.
Topographic features on the open ocean can only be seen from space, by specialized instruments called altimeters that precisely measure the height of the ocean surface.
Since 1992, NASA has partnered with other U.S. and European institutions on multiple satellite missions to map ocean surface topography. They include the joint NASA/Centre National d'Etudes Spatiales (CNES) Topex/Poseidon mission, which operated from 1992 to 2005; the NASA/CNES Jason-1, which operated from 2001 to 2013; the joint NASA/CNES/National Oceanic and Atmospheric Administration (NOAA)/European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) Jason-2/Ocean Surface Topography Mission (OSTM), which operated from 2008 to 2019; and the current Jason-3, an international partnership led by NOAA with participation by NASA, CNES and EUMETSAT, launched in 2016. This November, Sentinel-6 Michael Freilich will launch to continue this long-term data set. The new mission is jointly developed by the European Space Agency (ESA), EUMETSAT, NASA and NOAA with funding support from the European Commission and support from CNES.
Measuring ocean surface topography allows us to understand ocean circulation (how our ocean stores energy from the Sun and moves it around our planet), accurately track changes in global sea level, and understand how the ocean joins forces with Earth’s atmosphere to create our weather and climate, including phenomena such as El Niño and La Niña and weather patterns such as hurricanes and other storms.
A Flattening Map
A look at a current map of trends in the nearly 30-year satellite record of global ocean surface topography reveals clear regional differences across the globe, with variations of up to 20 centimeters (8 inches) of sea level rise and fall from one place to another. But, says Josh Willis of NASA’s Jet Propulsion Laboratory in Pasadena, California, NASA’s project scientist for the Jason-2, Jason-3 and Sentinel-6 Michael Freilich missions, the map is getting flatter every year.
“Most of these 20-to-30 centimeter (8-to-12 inch) changes in sea level on the open ocean are cyclic, from natural things like El Niño and La Niña, or ocean currents speeding up or slowing down,” he said. “They've always been part of the story and always will be. But what really matters to people at the coast are long-term changes in their relative sea level – that is, the height of the ocean relative to the land. Those are caused by the overall rise due to global warming, and the movement of the land. And both of those are here to stay.”
Up next: why all sea level is "local."