Ensuring the accuracy of Earth’s long-term global and regional surface temperature records is a challenging, constantly evolving undertaking.
There are lots of reasons for this, including changes in the availability of data, technological advancements in how land and sea surface temperatures are measured, the growth of urban areas, and changes to where and when temperature data are collected, to name just a few. Over time, these changes can lead to measurement inconsistencies that affect temperature data records.
Scientists have been building estimates of Earth’s average global temperature for more than a century, using temperature records from weather stations. But before 1880, there just wasn’t enough data to make accurate calculations, resulting in uncertainties in these older records. Fortunately, consistent temperature estimates made by paleoclimatologists (scientists who study Earth’s past climate using environmental clues like ice cores and tree rings) provide scientists with context for understanding today’s observed warming of Earth’s climate, which has no historic parallel.
Over the past 140 years, we’ve literally gone from making some temperature measurements by hand to using sophisticated satellite technology. Today’s temperature data come from many sources, including more than 32,000 land weather stations, weather balloons, radar, ships and buoys, satellites, and volunteer weather watchers.
To account for all of these changes and ensure a consistent, accurate record of our planet’s temperature variations, scientists use information from many sources to make adjustments before incorporating and absorbing temperature data into analyses of regional or global surface temperatures. This allows them to make “apples to apples” comparisons.
Let’s look more closely at why these adjustments are made.
To begin with, some temperature data are gathered by humans. As all of us know, humans can make occasional mistakes in recording and transcribing observations. So, a first step in processing temperature data is to perform quality control to identify and eliminate any erroneous data caused by such errors – things like missing a minus sign, misreading an instrument, etc.
Changes to Land Weather Stations
Next are changes to land weather stations. Temperature readings at weather stations can be affected by the physical location of the station, by what’s happening around it, and even by the time of day that readings are made.
For example, if a weather station is located at the bottom of a mountain and a new station is built on the same mountain but at a higher location, the changes in latitude and elevation could affect the station’s readings. If you simply averaged the old and new data sets, the station’s overall temperature readings would be lower beginning when the new station opens. Similarly, if a station is moved away from a city center to a less developed location like an airport, cooler readings may result, while if the land around a weather station becomes more developed, readings might get warmer. Such differences are caused by how ground surfaces in different environments absorb and retain heat.
Then there are changes to the way that stations collect temperature data. Old technologies become outdated or instrumentation simply wears out and is replaced. Using new equipment with slightly different characteristics can affect temperature measurements.
Data adjustments may also be required if there are changes to the time of day that observations are made. If, for example, a network of weather stations adopts a uniform observation time, as they did in the United States, stations making such a switch will see their data affected, because temperature is dependent on time of day.
Scientists also make adjustments to account for station temperature data that are significantly higher or lower than that of nearby stations. Such out-of-the-ordinary temperature readings typically have absolutely nothing to do with climate change but are instead due to some human-produced change that causes the station readings to be out of line with neighboring stations. By comparing data with surrounding stations, scientists can identify abnormal station measurements and ensure that they don’t skew overall regional or global temperature estimates.
In addition, since the number of land weather stations is increasing over time, forming more dense networks that increase the accuracy of temperature estimates in those regions, scientists also take those improvements into account so data from areas with dense networks can be appropriately compared with data from areas with less dense networks.
Changes to Sea Surface Temperature Measurements
Much like the trends on land, sea surface temperature measurement practices have also changed significantly.
Before about 1940, the most common method for measuring sea surface temperature was to throw a bucket attached to a rope overboard from a ship, haul it back up, and read the water temperature. The method was far from perfect. Depending on the air temperature, the water temperature could change as the bucket was pulled from the water.
During the 1930s and ‘40s, scientists began measuring the temperature of ocean water piped in to cool ship engines. This method was more accurate. The impact on long-term ocean surface temperature records was to reduce the warming trend in global ocean temperatures that had been observed before that time. That’s because temperature readings from water drawn up in buckets prior to measurement are, on average, a few tenths of a degree Celsius cooler than readings of water obtained at the level of the ocean in a ship’s intake valves.
Then, beginning around 1990, measurements from thousands of floating buoys began replacing ship-based measurements as the commonly accepted standard. Today, such buoys provide about 80% of ocean temperature data. Temperatures recorded by buoys are slightly lower than those obtained from ship engine room water intakes for two reasons. First, buoys sample water that is slightly deeper, and therefore cooler, than water samples obtained from ships. Second, the process of passing water samples through a ship’s inlet can slightly heat the water. To compensate for the addition of cooler water temperature data from buoys to the warmer temperature data obtained from ships, ocean temperatures from buoys in recent years have been adjusted slightly upward to be consistent with ship measurements.
So Many Climate Data Sets, So Little Disagreement
Currently, there are multiple independent climate research organizations around the world that maintain long-term data sets of global land and ocean temperatures. Among the best known are those produced by NASA, the National Oceanic and Atmospheric Administration (NOAA), the U.K. Meteorological Office's Hadley Centre/Climatic Research Unit (CRU) of the University of East Anglia, and Berkeley Earth, a California-based non-profit.
Each organization uses different techniques to make its estimates and adjusts its input data sets to compensate for changes in observing conditions, using data processing methods described in peer-reviewed literature.
Remarkably, despite the differences in methodologies used by these independent researchers, their global temperature estimates are all in close agreement. Moreover, they also match up closely to independent data sets derived from satellites and weather forecast models.
NASA’s GISTEMP Analysis
One of the leading data sets used to conduct global surface temperature analyses is the NASA Goddard Institute for Space Studies (GISS) surface temperature analysis, known as GISTEMP.
GISTEMP uses a statistical method that produces a consistent estimated temperature anomaly series from 1880 to the present. A “temperature anomaly” is a calculation of how much colder or warmer a measured temperature is at a given weather station compared to an average value for that location and time, which is calculated over a 30-year reference period (1951-1980). The current version of GISTEMP includes adjusted average monthly data from the latest version of the NOAA/National Centers for Environmental Information (NCEI) Global Historical Climatology Network analysis and its Extended Reconstructed Sea Surface Temperature data.
GISTEMP uses an automated process to flag abnormal records that don’t appear to be accurate. Scientists then perform manual inspections on the suspect data.
GISTEMP also adjusts to account for the effects of urban heat islands, which are differences in temperatures between urban and rural areas.
The procedure used to calculate GISTEMP hasn’t changed significantly since the mid-1980s, except to better account for data from urban areas. While the growing availability of better data has led to adjustments in GISTEMP’s regional temperature averages, the adjustments haven’t impacted GISTEMP’s global averages significantly.
While raw data from an individual station are never adjusted, any station showing abnormal data resulting from changes in measurement method, its immediate surroundings, or apparent errors, is compared to reference data from neighboring stations that have similar climate conditions in order to identify and remove abnormal data before they are input into the GISTEMP method. While such data adjustments can substantially impact some individual stations and small regions, they barely change any global average temperature trends.
In addition, results from global climate models are not used at any stage in the GISTEMP process, so comparisons between GISTEMP and model projections are valid. All data used by GISTEMP are in the public domain, and all code used is available for independent verification.
The Bottom Line
Independent analyses conclude the impact of station temperature data adjustments is not very large. Upward adjustments of global temperature readings before 1950 have, in total, slightly reduced century-scale global temperature trends. Since 1950, however, adjustments to input data have slightly increased the rate of global warming recorded by the temperature record by less than 0.1 degree Celsius (less than 0.2 degrees Fahrenheit).
A final note: while adjustments are applied to station temperature data being used in global analyses, the raw data from these stations never changes unless better archived data become available. When global temperature data are processed, the original records are preserved and are available to anyone who wants them, at no cost, online. For example, the NOAA National Climatic Data Center's U.S. and global records may be accessed here.
The year 2020 will be remembered for many things, not the least of which were a series of devastating fires around the globe that bear the fingerprints of climate change. From Australia and South America’s Amazon and Pantanal regions, to Siberia and the U.S. West, wildfires set new records and made news year-round.
It was an especially bad year for wildfires on the U.S. West Coast. Five of California’s 10 largest wildfires on record happened in 2020, and the state set a new record for acres burned. According to CAL FIRE, the state’s Department of Forestry and Fire Protection, more than 9,600 wildfires burned nearly 4.2 million acres through mid-December, causing more than 30 fatalities and damaging or destroying nearly 10,500 structures.
The Golden State wasn’t alone. Oregon, Washington, and Colorado were also particularly hard hit. In fact, as of mid-December 2020, the National Interagency Fire Center reported more than 10.6 million acres burned and nearly 17,800 buildings destroyed across its seven geographic area coordination centers in the western half of the contiguous United States.
It was the fire equivalent of a perfect storm. Record drought conditions across the Western United States in late 2019 extended into early 2020, and were followed by the hottest summer on record in the Northern Hemisphere. Add in unusually dry air, strong wind events, and an outbreak of summer thunderstorms in Northern California in August, and conditions were ripe for a dangerous fire season.
Natasha Stavros is an applied science system engineer at NASA’s Jet Propulsion Laboratory in Southern California who studies wildfires. She says that not only is the U.S. West experiencing more frequent wildfires, but they’re also happening at the same time, putting a strain on resources. They’re also bigger, more severe, and faster than ever before, and more destructive, with 15 of the 20 most destructive wildfires in California history occurring within the past decade.
Stavros attributes these trends to three primary factors: a changing climate, greater availability of fuel, and the expansion of urban areas, which brings with it more ignitions.
Climate Change: A Powerful Catalyst
“Climate affects how long, how hot and how dry fire seasons are,” she said. “As climate warms, we’re seeing a long-term drying and warming of both air and vegetation.”
In recent decades, the U.S. West has warmed, and the frequency and severity of heat waves and droughts has increased. According to the National Oceanic and Atmospheric Administration (NOAA), temperatures in California have increased approximately 2 degrees Fahrenheit (1.1 degrees Celsius) since the beginning of the 20th century. This has dried out the air. Fire seasons are also starting earlier and ending later each year, while snow packs are shrinking, leading to earlier spring snowmelt and longer, more intense dry seasons.
These warmer and drier conditions are also making U.S. Western wildfires more severe. Another recent study led by Sean Parks of the U.S. Forest Service finds the amount of Western U.S. land burned by “high-severity” wildfires (fires that destroy more than 95 percent of trees) has increased 800 percent since 1985.
More Fuel to Burn
Another factor driving changes in U.S. Western wildfires is a greater availability of fuel. Drier air stresses vegetation, making forests more susceptible to severe wildfires, while droughts are creating more dead fuel. But, as Stavros explains, there are limits.
“Fire is both fuel- and flammability-limited,” she said. “Take the state of Washington. You have lots of trees, but it tends to be really wet and cold there, so fires are limited by the flammability of the fuels. In a place like Nevada, however, the amount of fuel is limited, but it tends to be dry. Droughts increase fires in flammability-limited areas, but don’t have an impact in fuel-limited areas. Ironically, you have to have rain to have a fire.”
Fuels in the Western U.S. are also building up due to a century of intentional wildfire suppression. “Prescribed fires are important to reduce fuels, while mitigating the effects of smoke,” she said. “For example, ozone, regulated by the Clean Air Act, is problematic in the summer season when conditions are optimal for ozone formation. Wildfire emissions can increase these concentrations. Altering the timing of smoke emissions through the use of prescribed burning so that emissions occur outside of the ozone season may have a positive effect and reduce health impacts.”
Ignition Sources on the Rise
Yet another factor driving changes in Western U.S. wildfires is a greater number of ignition sources, both natural and human-caused.
Wildfires caused by lightning tend to occur in remote areas that are harder for firefighters to reach. These lightning-triggered wildfires are occurring more frequently. According to the U.S Forest Service, between 1992 and 2015, 44 percent of Western U.S. wildfires were triggered by lightning. Those fires were responsible for 71 percent of all land burned. Some studies predict climate change will increase the frequency of lightning in the future, but further research is needed.
Human-caused fires are also on the rise, due to increased human development of land at what’s known as the wildland-urban interface – the edge of wildland areas. This significantly increases opportunities for both accidental and intentionally set wildfires. It also tends to make these fires more destructive to lives and property.
How Wildfires Are Impacting Climate
While the impact of climate change on wildfires is well-established, wildfires are also affecting climate, with associated impacts on ecosystems, air and water quality, and human health. These climate impacts may be significant.
Wildfires release carbon emissions that affect climate and drive climate change-related events that contribute to even more wildfires. The specific type of emissions they produce is determined by what they burn and how complete the combustion process is. The largest amounts of carbon emitted are in the form of carbon dioxide - a powerful greenhouse gas - and carbon monoxide. The quantity of each gas depends on whether a fire is flaming or smoldering. Dry fuels combust more easily and are more likely to be flaming.
To put the carbon dioxide emissions from wildfires into perspective, September 2020 data from the Global Fire Emissions Database show that California wildfires in 2020 generated more than 91 million metric tons of carbon dioxide. That’s roughly 30 million metric tons more carbon dioxide emissions than the state emits annually from power production.
Wildfires also emit aerosols (tiny, floating solid and/or liquid particles of organic and inorganic matter). These aerosols can come in the form of black carbon, brown carbon, or both. When a fire is really hot, it produces more black carbon, commonly known as soot, char, or ash. When fires are less hot and smoldering, they produce more brown carbon, which reflects light, making it appear brown or yellow. Both types of carbon warm Earth’s climate, but black carbon has a stronger warming effect. Scientists currently know more about black carbon and its effects on climate than they do about brown carbon.
Scientists are also working to better understand the amount of ammonia wildfires release. When mixed with sunlight, ammonia produces two secondary aerosols - ammonium sulfate and ammonium nitrate – both of which have a cooling effect on climate. Ammonia also contributes to the formation of brown carbon.
Recently, scientists studying the devastating Australia wildfires of late 2019-early 2020 discovered that an outbreak of a rare type of fire-generated thundercloud had punched into Earth’s stratosphere, the second lowest layer of Earth’s atmosphere. The large quantity of smoke that made it into the stratosphere then circled the globe, reducing the amount of sunlight that reached the ground for several months. The smoke slightly cooled Earth’s surface by an as-yet undetermined amount (likely a small fraction of a degree, similar to the cooling effect of a moderate volcanic eruption). The event illustrates how large future wildfires may, at times, have a slight cooling effect on climate.
Studying the trace gas and aerosol emissions from wildfires and prescribed burns was the objective of a joint 2019 NASA-NOAA field campaign called Fire Influence on Regional to Global Environments Experiment – Air Quality (FIREX-AQ). FIREX-AQ combined aircraft measurements, ground sampling and satellite data to correlate wildfire emissions to fuel and fire conditions on the ground; study wildfire plumes, including how they’re transported in the atmosphere and how they impact air quality downwind; and assess how effective satellites are in estimating fire emissions.
The air quality impacts of the 2020 U.S. Western wildfires were truly extraordinary, at times making day as dark as night and tinging skies in major urban areas a surreal red. Some locations recorded air quality readings higher than 500 on the Air Quality Index scale (anything above 300 is considered hazardous to health). But smoke doesn’t know state or national boundaries - it drifted east thousands of miles across many parts of the United States, north into Canada and even as far as Europe. Researchers at Stanford University in Stanford, California, estimated California wildfire smoke likely led to at least 1,200 and as many as 3,000 excess California deaths between Aug. 1 and Sept. 10, 2020 alone.
Another climate impact of U.S. Western wildfires is their role in converting ecosystems from one type to another. Wildfires are necessary for healthy forest ecosystems. They help clear the forest floor of dead organic material, allow sunlight to reach it, add nutrients to the soil, provide habitat for animals and birds by clearing heavy brush so new plants can grow, and kill disease and insect infestations, among their many benefits. But when their frequency or severity is disturbed, it can throw things dangerously out of whack. In time, this may lead to the loss of some forests, as climate change increases the frequency of fires and makes it harder for ecosystems to reestablish.
“When you have major disturbance events like droughts and fires back-to-back in quick succession, you can change ecosystems,” Stavros said. “We’re starting to see this in some regions as wildfire frequency increases. Southern California’s mountains are covered with chaparral shrubs whose seedlings are only triggered to open by the extreme heat of a wildfire, and they’ve adapted to burning every seven to 15 years. If you increase the wildfire frequency, you begin depleting the seed bank and the chaparral may not regrow, because the only seedlings available for growth are often invasive species. In places like Arizona, Colorado, Washington, Oregon, and Idaho, we’re starting to see forests turn into prairies and grasslands. It’s not yet widespread, but it’s happening.”
Of course, the climate impacts of wildfires aren’t limited to the contiguous Western United States. In Alaska, increased wildfire activity is causing fires to burn through dense peatlands, releasing significant quantities of methane and carbon dioxide that exacerbate global warming. Other areas of global concern include Australia; Southeast Asia; the Amazon; Siberia, Canada and other parts of the Arctic; and even the Mediterranean region. The climate impacts of fires in each of these regions varies.
“The worst fires for climate are actually coming from Southeast Asia, the Amazon, and the Arctic, because you have carbon that’s been sitting there for a long time and then put back into the atmosphere when it burns,” Stavros said.
Adapting to a Fierier Future
One thing is clear: fires are likely to become an increasingly consequential fact of life as the U.S. West continues to get warmer and drier. Society will need to adapt.
“The impact of fire is much more than just area burned,” Stavros said. “It’s lives lost, infrastructure damaged, degraded air quality. We can use our scientific understanding to inform systematic approaches to managing how we live in a world with fire: how and where we build, how and where we perform maintenance on power lines, etc.
“Everybody cares when they can see and smell the smoke, but when it’s gone, they stop,” she added. “But the problem isn’t going to go away.”
Recently, an international research team published a comprehensive review in the journal Reviews of Geophysics on our state of understanding of Earth's "climate sensitivity," a key measure of how much our climate will change as greenhouse gas emissions increase. Essentially, by narrowing the range of estimates, the researchers found that climate sensitivity isn’t so low that it should be ignored, but it’s also not so high that there is no hope for the planet’s recovery.
We asked the two NASA authors on the study — Kate Marvel, jointly of Columbia University in New York and NASA’s Goddard Institute of Space Studies (GISS) in New York; and GISS Director Gavin Schmidt — to discuss their roles in the study and its significance for understanding the impacts of our warming world on climate.
Q. What exactly is climate sensitivity and why is it important to know its true value?
Schmidt: “We know from studies of the past that Earth’s climate can change dramatically. The evidence shows that the amount of greenhouse gases in the atmosphere can vary over time and make a big difference to the climate. Scientists try to quantify that by estimating how much the surface air temperature, averaged over the whole globe, would change if we doubled the amount of one typical but specific greenhouse gas – carbon dioxide. That number, called climate sensitivity, has quite a wide uncertainty range, and that has big implications for how serious human-made climate change will be.”
Q. Your team was able to narrow the range of estimates of Earth's climate sensitivity by more than 43 percent, from the previously accepted range of 1.5 to 4.5 Kelvin first established in 1979 (roughly 3 to 9 degrees Fahrenheit), to a narrower range of 2.6 to 3.9 Kelvin (roughly 4.5 to 7 degrees Fahrenheit). Why is it important for scientists to narrow this range of uncertainty? What does it mean in practical terms to be able to reduce uncertainties in measuring climate sensitivity?
Schmidt: “Scientists would like to reduce that uncertainty so that we can have more confidence in how we need to mitigate and adapt to future changes. For instance, how much sea level might rise, or how heat waves will get worse, or rainfall patterns change, are tied to the climate sensitivity combined with our actions in changing the atmosphere. A higher climate sensitivity would mean we would have to do more to avoid big changes, while a lower value would mean we’d have more time to adapt. It’s useful to note that we expect to reach double carbon dioxide levels later this century, and that while a few degrees might not seem like much, it's a big deal for the planet. The difference between forests beyond the Arctic Circle or glaciers extending down to New York City is only a range of about 8 K (about 14 degrees Fahrenheit) in the global average, while it changes sea level by 150 meters (more than 400 feet)!”
Q. How can better estimates of climate sensitivity impact policy decisions?
Marvel: “The most important thing about climate sensitivity is that it's not zero. Increasing atmospheric carbon dioxide definitely makes it warmer and increases the risk of extreme weather like drought, downpours, and heat waves. But better estimates of climate sensitivity are important for motivating action. Our results show that it would be foolish to rely on nature to save us from climate change — we don't think it's likely that sensitivity is low. But conversely, it's unlikely that climate sensitivity is so high as to make action pointless.”
Schmidt: “I’m not sure that our policy decisions are that finely tuned to the science of climate sensitivity other than knowing that climate really is sensitive to increasing greenhouse gases. Many climate policies are robust to those uncertainties, but many adaptation decisions will depend on knowing how bad things will get.”
Q. Why has it been so difficult over the past 40 years to narrow this range? What made this new estimate possible?
Schmidt: “There are three main reasons why this has been difficult. First, knowledge of past climate change has been difficult to quantify in globally coherent ways. Of course, we have known about the ice ages for a century or more, but getting accurate estimates of the global changes in temperature, greenhouse gases, and ice sheets has taken time and has needed many scientists working on many different aspects of the problem to come together. Second, the climate change signal has taken time to come out of the ‘noise’ of normal variability. In the 1980s and 1990s, people were still arguing about whether the warming over the 20th century was significant, but with another 20 years of record-breaking temperatures, that has been very clearly shown. Third, our understanding of the processes in the climate that affect sensitivity — clouds, water vapor, aerosols, etc. — has improved immensely with the development of satellite remote sensing, and every decade we are producing better and more useful information. But as these lines of evidence have matured, the need to come up with new methods to tie them all together coherently has become acute — and that was the impetus for this roughly 4-year effort.”
Marvel: “Yes, and in modeling, clouds are some of the biggest wildcards. See go.ted.com/katemarvel.”
Q. What types of evidence did the team consider in reaching its conclusions? Where do the lines of evidence agree and disagree most substantially?
Schmidt: “There are three main sources of information: changes since the late 19th century that have been measured in real time, our understanding of physical processes (particularly clouds), and new and more complete information from periods in the paleoclimate record (the geological past) where the planet was significantly cooler or warmer than today. All of the lines of evidence are mostly commensurate, but specific issues mean that the recent record isn’t good at constraining the high-end values because of the imprecise role of aerosols, and paleoclimate change is less able to constrain the low end because of the uncertain nature of that data. Together, however, we can mostly rule those tails out.”
Q. What were a few of the most significant findings for each of the three lines of evidence studied (feedback processes, the historical warming record, and paleoclimate records)?
Marvel: “For a long time, many people thought that sensitivity estimates derived from paleoclimate — the far past — were incompatible with estimates derived from more recent observations. But there's a difference between a past climate state in which the planet has reached an equilibrium — a ‘new normal’ — and our current climate, where things are very much in flux and continuing to change. There is some uncertainty in just how different the future will look from what we're experiencing now — it's possible we're moving into a new world for which we don't have a recent analogue. And when we take that uncertainty into account in a rigorous way, we find that the far past and the near future may not be telling us such different things after all.”
Schmidt: “What was interesting was that by starting off with a view of climate sensitivity that was a little more sophisticated than people had used previously, we found that there was more coherence among the different lines of evidence than others had found, and since the information we are using really is very independent, that allowed us to narrow the uncertainty.”
Q. Your team used a "Bayesian approach" to calculate your estimated range of climate sensitivity. In layman's terms, what is that?
Schmidt: “A Bayesian approach is really just a mathematical representation of how we do science in general. We have an initial hypothesis, we get some evidence that may or may not support it, and then we update our understanding based on that evidence. And then we do it again (and again, and again, etc.). Over time, and as more evidence accumulates, we hopefully hone in on the most correct answer. Using Bayesian methods allowed us to pull together disparate threads of evidence in a coherent way — allowing for different degrees of confidence in each of the lines of evidence. What is great is that in the future, as more evidence is discovered, we can continue the process and update our understanding again.”
Q. What role did global climate models play in the team's findings?
Marvel: “Complex climate models are useful tools (see here for a good overview). But in this paper, we relied largely on observations: satellite and ground-based measurements of recent trends, paleoclimate datasets, and basic physical principles.”
Schmidt: “Climate models help frame the questions we are asking and can be examined to see how climate patterns in space and time connect to things we can directly observe. But we know that climate models have a lot of uncertainty related (for instance) to cloud processes, and so we didn’t use them directly to estimate sensitivity. You could, however, use our results to assess whether a climate model has a sensitivity that is within our independently constrained range.”
Q. Your new estimated range of Earth's climate sensitivity finds the value is around the mid-point of the previous estimate range rather than on the lower or higher end. What does that mean in practical terms for projections of Earth's global temperatures and Earth's climate in this century?
Schmidt: “It means that climate sensitivity is not so low that we can ignore it, nor is it so high that we should despair. Ultimately, it tells us that while human-made climate change is (and will continue to be) a problem, our actions as a society can change that trajectory.”
Q. How likely is it that Earth's climate sensitivity could be higher than 3.9 Kelvin? Lower than 2.6 Kelvin?
Schmidt: “There are subjective elements to the analysis we performed, and other people could decide to weight things a little differently. We explored some of these alternative choices and that broadens the uncertainty a little, but basically, we estimate that there is about a one-in-six chance that it was less than the low end, and one-in-six that it was higher than the high end. That’s not impossible, but, if true, then a lot of our assessments would have to be quite a ways off.”
Q. The concentration of carbon dioxide in Earth's atmosphere is currently around 414 ppm (parts per million). What are the projections for future carbon dioxide increases under the range of current emissions scenarios and how does having a better estimate of climate sensitivity improve our understanding of how our climate may change in the future?
Schmidt: “The future trajectory of carbon dioxide will depend on what we do as a society — if we decide to burn all the fossil fuels we can find, we could reach 900 ppm by the end of the century, but if we aggressively reduce emissions, we could stay below 500 ppm, maybe lower. The climate sensitivity tells us what we can expect in terms of temperature — between another 1 or 2 degrees Celsius (1.8 or 3.6 degrees Fahrenheit) for the low scenario, which would be very serious, to between 4 and 7 degrees Celsius (7.2 and 12.6 degrees Fahrenheit) for the high end scenario, which would be a disaster.”
Q. What about your study did you find most surprising?
Marvel: “How difficult it was to get everyone with all their different expertise working together on a big, joint effort. In the end, I think everyone realized how important it was and how this will be a strong basis for everyone’s future research.”
Schmidt: “How consistent the results were across all three different approaches.”
Q. What was your role in the study?
Marvel: “I was one of the lead scientists on the section looking at historical constraints on sensitivity, making sure that we took into account the differences in how things changed over the 20th century and how things will change going forward, and working to make sure that the uncertainties in historical climate records were properly included.”
Schmidt: “I worked mainly on the paleoclimate section, making sure that we used the most appropriate data from key periods in the planet’s history (like the last ice age or the last time carbon dioxide was as high as it is now — some 3 million years ago).”
Few natural phenomena are as impressive or awesome to behold as glaciers and volcanoes. I’ve seen both with my own eyes. I’ve marveled at the enormous power of flowing ice as I trekked across a glacier on Washington’s Mount Rainier — an active, but dormant, volcano. And I’ve hiked a rugged lava field on Hawaii’s Big Island alone on a moonless night to witness the surreal majesty of a lava stream from Kilauea volcano spilling into the sea — its orange-red lava meeting the waves in billowing steam — while still more glowing ribbons of lava snaked down the mountain slopes behind me.
There are many places on Earth where fire meets ice. Volcanoes located in high-latitude regions are frequently snow- and ice-covered. In recent years, some have speculated that volcanic activity could be playing a role in the present-day loss of ice mass from Earth’s polar ice sheets in Greenland and Antarctica. But does the science support that idea?
In short, the answer is a definitive “no,” though recent studies have shed important new light on the matter. For example, a 2017 NASA-led study by geophysicists Erik Ivins and Helene Seroussi of NASA’s Jet Propulsion Laboratory added evidence to bolster a longstanding hypothesis that a heat source called a mantle plume lies deep below Antarctica's Marie Byrd Land, explaining some of the melting that creates lakes and rivers under the ice sheet. While the study may help explain why the ice sheet collapsed rapidly in an earlier era of rapid climate change and why it’s so unstable today, the researchers emphasized that the heat source isn't a new or increasing threat to the West Antarctic ice sheet, but rather has been going on over geologic timescales, and therefore represents a background contribution to the melting of the ice sheet.
I checked in with Ivins and Seroussi to get a deeper understanding of this question, which our readers frequently ask about. Here's what I learned…
Greenland Has a Long-Departed “Hot Spot” but Is Now Quiet
Since 2002, the U.S./German Gravity Recovery and Climate Experiment (GRACE) and GRACE Follow-On (GRACE-FO) satellite missions have recorded a rapid loss of ice mass from Greenland — at a rate of approximately 281 gigatonnes per year.
There’s plenty of evidence of volcanism in regions now covered by the Greenland ice sheet and the mountains around it, but this volcanic activity occurred in the distant past. Many of Greenland’s mountains are eroded flood basalts — high-volume lava eruptions that cover broad regions. Flood basalts are the biggest type of lava flows known on Earth.
But volcanic activity isn’t responsible for the current staggering loss of Greenland’s ice sheet, says Ivins. There are no active volcanoes in Greenland, nor are there any known mapped, dormant volcanoes under the Greenland ice sheet that were active during the Pliocene period of geological history that began more than 5.3 million years ago (volcanoes are considered active if they’ve erupted within the past 50,000 years). In fact, he says, the history of the Greenland ice sheet is probably more connected to atmospheric and ocean heat than it is to heat from the solid Earth. Ten million years ago, there was actually very little ice present in Greenland. The whole age of ice sheet waxing and waning in the Northern Hemisphere didn’t really get going until about five million years ago.
While there are no active volcanoes in Greenland, scientists are confident a “hot spot” — an area where heat from Earth’s mantle rises up to the surface as a thermal plume of buoyant rock — existed long ago beneath Greenland because they can see the residual heat in Earth’s crust, Ivins says. While mantle plumes can drive some forms of volcanoes, Ivins says they aren’t a factor in the current melting of the ice sheet. Researchers hypothesize however that this residual heat may drive the flow of the Northeast Greenland Ice Stream, which penetrates hundreds of kilometers inland (an ice stream is a faster-flowing current of ice within a larger and more stagnant ice sheet). Recent modeling experiments show that if enough residual heat is present, it can initiate an ice stream. GPS measurements also provide evidence that a hot spot once existed beneath Greenland.
That hot spot subsequently moved, however, and now lies beneath Iceland — home to about 130 volcanoes, of which roughly 30 are active. The hot spot is at least partially responsible for the island’s high volcanic activity. Iceland also lies along the tectonically active Mid-Atlantic Ridge.
Antarctica Has Volcanoes, but There's No Link to its Current Ice Loss
The GRACE missions have also observed a rapid loss of ice mass in Antarctica, at a rate of approximately 146 gigatonnes per year since 2002. Unlike Greenland, however, there’s substantial evidence of volcanoes under the Antarctic Ice Sheet, some of which are currently active or have been in the recent geologic past. While the exact number of volcanoes in Antarctica is unknown, a recent study found 138 volcanoes in West Antarctica alone. Many of the active volcanoes are located in Marie Byrd Land. However, there’s no evidence of a dramatic volcanic eruption in Antarctica in the recent geologic past. Seroussi says details about the volcanism of many parts of Antarctica (particularly in East Antarctica) remain uncertain, both because they’re covered by ice and because their remoteness makes surveying them difficult.
Multiple additional lines of evidence point to Antarctica’s past and present volcanism. For example, topographic maps of the bedrock beneath the Antarctic ice sheet give scientists clues to suspected volcanic locations. Analyses of volcanic rock samples reveal numerous volcanic eruptive events within the last 100,000 years, as do ash layers in ice cores. In their 2017 study of Marie Byrd Land, Seroussi and Ivins estimated the intensity of the heat produced by the hypothesized mantle plume by studying the meltwater produced under the ice sheet and its motion by measuring changes in the elevation of the ice surface.
An intriguing paper by Loose et al. published in Nature Communications in 2018 provides additional evidence. The researchers measured the composition of isotopes of helium detected in glacial meltwater flowing from the Pine Island Glacier Ice Shelf. They found evidence of a source of volcanic heat upstream of the ice shelf. Located on the West Antarctic ice sheet, Pine Island Glacier is the fastest melting glacier in Antarctica, responsible for nearly a quarter of all Antarctic ice loss. By measuring the ratio between helium’s two naturally-occurring isotopes, scientists can tell whether the helium taps into Earth’s hot mantle or is a product of crust that is relatively passive tectonically.
The team found the helium originated in Earth’s mantle, pointing to a volcanic heat source that may be triggering melting beneath the glacier and feeding the water network beneath it. However, the researchers concluded that the volcanic heat is not a significant contributor to the glacial melt observed in the ocean in front of Pine Island Glacier Ice Shelf. Rather, they attributed the bulk of the melting to the warm temperature of the deep-water mass Pine Island Glacier flows into, which is melting the glacier from underneath.
Seroussi notes the changes happening now, especially in West Antarctica, are along the coast, which suggests the changes taking place in the ice sheet have nothing to do with volcanism, but are instead originating in the ocean. Ice streams reaching inland begin to flow and accelerate as ice along the coast disappears.
In addition, Seroussi says the tectonic plate that Antarctica rests upon is one of the most immobile on Earth. It’s surrounded by activity, but that activity also tends to keep it locked in position. There’s no reason to believe it would change today to impact the melting of the Antarctic ice sheet.
So, in conclusion, while Antarctica’s known volcanism does cause melting, Ivins and Seroussi agree there’s no connection between the loss of ice mass observed in Antarctica in recent decades and volcanic activity. The Antarctic ice sheet is at least 30 million years old, and volcanism there has been going on for millions of years. It's having no new effect on the current melting of the ice sheet.
In the last few months, a number of questions have come in asking if NASA has attributed Earth’s recent warming to changes in how Earth moves through space around the Sun: a series of orbital motions known as Milankovitch cycles.
What cycles, you ask?
Milankovitch cycles include the shape of Earth’s orbit (its eccentricity), the angle that Earth’s axis is tilted with respect to Earth’s orbital plane (its obliquity), and the direction that Earth’s spin axis is pointed (its precession). These cycles affect the amount of sunlight and therefore, energy, that Earth absorbs from the Sun. They provide a strong framework for understanding long-term changes in Earth’s climate, including the beginning and end of Ice Ages throughout Earth’s history. (You can learn more about Milankovitch cycles and the roles they play in Earth’s climate here).
But Milankovitch cycles can’t explain all climate change that’s occurred over the past 2.5 million years or so. And more importantly, they cannot account for the current period of rapid warming Earth has experienced since the pre-Industrial period (the period between 1850 and 1900), and particularly since the mid-20th Century. Scientists are confident Earth’s recent warming is primarily due to human activities — specifically, the direct input of carbon dioxide into Earth’s atmosphere from burning fossil fuels.
So how do we know Milankovitch cycles aren’t to blame?
First, Milankovitch cycles operate on long time scales, ranging from tens of thousands to hundreds of thousands of years. In contrast, Earth’s current warming has taken place over time scales of decades to centuries. Over the last 150 years, Milankovitch cycles have not changed the amount of solar energy absorbed by Earth very much. In fact, NASA satellite observations show that over the last 40 years, solar radiation has actually decreased somewhat.
Second, Milankovitch cycles are just one factor that may contribute to climate change, both past and present. Even for Ice Age cycles, changes in the extent of ice sheets and atmospheric carbon dioxide have played important roles in driving the degree of temperature fluctuations over the last several million years.
The extent of ice sheets, for example, affects how much of the Sun’s incoming energy is reflected back to space, and in turn, Earth’s temperature.
Then there’s carbon dioxide. During past glacial cycles, the concentration of carbon dioxide in our atmosphere fluctuated from about 180 parts per million (ppm) to 280 ppm as part of Milankovitch cycle-driven changes to Earth’s climate. These fluctuations provided an important feedback to the total change in Earth’s climate that took place during those cycles.
Today, however, it’s the direct input of carbon dioxide into the atmosphere from burning fossil fuels that’s responsible for changing Earth’s atmospheric composition over the last century, rather than climate feedbacks from the ocean or land caused by Milankovitch cycles.
Since the beginning of the Industrial Age, the concentration of carbon dioxide in Earth’s atmosphere has increased 47 percent, from about 280 ppm to 412 ppm. In just the past 20 years alone, carbon dioxide is up 11 percent.
Scientists know with a high degree of certainty this carbon dioxide is primarily due to human activities because carbon produced by burning fossil fuels leaves a distinct “fingerprint” that instruments can measure. Over this same time period, Earth’s global average temperature has increased by about 1 degree Celsius (1.8 degrees Fahrenheit), and is currently increasing at a rate of 0.2 degrees Celsius (0.36 degrees Fahrenheit) every decade. At that rate, Earth is expected to warm another half a degree Celsius (almost a degree Fahrenheit) as soon as 2030 and very likely by 2040.
This relatively rapid warming of our climate due to human activities is happening in addition to the very slow changes to climate caused by Milankovitch cycles. Climate models indicate any forcing of Earth’s climate due to Milankovitch cycles is overwhelmed when human activities cause the concentration of carbon dioxide in Earth’s atmosphere to exceed about 350 ppm.
Scientists know of no natural changes to the equilibrium between the amount of solar radiation absorbed by Earth and the amount of energy radiated back to space that can account for such a rapid period of global warming. The amount of incoming solar radiation has increased only slightly over the past century and is therefore not a driver of Earth’s current climate warming.
Since 1750, the warming driven by greenhouse gases coming from the human burning of fossil fuels is over 50 times greater than the slight extra warming coming from the Sun itself over that same time interval. If Earth’s current warming was due to the Sun, scientists say we should expect temperatures in both the lower atmosphere (troposphere) and the next layer of the atmosphere, the stratosphere, to warm. Instead, observations from balloons and satellites show Earth’s surface and lower atmosphere have warmed but the stratosphere has cooled.
Finally, Earth is currently in an interglacial period (a period of milder climate between Ice Ages). If there were no human influences on climate, scientists say Earth’s current orbital positions within the Milankovitch cycles predict our planet should be cooling, not warming, continuing a long-term cooling trend that began 6,000 years ago.
There’s nothing cool about that.
"Pink elephant in the room" time: There is no impending “ice age” or "mini ice age" if there's a reduction in the Sun’s energy output in the next several decades.
Through its lifetime, the Sun naturally goes through changes in energy output. Some of these occur over a regular 11-year period of peak (many sunspots) and low activity (fewer sunspots), which are quite predictable.
But every so often, the Sun becomes quieter for longer periods of time, experiencing much fewer sunspots and giving off less energy. This is called a "Grand Solar Minimum," and the last time this happened, it coincided with a period called the "Little Ice Age" (a period of extremely low solar activity from approximately AD 1650 to 1715 in the Northern Hemisphere, when a combination of cooling from volcanic aerosols and low solar activity produced lower surface temperatures).
Anomalous periods like a Grand Solar Minimum show that magnetic activity and energy output from the Sun can vary over decades, although the space-based observations of the last 35 years have seen little change from one cycle to the next in terms of total irradiance. Solar Cycle 24, which began in December 2008 and is likely to end in 2020, was smaller in magnitude than the previous two cycles.
On occasion, researchers have predicted that coming solar cycles may also exhibit extended periods of minimal activity. The models for such predictions, however, are still not as robust as models for our weather and are not considered conclusive.
But if such a Grand Solar Minimum occurred, how big of an effect might it have? In terms of climate forcing – a factor that could push the climate in a particular direction – solar scientists estimate it would be about -0.1 W/m2, the same impact of about three years of current carbon dioxide (CO2) concentration growth.
Thus, a new Grand Solar Minimum would only serve to offset a few years of warming caused by human activities.
What does this mean? The warming caused by the greenhouse gas emissions from the human burning of fossil fuels is six times greater than the possible decades-long cooling from a prolonged Grand Solar Minimum.
Even if a Grand Solar Minimum were to last a century, global temperatures would continue to warm. The reason for this is because more factors than just variations in the Sun’s output change global temperatures on Earth, the most dominant of those today is the warming coming from human-induced greenhouse gas emissions.
The Sun powers life on Earth; it helps keep the planet warm enough for us to survive. It also influences Earth’s climate: We know subtle changes in Earth’s orbit around the Sun are responsible for the comings and goings of the past ice ages. But the warming we’ve seen over the last few decades is too rapid to be linked to changes in Earth’s orbit, and too large to be caused by solar activity.
The Sun doesn’t always shine at the same level of brightness; it brightens and dims slightly, taking approximately 11 years to complete one solar cycle. During each cycle, the Sun undergoes various changes in its activity and appearance. Levels of solar radiation go up or down, as does the amount of material the Sun ejects into space and the size and number of sunspots and solar flares. These changes have a variety of effects in space, in Earth’s atmosphere and on Earth’s surface.
The current solar cycle, Solar Cycle 24, began in December 2008 and is less active than the previous two. It’s expected to end sometime in 2020. Scientists don’t yet know with confidence how strong the next solar cycle may be.
What Effect Do Solar Cycles Have on Earth’s Climate?
According to the United Nations’ Intergovernmental Panel on Climate Change (IPCC), the current scientific consensus is that long and short-term variations in solar activity play only a very small role in Earth’s climate. Warming from increased levels of human-produced greenhouse gases is actually many times stronger than any effects due to recent variations in solar activity.
For more than 40 years, satellites have observed the Sun's energy output, which has gone up or down by less than 0.1 percent during that period. Since 1750, the warming driven by greenhouse gases coming from the human burning of fossil fuels is over 50 times greater than the slight extra warming coming from the Sun itself over that same time interval.
Are We Headed for a ‘Grand Solar Minimum’? (And Will It Slow Down Global Warming?)
As mentioned, the Sun is currently experiencing a lower level of sunspot activity. Some scientists speculate that this may be the beginning of a Grand Solar Minimum — a decades-to-centuries-long period of low solar activity — while others say there is insufficient evidence to support that position. During a grand minimum, solar magnetism diminishes, sunspots appear infrequently and less ultraviolet radiation reaches Earth.
The largest recent event -- the “Maunder Minimum,” which lasted from 1645 and 1715 — overlapped with the “Little Ice Age” (13th to mid-19th century). While scientists continue to research whether an extended solar minimum could have contributed to cooling the climate, there is little evidence that the Maunder Minimum sparked the Little Ice Age, or at least not entirely by itself (notably, the Little Ice Age began before the Maunder Minimum). Current theories on what caused the Little Ice Age consider that a variety of events could have contributed, with natural fluctuations in ocean circulation, changes in land use by humans and cooling from a less active sun also playing roles; overall, cooling caused by volcanic aerosols likely played the title role.
Several studies in recent years have looked at the effects that another Grand Solar Minimum might have on global surface temperatures. These studies have suggested that while a grand minimum might cool the planet as much as 0.3 degrees C, this would, at best, slow down but not reverse human-caused global warming. There would be a small decline of energy reaching Earth; however, just three years of current carbon dioxide concentration growth would make up for it. In addition, the Grand Solar Minimum would be modest and temporary, with global temperatures quickly rebounding once the event concluded.
Moreover, even a prolonged Grand Solar Minimum or Maunder Minimum would only briefly and minimally offset human-caused warming.
More about solar cycles:
Periodically, we receive queries asking if Earth is cooling. Although multiple lines of converging scientific evidence show conclusively that our climate is warming, stories sometimes appear in the media calling that into question. New studies are interpreted as contradicting previous research, or data are viewed to be in conflict with established scientific thinking.
Last spring, for example, a number of media outlets and websites reported on a story that looked at data acquired from NASA’s Goddard Institute for Space Studies (GISS) Surface Temperature Analysis (GISTEMP), which estimates changes in global surface temperature. The article discussed a short-term cooling period that showed up in the data in 2017 and 2018 and correctly stated that short-term cooling cycles are “statistical noise compared to the long-term trend.”
Afterward, we received some queries from readers who wanted to know if this finding meant a significant period of global cooling either could be or already was under way.
The answer is no. This story is a great example of why focusing on just a short period of time – say, one, two or even several years — doesn’t tell you what’s really going on with the long-term trends. In fact, it’s likely to be misleading.
So, what’s really important to know about studying global temperature trends, anyway?
Well, to begin with, it’s vital to understand that global surface temperatures are a “noisy” signal, meaning they’re always varying to some degree due to constant interactions between the various components of our complex Earth system (e.g., land, ocean, air, ice). The interplay among these components drive our weather and climate.
For example, Earth’s ocean has a much higher capacity to store heat than our atmosphere does. Thus, even relatively small exchanges of heat between the atmosphere and the ocean can result in significant changes in global surface temperatures. In fact, more than 90 percent of the extra heat from global warming is stored in the ocean. Periodically occurring ocean oscillations, such as El Niño and its cold-water counterpart, La Niña, have significant effects on global weather and can affect global temperatures for a year or two as heat is transferred between the ocean and atmosphere.
This means that understanding global temperature trends requires a long-term perspective. An examination of two famous climate records illustrate this point.
You may be familiar with the Keeling Curve (above), a long-term record of global carbon dioxide concentrations. It’s not a straight line: The curve jiggles up and down every year due to the seasonal cycling of carbon dioxide. But the long-term trend is clearly up, especially in recent decades. As countries around the world rapidly develop and gross domestic products increase, human-produced emissions of carbon dioxide are accelerating.
During fall and winter in the Northern Hemisphere, when trees and plants begin to lose their leaves and decay, carbon dioxide is released in the atmosphere, mixing with emissions from human sources. This, combined with fewer trees and plants removing carbon dioxide from the atmosphere, allows concentrations to climb in winter, reaching a peak by early spring. During spring and summer in the Northern Hemisphere, plants absorb a substantial amount of carbon dioxide through photosynthesis.
Similarly, the above graph of long-term independent global temperature records maintained by NASA, NOAA and the UK’s Climatic Research Unit doesn’t show perfectly straight lines, either. There are ups and downs, and depending on when you start and stop, it’s easy to find numerous periods spanning multiple years where no warming occurred or when global temperatures even decreased. But the long-term trend is clearly up. To learn more about the relationship between carbon dioxide and other greenhouse gases and climate change, visit NASA’s Global Climate change website.
Growing Confidence in Earth Temperature Measurements
Scientists continue to grow increasingly confident that measurements of Earth’s long-term temperature rise in recent decades are accurate. For example, an assessment published earlier this year1 of the agency’s GISTEMP record of global temperatures found that NASA’s estimate is accurate to within less than one-tenth of a degree Fahrenheit in recent decades. They concluded that Earth’s approximately 1 degree Celsius (2 degrees Fahrenheit) global temperature increase since 1880 can’t be explained by any uncertainty or data error. The recent trends were also validated with data from the Atmospheric Infrared Sounder (AIRS) instrument on NASA’s Aqua satellite.
Global Warming Is 'Global'
What’s perhaps most important to remember about global surface temperature fluctuations is that despite short-term ups and downs, the evidence shows that our planet is steadily accumulating heat. Scientists assessing global warming study Earth’s entire heat content, not just what happens in one part of the atmosphere or one component of the Earth system. And what they have found is that the balance of energy in the Earth system is out of whack: Our lower atmosphere is warming, the ocean is accumulating more energy, land surfaces are absorbing energy, and Earth’s ice is melting.
A study by Church et al. (2011) found that since 1970, Earth’s heat content has risen at a rate of 6 x 1021 Joules a year. That’s the equivalent of taking the energy output of about 190,000 nuclear power plants and dumping it into the ocean every year.
Despite short-term decreases in global temperature, the long-term trend shows that Earth continues to warm.
- Lenssen, N., G. Schmidt, J. Hansen, M. Menne,A. Persin,R. Ruedy, and D. Zyss, 2019: Improvements in the GISTEMP uncertainty model. J. Geophys. Res. Atmos., early view, doi:10.1029/2018JD029522.