NASA’s Global Climate Change website gets a lot of user feedback. Aside from typical random Internet trolls and students posing thinly veiled attempts at getting us to write their term papers, one of the most commonly asked questions goes something like this:
“Hey, NASA, are you really sure people are causing climate change? Have you double-checked?” or “Hey, NASA, I have an idea. Maybe climate change is caused by x, y, z and it’s not really caused by humans. You should look into this.”
The short answer to this type of question is “Yes, we’ve double-, triple-, quadruple-checked. It’s science! We check and recheck a gazillion times. We’ve looked into everything you could possibly imagine and more. Before we commit to what we say, we have a strong desire to make sure it’s actually true.”
One example of how careful we have to be is when we’re analyzing the carbon dioxide in Earth’s atmosphere from space. OCO-2 is the NASA mission designed to be sensitive enough to detect a single part of carbon dioxide per million parts of atmosphere (ppm). The way it works is super complicated. And because carbon dioxide is the most important human contribution to climate change (the biggest issue of our time) and expectations of science results were set very high, we have to be super-duper certain our measurements are correct.
The sensitivity makes it very challenging.
The instruments on OCO-2 not only measure the absolute amount of carbon dioxide at a location, but they also look for very small gradients in the distribution of CO2, the difference in the distribution of carbon dioxide between one location and another as a function of time. For example, “a gradient on and off a city is like 2 parts per million,” explained Mike Gunson, project scientist for the mission. "You see 2 parts per million from any city of modest size on up. You’re looking at the difference between 399.5 and 401.5 parts per million. So you have to be careful. Nobody’s done this over New York City, Mumbai, Beijing or Shanghai, where it could be wildly different.”
Scientists spend their lives working to get reliable data. Science is hard; it’s not a walk in the park. Everything doesn't just land in your lap. Sometimes it’s a miracle to get any data at all. People don’t often talk about the challenges of doing science, but if you could uncover the history of any project, you would probably find loads of problems, issues and challenges that come up.
After most NASA satellite launches, the instruments typically go through a validation phase, a two- or three-month period when engineers and project managers check, double-check and recheck the data coming in from the satellite to assess its quality and make sure it’s absolutely accurate before it’s released to the scientific community. But with OCO-2, “there is no validation phase,” Gunson told me, “because the measurements have such sensitivity. You’re always validating. Constant validation is an integral part of ensuring the integrity of the dataset.”
For OCO-2 to make an observation, the sky has to be clear, without clouds. Too much wind will move the carbon dioxide, so you also need quiet meteorological conditions. Then, before we can make an inference, we must assess the quality of data, which involves exceptionally large computing capacity.” Because there is so much data coming in, you end up using all sorts of analysis techniques, including machine learning, to analyze the quality of the data. OCO-2 launched in July 2014, and since this past September the data have been released to the broader science community to sink their teeth into. This means, Gunson said, “after a year of alligator-wrestling, all of a sudden we can walk it on a leash.”
Learn more about NASA’s efforts to better understand the carbon and climate challenge.
I look forward to your comments.
This blog is moderated to remove spam, trolling and solicitations from this government website. We do our best to approve comments as quickly as possible.