Pseudoscience Be Gone: Climate Change

Welcome back, dear readers, and I hope you’re ready for some inconvenient truths. (Is that reference too dated? I’m still new to this blogging thing.)

This entry is part of a section called Pseudoscience Be Gone, in which I will confront (sadly) common pseudoscientific beliefs with cold, hard facts. In this entry, I won’t be talking about the actual pseudoscientific beliefs much; instead, I will focus on the evidence that exists to support the ideas that 1) global warming is real and 2) humans are having a profound effect on global warming.

I’ll start by discussing what climate change actually is. “Climate” can be described as “the average weather”, and when it comes to “climate change”, the scientific definition of climate change is often different than the political one. The scientific definition is “a change in the statistical properties of the climate system on a large scale over long periods of time”. “Statistical properties” means averages—such as average temperature or average concentration of carbon dioxide in the air—as well as how much those properties vary. The “climate system” is the sum of five zones covering the Earth, including the atmosphere and the biosphere (the portion of the Earth populated by living things). As far as the political definition, when most people talk about climate change these days, what they really mean is anthropogenic global warming: the increase in the Earth’s surface temperature caused by humans. That “large scale and long periods of time” bit from the scientific definition still applies, though; anthropogenic global warming refers to what’s happening to the entire Earth over hundreds of years. So the next time someone brings up El Niño in a discussion about climate change, you can tell them that that doesn’t count; the change doesn’t last long enough.

“Global warming is real! Science shows that!” is something I hear a lot from frustrated people who are in touch with the truth. And they’re right, but how do scientists figure out that the Earth is getting warmer, and what evidence do they actually have? That’s what I’d like to focus on for the rest of this entry. As a cancer biologist, I started research for this entry without much knowledge on how environmental scientists actually gather evidence for climate change. There are a number of sources that scientists can turn to when looking at how Earth’s climate has changed over time: ice cores, fossil records, sediment layers, floral and faunal records, and meteorological stations.

I do love fossils, but my favorite climate change information source to research was the ice cores. Ice cores are cylindrical samples of ice taken with a special kind of drill called a core drill, and they provide some of the best records for investigating past climate conditions. Not only can some of the deeper cores contain information that go back hundreds of thousands of years, they can be used to figure out data on this long list of conditions at a given time:

  • Temperature
  • Strength of air circulation in the atmosphere
  • Precipitation (rain/snowfall)
  • Ocean volume
  • Dust in the atmosphere
  • Volcanic eruptions
  • Solar variability
  • Amount of sea ice
  • Rate at which energy is converted to organic substances by marine life
  • Geographic extent of deserts
  • Forest fires
  • Radioactivity

How is this possible? Well, all of those can be figured out by looking at the levels of materials like dust, ash, and human pollutants in the air. Snowfall captures those things, and in some places (such as the poles), that snow doesn’t melt. Ice cores provide valuable information about the ambient temperature, which has been useful in the hunt for evidence of anthropogenic global warming. Certain molecules—like deuterium, which is hydrogen with an extra neutron—have a known relationship with temperature, so scientists can determine temperatures at specific years from the concentration of deuterium at a particular point in the ice core. Known geological events, such as an extremely powerful volcanic eruption in 1815, can be used as “dating horizons” to figure out approximately what year an ice core layer is from. Increased radioactivity from nuclear bomb testing is also often used as one of these “dating horizons”.

As I mentioned, I also like fossils. One of the things that the fossil record does for scientists is to divide up geologic time, often by showing extinction events. (Well, okay, the division of Earth’s history into periods is mostly based on visible changes in layers of sedimentary rock, but fossils are involved too.) The fossil record currently shows seven mass extinctions, my favorite of which is the Great Dying, when 96% of all marine life and 70% of all land-dwelling vertebrates died due to a possible asteroid impact and one of the most tremendous known volcanic eruptions on Earth. (If you ever want nightmares, look up “formation of Siberian Traps”.) That extinction event ended the Permian period and began the Triassic period. Periods are divided into epochs, which are further divided into ages; more on why that is important later. For now, I want to talk about foraminifera.

Foraminifera are tiny protozoans with carbonate shells. While these little critters are alive, their shells are formed from the elements found in the water where they live. Paleontologists can look at the ratios of elements in foraminifera shells and be able to tell how much of the Earth’s surface was covered in ice when the animals were alive. This is important because it provides valuable information on a time when rapid climate change—that thing that’s happening now—happened before. This rapid climate change, which marked the end of the Eocene epoch and the beginning of the Oligocene epoch, was actually a transition from a “greenhouse” climate to a much cooler one. This rapid cooling led to one of those mass extinction events I was talking about earlier. So rapid changes in the Earth’s overall temperature? Those don’t bode well for the things living here.

So we have all this evidence. What does it show, exactly? One of the things we have learned about climate change is that starting in the Industrial Revolution—let’s say 1750—ice cores show dramatically increasing concentrations of carbon dioxide and methane. To be specific, there has been a 40% increase in the concentration of carbon dioxide since the beginning of the Industrial Revolution. Carbon dioxide and methane are what is known as “greenhouse gases”, or gases that absorb and emit heat. These gases are the main cause of the “greenhouse effect”, which is what happens when a planet’s atmosphere warms its surface to a higher temperature than what it would be without the atmosphere. Higher concentrations of greenhouse gases in the atmosphere cause higher surface temperatures.

The Earth has had atmospheric carbon dioxide levels this high before: in an age called the Pliocene, which the Intergovernmental Panel on Climate Change has called the “benchmark for modern global warming”. Dr. Aradhna Tripati, an assistant professor at UCLA who studies Earth Sciences and is so smart she started university when she was 12 (really), says “Our data from the early Pliocene, when carbon dioxide levels remained close to modern levels for thousands of years, may indicate how warm the planet will eventually become if carbon dioxide levels are stabilized at the current value of 400 parts per million”. What Dr. Tripati is saying is that the Earth is currently on the bullet train to having a climate like the Pliocene’s, which would mean summertime Arctic temperatures 18 to 15 C warmer than they are today. Meteorological station records show that the Earth’s surface temperature had increased a little over half a degree Celsius in the past century, and just that is already causing hotter days, heavier rainfall, stronger hurricanes, and more severe droughts. Now imagine what might happen if the Earth gets 15 C warmer. One thing we can predict is that if the concentration of greenhouse gases in the atmosphere doesn’t change, there could be no sea ice in 50 to 100 years.

Humans have had such a profound impact on the climate that many geologists believe that we have entered a new geological age: the Anthropocene Age. While there is not a consensus in the scientific community regarding the Anthropocene Age, it is still extremely telling that humans have changed our environment so much that some experts are claiming that we have entered a new period of geological time. While we haven’t made a noticeable change in the sediment layers, 12% of land masses are now cropland. On the topic of humans cultivating so much land, Dr. Anna Behrensmeyer, a paleoecologist with the National Museum of Natural History, says “the shift from forest to grassland took millions of years, but the pace and rates were nothing like they are now. With the rates now, you don’t really give the animals and plants a chance to evolve”. Dr. Thomas Lovejoy, the Smithsonian assistant secretary for external affairs, cautions “biological systems can take it, and take it, and take it, then all of a sudden there’s a tremendous reordering”. So it looks like we’re headed for a “tremendous reordering” from all the meddling we’ve been doing with the environment.

So the next time some ill-informed soul tells you climate change is a hoax, tell them about ice cores and foraminifera. And remember: science doesn’t care what you believe.

Credit to my sister-of-the-heart Rocky Blonshine for geology tips and my biological nuclear family for reading.