Since I've read sunspot data goes back to Galileo's day, I wonder why the climate models DON'T incorporate it.
Very few sunspots were seen from about 1645 to 1715, a period referred to as the Maunder Minimum after the English astronomer Edward W. Maunder who studied this unusual time of solar inactivity. This period corresponds to the middle of a series of exceptionally cold winters throughout Europe known as the Little Ice Age. Scientists still debate whether decreased solar activity helped cause the Little Ice Age, or if the cold snap coincidentally occurred around the same time as the Maunder Minimum. Several other less-extreme periods of decreased sunspot activity have been noted: the Spörer Minimum (1420 to 1570), named after the German astronomer Gustav Spörer; the Dalton Minimum of 1790 to 1820; the Wolf Minimum of 1280 to 1340; and the Oort Minimum of 1010 to 1050. Indirect evidence from elemental isotopes seemingly indicates that there have been 18 such periods of lessened sunspots over the last 8,000 years and that the Sun may spend as much as a quarter of its time in such sunspot minima periods. The Spörer, Wolf, and Oort Minima were all discovered via such isotope analyses, since they occurred before the era of regular, reliable sunspot observations. In contrast to these periods of sunspot minima, sunspot counts have been higher than usual since around 1900, which has led some scientists to call this time the Modern Maximum. Likewise a period called the Medieval Maximum, which lasted from 1100 to 1250, apparently had higher levels of sunspots and associated solar activity, and intriguingly coincides (at least partially) with a period of warmer climates on Earth called the Medieval Warm Period.
windows.ucar.edu |