All possible influences are considered, and at different times, different influences can dominate. But they all work by affecting two basics: the rate of heat energy coming in from the sun and the rate of heat energy escaping. If an asteroid hits the Earth and dust turns day to night for a few years, we don't need to worry too much about other influences for some time. But CO2 is far from insignificant. At current concentrations, IR photons of the right frequency have a mean free path of around 25 metres before they get captured by a CO2 molecule and randomly re-radiated, so it's quite a struggle for the radiation to escape. The net result is that CO2 is currently retaining 2.0 Watts/metre2 more than in 1750, and satellite measurement shows this to be increasing, and the rate of increase to be increasing:
|
‘That's why the name has been changed to climate change’ I'm tired of hearing this nonsense. The seminal paper, the one which first set the bells ringing, was Plass 1956, entitled "The Carbon Dioxide Theory of Climatic Change", and the CC in IPCC (1988) surely does not stand for Global Warming. |
‘0.01 degree F higher than the last "warmest year on record"’ No it wasn't. It was 0.12°C higher than the last warmest year 2015, and 0.35°C warmer than the "year warming stopped " 1998. For those of you stuck in the 19th century, that's 0.22°F and 0.63°F - sixty-three times higher than your claim and well outside the margin of error. NASA/GISS Global Temperature data. |
Funny Times February 2010 |
‘I really believe that if all the months when averaged over a decade had shown a warming trend then we would have been told.’ That's an interesting thought; it hadn't occurred to me to check. I'll run NASA/GISS's data through my software and see what if comes up with. There is such a lot of month to month variability that it would be a bit unlikely for ALL 12 months to show the warming trend over all 5 decades of warming. Anyway, here are the monthly averages in °C per decade: Decade Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec As it happens, of the 60 averages, 59 DO show the warming trend. Only the last decade's February is different, falling one hundredth of a °C from the 2nd last. Every other month shows warming over every decade. So, now you have been told. Here it is graphically, each decade very clearly warmer than the previous one: NASA/GISS Global Temperature Anomaly data |
‘DIE KALTE SONNE, WARUM DIE KLIMAKATASTROPHE NICHT STATTFINDET’ It's early days yet, but I thought you might be interested in how Die Kalte Sonne's projection from 2011 was working out: NASA/GISS Global Temperature Anomaly data Die Kalte Sonne It is only a short time since publication of Die Kalte Sonne, but it is quite remarkable how their prediction could have gone so wrong so quickly. Link: Climate Change 10th September 2015 In fact, as I said to iemnutz elsewhere (unanswered), the numbers don't add up to a new Little Ice Age. From 1600, to 1650 when the LIA bottomed out, solar heating fell by 0.325 Watts/metre2. That's 0.065 Watts/metre2 per decade. Currently, heating due to rising CO2 levels is rising at 0.2 Watts/metre2 per decade (Phys.org), which would swamp any solar decrease like the one causing the LIA. |
‘Carbon dioxide is ALWAYS a lagging indicator of warming because during EVERY LAST WARMING PERIOD’ That is correct. It is not news and does not affect the theory and measured fact of CO2's greenhouse effect. Some people hold on to the belief that, because warming will cause increased CO2, increased CO2 cannot cause warming. That is logically incorrect and factually false. Warming increases CO2 level, not because of increased life, but because it makes it less soluble in water and so drives it out of the oceans into the atmosphere. The CO2 then amplifies the warming that drove it out in the first place - positive feedback. Although CO2 increase lags temperature at the start of the warming period, the period ends with temperature lagging CO2 level. |
‘... any quantitative data’ Here's a bit more based on the Berkeley Lab experiment, which measured the rise in heat energy returned by CO2 over 10 years as 0.2±0.06 watts per square metre. 0.2 watts/metre2 in a decade adds up to 0.2×24×365×10/1000 = 17.52 kwh more per metre2 than 10 years previously. But only 3% goes into the atmosphere, ie 0.53 kwh. 1 kwh will raise the temperature of 860 kg of air by 1°C. The troposphere weighs about 7000 kg per metre2, but heating tails off towards the top, so we could assume it effectively to weigh half, ie 3500 kg. So 0.53 kwh/metre2 would warm the troposphere from the bottom up by 0.53×860/3500 = 0.13°C. 0.13°C is the amount we would expect the temperature to be driven higher by the decade's CO2 increase. The reality? Temperature data source: NASA/GISS 1.66°C per century or 0.166°C per decade over the 50 years of global warming. We are in the right ball park. Link: Sassy's Place 7th September 2015 |
You have in the past suggested that the Dansgaard-Oetsger cycle might well have played a part in the warming that the Earth has recently experienced.
Strictly speaking, D-O events only occur during glacial periods, but Bond events are an equivalent phenomenon during interglacial periods, so I thought it would be interesting to perform a Fourier analysis of Loehle 2008, the temperature reconstruction which most emphasises the Medieval Warm Period and the Little Ice Age. The analysis highlighted three potential cycles, 1540, 418, and 239 year periods: T = 0.365cos(2Π(year-844.7)/1540) + 0.117cos(2Π(year-139.3)/418) + 0.101cos(2Π(year-113.3)/239) Skeptical Science (Loehle) NASA/GISS Global Temperature Anomaly data WikiPedia (Fourier analysis) The pink curve is a good fit, seldom straying outside the ±0.16°C uncertainty limit given for the Loehle original. At least it doesn't UNTIL we get to the 20th century. At that point the actual temperature zooms up and away from the curve, ending 0.7°C above, well outside the uncertainty margin. So let's try again, this time allowing for the CO2 greenhouse effect: T = 1.517log(CO2)/log(2) + 0.368cos(2Π(year-850.85)/1540) + 0.104cos(2Π(year-138.38)/418) + 0.101cos(2Π(year-107.8)/239) - 12.31Skeptical Science (Loehle) NASA/GISS Global Temperature Anomaly data WikiPedia (Fourier analysis) |
‘It is not Co2, but cloud cover which controls temperatures’ You seem to be on a quest to find a non-existent Holy Grail - a single factor that controls all temperature. Many things influence global temperature change. In the last half century the dominant factor has been increasing atmospheric carbon dioxide. That has now been clearly demonstrated; all else is illusion. Someone yesterday suggested that it was the 60-year Altlantic Multidecadal Oscillation which was the main driver, so I did a bit of curve fitting on that and it seemed to contribute 20-25% of the warming, with CO2 contributing the rest. But you could just skip to the last of the three exercises to see what happens when we rule out CO2: Temperature data source: NASA/GISS - and although the 60-year cycle is plain to see, it only has an amplitude of ±0.074°C, ie 0.15°C top to bottom, out of a warming 1975-2014 of 0.7°C, ie 21%. Much of the remaining noise is down to El Niño/La Niña, a few volcanic eruptions, and some solar variation. A curve (red) incorporating these fits the data (blue) quite well: Temperature data source: NASA/GISS However, if we include everything except CO2, then the result is quite lamentable: Temperature data source: NASA/GISS |
Here's a bit more based on the Berkeley Lab experiment, which measured the rise in heat energy returned by CO2 over 10 years as 0.2±0.06 watts per square metre. 0.2 watts/metre2 in a decade adds up to 0.2x24x365x10/1000 = 17.52 kwh more per metre2 than 10 years previously. But only 3% goes into the atmosphere, ie 0.53 kwh. 1 kwh will raise the temperature of 860 kg of air by 1°C. The troposphere weighs about 7000 kg per metre2, but heating tails off towards the top, so we could assume it effectively to weigh half, ie 3500 kg. So 0.53 kwh/metre2 would warm the troposphere from the bottom up by 0.53x860/3500 = 0.13°C. 0.13°C is the amount we would expect the temperature to be driven higher by the decade's CO2 increase. The reality? Temperature data source: NASA/GISS 1.66°C per century or 0.166°C per decade over the 50 years of global warming. We are in the right ball park. |
There is no question but the climate is warming.
The adjustments made to pre-WWII temperatures, particularly in the USA, did cause official land temperatures rises to increase. But adjustments to above-ocean temperatures caused the opposite effect: reducing the apparent warming rate. The net result of ALL homogenisation adjustment globally was to REDUCE the apparent warming: Source
NOAA have been making more corrections recently which only really impact the years just prior to 1940, as you can see in the upper graph below. See RealClimate.com for details. The lower graph shows the effect of ALL corrections, confirming that the net effect is to decrease the apparent warming. Not what you would expect if they were trying to fake global warming. RealClimate.com RealClimate.com Generalised Climate Model Equation. model = c1(Log(CO2+1)/Log(2)) + c2E + c3A + c4Cos(2Π(T+c5)/c6) + c7Cos(2Π(T+c8)/c9) + c10Cos(2Π(T+c11)/c12) + c13R + c14S + c15V + c16
We are discussing the global warming that has occurred since the 1970s, and what I said was quite correct: there has been NO INCREASE in solar radiation during that period. Certainly it is at a high level, but since the 70s there has been no increase to correspond to the rise in temperature.
|