“But the question the media love to debate is not: can we find a warming trend since 1998 which is outside what might be explained by natural variability? The question being debated is: is the warming since 1998 significantly less than the long-term warming trend? Significant again in the sense that the difference might not just be due to chance, to random variability? And the answer is clear: the 0.116 since 1998 is not significantly different from those 0.175 °C per decade since 1979 in this sense. Just look at the confidence intervals. This difference is well within the range expected from the short-term variability found in that time series. (Of course climatologists are also interested in understanding the physical mechanisms behind this short-term variability in global temperature, and a number of studies, including one by Grant Foster and myself, has shown that it is mostly related to El Niño / Southern Oscillation.) There simply has been no statistically significant slowdown, let alone a “pause”. – See more at: http://www.realclimate.org/index.php/archives/2014/12/recent-global-warming-trends-significant-or-paused-or-what/#sthash.4JkNyK1x.dpuf
Re: TBTF – a friend sent me this Daily Kos piece and I’d like some input from some of you more deeply immersed in this than I am. My first reaction was to say, “No way ‘they’ (whoever your ‘they’ are), would allow depositors’ money to be used this way. It struck me as a bit too “the sky is falling, the sky is falling!”
But in this up-is-down, down-is-sideways world we live in, who knows? If it’s true (as seems to be) that the biggest of the big money is running the show, do they feel so invulnerable as to rob us blind in broad daylight, as this piece indicates?
Inquiring minds want to know…………..
Sandi,
A link to the article you are referencing would help. Your description is a bit vague. TBTF is rather generalized designation for lots of banking’s mysterious ways.
He simply calculates the trend and variability range (+/- 2 standard deviations) for 1979-1997 and extrapolates them linearly from 1997 to 2013. On the same plot he draws flat (horizontal/ zero-trend) lines for the trend and range from 1997 to 2013. Then he adds the actual data points (from multiple sources) to the plot, and asks, which scenario (continued trend or pause) agrees best with the data points? The result is quite striking.
EM, the REalClimate article is based upon an unofficial temperature dataset the HadCru (?hybrid?) proposed by Kowtan and Way. Did you wonder why they chose that dataset? It shows warming.
If you remember our discussion of several months past about the quality of the surface temperature dataset due to the adjustments? One of the major adjustments is the infilling of missing data. Guess what Kowtan and Way proposed? A new way to infill.
JimV, this pull quote from your referenced article is what the current science is saying: “And it has been shown (as climate scientists knew all along) that greenhouse gases aren’t the only factor influencing temperature, that “since 1998″ we’ve seen the most prominent known non-greenhouse factors (el Nino southern oscillation, volcanic aerosols, and solar variations) conspire to lower global temperature. ”
I would question the use of “conspire” –“to plot (something wrong, evil, or illegal). “, but Grant Foster (Tamino) in the beginning of the article seems to accept and summarize the science.
The other slight of hand approach is to calculate the std deviations of just the early part of the data, (AN ACKNOWLEDGED FAST WARMING PERIOD) then applying it to the whole of his period without recalculating them using the lower period. It is a mathematical certitude that using the more normal recalculation approach all the data would be within the std deviations. So what he showed was the validity of his ASSUMPTION, if blah, blah, blah.
The data did not completely match his assumption, and except for the ends of the several datasets he tested only exceeded the cherry picked std deviations of the very ends.
How does he explain this? he doesn’t! He shifts attention from his ASSUMPTION to highlighting “All sixteen years were hotter than expected…” What happened to the std deviation and statistical significance argument with which he started?
It gets tiresome trying to explain how disinformation is so easily passed off to those who want/need to believe it.
If I or Dan or others sought to boot your butt, you would be in spam and forever tossed. We are painfully obvious and would tell you. I just approved your post (from yesterday) for some reason other than you being censored. Not sure why it needed approving.
Multiple links are a trigger for spam-moderation at most websites.
CoRev, Tamino explains exactly what he is doing and why. To characterize this as sleight-of-hand can only be bias.
The key point of the post is that extrapolating from what we knew in 1997 about the existing trend and standard deviation is much more consistent with the observed data than assuming temperatures went flat-line for some unknown reason. To argue that including the whole data set would have changed the standard deviation is a non-sequitor to this point.
Yes, if the whole data set is analyzed (1979-2013) it would have confidence intervals which included most of the data points, by definition. This is not an argument for either position. You neglect to mention that the whole data set would have a very significant positive trend also.
Or perhaps you want to cherry-pick a smaller data set and remove all of the 1979-1997 data? My old engineering boss Frank Ryan had a principle he called “the theory of Ridiculous Limits”. In the limit of taking the last two data points of any time series which has variation, you can obtain a positive, flat, or even negative trend, regardless of what the overall series shows. Math says that the more variation a series has, the more data points you need to draw valid conclusions.
The math has been explained over and and over by Tamino and many others (especially at Real Climate), but Tamino’s post gives a simple graphical explanation than non-statisticians should be able to understand.
Run, history says otherwise. In the last great conservative cleaning, what you observed, the moderation hold, was exactly what happened to the conservatives remaining to comment. No notice. No comment. Just silence. What’s worse is they show up as comment on the sending machine. That’s what I find sneaky. You may think it a feature. 😉
JimV, I’m sorry, but I’ve been down this road too many times. His point was to disprove the existence of the pause: “A pause or not a pause, that is the question.”
It is becoming an accepted point with many of the climate scientists. We are up to nearly 60 papers and articles, mostly from big name traditional climate scientists, trying to the explain the reasons for the pause. They are past denying it. Tamino. not so much.
I do agree: ” To argue that including the whole data set would have changed the standard deviation is a non-sequitor to this point.”
Jerry Critter, Joel was referring to a previous discussion we had. In denying the pause in Global Warming, and referring toe ocean currents, he is not talking about the pause in Surface Average Temperatures. That was the subject at hand. It is this kind of confused use of the terms that adds to the confusion.
It is not confusing at all, CoRev. For you it is inconvenient, so you ignore it totally except to comment that suddenly “now climate scientists are putting in more variables cause the old ones don’t work.”
Course that is nonsense as has been shown to you many times. Just the same kind of nonsense as your constant claims that scientists have ignored natural variables until they needed them.
I think you should ask Tamino why he did not “show a dataset with the pause”. I would love to see his answer to you.
EM, most of your comments are made up of angry BS, but this is just unfathomable: “Your comments on Cowtan and Way are despicable.”
I said: “If you remember our discussion of several months past about the quality of the surface temperature dataset due to the adjustments? One of the major adjustments is the infilling of missing data. Guess what Kowtan and Way proposed? A new way to infill.”
This is how Tamino explains their approach: “Their goal was to interpolate across unobserved areas in the best way available, by Kriging. They also used satellite data to supplement the interpolation.”
A translation: 1) “interpolate across unobserved areas” = INFILLING
2) the best way available, by Kriging. = A NEW WAY TO INFILL kriging satellite data.
EM claims the Cowtan and Way method is: ” a method we know is more accurate”. We do not know it is more accurate. It still must be vetted by science, other than just peer reviewed.
When you say more accurate, to what is it more accurate? Hint: Cowtan and Way tried to improve just one surface measurement dataset, Hadley CRU. And they did this by using the acknowledged better satellite data.
“But the question the media love to debate is not: can we find a warming trend since 1998 which is outside what might be explained by natural variability? The question being debated is: is the warming since 1998 significantly less than the long-term warming trend? Significant again in the sense that the difference might not just be due to chance, to random variability? And the answer is clear: the 0.116 since 1998 is not significantly different from those 0.175 °C per decade since 1979 in this sense. Just look at the confidence intervals. This difference is well within the range expected from the short-term variability found in that time series. (Of course climatologists are also interested in understanding the physical mechanisms behind this short-term variability in global temperature, and a number of studies, including one by Grant Foster and myself, has shown that it is mostly related to El Niño / Southern Oscillation.) There simply has been no statistically significant slowdown, let alone a “pause”. – See more at: http://www.realclimate.org/index.php/archives/2014/12/recent-global-warming-trends-significant-or-paused-or-what/#sthash.4JkNyK1x.dpuf
Is not the real question, What is man’s influence on the global warming trend?
C’mon.
WTF do you think climate scientists do?
You somehow missed the “natural variability” thing?
Global dimming is undeniable — may account for slowing of global warming.
Re: TBTF – a friend sent me this Daily Kos piece and I’d like some input from some of you more deeply immersed in this than I am. My first reaction was to say, “No way ‘they’ (whoever your ‘they’ are), would allow depositors’ money to be used this way. It struck me as a bit too “the sky is falling, the sky is falling!”
But in this up-is-down, down-is-sideways world we live in, who knows? If it’s true (as seems to be) that the biggest of the big money is running the show, do they feel so invulnerable as to rob us blind in broad daylight, as this piece indicates?
Inquiring minds want to know…………..
Sandi,
A link to the article you are referencing would help. Your description is a bit vague. TBTF is rather generalized designation for lots of banking’s mysterious ways.
Ooops! Sorry about that –
http://www.dailykos.com/story/2014/12/03/1348957/-Bankers-Want-Your-Savings-As-Part-of-Their-Next-Bail-Out?detail=email
That’s what happens when my fingers outrun my brain.
Sandi
Tamino at “Open Mind” blew the “pause” out of the water for me back in January with this post:
http://tamino.wordpress.com/2014/01/30/global-temperature-the-post-1998-surprise/
He simply calculates the trend and variability range (+/- 2 standard deviations) for 1979-1997 and extrapolates them linearly from 1997 to 2013. On the same plot he draws flat (horizontal/ zero-trend) lines for the trend and range from 1997 to 2013. Then he adds the actual data points (from multiple sources) to the plot, and asks, which scenario (continued trend or pause) agrees best with the data points? The result is quite striking.
EM, the REalClimate article is based upon an unofficial temperature dataset the HadCru (?hybrid?) proposed by Kowtan and Way. Did you wonder why they chose that dataset? It shows warming.
If you remember our discussion of several months past about the quality of the surface temperature dataset due to the adjustments? One of the major adjustments is the infilling of missing data. Guess what Kowtan and Way proposed? A new way to infill.
JimV, this pull quote from your referenced article is what the current science is saying: “And it has been shown (as climate scientists knew all along) that greenhouse gases aren’t the only factor influencing temperature, that “since 1998″ we’ve seen the most prominent known non-greenhouse factors (el Nino southern oscillation, volcanic aerosols, and solar variations) conspire to lower global temperature. ”
I would question the use of “conspire” –“to plot (something wrong, evil, or illegal). “, but Grant Foster (Tamino) in the beginning of the article seems to accept and summarize the science.
When you come to Tamino you always have to watch the slight of hand. He is comparing, in his selected dataset, these two periods: http://woodfortrees.org/plot/hadcrut4gl/from:1979/plot/hadcrut4gl/from:1997/trend/plot/hadcrut4gl/from:1979/trend
But, makes some assumption about what the data should show in the 1997 forward period, then continues to show the entire period. http://tamino.files.wordpress.com/2014/01/hadcrut4.jpg?w=500&h=270
The other slight of hand approach is to calculate the std deviations of just the early part of the data, (AN ACKNOWLEDGED FAST WARMING PERIOD) then applying it to the whole of his period without recalculating them using the lower period. It is a mathematical certitude that using the more normal recalculation approach all the data would be within the std deviations. So what he showed was the validity of his ASSUMPTION, if blah, blah, blah.
The data did not completely match his assumption, and except for the ends of the several datasets he tested only exceeded the cherry picked std deviations of the very ends.
His argument falls apart when he shows the satellite data http://tamino.files.wordpress.com/2014/01/rss.jpg?w=500&h=270
and
http://tamino.files.wordpress.com/2014/01/uah.jpg?w=500&h=270
How does he explain this? he doesn’t! He shifts attention from his ASSUMPTION to highlighting “All sixteen years were hotter than expected…” What happened to the std deviation and statistical significance argument with which he started?
It gets tiresome trying to explain how disinformation is so easily passed off to those who want/need to believe it.
JimV,
Yes, nice to see Tamino back.
Moderators, you are an interesting study in the liberal mind. Even your censorship is hidden.
CoRev:
If I or Dan or others sought to boot your butt, you would be in spam and forever tossed. We are painfully obvious and would tell you. I just approved your post (from yesterday) for some reason other than you being censored. Not sure why it needed approving.
Multiple links are a trigger for spam-moderation at most websites.
CoRev, Tamino explains exactly what he is doing and why. To characterize this as sleight-of-hand can only be bias.
The key point of the post is that extrapolating from what we knew in 1997 about the existing trend and standard deviation is much more consistent with the observed data than assuming temperatures went flat-line for some unknown reason. To argue that including the whole data set would have changed the standard deviation is a non-sequitor to this point.
Yes, if the whole data set is analyzed (1979-2013) it would have confidence intervals which included most of the data points, by definition. This is not an argument for either position. You neglect to mention that the whole data set would have a very significant positive trend also.
Or perhaps you want to cherry-pick a smaller data set and remove all of the 1979-1997 data? My old engineering boss Frank Ryan had a principle he called “the theory of Ridiculous Limits”. In the limit of taking the last two data points of any time series which has variation, you can obtain a positive, flat, or even negative trend, regardless of what the overall series shows. Math says that the more variation a series has, the more data points you need to draw valid conclusions.
The math has been explained over and and over by Tamino and many others (especially at Real Climate), but Tamino’s post gives a simple graphical explanation than non-statisticians should be able to understand.
Run, history says otherwise. In the last great conservative cleaning, what you observed, the moderation hold, was exactly what happened to the conservatives remaining to comment. No notice. No comment. Just silence. What’s worse is they show up as comment on the sending machine. That’s what I find sneaky. You may think it a feature. 😉
JimV, I’m sorry, but I’ve been down this road too many times. His point was to disprove the existence of the pause: “A pause or not a pause, that is the question.”
Did he even show a dataset with the pause? No. Why not? Because it exists in nearly every dataset, and even the Wood For Trees average of them. http://www.woodfortrees.org/plot/hadcrut3gl/from:1997.33/trend/plot/gistemp/from:2001.33/trend/plot/rss/from:1996.65/trend/plot/wti/from:2000.9/trend/plot/hadsst2gl/from:1997.1/trend/plot/hadcrut4gl/from:2000.9/trend/plot/uah/from:2004.75/trend/plot/hadcrut3gl/from:1997/plot/gistemp/from:1997/plot/rss/from:1997/plot/wti/from:1997/plot/hadsst2gl/from:1997/plot/hadcrut4gl/from:1997/plot/uah/from:1997
It is becoming an accepted point with many of the climate scientists. We are up to nearly 60 papers and articles, mostly from big name traditional climate scientists, trying to the explain the reasons for the pause. They are past denying it. Tamino. not so much.
I do agree: ” To argue that including the whole data set would have changed the standard deviation is a non-sequitor to this point.”
Run, thank you. BTW, that was supposed to say “No notice.” in my response to you.
Ah, CoRev. Still with the so-called “pause,” eh?
Still don’t understand the deep sea data, do you?
There is no evidence for a “pause” in global warming.
Joel, take it up with Jerry Critter and Denis Drew They and you mentioned global warming..
CoRev,
I think the pertinent word in Joel’s comment is “pause”, unless you are denying global warming.
Jerry Critter, Joel was referring to a previous discussion we had. In denying the pause in Global Warming, and referring toe ocean currents, he is not talking about the pause in Surface Average Temperatures. That was the subject at hand. It is this kind of confused use of the terms that adds to the confusion.
It is not confusing at all, CoRev. For you it is inconvenient, so you ignore it totally except to comment that suddenly “now climate scientists are putting in more variables cause the old ones don’t work.”
Course that is nonsense as has been shown to you many times. Just the same kind of nonsense as your constant claims that scientists have ignored natural variables until they needed them.
I think you should ask Tamino why he did not “show a dataset with the pause”. I would love to see his answer to you.
BTW,
Your comments on Cowtan and Way are despicable.
EM, most of your comments are made up of angry BS, but this is just unfathomable: “Your comments on Cowtan and Way are despicable.”
I said: “If you remember our discussion of several months past about the quality of the surface temperature dataset due to the adjustments? One of the major adjustments is the infilling of missing data. Guess what Kowtan and Way proposed? A new way to infill.”
This is how Tamino explains their approach: “Their goal was to interpolate across unobserved areas in the best way available, by Kriging. They also used satellite data to supplement the interpolation.”
A translation: 1) “interpolate across unobserved areas” = INFILLING
2) the best way available, by Kriging. = A NEW WAY TO INFILL kriging satellite data.
Yeah, let’s attack a method we know is more accurate while ignoring that Smith uses the other methods.
You are despicable.
EM claims the Cowtan and Way method is: ” a method we know is more accurate”. We do not know it is more accurate. It still must be vetted by science, other than just peer reviewed.
When you say more accurate, to what is it more accurate? Hint: Cowtan and Way tried to improve just one surface measurement dataset, Hadley CRU. And they did this by using the acknowledged better satellite data.