Jump to content

CERN: Climate Models will need to be revised


Recommended Posts

I'm having a hard time understanding how including contaminated data is referred to as "flipping the sign" or "using the data upside down".
Mann's algorithm attempts to guess the relationship between temperature and the proxy by looking at the correlation with temperatures during the last 100 years. If the proxy is positivity correlated then the algorithm assumes that temperatures are higher when the proxy value is higher in the past. If the proxy is negatively correlated then the algorithm assumes that temperatures are lower when the proxy value is higher in the past.

However, in the case of the Tiljander proxies the huge contamination in the last 100 years tricked Mann's algorithm into guessing the wrong sign. i.e. Mann assumed that temperatures rose when the proxy value rose when the physics of the proxy dictates that the temperature falls as the proxy value rises. e.g. Mann used the proxy 'upside down'.

That said, the data contamination ensures that the results would be spurious when Mann's algorithm is applied even if there was not a sign error. The 'upside-side down' aspect of the problem simply provides irrefutable evidence that the data is useless. This is one of the reasons why I picked this particular example. There really is no rational justification for using this data in Mann's paper.

Edited by TimG
Link to comment
Share on other sites

  • Replies 615
  • Created
  • Last Reply

Top Posters In This Topic

Mann's algorithm attempts to guess the relationship between temperature and the proxy by looking at the correlation with temperatures during the last 100 years. If the proxy is positivity correlated then the algorithm assumes that temperatures are higher when the proxy value is higher in the past. If the proxy is negatively correlated then the algorithm assumes that temperatures are lower when the proxy value is higher in the past.

However, in the case of the Tiljander proxies the huge contamination in the last 100 years tricked Mann's algorithm into guessing the wrong sign. i.e. Mann assumed that temperatures rose when the proxy value rose when the physics of the proxy dictates that the temperature falls as the proxy value rises. e.g. Mann used the proxy 'upside down'.

I read the notes on that. Is this what is referred to as EIV ? "Errors-In-Variables" ?

Mann seems to address this in "Potential Data Quality Problems" here:

http://www.pnas.org/content/suppl/2008/09/02/0805721105.DCSupplemental/0805721105SI.pdf#nameddest=ST1

But then, what does this mean ?

"Where the sign of the correlation could a priori be specified (positive for tree-ring data, ice-core oxygen isotopes, lake sediments, and historical documents, and negative for coral oxygen-isotope records), a one-sided significance criterion was used. "

Which one was used for lake sediments ?

That said, the data contamination ensures that the results would be spurious when Mann's algorithm is applied even if there was not a sign error. The 'upside-side down' aspect of the problem simply provides irrefutable evidence that the data is useless. This is one of the reasons why I picked this particular example. There really is no rational justification for using this data in Mann's paper.

If it passed a significance test why is it useless ? Is this about two different temperature series that pass significance tests or one series that passes significance tests in two different situations, with a different coefficient ? That seems strange...

Link to comment
Share on other sites

Mann seems to address this in "Potential Data Quality Problems" here:
Mann's algorithm does a correlation test. We also know that his algorithm picked the wrong sign for the correlation. Therefore, his algorithm did NOT account for the contamination no matter what he claims. If he had accounted for the contamination he would have picked the sign that matches the physics of the proxy.

This is one of these points where you need to understand how correlation works before you can understand that Mann's claims are necessarily false.

Which one was used for lake sediments ?
The wrong sign was picked therefore it was a two sided test. If a one sided test had been done the data would have been excluded because there is no correlation if you use the physics of the proxy to determine the orientation.
If it passed a significance test why is it useless ?
What you say makes no sense. Passing a significance test means nothing if the data is garbage (GIGO). This data is useless in the last 150 years. Edited by TimG
Link to comment
Share on other sites

Mann's algorithm does a correlation test. We also know that his algorithm picked the wrong sign for the correlation. Therefore, his algorithm did NOT account for the contamination no matter what he claims. If he had accounted for the contamination he would have picked the sign that matches the physics of the proxy.

Two questions on this:

1. How do we know which method produced the "right" sign ?

2. Isn't Least Squares multivariate regression a standard process ? Is Mann doing something different than following that process ?

This is one of these points where you need to understand how correlation works before you can understand that Mann's claims are necessarily false.

The wrong sign was picked therefore it was a two sided test. If a one sided test had been done the data would have been excluded because there is no correlation if you use the physics of the proxy to determine the orientation.

It was a two sided test - ok.

What you say makes no sense. Passing a significance test means nothing if the data is garbage (GIGO). This data is useless in the last 150 years.

Well, it's statistically significant. That's the point - even random data will pass the test from time to time right ?

Link to comment
Share on other sites

1. How do we know which method produced the "right" sign ?
Read the Tiljander paper. The relationship between temperatures and the proxies is clearly stated. That is the "right" sign. The Tiljander paper also clearly states that the relationship between temps breaks down in the last 100 years because of contamination from agricultural runoff. Therefore we can conclude that it is not possible to select and calibrate this data using correlations with temperatures in the last 100 years. Mann claims he was able to select and calibrate this data using correlation - a claim that is necessarily false if you look at the actual data provided by Tiljander.
Isn't Least Squares multivariate regression a standard process ? Is Mann doing something different than following that process ?
It is standard and we can assume that it has all of the limitations that come with that technique. On big limitation is it can be fooled by data with non-random contamination.
Well, it's statistically significant. That's the point - even random data will pass the test from time to time right ?
I am not sure what your point is. Edited by TimG
Link to comment
Share on other sites

Read the Tiljander paper. The relationship between temperatures and the proxies is clearly stated. That is the "right" sign.

Isn't it possible for a different series to create different coefficients in different models, depending on the factors in the model ? It seems to me that it is.

And my question here:

"I read the notes on that. Is this what is referred to as EIV ? "Errors-In-Variables" ?"

Link to comment
Share on other sites

Isn't it possible for a different series to create different coefficients in different models, depending on the factors in the model ? It seems to me that it is.
You cannot create a model that violates the physics of the proxies. The fact that Mann included the proxy is irrefutable evidence that his algorithm does not work. If the algorithm had worked the proxy would not have passed screening because the data does not correlate with temperatures.
"I read the notes on that. Is this what is referred to as EIV ? "Errors-In-Variables" ?"
No. EIV was developed to deal with random measurement error - not contamination of the form found in this data. Look at the assumptions that are implicit in EIV - they all assume some variation on random noise: http://en.wikipedia.org/wiki/Errors-in-variables_models Edited by TimG
Link to comment
Share on other sites

Is this what is referred to as EIV ? "Errors-In-Variables"?

a part of that 'rathole' pursuit that I described you were entering, reflects upon a holier-than-thou positioning where particular statistical driven denialists hold "traditional statistics", over and above, developed/tailored statistical climate field reconstruction (CFR) applications... in this case EIV - described as a, 'simplified application of the RegEM climate field reconstruction method'. Of course, that RegEM algorithm applied within the EIV regression is derogatorily referred to by the parroting TimG, as the, "Mann meat-grinder algorithm"... notwithstanding RegEM was not developed by Mann, it simply allows denialists another avenue to continue their targeted attack against the Mann!

speaking to the robustness of proxy-based CFR methods, Mann and others published: Robustness of proxy-based climate field reconstruction methods

Link to comment
Share on other sites

Read the Tiljander paper. The relationship between temperatures and the proxies is clearly stated. That is the "right" sign. The Tiljander paper also clearly states that the relationship between temps breaks down in the last 100 years because of contamination from agricultural runoff.

From what I could tell in the Tiljander paper, the statement of the relationship between temperature and the several aspects of lake sedimentation he discusses (varve thickness, mineral vs organic layers, magnetic susceptibility, etc) is kind of danced around the whole time. Never in the paper does he explicitly say hotter = thinner varves or vice versa. In fact, he does extensive interpretations of the various aspects of the data to try to explain how it agrees with climate variation and human habitation near the lake throughout historical times, always bringing in additional factors to consider besides just the varve thickness (which I presume is the proxy you are talking about). Furthermore, the noise in the data (in figure 5) is generally of greater amplitude than the changes that he is talking about (signal to noise ratio < 1).

From what I could tell, although he never stated it unambiguously, varve thickness was supposed to be negatively correlated with temperature. However, in the 20th century, varve thickness has increased substantially while temperatures have also increased, and this is explained by increased drainage into the lake due to human activities in the area (logging, etc).

I haven't looked at all at the Mann paper and don't know what his algorithm or his argument is, but based on my analysis of the Tiljander paper and the data he presents, there is no significant correlation between varve thickness and temperature. Maybe other lake sedimentation data from other places around the world is more robust? I don't know.

However, this much is clear from Tijlander's paper: if one was simply to take the last 100 years of varve thickness data in it, see the upward slope in varve thickness and associate that with the upward trend in temperatures, one would be misled.

Edited by Bonam
Link to comment
Share on other sites

Is RegEM a type of regression analysis ?
I think you are on the wrong track. It makes no difference what algorithms Mann is using to do his correlation analysis because there is no such thing as an algorithm that can use correlation in the last 100 years to extract useful information from these proxies.

This is a situation like what is often shown on crime dramas where the investigators take a blurry photograph, zoom in 1000x and apply some magical 'image analysis' algorithm which turn a few pixels into a high-def image with a critical clue. It seems real but it is basically nonsense because there is no information in those pixels that would allow any algorithm to do what is shown. Same thing with these proxies and Mann: there is no information in the last 100 years that would allow Mann to do anything useful with them. They should not have been included in his reconstruction.

That is why I say all the information you need to know Mann is wrong is in the Tiljander paper but it requires that understand how correlation works and what its limitations are.

Edited by TimG
Link to comment
Share on other sites

From what I could tell, although he never stated it unambiguously, varve thickness was supposed to be negatively correlated with temperature. However, in the 20th century, varve thickness has increased substantially while temperatures have also increased, and this is explained by increased drainage into the lake due to human activities in the area (logging, etc).
Exactly. The only qualification the paper uses actual thinkness and 'x-ray' thickness (a.k.a. density) where the x-ray thickness is the proxy that appears to have the most information in the more distant past.
Link to comment
Share on other sites

That is why I say all the information you need to know Mann is wrong is in the Tiljander paper but it requires that understand how correlation works and what its limitations are.

If it's as simple as "Mann made a mistake by using contaminated data" then I guess Mann is just clearly wrong and refuses to acknowledge that. But why am I thinking there is more to this ?

Link to comment
Share on other sites

If it's as simple as "Mann made a mistake by using contaminated data" then I guess Mann is just clearly wrong and refuses to acknowledge that. But why am I thinking there is more to this ?
There is nothing more to it. The main problem here is Mann who refuses to acknowledge even the smallest error. In order to avoid doing so he spreads misinformation which is picked up and amplified by various mouthpieces. If you think there is more to it, it is only because of this.

One piece of misinformation which is endlessly repeated is that 'multivariant regression is insensitive to sign'. It is misinformation because multivariant regression still needs a sign assigned for each proxy and that sign can be wrong. The fact that multivariant regression can handle proxies with either sign does not address the criticism in any way. Can you see the distinction I am making?

Edited by TimG
Link to comment
Share on other sites

a part of that 'rathole' pursuit that I described you were entering, reflects upon a holier-than-thou positioning where particular statistical driven denialists hold "traditional statistics", over and above, developed/tailored statistical climate field reconstruction (CFR) applications... in this case EIV - described as a, 'simplified application of the RegEM climate field reconstruction method'. Of course, that RegEM algorithm applied within the EIV regression is derogatorily referred to by the parroting TimG, as the, "Mann meat-grinder algorithm"... notwithstanding RegEM was not developed by Mann, it simply allows denialists another avenue to continue their targeted attack against the Mann!

speaking to the robustness of proxy-based CFR methods, Mann and others published: Robustness of proxy-based climate field reconstruction methods

Is RegEM a type of regression analysis?

no - the regression, proper, is the EIV ("error-in-variables")... RegEM is a series of methodologies for determining coefficients within that EIV regression, most particularly per Mann et al, 1=> a 'truncated total least squares (TTLS)' regularization method in regards to, "handling observational errors" and, 2=> a 'ridge (regression)' regularization method in regards to, "gaps in the available individual annual mean temperature data".

again, as I mentioned, RegEM was not created by Mann (no matter how improperly derisive TimG continues his parroting by referring to it as "Mann's meat-grinder algorithm")... the actual originator - Tapio Schneider per "Analysis of Incomplete Climate Data: Estimation of Mean Values andCovariance Matrices and Imputation of Missing Values"... speaking to both ridge and TTLS within RegEM

If it's as simple as "Mann made a mistake by using contaminated data" then I guess Mann is just clearly wrong and refuses to acknowledge that. But why am I thinking there is more to this?

is this you, uhhh... falling into the TimG meat-grinder? :lol: Let me again play my earlier devil's advocate role... directly with your comment and the related Peer-Review discussions that TimG so casually waves off, declaring the point an "irrelevant distraction". Just how would you expect Dr. Mann to respond to your hypothetical suggestion of, as you propose, "being clearly wrong & refusing to acknowledge it"? How does that actually transpire... the mechanics of it... when the loud denialist howlers TimG parrots can't actually be bothered to formalize something for Dr. Mann to respond to... after 3+ years and counting. Just what forum would you like Dr. Mann to respond in? :lol: Dr. Mann, like any/all scientists, is not infallible... when legitimate errors have been brought forward in the past, he most certainly has issued corrigendum. Another of TimG's false narratives, as you've just now read (again), is to paint Dr. Mann as broadly unwilling to accept and acknowledge, as TimG falsely states, "even the smallest error".

Link to comment
Share on other sites

There is nothing more to it. The main problem here is Mann who refuses to acknowledge even the smallest error. In order to avoid doing so he spreads misinformation which is picked up and amplified by various mouthpieces. If you think there is more to it, it is only because of this.

Really.

One piece of misinformation which is endlessly repeated is that 'multivariant regression is insensitive to sign'. It is misinformation because multivariant regression still needs a sign assigned for each proxy and that sign can be wrong. The fact that multivariant regression can handle proxies with either sign does not address the criticism in any way. Can you see the distinction I am making?

I think it's "multivariate"... that's how I always have seen it written.

If the core data series' sign is wrong, then it doesn't matter because you'll get a negative sign of the coefficient and -1 * -1 = 1.

But that's not the problem.

The problem - according to you - is that he used bad data and that is what caused the coefficient to be negative.

Link to comment
Share on other sites

is this you, uhhh... falling into the TimG meat-grinder? :lol: Let me again play my earlier devil's advocate role... directly with your comment and the related Peer-Review discussions that TimG so casually waves off, declaring the point an "irrelevant distraction". Just how would you expect Dr. Mann to respond to your hypothetical suggestion of, as you propose, "being clearly wrong & refusing to acknowledge it"?

I would not expect that. I would expect the peer review process to highlight such an obvious error.

Another of TimG's false narratives, as you've just now read (again), is to paint Dr. Mann as broadly unwilling to accept and acknowledge, as TimG falsely states, "even the smallest error".

"The smallest error" might not be of consequence - such as the double-reverse sign issue discussed above. If I try to regress depth below sea level, for example, as a positive number (below sea level) or a negative number (i.e. height) over some series, it will come out the same.

Link to comment
Share on other sites

If the core data series' sign is wrong, then it doesn't matter because you'll get a negative sign of the coefficient and -1 * -1 = 1.
Look, Mann admits the sign matters when he said this:
Where the sign of the correlation could a priori be specified (positive for tree-ring data, ice-core oxygen isotopes, lake sediments, and historical documents, and negative for coral oxygen-isotope records), a one-sided significance criterion was used.
He would have never written such a statement if sign did not matter to his algorithm. If am saying he should have applied a "a one-sided significance criterion" to this proxy because the "sign of the correlation could a priori be specified". He did not because the proxy would have not passed screening if he had.

This is an error. There is no doubt.

Edited by TimG
Link to comment
Share on other sites

Look, Mann admits the sign matters when he said this:

He would have never written such a statement if sign did not matter to his algorithm. If am saying he should have applied a "a one-sided significance criterion" to this proxy because the "sign of the correlation could a priori be specified". He did not because the proxy would have not passed screening if he had.

This is an error. There is no doubt.

as much as you insist in 'argument by authority' and your self-declared/assigned authority to Tiljander's assessment, when she co-authored the paper, as I understand, she had relative inexperience and was still pursuing her doctorate. I believe it's now been determined that one of the proxies isn't even direct for temperature but rather an indirect proxy relative to melting snow run-off... and I believe two are actually derived measurements of the others. In any case, the possible suspect level of Tiljander's experience stands up against 6-7 knowledgeable, well published scientists of standing within the 'Mann et al' grouping. I also recall she held and positioned one of those 'et al authors' as her personal mentor. Perhaps some of this reflects upon the reasoning behind why the Mann et al authors chose a two-sided test... why they chose not to recognize a Tiljander interpretation as a prior orientation for a one-sided test. All your bull-shyte over upsidedownyness is nothing more than... bull-shyte. Ultimately, it comes down to what climate signal existed (significant, or not), what calibration/weighting was arrived at for the complete timeline, and what sensitivity testing was performed to evaluate the effect/degree of contamination within the particular segment of the overall timeline. Additionally, a simplistic comment was offered earlier suggesting selection should/could have chosen other lake sediments - clearly there weren't and there still aren't many lake sediment proxies to choose from for NH reconstructions.

The problem is it did not and your faith in it is woefully misplaced. This is ultimately why this issue matters.

whaaa! :lol: then for the sake of why the issue so matters, in spite of you calling it "small in the bigger picture", take your miscast and misplaced vitriol and rhetoric and apply it! Apply it. Formalize it and run it on through Peer-Review. Oh, wait... is there a problem in why no single person across the breadth of the denialsphere can seem to formalize a challenge through Peer-Review. You know... your flippant hand-wave and declaration of the issue as nothing more than an, "irrelevant distraction".

Link to comment
Share on other sites

I believe it's now been determined that one of the proxies isn't even direct for temperature but rather an indirect proxy relative to melting snow run-off
More misinformation (otherwise known as lies):

In the actual proxy data accompaning the paper there were 2 seperate time series:

For Darksum, higher values correspond to warmer and wetter summers with longer growing seasons.

For Lightsum, higher values correspond to cooler and wetter winters with more pronounced spring snowmelt.

Yet they both showed a uptick at the end because of contamination.

Mann used correlation to screen these and reversed the sign for 1 but not the other.

If Mann really believed that Tiljander got it wrong he would have had to reverse the signs of both.

In any case, Mann has no business changing the interpretation of the proxy authors.

His claim now is nothing but an after the fact excuse to cover up his screw up.

Edited by TimG
Link to comment
Share on other sites

What about the allegation that contaminated data was used ?
It is not an allegation. It is a fact that is clearly stated in the paper.

I am not sure there is any point to continue if you are not willing to acknowledge that certain points are facts and are not open to interpretation.

Edited by TimG
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Tell a friend

    Love Repolitics.com - Political Discussion Forums? Tell a friend!
  • Member Statistics

    • Total Members
      10,732
    • Most Online
      1,403

    Newest Member
    gentlegirl11
    Joined
  • Recent Achievements

  • Recently Browsing

    • No registered users viewing this page.
×
×
  • Create New...