Keepitsimple Posted November 30, 2009 Author Report Posted November 30, 2009 worthy of it's own thread, hey! C'mon, Simple... this one's been beat upon so heavily in the last days. But it's always refreshing to hear someone issue a supposed from authority revelation... coming from a self-described old programmer who, apparently, knows nothing how real software development does/may occur. This isn't code... this is simply a text file of running commentary used by an individual tasked with coming in after-the-fact and working to update product documentation. Invariably, programmers are loath to create extensive documentation - this is not unique to any industry, to any science. More progressive development methodologies (e.g. expert agile methodology), in fact, view creating detailed static documentation as an impediment, as a detriment, to eventual success of the design, of the development. This approach has worked well for development shops; however, once production deployed, Operations Support may face challenges depending on the hand-off between Development and Operations. Invariably, Operations Depts... those left to manage the production ready code received from Development, have their own standards, inclusive of static documentation. Building that static documentation, after the fact, can be most intensive, most challenging, most frustrating and may reflect upon the skills, knowledge and... personality... of those tasked with writing (updating) the documentation. Some of that skill, knowledge and... personality... of the individual ('Harry') comes through in his self-documenting text file that reflects his after-the-fact efforts to update documentation. ... for completeness, this single individuals documentation update initiative reflects back upon a dated early legacy product, the CRU TS 2.1 product which has no relationship to the most current product associated with the CRU HadCRUT temperature dataset. As usual Waldo, you're full of crap. If you'll read through all the comments in the source txt file, you'll see that this is an environment that is completely lacking in Change Control. That's not surprising - it's a University - not an establishment for commercial, bullet proof code. Why would you expect the CRU to have a successful project that is an extremely complex undertaking......when over 50% of IT projects fail? As with much scientific programming, they are using a lot of Fortran and IDL.....old languages that are great for large datasets/statistics/math - but cumbersome to develop and maintain. Agile and Expert methodologies have their place but really, they are just methods of iterative development - breaking down a project into small chunks and allowing somewhat for on-the-fly modifications to mitigate scope-creep......but if the last 10 or so years of development has shown anything - it's the importance of Change Control. But I'll go back to my original point - there are many references in the Txt file that make it clear that this programmer doesn't understand the data that he is working with - he's fudging things. This project attitude (it's probably not HIM - it's the leadership and project management) is just not compatible with a successful IT project. One might say "well, he's only one programmer" but heck....look at the significance of the data he's working with. It's only the tip of the iceberg....and there is surely more to come. Quote Back to Basics
waldo Posted November 30, 2009 Report Posted November 30, 2009 ... for completeness, this single individuals documentation update initiative reflects back upon a dated early legacy product, the CRU TS 2.1 product which has no relationship to the most current product associated with the CRU HadCRUT temperature dataset.Another lie courtesy of RC. The code produces HadCRUTv3 as well as HadCRUTv2. More importantly the code shows that HadCRUTv3 was adjusted by fitting a curve to the HadCRUTv2 results which means any errors in HadCRUTv2 will be in HadCRUTv3.well... we see the 'lie' word again emanating from the mouths of the deniers pony up your source for that, bucky! You know you could avoid embarrasing yourself if you took time to fact check the pronouncements from RC. You will find that honesty is not their strong suit.From the README file for the code that produces the HadCRUT temperature series. I took the twelve 1990 anomaly files from the original 1901-2006 run (that was done with some flavour of anomdtb.f90). They were here:/cru/cruts/version_3_0/primaries/tmp/tmptxt/*1990* Then I modified the update 'latest databases' file to say that tmp.0705101334.dtb was the current database, and made a limited run of the update program for tmp only, killing it once it had produced the anomaly files. The run was #0908181048. So, under /cru/cruts/version_3_0/fixing_tmp_and_pre/custom_anom_comparisons, we have a 'manual' directory and an 'automatic' directory, each with twelve 1990 anomaly files. And how do they compare? NOT AT ALL!!!!!!!!! Example from January: crua6[/cru/cruts/version_3_0/fixing_tmp_and_pre/custom_anom_comparisons] head manual/tmp.1990.01.txt For some reason I don't think they would store v2 data in directories labeled version_3_0 Yours is an utterly pathetic... desperate... response; one that speaks to much of the frothing and gnashing emanating from denierblogworld, where the anal line-by-by parsing of hacked emails continues... searching for the holy grail to absolutely prove that global climate warming is a hoax!!! Let's see if we can get you to actually reference from your favourite go-to denying blogger... sure we can! If you really want to speak to embarrassment - yours: - again, this hacked email, the so-called "Harry_Read_Me.txt" hacked email, reflects upon nothing other than a lone individuals tasked work to update documentation relative to the dated legacy product associated with the migration from CRU TS2.1 to CRU TS3.0 datasets. - CRU TS2.1/TS3.0 datasets are different from the CRU CRUTEM3 dataset - CRU TS2.1/TS3.0 datasets are different from the most commonly used and referenced CRU HADCRUT3 dataset - again, the self-documenting txt file explicitly speaks to TS datasets... not CRUTEM3... and, in turn, not HADCRUT3 But please continue... let's see if we can finally get you to reference your source... the one that just might try to draw linkage between all those CRU disparate data gathering mechanisms, disparate metadata, disparate databases, disparate processing systems, etc., etc., etc. I really want to see the one, your source, that can speak to supporting your profound statement about "curve fitting HadCRUTv2/HadCRUTv3" relative to a self-documenting text file used to help an individual upgrade documentation reflective upon the migration from CRU TS2.1 to CRU TS3.0 datasets. And, of course, the one that will further embolden you to parrot how, as you (continue) to say, "RC lies" Quote
waldo Posted November 30, 2009 Report Posted November 30, 2009 worthy of it's own thread, hey! C'mon, Simple... this one's been beat upon so heavily in the last days. But it's always refreshing to hear someone issue a supposed from authority revelation... coming from a self-described old programmer who, apparently, knows nothing how real software development does/may occur. This isn't code... this is simply a text file of running commentary used by an individual tasked with coming in after-the-fact and working to update product documentation. Invariably, programmers are loath to create extensive documentation - this is not unique to any industry, to any science. More progressive development methodologies (e.g. expert agile methodology), in fact, view creating detailed static documentation as an impediment, as a detriment, to eventual success of the design, of the development. This approach has worked well for development shops; however, once production deployed, Operations Support may face challenges depending on the hand-off between Development and Operations. Invariably, Operations Depts... those left to manage the production ready code received from Development, have their own standards, inclusive of static documentation. Building that static documentation, after the fact, can be most intensive, most challenging, most frustrating and may reflect upon the skills, knowledge and... personality... of those tasked with writing (updating) the documentation. Some of that skill, knowledge and... personality... of the individual ('Harry') comes through in his self-documenting text file that reflects his after-the-fact efforts to update documentation. ... for completeness, this single individuals documentation update initiative reflects back upon a dated early legacy product, the CRU TS 2.1 product which has no relationship to the most current product associated with the CRU HadCRUT temperature dataset. As usual Waldo, you're full of crap. If you'll read through all the comments in the source txt file, you'll see that this is an environment that is completely lacking in Change Control. That's not surprising - it's a University - not an establishment for commercial, bullet proof code. Why would you expect the CRU to have a successful project that is an extremely complex undertaking......when over 50% of IT projects fail? As with much scientific programming, they are using a lot of Fortran and IDL.....old languages that are great for large datasets/statistics/math - but cumbersome to develop and maintain. Agile and Expert methodologies have their place but really, they are just methods of iterative development - breaking down a project into small chunks and allowing somewhat for on-the-fly modifications to mitigate scope-creep......but if the last 10 or so years of development has shown anything - it's the importance of Change Control. But I'll go back to my original point - there are many references in the Txt file that make it clear that this programmer doesn't understand the data that he is working with - he's fudging things. This project attitude (it's probably not HIM - it's the leadership and project management) is just not compatible with a successful IT project. One might say "well, he's only one programmer" but heck....look at the significance of the data he's working with. It's only the tip of the iceberg....and there is surely more to come. Yes, I certainly see scope-creep in your own comments Your generalizing about distinctions between commercial versus corporate versus academia… is simply that, generalizing… the best and worst can be found in all contexts… where you may or may not find degrees of, for example, dedicated data librarians, deeply entrenched coding standards, rigorous change control, defined QC, etc.. I certainly won’t spend any cycles bothering to look through that hacked email/txt file… we can certainly leave that anal pursuit to those hell-bent upon finding the smoking gun to prove "the hoax"! … leave that to Riverwind! However, what I have seen highlighted of that text file is nothing that hasn’t been encountered a brazillion times over, particularly around after-the-fact documentation pursuits relative to the integration-migration initiatives involving dated legacy systems. Your generalizing about the success/failure rate of IT projects… is simply that, generalizing. With what relevance to this focused discussion? The reference to agile methodology, as an example, was simply offered in the context that static documentation is not a priority in today’s development methodologies… that, in fact, it’s viewed as an actual impediment to project success. That invariably, today’s real-world Development-to-Operations hand-off involves significant after-the-fact documentation pursuits to allow the actual support shop/personnel to maintain systems. Overall, that’s exactly what we see played out across that hacked email/txt file… again, an after-the-fact pursuit by an individual (who may or may not have had any/degrees of involvement in the actual initial development)… an individual further challenged by not having access to the original developers… an individual tasked with updating/creating documentation to reflect upon the actual migration of legacy product/component. Not the actual legacy system/product itself... the documentation... not "the code", not "the system"... not the "system output"... but the documentation. Nothing out of the norm here… nothing to see here… move along now! Quote
Riverwind Posted November 30, 2009 Report Posted November 30, 2009 CRU TS2.1/TS3.0 datasets are different from the most commonly used and referenced CRU HADCRUT3 datasetOk. I was wrong. My understanding is CRU TS was an input to CRUTEMP which was further adjusted for UHI issues.However, dataset mixups aside - the software and database problems documented are serious issues and undermine the credibility of CRU as a custodian of such critical datasets. They only re-enforce the need to have all of this code released and audited to ensure that it does what CRU claims. Lastly, I do not feel that any of this stuff 'proves AGW is hoax'. People who say that are being rediculous. But issues are serious because illustrates why we cannot simply take scientists at their word and the government needs to demand that climate scientists meet the same standards for data management and anaylsis that it expects from drug companies. Quote To fly a plane, you need both a left wing and a right wing.
waldo Posted November 30, 2009 Report Posted November 30, 2009 Ok. I was wrong. My understanding is CRU TS was an input to CRUTEMP which was further adjusted for UHI issues. However, dataset mixups aside - the software and database problems documented are serious issues and undermine the credibility of CRU as a custodian of such critical datasets. They only re-enforce the need to have all of this code released and audited to ensure that it does what CRU claims. Lastly, I do not feel that any of this stuff 'proves AGW is hoax'. People who say that are being rediculous. But issues are serious because illustrates why we cannot simply take scientists at their word and the government needs to demand that climate scientists meet the same standards for data management and anaylsis that it expects from drug companies. Fine - you were wrong - thanks for acknowledging that. I trust there will be less tendency in the future to reach for the "lie from the quiver". As to the rest, I've expressed my thoughts on the relevance of that hacked email/txt file... we won't agree on much. The bottom-line for me, is none of that hacked email/txt file... directly... reflects upon anything other than an initiative to update documentation. Essentially, the single guys tasked initiative to create a step-by-step, documented procedures statement on how to migrate everything and anything (inclusive of data/metadata/programs/database,etc.) in support of the upgrade of TS datasets. And... the guy is facing challenges in putting the documentation together. Challenges that reflect upon many things... but not, necessarily, the starting points, the end points or the interim components therein. To suggest otherwise, to infer otherwise, one must take a position of authority based upon interpretations of a hacked email/txt file. Quote
Shady Posted November 30, 2009 Report Posted November 30, 2009 Sorry, but it sounds like a cop-out to say that tons of data is being faked, or is inaccurate and nobody has looked, and nobody has time to look. It's even worse than first thought. But I'm sure Michael will continue his strident defense, on behalf of the true believers. Climate change data dumped SCIENTISTS at the University of East Anglia (UEA) have admitted throwing away much of the raw temperature data on which their predictions of global warming are based. It means that other academics are not able to check basic calculations said to show a long-term rise in temperature over the past 150 years. The UEA’s Climatic Research Unit (CRU) was forced to reveal the loss following requests for the data under Freedom of Information legislation Times Quote
Keepitsimple Posted November 30, 2009 Author Report Posted November 30, 2009 [/size] I certainly won’t spend any cycles bothering to look through that hacked email/txt file… we can certainly leave that anal pursuit to those hell-bent upon finding the smoking gun to prove "the hoax"! … leave that to Riverwind! You just don't get it. This is not about a smoking gun. It's not about completely invalidating AGW.......because skeptics have only been "skeptical" of the degree to which humans are responsible for Climate Change. If you'd take a few minutes - because that's all it would take - to go through the text file, you'd see that this is more than documentation.....this guy is writing programs left and right, doing his best to get the information to come out the same as it was published - or something to that effect. As Riverwind said previously and I concur, it's about quality control of data and code. If you read the TXT you can clearly see that the environment is fly-by-the-seat-of-your-pants. That's why auditing of all the data and code is so important. We cannot make global decisions with shoddy work. Quote Back to Basics
Wild Bill Posted November 30, 2009 Report Posted November 30, 2009 Fine - you were wrong - thanks for acknowledging that. I trust there will be less tendency in the future to reach for the "lie from the quiver". As to the rest, I've expressed my thoughts on the relevance of that hacked email/txt file... we won't agree on much. The bottom-line for me, is none of that hacked email/txt file... directly... reflects upon anything other than an initiative to update documentation. Essentially, the single guys tasked initiative to create a step-by-step, documented procedures statement on how to migrate everything and anything (inclusive of data/metadata/programs/database,etc.) in support of the upgrade of TS datasets. And... the guy is facing challenges in putting the documentation together. Challenges that reflect upon many things... but not, necessarily, the starting points, the end points or the interim components therein. To suggest otherwise, to infer otherwise, one must take a position of authority based upon interpretations of a hacked email/txt file. Man, you just won't accept ANYTHING that shakes your argument, will you? Your last post seems to really say "Just because the monks made major errors in copying their manuscripts the past 1000 years doesn't mean that the Gospel is still not absolutely, literally TRUE!" Go back through your own posts. RW is quite right! You consistently ignore his premise, which is that the data from the GW side is flagrantly unreliable and should be held to higher standards. You keep shouting that this doesn't prove that GW is not happening. That WASN'T his point and you keep avoiding it! I'm not sure if you actually believe in what you say or if you're just trying to win an argument. To me, that's a pointless endeavour. If you keep sliding off the actual premises the argument just can't be resolved and will go on forever. To me, the only real point to any debate is to arrive at the TRUTH! You can win a debate and still be dead wrong. Mother Nature doesn't care. Her universe works by her laws. Quote "A government which robs Peter to pay Paul can always depend on the support of Paul." -- George Bernard Shaw "There is no point in being difficult when, with a little extra effort, you can be completely impossible."
Michael Hardner Posted November 30, 2009 Report Posted November 30, 2009 Their reasons are stated clearly in the leaked emails. They don't want SteveMc to have the data because he would find things wrong with it ans they would have 'waste time' defending themselves. Well, what are they ? Have you seen them ? Why are you only giving me part of your argument here ? Why did they say they weren't giving your blogger the data, and what was the 'real' reason as stated in the emails ? The problem is 1) Key datasets are not available. 2) Authors refuse to provide data to sceptics 3) Peer reviewers don't look at the data. We're going around in circles here. I need to understand the answer to the question above. It is not conspiracy or neglect. It is a system with no reliable quality control mechanism that is producing suspect results. This stuff was not being done in the open. The data and code was hidden from people that might criticize it. And when the data and code is made available the climate scientist establishment spends its time attacking the credibility of the critics instead of actually fixing problems. The fact that the data isn't available, though, is a known fact that it's not included in the paper itself. It's an easy fix to include the data, and just a little extra work and documentation that needs to happen. The University at the centre of this recognizes that, and they have indicated that they will publish the data. Quote Click to learn why Climate Change is caused by HUMANS Michael Hardner
Shady Posted November 30, 2009 Report Posted November 30, 2009 The University at the centre of this recognizes that, and they have indicated that they will publish the data. You're quite misinformed. Climate change data dumped SCIENTISTS at the University of East Anglia (UEA) have admitted throwing away much of the raw temperature data on which their predictions of global warming are based. It means that other academics are not able to check basic calculations said to show a long-term rise in temperature over the past 150 years. The UEA’s Climatic Research Unit (CRU) was forced to reveal the loss following requests for the data under Freedom of Information legislation Times Quote
Riverwind Posted November 30, 2009 Report Posted November 30, 2009 Why did they say they weren't giving your blogger the data, and what was the 'real' reason as stated in the emails ?I gave you the real reason. They did not want to be publically criticized because they did not want to have to defend themselves publically. They see such requests as a waste of their time.The University at the centre of this recognizes that, and they have indicated that they will publish the data.So? It took a leak or hacker to force this disclosure. It is not enough to release this set of data and sweep it under the rug. Rules have to be put in place to make sure that no sceptic ever has any problem accessing any data/code required to replicate the science used to justify government polcies. Quote To fly a plane, you need both a left wing and a right wing.
Michael Hardner Posted November 30, 2009 Report Posted November 30, 2009 You're quite misinformed. Climate change data dumped SCIENTISTS at the University of East Anglia (UEA) have admitted throwing away much of the raw temperature data on which their predictions of global warming are based. It means that other academics are not able to check basic calculations said to show a long-term rise in temperature over the past 150 years. The UEA’s Climatic Research Unit (CRU) was forced to reveal the loss following requests for the data under Freedom of Information legislation Times I'm not misinformed at all. Check the dates on the news releases - your news release came after mine. Quote Click to learn why Climate Change is caused by HUMANS Michael Hardner
Riverwind Posted November 30, 2009 Report Posted November 30, 2009 I'm not misinformed at all. Check the dates on the news releases - your news release came after mine.The story Shady referenced reports information that was disclosed months ago as part of a FOI refusal from CRU. Climategate has stirred the media's interest in these stories so they are reporting old news as new. Quote To fly a plane, you need both a left wing and a right wing.
Michael Hardner Posted November 30, 2009 Report Posted November 30, 2009 I'm not misinformed at all. Check the dates on the news releases - your news release came after mine. Well, with Shady's post, I have to say that the University and the professors involved have behaved arrogantly, and negligently - thinking that studies with global significance, and that needed to be understood and believed by everyone did not need to be rigorously explained and carefully defended, and shared with skeptics. They blew it, but hopefully this will change academia so that this never happens again. Change is already happening: NY Times Scientists are proposing 'open review' on the web... Quote Click to learn why Climate Change is caused by HUMANS Michael Hardner
Michael Hardner Posted November 30, 2009 Report Posted November 30, 2009 The story Shady referenced reports information that was disclosed months ago as part of a FOI refusal from CRU. Climategate has stirred the media's interest in these stories so they are reporting old news as new. In that case, the article I referenced is indeed the latest news from the CRU in this regard. Quote Click to learn why Climate Change is caused by HUMANS Michael Hardner
Riverwind Posted November 30, 2009 Report Posted November 30, 2009 (edited) In that case, the article I referenced is indeed the latest news from the CRU in this regard.Except CRU either 1) does not have the data to release because it was deleted or 2) lied on their FOI response. Neither says good things about CRU.The Open Review Journal would be an excellent basis for reform of the peer review system but it is going to take more than a proposal by some scientists. Edited November 30, 2009 by Riverwind Quote To fly a plane, you need both a left wing and a right wing.
Michael Hardner Posted November 30, 2009 Report Posted November 30, 2009 Except CRU either 1) does not have the data to release because it was deleted or 2) lied on their FOI response. Neither says good things about CRU. The Open Review Journal would be an excellent basis for reform of the peer review system but it is going to take more than a proposal by some scientists. Indeed, it will be interesting to see how this evolves. If they manage to bring skeptics inside the tent, then the model for adoption of 'open sourcing' could be used in other aspects of life, especially government. Quote Click to learn why Climate Change is caused by HUMANS Michael Hardner
eyeball Posted November 30, 2009 Report Posted November 30, 2009 Indeed, it will be interesting to see how this evolves. If they manage to bring skeptics inside the tent, then the model for adoption of 'open sourcing' could be used in other aspects of life, especially government. And even more especially, economics. I can only imagine how vastly different our world might be if the level of scepticism that has slowed efforts to deal with AGW had similarly put the brakes on some of our economic policies. Quote A government without public oversight is like a nuclear plant without lead shielding.
Michael Hardner Posted November 30, 2009 Report Posted November 30, 2009 And even more especially, economics. I can only imagine how vastly different our world might be if the level of scepticism that has slowed efforts to deal with AGW had similarly put the brakes on some of our economic policies. One could argue that it put the brakes on free trade for an extended period. Quote Click to learn why Climate Change is caused by HUMANS Michael Hardner
eyeball Posted November 30, 2009 Report Posted November 30, 2009 One could argue that it put the brakes on free trade for an extended period. What free trade? Talk about scams... Quote A government without public oversight is like a nuclear plant without lead shielding.
waldo Posted November 30, 2009 Report Posted November 30, 2009 You just don't get it. This is not about a smoking gun. It's not about completely invalidating AGW.......because skeptics have only been 'skeptical' of the degree to which humans are responsible for Climate Change. If you'd take a few minutes - because that's all it would take - to go through the text file, you'd see that this is more than documentation.....this guy is writing programs left and right, doing his best to get the information to come out the same as it was published - or something to that effect. As Riverwind said previously and I concur, it's about quality control of data and code. If you read the TXT you can clearly see that the environment is fly-by-the-seat-of-your-pants. That's why auditing of all the data and code is so important. We cannot make global decisions with shoddy work. You are truly obtuse to the point of boredom... this has absolutely nothing - nothing - to do with as you say, 'making the information come out the same as it was published'. Absolutely nothing... but, somehow, your agenda feels validated if you can obfuscate further. Again, this was nothing more than updating/creating the documentation to reflect a procedures driven step-by-step method to be able to support the migration update of TS datasets - which, again, has absolutely nothing what so ever to do with the most recognized and referenced HadCRUT data. Call it wanting to create a template, so to speak, to allow someone to easily come in, after-the-fact, run the documented procedures from the template (that this guy is trying to update/create) and... viola... the migrated TS dataset. This guy, apparently, had to figure a lot of it out... given the lack of preceding documentation and/or access to the original programmers/team. This is nothing unique - an everyday common occurrence. If it takes writing a bit of code to validate steps within that template, that's all it is... ensuring the documented template procedures, when run, actually run - duh! But of course, the deniers will trumpet out the 'hoax has been proved' based on such a trivial happenstance... if you think that interpretation is over the top... it's not, given the nonsense permeating through the deniersphere. Look no further than this thread where Riverwind attempted to associate this to Hadcrut... and he took it even further by suggesting nonsense about curve fitting between Hadcrut2/3 based on... his (or his sources) interpretations of that hacked email/txt file. When called on it, he finally relented and admitted he was wrong - unfortunately, that same nonsense is being trumpeted a thousand fold by idgits who actually know jack-shit about anything... they just have a keyboard and a resolve to stir the pot. Quote
waldo Posted November 30, 2009 Report Posted November 30, 2009 Man, you just won't accept ANYTHING that shakes your argument, will you? Your last post seems to really say "Just because the monks made major errors in copying their manuscripts the past 1000 years doesn't mean that the Gospel is still not absolutely, literally TRUE!" Go back through your own posts. RW is quite right! You consistently ignore his premise, which is that the data from the GW side is flagrantly unreliable and should be held to higher standards. You keep shouting that this doesn't prove that GW is not happening. That WASN'T his point and you keep avoiding it! I'm not sure if you actually believe in what you say or if you're just trying to win an argument. To me, that's a pointless endeavour. If you keep sliding off the actual premises the argument just can't be resolved and will go on forever. To me, the only real point to any debate is to arrive at the TRUTH! You can win a debate and still be dead wrong. Mother Nature doesn't care. Her universe works by her laws. What the hell does Mother Nature's universe works by her laws mean/imply. What data (exactly what data) is unreliable... 'flagrantly unreliable', as you state. Attempting to cast doubt upon CRU data (particularly) HadCrut data, is easily dispatched by looking at other data sources that confirm the same results the HadCrut data brings forward... look to the NOAA NCDC data... look to the NASA GISS data... look to the JMA data, etc., Again... what data (exactly what data) is unreliable... 'flagrantly unreliable', as you state. Waiting.......... Quote
waldo Posted November 30, 2009 Report Posted November 30, 2009 I gave you the real reason. They did not want to be publically criticized because they did not want to have to defend themselves publically. They see such requests as a waste of their time. So? It took a leak or hacker to force this disclosure. It is not enough to release this set of data and sweep it under the rug. Rules have to be put in place to make sure that no sceptic ever has any problem accessing any data/code required to replicate the science used to justify government polcies. Completely blown out of proportion... to real world events. Quit echoing and perpetuating the McIntyre whine... RealClimate is very quickly dispatching with all this hiding the data nonsense, by showing exactly what data has always been available. That initial comprehensive list RC put up continues to grow daily... daily as others bring forward notice of access links to particular data/procedures/models/etc...... that have always been available. Quote
Riverwind Posted November 30, 2009 Report Posted November 30, 2009 HadCrut data, is easily dispatched by looking at other data sources that confirm the same results the HadCrut data brings forward... look to the NOAA NCDC data... look to the NASA GISS data... look to the JMA dataThe custodians for these datasets are not independent, they use many of the same raw datasets and check their results against the other. This will lead to tweeking of algorithms to ensure that the long term trends match. Quote To fly a plane, you need both a left wing and a right wing.
Riverwind Posted November 30, 2009 Report Posted November 30, 2009 (edited) RealClimate is very quickly dispatching with all this hiding the data nonsense, by showing exactly what data has always been available.Read the full series of FOI requests and responses between CRU and sceptics:http://camirror.wordpress.com/2009/11/25/willis-eschenbachs-foi-request/ If the data was really available why did they refuse the FOI requests? If the data was really available why did Jones spend so much time talking about how to keep the data away from CA? If the data was really available why did Jones say: Subject: Re: WMO non respondo… Even if WMO agrees, I will still not pass on the data. We have 25 or so years invested in the work. Why should I make the data available to you, when your aim is to try and find something wrong with it. … Cheers Phil The RC claim is a falsehood and they know it. Edited November 30, 2009 by Riverwind Quote To fly a plane, you need both a left wing and a right wing.
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.