PDA

View Full Version : Climate gate round II - The source code!




tangent4ronpaul
12-05-2009, 01:26 PM
http://www.examiner.com/x-9111-SF-Environmental-Policy-Examiner~y2009m11d24-As-we-wait-for-Round-2-of-climate-gate

As we wait for Round 2 of climate gate...

A number of computer scientists and engineers are analysing computer code contained in the files leaked anonymously to the Internet last week, and it will more than likely produce more controversy than the emails that have been the subject of intense discussion so far.

In fact, if the documentation (notes written by authors and fixers of the computer code) is any indication, what we have seen so far is only prelude.

But before the storm breaks, I think we should summarise what's important in the emails.

First, prominent climate scientists, including a lead author of IPCC report sections, were willing to discuss withholding or deleting information to frustrate legitimate requests made under the Freedom of Information Act in the UK. They apparently chose who could not receive information based on the requester's identity, which may have been unlawful. They threatened to delete data--data which in fact has since disappeared. They advised each other to delete emails.

Second, these same scientists worked closely together to control channels of communication regarding climate science and global warming. They banded together to minimise or eliminate skeptical discussion. While telling the world that only peer-reviewed science should be considered legitimate, they fiercely fought to prevent skeptic writings from being peer-reviewed at all. They wrote openly about replacing an uncooperative journal editor (who was later replaced), and boycotting journals that published skeptical papers. They organised peer review so that they reviewed each others' papers.

Third, they were willing to change data so that their presentations of the state of climate looked worse. At the end of the day, this is most damning--most of the rest, even apparently illegal FOI actions, is just politics and a playground media strategy. But while world governments were imposing taxes, changing energy policies, preparing energy-based conflict policies, planning to deal with warming-based immigration, these people were content to display figures that were wrongly exaggerated to show the warming they had previously predicted but could not find in actual measurements.

I am willing to speculate that further analysis of the computer code will contribute to discussions on why they were unable to show the warming they so desperately needed to find to justify their assertions that the IPCC was too consevative, but time will certainly tell.

In the meantime, while we're waiting for the next release, it's clear that different institutions should take control of several aspects of climate research. In the UK, there are a number of bodies that might be able to sort out what's been going on. Archiving and verification, proper evaluation of previous studies--the UK has a government department called The National Archive that does this for a living, and they have recently undertaken to completely modernise how they go about things. We might ask them for assistance.

Because the way we've done things so far is not getting us to where we need to be. We know there's a problem--global warming is real, and CO2 is a contributor. But we can no longer trust the numbers we have grown accustomed to using, nor the people who generated those numbers. Time for a shake-up.

tangent4ronpaul
12-05-2009, 01:28 PM
http://www.examiner.com/x-9111-SF-Environmental-Policy-Examiner~y2009m12d4-It-will-be-the-code-that-killsClimategate-at-the-Hockey-Stick-factory

It will be the code that kills--Climategate at the Hockey Stick factory

While Michael Mann and Joe Romm are standing in front of a podium telling us there's nothing untoward found in the leaked emails from CRU, Robert Greiner of Cube Antics is telling us that we need to look at the computer code that adjusted raw temperature data before showing as the 'value added' results indicating unprecedented warming. (Hat tip to Watts Up With That for linking to him, and to Jeff Id at The Air Vent for numerous posts on the importance of the code).

The post clearly shows that a warming bias was built into the adjustments. But before I continue, here's Anthony Watt's caveats on this, in entirety:

"While there are some interesting points raised here, it is important to note a couple of caveats. First, the adjustment shown above is applied to the tree ring proxy data (proxy for temperature) not the actual instrumental temperature data. Second, we don’t know the use context of this code. It may be a test procedure of some sort, it may be something that was tried and then discarded, or it may be part of final production output. We simply don’t know. This is why a complete disclosure and open accounting is needed, so that the process can be fully traced and debugged. Hopefully, one of the official investigations will bring the complete collection of code out so that this can be fully examined in the complete context."

With that in mind, I want to remind readers that once before we saw a software sausage grinder that turned all data into hockey sticks. This was the analysis tool used by Michael Mann in creating the original Hockey Stick chart, and the way he had it set up it would turn any string of numbers into a hockey stick figure, a point noted by Greiner in his post.

Remember Anthony's caveats--we don't know enough to do more than ask more questions at this point. But this story has a historical context. When they did the same thing before, they had to issue corrections, and the Hockey Stick had to be hidden behind a curtain--until they found new proxies to use and new techniques to artificially produce the same shape.

The emails are interesting and tell us more than we want to know about the characters of The Team and how they played politics. But if there is wrongdoing in the science, it will show up in the code.

amy31416
12-05-2009, 01:32 PM
This aspect is what I'm most interested in. It should be interesting.

tangent4ronpaul
12-05-2009, 01:37 PM
http://cubeantics.com/2009/12/the-proof-behind-the-cru-climategate-debacle-because-computers-do-lie-when-humans-tell-them-to/

The Proof Behind the CRU Climategate Debacle: Because Computers Do Lie When Humans Tell Them To

I’m coming to you today as a scientist and engineer with an agnostic stand on global warming.

If you don’t know anything about “Climategate” (does anyone else hate that name?) Go ahead and read up on it before you check out this post, I’ll wait.

Back? Let’s get started.

First, let’s get this out of the way: Emails prove nothing. Sure, you can look like an unethical asshole who may have committed a felony using government funded money; but all email is, is talk, and talk is cheap.

Now, here is some actual proof that the CRU was deliberately tampering with their data. Unfortunately, for readability’s sake, this code was written in Interactive Data Language (IDL) and is a pain to go through.

NOTE: This is an actual snippet of code from the CRU contained in the source file: briffa_Sep98_d.pro

1. ;
2. ; Apply a VERY ARTIFICAL correction for decline!!
3. ;
4. yrloc=[1400,findgen(19)*5.+1904]
5. valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75 ; fudge factor
6. if n_elements(yrloc) ne n_elements(valadj) then message,'Oooops!'
7.
8. yearlyadj=interpol(valadj,yrloc,timey)

What does this Mean? A review of the code line-by-line
Starting off Easy

Lines 1-3 are comments
Line 4

yrloc is a 20 element array containing:
1400 and 19 years between 1904 and 1994 in increments of 5 years…

yrloc = [1400, 1904, 1909, 1914, 1919, 1924, 1929, ... , 1964, 1969, 1974, 1979, 1984, 1989, 1994]

findgen() creates a floating-point array of the specified dimension. Each element of the array is set to the value of its one-dimensional subscript

F = indgen(6) ;F[0] is 0.0, F[1] is 1.0….. F[6] is 6.0

Pretty straightforward, right?
Line 5

valadj, or, the “fudge factor” array as some arrogant programmer likes to call it is the foundation for the manipulated temperature readings. It contains twenty values of seemingly random numbers. We’ll get back to this later.
Line 6

Just a check to make sure that yrloc and valadj have the same number of attributes in them. This is important for line 8.
Line 8

This is where the magic happens. Remember that array we have of valid temperature readings? And, remember that random array of numbers we have from line two? Well, in line 4, those two arrays are interpolated together.

The interpol() function will take each element in both arrays and “guess” at the points in between them to create a smoothing effect on the data. This technique is often used when dealing with natural data points, just not quite in this manner.

The main thing to realize here, is, that the interpol() function will cause the valid temperature readings (yrloc) to skew towards the valadj values.
What the heck does all of this mean?

Well, I’m glad you asked. First, let’s plot the values in the valadj array.

Update: the X-axis of this plot is the number of elements in the validj array, and the Y-axis is the hard-coded value in valadj. Since yrloc is chronological, it is obvious that as we approach current day, the temperature readings are skewed upward drastically. Also, it doesn’t matter that this graph flattens out at the end because it is already up around the 2.5 mark.

http://cubeantics.com/wp-content/uploads/2009/12/graph.jpg

Look familiar? This closely resembles the infamous hockey stick graph that Michael Mann came up with about a decade ago. By the way, did I mention Michael Mann is one of the “scientists” (and I use that word loosely) caught up in this scandal?

Here is Mann’s graph from 1999

http://cubeantics.com/wp-content/uploads/2009/12/mann-hockey-stick-graph-440x330.gif

As you can see, (potentially) valid temperature station readings were taken and skewed to fabricate the results the “scientists” at the CRU wanted to believe, not what actually occurred.
Where do we go from here?

It’s not as cut-and-try as one might think. First and foremost, this doesn’t necessarily prove anything about global warming as science. It just shows that all of the data that was the chief result of most of the environmental legislation created over the last decade was a farce.

This means that all of those billions of dollars we spent as a global community to combat global warming may have been for nothing.

If news station anchors and politicians were trained as engineers, they would be able to find real proof and not just speculate about the meaning of emails that only made it appear as if something illegal happened.
Conclusion

I tried to write this post in a manner that transcends politics. I really haven’t taken much of an interest in the whole global warming debate and don’t really have a strong opinion on the matter. However, being part of the Science Community (I have a degree in Physics) and having done scientific research myself makes me very worried when arrogant jerks who call themselves “scientists” work outside of ethics and ignore the truth to fit their pre-conceived notions of the world. That is not science, that is religion with math equations.
What do you think?

Now that you have the facts, you can come to your own conclusion!
Be sure to leave me a comment, it gets lonely in here sometimes.

(tons of comments follow)

-t

tangent4ronpaul
12-05-2009, 01:55 PM
http://savecapitalism.wordpress.com/2009/11/29/climate-hacking-1/

Climate hacking #1

Update : Through some nice linking from The Air Vent I got some traffic on this post. Hats off! Also, through comments received over there I found the following, which is a nice outline of problems with the data from the pacific area : Pacific islands sinking from the top down. Basically, someone forgot to put thermometers anywhere higher than the beaches ….

Since there seems to be pretty heavy public interest in this, I will try and keep doing comparisons between adjusted/unadjusted data for a few stations at the time. Next up will be Sweden, posted this evening if I get time to compile it (I’ve actually done a complete run of “average adjustments” for the entire thing, which rendered some interesting results, but more on that later)



Okay, so I couldn’t help but do it. I had to try and reproduce the New Zealand “glitch”. And I did. When reading this, please remember that it is NOT scientific evidence, it is a first draft for reproducing the results found by others. I have not interpolated to get missing data, and some calculations may have some lack of precision, so do not rely on this data as “evidence” – simply as a reason to look further into the matter yourself. I am not a climatologist.

Recipe :

1. Fetch data from GHCN datasets found here : ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v2/ (v2.mean and v2.mean_adj)
2. Get all lines from v2.mean_adj where country code is 507 (New Zealand)
3. Remove all lines that have data missing for one or more months (I did NOT interpolate to get missing data, I simply excluded the entire year)
4. Get annual mean for each line by adding together the months and dividing by 12 (does this muck up the averages, since different months have different number of days?)
5. Group data by nearest WMO Station, and use the average of all annual means for that WMO Station as data for that WMO Station / Year
6. Repeat steps (2-6) for all lines in v2.mean that correspond to a line in v2.mean_adj

The result is rather interesting. I took this data and jammed it into the OpenOffice Calc program and let it plot the following for me (thick line is average of all series, with linear regression curve)

http://savecapitalism.files.wordpress.com/2009/11/unadjusted.jpg?w=600&h=567

Note how the linear regression curve shows that average of all series start slightly above 12 and ends at around 13, which would be very much in line with the “consensus” agreement that temperatures have gone up 0.7 degrees since the 1840s. However – this is not what you get if you take the adjusted data.

http://savecapitalism.files.wordpress.com/2009/11/adjusted.jpg?w=600&h=567

Whoa! A whopping 2 degreees increase in temperatures instead of the 0.7. How is this done? Well, look how the adjusted values makes the average start at around 11-11.5, and ends a bit above 13. The linear regression really says it all. I’ve tried to get the same coloring for the different series in both graphs, so the “adjustments” of different series become apparent. The series 93119 shows much of the story.

Now – I’m not saying these weren’t valid adjustments – but it is VERY INTERESTING that the raw data seems to show a COMPLETELY different picture than the adjusted data. If one “adjusts” data, one could reasonably expect (or??) that about the same amount of adjustments would be done up and down. Or at least that the adjustments would be reasonably spread over time, with somewhat similar magnitude (or??). Since I am not a climatologist, maybe I’m completely wrong.

MAYBE THE MOST PROBABLE THING IS FOR THE GRAPH TO LOOK LIKE SOMEONE SAT ON THE BACK END OF IT.

Food for thought, anyways ……

tangent4ronpaul
12-05-2009, 02:16 PM
http://savecapitalism.wordpress.com/2009/12/02/ghcn-database-adjustments/

GHCN Database Adjustments

http://savecapitalism.files.wordpress.com/2009/12/ghcn-adjustments2.jpg?w=600&h=387

This graph shows the average “adjustment” for data in different years, ergo the difference between temperature as stated in the unadjusted dataset with that stated in the adjusted dataset. The green line represents all lines that do not have missing data for any months. The red line represents all lines where adjustments have been made. Now, one would be inclined to think that, when a scientist in retrospect realize that someone has during a given period (due to a bad thermometer, poor eyesight, changed altitude or other) been misreading the temperature and the data needs adjustment, this would occur about equally frequent with too high and too low values. What this graph seems to tell us is that :

* By average, temperature measurements are biased upwards, ergo scientists need to adjust their data downwards
* During the period from 1901 to 1920, by average scientists has extra bad thermometers, or moved them around to make them extra unreliable, or some other factor made the average adjustment -0.25 C
* From the period 1920 to the mid 1980’s, temperature measurements were less and less erronous, but for some reason they became more erronous again thereafter

This graph is not used to make the point that temperatures have been adjusted consistently to make it appear to be warming (although the period from 1920 to 1980’s seem to indicate this, the drop thereafter countracts this claim). My main point is that this line should, by large, flutter around 0, should it not? If there is a systematic misreading of temperatures, isn’t this just as likely to be downwards biased as upwards biased (thus bringing the need for upwards adjustment as often as downward adjustment)

To recreate, follow this recipe :

1. Fetch data from GHCN datasets found here : ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v2/(v2.mean and v2.mean_adj)
2. Get all lines from v2.mean_adj, and subtract corresponding values from v2.mean
3. Sum together all 12 months for each line, divide by 12
4. Group lines by year and calculate the average value from (3)

Also, ponder the following graph. Why did the number of stations collapse in 2007?

http://savecapitalism.files.wordpress.com/2009/12/active-ghcn-stations.jpg?w=600&h=364

This graph has been published before here but it deserves republishing. Also, since I decided to reproduce any evidence I see, I created the graph all over again.

Update : Can anyone explain why there are a lot of temperature stations available in the raw dataset that are not used in the “adjusted” data. If the stations are clearly wrong, worthless, corrupted etc. then why not replace them with new, working ones? I do not understand what this graph tells me….

http://savecapitalism.files.wordpress.com/2009/12/ghcn-total-adjusted-stations1.jpg?w=600&h=492

tangent4ronpaul
12-05-2009, 02:22 PM
http://savecapitalism.wordpress.com/2009/11/29/sql-climate-data-setup/

SQL Climate Data Setup

Okay, so shortly after I was made aware that the GHCN world temperature datasets are available both in adjusted and un-adjusted form (at least – that’s what it looks like – please correct me if this is wrong), I realized that working with 500.000 lines of text files is ridiculous. I need a bloody SQL-server for this. Luckily, I’m a tech guy, so I spent a couple of hours and voila – I know have a small piece of software that takes these humongous temperature data files and slams them into a database.

First job is to find out which adjustments have been made, and interestingly enough, it seems that most of them are downward adjustments. This doesn’t really tell me anything however, its the distribution that I’m most interested in. Initially I just created tables that were identical to the text files and slammed in the data, which went quite fast – but I realized that I need to make some adaptions. Missing/incorrect values in the text files are set as “-9999″, which isn’t good when using SQL. I decided to convert these to NULL values instead. As I set columns to allow NULL values and restarted the data run, I realized that inserting data is now roughly twenty times slower. This means that I only do 2500 lines/minute, which means it’s going to take 3 hours instead of 3 minutes to insert each data file. Luckily – this only has to be done once per month, since thats the update frequency from NOAA.

If anyone who knows SQL is interested in a similar setup then send me an email at hans(dot)palmstierna(at)hotmail(dot)com and I can send you over the software. If you want the code (it’s written in C# .NET and its a very small console app, no oddities) then I can make it available here. If anyone has any tips about what I should start looking at then its very appreciated. The “todo” list currently looks like this

1. Compile list of the best stations (those with least missing data and least adjustments)
2. Verify the New Zealand mess (is this the right dataset? or do the NZ-people have another temperature dataset that isn’t included here)?
3. Start making inventory of odd adjustments (will be published graph-over-graph for each station with adjusted/unadjusted data)
4. Try to import paleo-data into SQL database as well

Since I’m only doing this because I need a good project to sink my teeth into, and from what I’ve read there will be publications of a lot of data in the coming future, if there is any specific data you want me to compile or pull then just send me an email or comment on this forum. My primary skills is programming and moving data around between formats etc., not statistical analysis, so if there is someone with good skills in statistics that want to start crunching datapoints then I’ll gladly supply whatever I can, in whichever format is appropriate.

This is becoming sort of a “community effort” to finally get to the bottom of this data manipulation scam. The import tool has now sped up to 90.000 lines / minute, so I guess the sluggishness was only temporary (running SQL-server, development environment, and a multitude of other things on a regular desktop comp).

amy31416
12-05-2009, 02:25 PM
Acch. I'll take a couple of aspirin and pore through some of this and see what I can make of it, if anything.

Thanks for posting it.

tangent4ronpaul
12-05-2009, 02:28 PM
http://savecapitalism.wordpress.com/2009/11/30/ghcn-adjustments-review-northern-europe/

GHCN Adjustments Review : Northern Europe

http://savecapitalism.files.wordpress.com/2009/11/ghcn-adjustment-northern-europe-data.jpg?w=600&h=452

Also, I would like to comment briefly on the graphs of New Zealand I posted recently. The averages were NOT correct temperature-averages, since they spanned different series (“all data available”) at different times, and this methodology was only used to prove that with adjustments – there would be a steeper curve.

* I will make OpenOffice Calc workbook available as soon as I get my things in order, everything is a bit of a mess right now as I’m just hammering away at different data looking for oddities.

tangent4ronpaul
12-05-2009, 02:46 PM
http://wattsupwiththat.com/2009/11/30/playing-hide-and-seek-behind-the-trees/

Playing hide and seek behind the trees
30 11 2009

Still Hiding the Decline

by Steve McIntyre

Even in their Nov 24, 2009 statement, the University of East Anglia failed to come clean about the amount of decline that was hidden. The graphic in their statement continued to “hide the decline” in the Briffa reconstruction by deleting adverse results in the last part of the 20th century. This is what Gavin Schmidt characterizes as a “good thing to do”.

First here is the Nov 2009 diagram offered up by UEA:

http://camirror.files.wordpress.com/2009/11/uea_nov2009_resized.jpg?w=400

Figure 1. Resized UEA version of Nov 2009, supposedly “showing the decline”. Original here ,

Here’s what UEA appears to have done in the above diagram.

While they’ve used the actual Briffa reconstruction after 1960 in making their smooth, even now, they deleted values after 1960 so that the full measure of the decline of the Briffa reconstruction is hidden. Deleted values are shown in magenta. Source code is below.

http://camirror.files.wordpress.com/2009/11/uea_nov2009.gif?w=420&h=320&h=320

Figure 2. Emulation of UEA Nov 2009, using all the Briffa reconstruction.



R SOURCE CODE:
##COMPARE ARCHIVED BRIFFA VERSION TO CLIMATEGATE VERSION

#1. LOAD BRIFFA (CLIMATEGATE VERSION)
# archive is truncated in 1960: ftp://ftp.ncdc.noaa.gov/pub/data/paleo/treering/reconstructions/n_hem_temp/briffa2001jgr3.txt”

loc=”http://www.eastangliaemails.com/emails.php?eid=146&filename=939154709.txt”
working=readLines(loc,n=1994-1401+104)
working=working[105:length(working)]
x=substr(working,1,14)
writeLines(x,”temp.dat”)
gate=read.table(“temp.dat”)
gate=ts(gate[,2],start=gate[1,1])

#2. J98 has reference 1961-1990
#note that there is another version at ftp://ftp.ncdc.noaa.gov/pub/data/paleo/contributions_by_author/jones1998/jonesdata.txt”

loc=”ftp://ftp.ncdc.noaa.gov/pub/data/paleo/contributions_by_author/jones2001/jones2001_fig2.txt”
test=read.table(loc,skip=17,header=TRUE,fill=TRUE, colClasses=”numeric”,nrow=1001)
test[test== -9.999]=NA
count= apply(!is.na(test),1,sum)
test=ts(test,start=1000,end=2000)
J2001=test[,"Jones"]

#3. MBH : reference 1902-1980
url<-"ftp://ftp.ncdc.noaa.gov/pub/data/paleo/contributions_by_author/mann1999/recons/nhem-recon.dat"
MBH99<-read.table(url) ;#this goes to 1980
MBH99<-ts(MBH99[,2],start=MBH99[1,1])

#4. CRU instrumental: 1961-1990 reference
# use old version to 1997 in Briffa archive extended
url<-"ftp://ftp.ncdc.noaa.gov/pub/data/paleo/treering/reconstructions/n_hem_temp/briffa2001jgr3.txt"
#readLines(url)[1:50]
Briffa<-read.table(url,skip=24,fill=TRUE)
Briffa[Briffa< -900]=NA
dimnames(Briffa)[[2]]<-c("year","Jones98","MBH99","Briffa01","Briffa00","Overpeck97","Crowley00","CRU99")
Briffa= ts(Briffa,start=1000)
CRU=window(Briffa[,"CRU"],start=1850)
tsp(CRU) # 1850 1999 #but starts 1871 and ends 1997
delta<-mean(CRU[(1902:1980)-1850])-mean(CRU[(1960:1990)-1850]);
delta # -0.118922
#used to get MBH values with 1961-1990 reference: compare to 0.12 mentioned in Climategate letters

#get updated version of CRU to update 1998 and 1999 values
loc="http://hadobs.metoffice.com/crutem3/diagnostics/hemispheric/northern/annual"
D=read.table(loc) #dim(D) #158 12 #start 1850
names(D)=c("year","anom","u_sample","l_sample","u_coverage","l_coverage","u_bias","l_bias","u_sample_cover","l_sample_cover",
"u_total","l_total")
cru=ts(D[,2],start=1850)
tsp(cru) # 1850 2009

# update 1998-1999 values with 1998 values
CRU[(1998:1999)-1849]= rep(cru[(1998)-1849],2)

#Fig 2.21 Caption
#The horizontal zero line denotes the 1961 to 1990 reference
#period mean temperature. All series were smoothed with a 40-year Hamming-weights lowpass filter, with boundary constraints
# imposed by padding the series with its mean values during the first and last 25 years.
#this is a low-pass filter
source("http://www.climateaudit.org/scripts/utilities.txt") #get filter.combine.pad function
hamming.filter<-function(N) {
i<-0:(N-1)
w<-cos(2*pi*i/(N-1))
hamming.filter<-0.54 – 0.46 *w
hamming.filter<-hamming.filter/sum(hamming.filter)
hamming.filter
}
f=function(x) filter.combine.pad(x,a=hamming.filter(40),M=25)[,2]

## WMO Figure at CRU
#http://www.uea.ac.uk/mac/comm/media/press/2009/nov/homepagenews/CRUupdate
#WMO: http://www.uea.ac.uk/polopoly_fs/1.138392!imageManager/1009061939.jpg
#2009: http://www.uea.ac.uk/polopoly_fs/1.138393!imageManager/4052145227.jpg

X=ts.union(MBH=MBH99+delta,J2001,briffa=briffa[,"gate"],CRU=cru ) #collate
Y=data.frame(X); year=c(time(X))
sapply(Y, function(x) range(year [!is.na(x)]) )
# MBH J2001 briffa CRU
# [1,] 1000 1000 1402 1850
# [2,] 1980 1991 1994 2009

smoothb= ts(apply(Y,2,f),start=1000)

xlim0=c(1000,2000) #xlim0=c(1900,2000)
ylim0=c(-.6,.35)
par(mar=c(2.5,4,2,1))
col.ipcc=c("blue","red","green4","black")

par(bg="beige")
plot( c(time(smoothb)),smoothb[,1],col=col.ipcc,lwd=2,bg="beige",xlim=xlim0,xaxs="i",ylim=ylim0,yaxs="i",type="n",axes=FALSE,xlab="",ylab="deg C (1961-1990)")
usr 1960
points( c(time(smoothb))[temp],smoothb[temp,"briffa"],pch=19,cex=.7,col=”magenta”)

tangent4ronpaul
12-05-2009, 02:58 PM
http://wattsupwiththat.com/2009/11/26/mcintyre-data-from-the-hide-the-decline/

McIntyre: The deleted data from the “Hide the Decline” trick
26 11 2009

By Steve McIntyre from his camirror.wordpress.com site.

For the very first time, the Climategate Letters “archived” the deleted portion of the Briffa MXD reconstruction of “Hide the Decline” fame – see here. Gavin Schmidt claimed that the decline had been “hidden in plain sight” (see here. ). This isn’t true.

The post-1960 data was deleted from the archived version of this reconstruction at NOAA here and not shown in the corresponding figure in Briffa et al 2001. Nor was the decline shown in the IPCC 2001 graph, one that Mann, Jones, Briffa, Folland and Karl were working in the two weeks prior to the “trick” email (or for that matter in the IPCC 2007 graph, an issue that I’ll return to.)

A retrieval script follows.

For now, here is a graphic showing the deleted data in red.

http://www.climateaudit.org/wp-content/uploads/2009/11/briffa_recon.gif

Figure 1. Two versions of Briffa MXD reconstruction, showing archived and climategate versions. The relevant IPCC 2001 graph, shown below, clearly does not show the decline in the Briffa MXD reconstruction.

Contrary to Gavin Schmidt’s claim that the decline is “hidden in plain sight”, the inconvenient data has simply been deleted.

The reason, as explained on Sep 22, 1999 by Michael Mann to coauthors in 938018124.txt, was to avoid giving “fodder to the skeptics”. Reasonable people might well disagree with Gavin Schmidt as to whether this is a “a good way to deal with a problem” or simply a trick.

http://camirror.files.wordpress.com/2009/11/fig2-212.gif?w=519&h=405&h=350

Figure 2. IPCC 2001 Fig 2.21 showing Briffa, Jones and Mann reconstructions together with HadCRU temperature.

Retrieval script:

##COMPARE ARCHIVED BRIFFA VERSION TO CLIMATEGATE VERSION

#1. LOAD ARcHIVED DATA

url<-"ftp://ftp.ncdc.noaa.gov/pub/data/paleo/treering/reconstructions/n_hem_temp/briffa2001jgr3.txt"
#readLines(url)[1:50]
Briffa<-read.table(url,skip=24,fill=TRUE)
Briffa[Briffa< -900]=NA
dimnames(Briffa)[[2]]<-c("year","Jones98","MBH99","Briffa01","Briffa00","Overpeck97","Crowley00","CRU99")
sapply(Briffa, function(x) range( Briffa$year[!is.na(x)]) )
# year Jones98 MBH99 Briffa01 Briffa00 Overpeck97 Crowley00 CRU99
#[1,] 1000 1000 1000 1402 1000 1600 1000 1871
#[2,] 1999 1991 1980 1960 1993 1990 1987 1997
Briffa= ts(Briffa,start=1000)

#2. LOAD CLIMATEGATE VERSION
loc="http://www.eastangliaemails.com/emails.php?eid=146&filename=939154709.txt"
working=readLines(loc,n=1994-1401+104)
working=working[105:length(working)]
x=substr(working,1,14)
writeLines(x,"temp.dat")
gate=read.table("temp.dat")
gate=ts(gate[,2],start=gate[1,1])

#Comparison
briffa=ts.union(archive= Briffa[,"Briffa01"],gate )
briffa=window(briffa,start=1402,end=1994) #
plot.ts(briffa)

X=briffa

par(mar=c(2.5,3,2,1))
plot( c(time(X)),X[,1],col=col.ipcc,lwd=2,ylim=c(-1.2,.5),yaxs="i",type="n",axes=FALSE,xlab="",ylab="")
for( i in 2:1) lines( c(time(X)),X[,i],col=i,lwd=1)
axis(side=1,tck=.025)
labels0=seq(-1,1,.1);labels0[is.na(match(seq(-1,1,.1),seq(-1,1,.5)))]=""
axis(side=2,at=seq(-1,1,.1),labels=labels0,tck=.025,las=1)
axis(side=4,at=seq(-1,1,.1),labels=labels0,tck=.025)
box()
abline(h=0)
title("Hide the Decline")
legend("topleft",fill=2:1,legend=c("Deleted","Archived"))

Uncle Emanuel Watkins
12-05-2009, 03:05 PM
http://www.examiner.com/x-9111-SF-Environmental-Policy-Examiner~y2009m11d24-As-we-wait-for-Round-2-of-climate-gate

As we wait for Round 2 of climate gate...

A number of computer scientists and engineers are analysing computer code contained in the files leaked anonymously to the Internet last week, and it will more than likely produce more controversy than the emails that have been the subject of intense discussion so far.

In fact, if the documentation (notes written by authors and fixers of the computer code) is any indication, what we have seen so far is only prelude.

But before the storm breaks, I think we should summarise what's important in the emails.

First, prominent climate scientists, including a lead author of IPCC report sections, were willing to discuss withholding or deleting information to frustrate legitimate requests made under the Freedom of Information Act in the UK. They apparently chose who could not receive information based on the requester's identity, which may have been unlawful. They threatened to delete data--data which in fact has since disappeared. They advised each other to delete emails.

Second, these same scientists worked closely together to control channels of communication regarding climate science and global warming. They banded together to minimise or eliminate skeptical discussion. While telling the world that only peer-reviewed science should be considered legitimate, they fiercely fought to prevent skeptic writings from being peer-reviewed at all. They wrote openly about replacing an uncooperative journal editor (who was later replaced), and boycotting journals that published skeptical papers. They organised peer review so that they reviewed each others' papers.

Third, they were willing to change data so that their presentations of the state of climate looked worse. At the end of the day, this is most damning--most of the rest, even apparently illegal FOI actions, is just politics and a playground media strategy. But while world governments were imposing taxes, changing energy policies, preparing energy-based conflict policies, planning to deal with warming-based immigration, these people were content to display figures that were wrongly exaggerated to show the warming they had previously predicted but could not find in actual measurements.

I am willing to speculate that further analysis of the computer code will contribute to discussions on why they were unable to show the warming they so desperately needed to find to justify their assertions that the IPCC was too consevative, but time will certainly tell.

In the meantime, while we're waiting for the next release, it's clear that different institutions should take control of several aspects of climate research. In the UK, there are a number of bodies that might be able to sort out what's been going on. Archiving and verification, proper evaluation of previous studies--the UK has a government department called The National Archive that does this for a living, and they have recently undertaken to completely modernise how they go about things. We might ask them for assistance.

Because the way we've done things so far is not getting us to where we need to be. We know there's a problem--global warming is real, and CO2 is a contributor. But we can no longer trust the numbers we have grown accustomed to using, nor the people who generated those numbers. Time for a shake-up.

In order to return to that which made us a nation great, we need to maintain our focus. This means avoiding distractions from that which made it great. As we are so painfully learning today, the flesh and blood in this nation is no better than the flesh and blood outside of its borders. We aren't a great nation because we have superior rulers, for these are necessary tyrants put in charge to serve us; likewise, we aren't a great nation because we have superior lawyers, doctors, business executives, educators, and other ruling elites, for these are the "official" part of the necessary tyranny so designated in order to serve the people.
We are a great nation because of a natural law declared by our Christian Founding Fathers and for no other reason. That natural law established for us a Civil Purpose as a self evident and unalienable Truth undeniable to the conscience to the extent that it trumps every past tradition that persecuted us and every future event yet to happen that might also make us suffer.
Why can't anyone else see the brilliance in this political scheme established by our Christian Founding Fathers?

tangent4ronpaul
12-05-2009, 03:06 PM
http://camirror.wordpress.com/2009/11/26/new-the-deleted-data/

Go here if you want the unmodified CRU data - you'll have to click some links...


New!! Data from the Decline
2009 November 26
by stevemcintyre

For the very first time, the Climategate Letters “archived” the deleted portion of the Briffa MXD reconstruction of “Hide the Decline” fame – see here.

Gavin Schmidt claimed that the decline had been “hidden in plain sight” (see here. ). This isn’t true. The post-1960 data was deleted from the archived version of this reconstruction at NOAA here and not shown in the corresponding figure in Briffa et al 2001, though pre-calibration values were archived in a different NCDC file here. While the decline was shown in Briffa et al 1998 and Briffa 2000, it was not shown in the IPCC 2001 graph, one that Mann, Jones, Briffa, Folland and Karl were working in the two weeks prior to the “trick” email (or for that matter in the IPCC 2007 graph, an issue that I’ll return to.) For now, here is a graphic showing the deleted data in red. A retrieval script follows.

[...]

tangent4ronpaul
12-05-2009, 03:12 PM
UK Met Office to release data and code
5 12 2009

While this is encouraging news, releasing a subset will fuel some suspicion. A better choice would be to release the entire set. It may be too little, too late, the die of public opinion has been cast. Had they done this six months ago, they would have appeared visionary, rather than reactionary. The most encouraging news is the statement: “We intend that as soon as possible we will also publish the specific computer code…”. I applaud that, and I hope they do a better job than NASA GISS did, whose code is so esoteric, it is difficult to get running. Many have tried, one may have succeeded. – Anthony

From the Met Office Press Release:

Release of global-average temperature data

05 December 2009

The Met Office has announced plans to release, early next week, station temperature records for over one thousand of the stations that make up the global land surface temperature record.

This data is a subset of the full HadCRUT record of global temperatures, which is one of the global temperature records that have underpinned IPCC assessment reports and numerous scientific studies. The data subset will consist of a network of individual stations that has been designated by the World Meteorological Organisation for use in climate monitoring. The subset of stations is evenly distributed across the globe and provides a fair representation of changes in mean temperature on a global scale over land.

This subset is not a new global temperature record and it does not replace the HadCRUT, NASA GISS and NCDC global temperature records, all of which have been fully peer reviewed. We are confident this subset will show that global average land temperatures have risen over the last 150 years.

This subset release will continue the policy of putting as much of the station temperature record as possible into the public domain.

We intend that as soon as possible we will also publish the specific computer code that aggregates the individual station temperatures into the global land temperature record.

As soon as we have all permissions in place we will release the remaining station records – around 5000 in total – that make up the full land temperature record. We are dependant on international approvals to enable this final step and cannot guarantee that we will get permission from all data owners.

UEA fully supports the Met Office in making this data publicly available and is continuing to work with the Met Office to seek the necessary permission from national data owners to publish, as soon as possible as much of the data that we can gain permission for.

Uncle Emanuel Watkins
12-05-2009, 03:25 PM
http://camirror.wordpress.com/2009/11/26/new-the-deleted-data/

Go here if you want the unmodified CRU data - you'll have to click some links...


New!! Data from the Decline
2009 November 26
by stevemcintyre

For the very first time, the Climategate Letters “archived” the deleted portion of the Briffa MXD reconstruction of “Hide the Decline” fame – see here.

Gavin Schmidt claimed that the decline had been “hidden in plain sight” (see here. ). This isn’t true. The post-1960 data was deleted from the archived version of this reconstruction at NOAA here and not shown in the corresponding figure in Briffa et al 2001, though pre-calibration values were archived in a different NCDC file here. While the decline was shown in Briffa et al 1998 and Briffa 2000, it was not shown in the IPCC 2001 graph, one that Mann, Jones, Briffa, Folland and Karl were working in the two weeks prior to the “trick” email (or for that matter in the IPCC 2007 graph, an issue that I’ll return to.) For now, here is a graphic showing the deleted data in red. A retrieval script follows.

[...]

We don't behave happily because being so makes us responsible, no; rather, we behave responsibly because being so makes us happy. If we are unhappy as human beings, we go to war. It's as simple as that. Don't blame human beings for this wayward behavior, but the Almighty who created them.

tangent4ronpaul
12-05-2009, 03:35 PM
http://wattsupwiththat.com/2009/12/04/uk-met-office-do-over-entire-global-temperature-series-160-years-worth/

UK Met office announces a do-over: entire global temperature series – 160 years worth
4 12 2009

Quite a bit different from their November 24th statement, which you can read here. For those that still think Climategate has no significant impact on climate science, this revelation tells another story.

Met Office to re-examine 160 years of climate data

Ben Webster, Environment Editor, The Times Online

The Met Office plans to re-examine 160 years of temperature data after admitting that public confidence in the science on man-made global warming has been shattered by leaked e-mails.

The new analysis of the data will take three years, meaning that the Met Office will not be able to state with absolute confidence the extent of the warming trend until the end of 2012.

The Met Office database is one of three main sources of temperature data analysis on which the UN’s main climate change science body relies for its assessment that global warming is a serious danger to the world. This assessment is the basis for next week’s climate change talks in Copenhagen aimed at cutting CO2 emissions.

The Government is attempting to stop the Met Office from carrying out the re-examination, arguing that it would be seized upon by climate change sceptics.

The Met Office works closely with the University of East Anglia’s Climatic Research Unit (CRU), which is being investigated after e-mails written by its director, Phil Jones, appeared to show an attempt to manipulate temperature data and block alternative scientific views.

The Met Office’s published data showing a warming trend draws heavily on CRU analysis. CRU supplied all the land temperature data to the Met Office, which added this to its own analysis of sea temperature data.

Since the stolen e-mails were published, the chief executive of the Met Office has written to national meteorological offices in 188 countries asking their permission to release the raw data that they collected from their weather stations.

The Met Office is confident that its analysis will eventually be shown to be correct. However, it says it wants to create a new and fully open method of analysing temperature data.

The development will add to fears that influential sceptics in other countries, including the US and Australia, are using the controversy to put pressure on leaders to resist making ambitious deals for cutting CO2.

The UN’s Intergovernmental Panel of Climate Change admitted yesterday that it needed to consider the full implications of the e-mails and whether they cast doubt on any of the evidence for man-made global warming.

========

“influential sceptics in other countries” I wonder who that could be?

I applaud the open process though.

-----------

I think this means we "won"! :)

-t

squarepusher
12-05-2009, 03:44 PM
the rumor was, you could feed white noise into thier inputs, and output global warming everytime

Uncle Emanuel Watkins
12-05-2009, 04:17 PM
the rumor was, you could feed white noise into thier inputs, and output global warming everytime

Remember the ancient Chinese story involving Confucius, his disciples, a woman living in the country and a tiger? When Confucius asked a country woman where her husband was, she answered he been eaten by a tiger. When he asked her where her children were, she answered that her only son had also been eaten by a tiger. When Confucius then asked her why she hadn't moved back to the city where she would be safe from the tiger, the woman answered that the city was ruled by an evil government.
Upon hearing her reply, Confucius marvelled at the woman. He then turned to his disciples to declare, "Let it be known that the people would rather be eaten by a tiger than have to suffer under the corruption of an evil tyranny."
Of course, this is in my own words, not being Confucius word for word.
So? Should we stop cleaning up? No, not in the least. I would think that the thought of cleaning up our environment makes most of us "happy." In other words, the worst pollution ever suffered by man is tyranny. As we continue cleaning up the environment, the cleaning up of tyranny should continue being our primary objective.

Anti Federalist
12-05-2009, 04:33 PM
Remember the ancient Chinese story involving Confucius, his disciples, a woman living in the country and a tiger? When Confucius asked a country woman where her husband was, she answered he been eaten by a tiger. When he asked her where her children were, she answered that her only son had also been eaten by a tiger. When Confucius then asked her why she hadn't moved back to the city where she would be safe from the tiger, the woman answered that the city was ruled by an evil government.
Upon hearing her reply, Confucius marvelled at the woman. He then turned to his disciples to declare, "Let it be known that the people would rather be eaten by a tiger than have to suffer under the corruption of an evil tyranny."
Of course, this is in my own words, not being Confucius word for word.
So? Should we stop cleaning up? No, not in the least. I would think that the thought of cleaning up our environment makes most of us "happy." In other words, the worst pollution ever suffered by man is tyranny. As we continue cleaning up the environment, the cleaning up of tyranny should continue being our primary objective.

The point UEW, is simply this:

Tyranny is being brought to our door in the guise of "cleaning up the environment".

Like the concept of Lebensraum in NAZI Germany, while benign enough when discussed, the end result is the death camp.

Thus will be most of humanity's fate under the banner of fighting AGW, once it becomes clear, through fraudulent data, that "half measures" are not working and a "Final Solution" is needed.

tangent4ronpaul
12-05-2009, 04:34 PM
http://wattsupwiththat.com/2009/12/04/jo-nova-finds-the-medieval-warm-period/

Jo Nova finds the Medieval Warm Period
4 12 2009

From Jo Nova a look at how the MWP looks when other data is used, not just a few trees in Yamal.

These maps and graphs make it clear just how brazen the fraud of the Hockey Stick is.

http://joannenova.com.au/globalwarming/skeptics-handbook-ii/web-pics/mwp-global-studies-map-i-ppt.gif

Click to enlarge

It’s clear that the world was warmer during medieval times. Marked on the map are study after study (all peer-reviewed) from all around the world with results of temperatures from the medieval time compared to today. These use ice cores, stalagmites, sediments, and isotopes. They agree with 6,144 boreholes around the world which found that temperatures were about 0.5°C warmer world wide.

http://joannenova.com.au/globalwarming/skeptics-handbook-ii/web-pics/boreholes-huang-1997.gif

http://joannenova.com.au/globalwarming/skeptics-handbook-ii/web-pics/most-influential-tree-350.jpg

What follows is a sordid tale of a graph that overthrew decades of work, conveniently fitted the climate models, and was lauded triumphantly in glossy publication after publication. But then it was crushed when an unpaid analyst stripped it bare. It had been published in the highest most prestigious journal, Nature, but no one had checked it before or after it was spread far and wide. Not Nature, not the IPCC, not any other climate researcher.

In 1995 everyone agreed the world was warmer in medieval times, but CO2 was low then and that didn’t fit with climate models. In 1998, suddenly Michael Mann ignored the other studies and produced a graph that scared the world — tree rings show the “1990’s was the hottest decade for a thousand years”. Now temperatures exactly “fit” the rise in carbon! The IPCC used the graph all over their 2001 report. Government departments copied it. The media told everyone.

But Steven McIntyre was suspicious. He wanted to verify it, yet Mann repeatedly refused to provide his data or methods — normally a basic requirement of any scientific paper. It took legal action to get the information that should have been freely available. Within days McIntyre showed that the statistics were so flawed that you could feed in random data, and still make the same hockey stick shape nine times out of ten. Mann had left out some tree rings he said he’d included. If someone did a graph like this in a stock prospectus, they would be jailed.

http://joannenova.com.au/globalwarming/skeptics-handbook-ii/web-pics/synthesis-report-summary-tar-hockey-stick-web.gif

Astonishingly, Nature refused to publish the correction. It was published elsewhere, and backed up by the Wegman Report, an independent committee of statistical experts.

http://joannenova.com.au/globalwarming/skeptics-handbook-ii/web-pics/rcs_chronologies1-ii-web.gif

In 2009 McIntyre did it again with Briffa’s Hockey Stick. After asking and waiting three years for the data, it took just three days to expose it too as baseless. For nine years Briffa had concealed that he only had 12 trees in the sample from 1990 onwards, and that one freakish tree virtually transformed the graph. When McIntyre graphed another 34 trees from the same region of Russia, there was no Hockey Stick.

The sharp upward swing of the graph was due to one single tree in Yamal.

Skeptical scientists have literally hundreds of samples. Unskeptical scientists have one tree in Yamal, and a few flawed bristlecones…

It was an audacious fraud.

Climate models don’t know why it was warmer 800 years ago.

The models are wrong.

The so-called “expert review” is meaningless. The IPCC say 2,500 experts review their reports, but those same “experts” made the baseless Hockey Stick graph their logo in 2001.

http://joannenova.com.au/globalwarming/skeptics-handbook-ii/web-pics/loehle_e-e_2007-5-fig-2-web.gif

Craig Loehle used 18 other proxies. Temperatures were higher 1000 years ago, & cooler 300 years ago. We started warming long before cars and powerstations were invented. There’s little correlation with CO2 levels.

Sources: Loehle 2007, Haung and Pollack 1997, See co2science.org for all the other peer reviewed studies to go with every orange dot on the map. McIntyre & McKitrick 2003 and 2005, and update, Mann et al 1998, Briffa 2006, read McIntyre at climateaudit.com, see “ClimateGate”, and Monckton “What Hockey Stick” (Science and Public Policy Institute paper)

GunnyFreedom
12-05-2009, 05:43 PM
Good, good stuff -t

Bruno
12-05-2009, 06:20 PM
Thanks for sharing this, tangent.

Brooklyn Red Leg
12-05-2009, 06:44 PM
I'm of the opinion that Mann and other US 'climate scientists' that are caught up in this whole fraud need to spend time in Federal Penitentiary and be forced to pay back at least SOME of the money they have robbed from the US taxpayers.

mczerone
12-05-2009, 06:52 PM
To quote the Dr.'s second most recent book:

You have been lied to

tangent4ronpaul
12-05-2009, 08:46 PM
Yeah - we've all been lied to - thanks for your support of this thread!

-t

FindLiberty
12-05-2009, 09:53 PM
The source code, as explained in this thread, is very easy to understand. Thanks to all!


Originally Posted by squarepusher
...the rumor was, you could feed white noise into their inputs,
and [the program would] output global warming every time

I think using weighted Pink Noise instead of using White Noise would be
a more likely test data source used by these Watermelon Marxist (green on the outside,
red on the inside) bogus scientists! It appears that these corrupt self-serving
programmer hacks are the ones who directly helped create and/or justify Gore's bogus
global warming cult.

+++

This makes me think of the electronic voting machines and how difficult it must be to keep
the vote counting programming code from becoming willfully "corrupted" to secretly
cause a particular election outcome.

Also, can we ever really trust the FED or the FDIC's accounting any farther than I could
throw the GOA?

parocks
12-05-2009, 11:20 PM
Link to 20 page pdf with this info and more.

http://joannenova.com.au/globalwarming/skeptics-handbook-ii/the_skeptics_handbook_II-sml.pdf

Andrew-Austin
12-05-2009, 11:30 PM
Can you summarize what all this says, so as to save people the trouble?

Dieseler
12-05-2009, 11:36 PM
Someone should take all of this data and put it in a time capsule somewhere. Not sure how but make it to be found or opened in twenty years.
A capstone maybe?
I bet it would open some eyes on that date.
We are the Priests of the Temples of Syrynx.
Our Great Computers fill the Hallowed Halls.
Yeah, leave a guitar and a copy of 2112 in there to.
Thanks for the research by the way -T

Brian4Liberty
12-06-2009, 12:58 AM
Tangent, you seem to be posting unrelated information in a confusing manner. What is your source for all of your postings? Do you have any input, or are you just providing "raw" internet data for the rest of us to analyze?

GunnyFreedom
12-06-2009, 01:04 AM
Tangent, you seem to be posting unrelated information in a confusing manner. What is your source for all of your postings? Do you have any input, or are you just providing "raw" internet data for the rest of us to analyze?

Doesn't look unrelated to me. In fact, it looks pretty cohesive. Maybe we are reading different threads? Some of the data are from different sources, but the distinctions in context are not nearly enough to consider them unrelated. These posts all center on the source programming that produces the charts, and contrasts the manipulated data against the unmanipulated data demonstrating a clear pattern of misbehavior from the CRU.

dannno
12-06-2009, 01:07 AM
http://www.clubtroppo.com.au/wp-content/uploads/2007/04/threadofdoom.gif

Brian4Liberty
12-06-2009, 01:12 AM
Some of the data are from different sources,

That is what I am referring to. Certainly they are all "related" to the source code, but it is all kind of random. I am not attempting to defend the climate frauds in any way.

GunnyFreedom
12-06-2009, 01:20 AM
That is what I am referring to. Certainly they are all "related" to the source code, but it is all kind of random. I am not attempting to defend the climate frauds in any way.

Well all the data is relevant, not sure how the "randomness" of the sourcing could be alleviated except maybe posting in a different order? I'm loving the hell out of this thread. I'm with dannno on this one -- epic thread!

tangent4ronpaul
12-06-2009, 04:57 AM
Doesn't look unrelated to me. In fact, it looks pretty cohesive. Maybe we are reading different threads? Some of the data are from different sources, but the distinctions in context are not nearly enough to consider them unrelated. These posts all center on the source programming that produces the charts, and contrasts the manipulated data against the unmanipulated data demonstrating a clear pattern of misbehavior from the CRU.

Well it's all based on CRU data, but some of the posts are about NZ climate data being modified. I don't think I posted the one about the NOAA Hawaii "oops"...

Point being that corruption in one lab has propagated to corrupt, intentionally or otherwise - other labs results.

-t

tangent4ronpaul
12-06-2009, 11:38 AM
blimpers!

tangent4ronpaul
12-06-2009, 07:41 PM
no more comments?

-t

huckans
12-08-2009, 10:32 PM
Very impressive. I am a physicist and just received an email from Lindzen (MIT) asking me to sign a petition asking APS (American Physical Society) to change its policy statement from "Anthropogenic Climate Change is Real" to "The Jury is Still Out." This in light of the East Anglia leaks. Of course I signed it. Lindzen appears in the 2007 documentary "The Great Global Warming Swindle."

Also, there was an excellent interview on RT in the last day or so in which Monckton quoted a recent conversation with Lindzen on the likelihood of temperatures being warmer in 100 years. The answer: we don't know.

hillertexas
12-09-2009, 01:08 PM
Very impressive. I am a physicist and just received an email from Lindzen (MIT) asking me to sign a petition asking APS (American Physical Society) to change its policy statement from "Anthropogenic Climate Change is Real" to "The Jury is Still Out." This in light of the East Anglia leaks. Of course I signed it. .

My husband is a geophysicist and just signed the "30,000 scientist" petition. You probably qualify to sign it too if you haven't already. You have to mail it in...no internet signatures allowed.
http://www.petitionproject.org/ http://www.oism.org/pproject/GWPetition.pdf

more info: http://www.oism.org/pproject/s33p36.htm ----->treatise on why AGW is wrong

huckans
12-14-2009, 12:53 AM
Thanks,

I sent that one in too.

Reason
12-14-2009, 10:32 AM
"Climate-gate" emails controversy examined
http://therealnews.com/t2/index.php?option=com_content&task=view&id=31&Itemid=74&jumival=4605

huckans
12-14-2009, 02:38 PM
The leaking of these emails reminds me of the demystification of the FED. I think we are living through an Oz "pulling back of the curtain" moment in these two areas.

idirtify
12-14-2009, 04:27 PM
This source-code stuff will certainly be difficult for us laypeople to interpret regarding the big picture, but to me Climategate looks worse than a smoking gun. In fact, it looks like hard evidence of the crime. Am I wrong? The “experts” originally came out saying there was no smoking gun in the emails. It looks to me like there was plenty of smoke and it led right to the fire.

idirtify
12-15-2009, 10:35 AM
this source-code stuff will certainly be difficult for us laypeople to interpret regarding the big picture, but to me climategate looks worse than a smoking gun. In fact, it looks like hard evidence of the crime. Am i wrong? The “experts” originally came out saying there was no smoking gun in the emails. It looks to me like there was plenty of smoke and it led right to the fire.

am i wrong?

idirtify
12-15-2009, 10:48 AM
The leaking of these emails reminds me of the demystification of the FED. I think we are living through an Oz "pulling back of the curtain" moment in these two areas.

Good point. The whole internet is leading to the demystification of all government “royalty”. That’s why governments have historically hated free communication; it always eventually leads to their exposure. I’m sure modern communication technology is very frustrating for them; they can’t fully enjoy their ease of propaganda without it backfiring on them.

huckans
12-22-2009, 10:21 AM
So, what's the final upshot on Copenhagen? It sounds like very little was accomplished--am I wrong?

tangent4ronpaul
12-22-2009, 10:51 AM
So, what's the final upshot on Copenhagen? It sounds like very little was accomplished--am I wrong?

Obama left with a non-binding, unenforcable agreement and pledged to cut US emitions and tax us hundreds of billions to give to developing countries to help them deal with the problem.

Meanwhile, He's about to tell Mexico thy are exempt, so the rest of our manufacturing infrastructure will go south of the border as carbon taxes make it too expensive to continue making things in this country.

Oh, and he promiced to get together in a few years for another climate conference.

Way to go Obama! - NOT!

-t

huckans
12-23-2009, 09:16 PM
And the next conference will be in...Mexico City?