By Kyle Niemeyer | Published about an hour ago
In order to improve long-term predictions of global climate change, we need more information about the current and changing environment.
Unfortunately, in the current era of government budget problems, expensive satellite climate studies are being cut, so it is important to identify the measurements we need the most, choosing among things like air temperature, pressure, humidity, radiance at various wavelengths, radiation transfer to and from the surface, etc.
One possible way of prioritizing is to figure out which of those measures would help us the most when it comes to projecting future climate change, and focus research funds there. A paper that recently appeared in the Proceedings of the National Academies of Science presents a statistical method for doing this and shows that surface temperature measurements may not be the most useful data to improve surface temperature predictions.
By including more data into climate models, the accuracy and precision of long-term predictions can be improved. In fact, any additional data could improve the predictions, but with limited resources, emphasis should be placed on obtaining the most useful data.
For the new paper, researchers started with a set of models used in the Intergovernmental Panel on Climate Change's (IPCC's) Fourth Assessment Report, and used a statistical method known as Bayesian inference to determine the improvements in accuracy and precision that resulted from including additional data when modeling one particular emissions scenario, the IPCC's A1B. This scenario gives a best estimate for temperature rise of 2.8°C, but has a likely range of 1.7-4.4°C—it's obvious that additional data to refine this range would be useful.
more
more
http://arstechnica.com/science/news/2011/06/climatologists-figuring-out-which-data-makes-their-models-better.arsOf course, we always have money for more wars, and tax cuts...