By Chris Lang
A new report from the Union of Concerned Scientists claims to show “how a substantial number of developing countries … have reduced deforestation.” The report is based on peer-reviewed quantitative data. Good news!
Well, sort of. Unfortunately, as the authors of the report admit, the “success” in reducing deforestation is the result of a very selective analysis: “This is not a meta-analysis or a comprehensive review paper – we did not look for failures…”
Imagine if we took this approach with football. England’s miserable performance in the World Cup, for example. We don’t need to worry about the fact that Wayne Rooney managed to head the ball against the bar from less than a metre from the goal. That was a failure, and we’re not looking for failures. England scored in both of the first two games. Success. Wayne Rooney scored against Uruguay. Success. All we need to do is ignore the fact that Italy and Uruguay scored more goals than England and we have a success!
Obviously, the UCS report, “Deforestation Success Stories”, isn’t such a blatant fraud. But it’s not far off.
The success stories in the UCS report are limited to tropical countries with quantitative evidence of a reduction in deforestation, with “some independent review of the evidence”. The report lists three elements that were not required for inclusion in the report: evidence of additionality, estimates of leakage and assurance that social and economic benefits were widely and fairly shared.
It makes use of “The Forest Transition Curve”, which looks reasonable enough, but reveals a serious flaw in the UCS report:
At first deforestation is rapid, then it slows, before transitioning to reforestation. But another way of looking at this “forest transition” is from highly biodiverse natural forests, to logged over degraded forests and monoculture tree plantations. UCS does not define what it means by forests in the report. Without a definition of forests that excludes industrial tree plantations, the concept of forest transition becomes meaningless.
The report divides “successes” into three categories. The first is where policies and programmes have led to reduced deforestation (Brazil, Guyana, Madagascar, Kenya, and India). The second is where payments for ecosystem services have been “beneficial for forests” despite not working out as planned (Mexico, Vietnam, Costa Rica, and Tanzania/Mozambique). The third is where “success” is more due to changes in the socioeconomic context than policy reforms (central Africa, El Salvador). Future posts on REDD-Monitor will look in more detail at UCS’s analysis of deforestation in some of these countries.
The report admits that nearly all successes are partial – limited by “leakage”, for example, where deforestation is reduced in one area, but increases somewhere else. Nevertheless, UCS maintains that, “the overall result is that deforestation has been reduced and reforestation increased”. But without defining what a forest is, statements like this are close to meaningless.
The report notes that the rate of deforestation in the 1990s was 16 million hectares per year. In the first decade of the 21st century the figure came down to 13 million hectares per year. But the source for UCS’s figures is the UN Food and Agriculture Organisation, and FAO’s figures are notoriously misleading. The online forest monitoring project Global Forest Watch relies on satellite data and paints a somewhat different picture. Between 2000 and 2012, the world lost a total area of 229.8 million hectares of tree cover. In 2012, the figure was 20.8 million hectares.
Global Forest Watch is a huge improvement over the FAO deforestation data. But Global Forest Watch’s information on deforestation starts in 2000 so UCS couldn’t use it to compare deforestation in the 1990s with deforestation today. Because Global Forest Watch uses a different definition of forests (and more reliable data) these figures cannot be compared with the FAO deforestation rates. UCS doesn’t mention the problems with the FAO data and in its discussion of global deforestation rates, UCS doesn’t mention Global Forest Watch at all.
Of course, UCS’s authors are aware of Global Forest Watch. In fact they use the data to exclude Indonesia from the report. UCS notes that while official data shows a reduction in the rate of deforestation, Global Forest Watch reveals an increase in Indonesia’s deforestation. Since 2010, when the moratorium on forest concessions started, the rate of deforestation in Indonesia has doubled. UCS, though, isn’t looking for failures:
Because of this conflicting information, we felt that at present we could not confidently consider Indonesia a success story in accordance with our criteria, and thus we left it out of the report.
REDD is “primary among the successful approaches”, according to UCS. This perhaps surprising conclusion is helped by using a broad definition of REDD. It includes payments for ecosystem services, law enforcement, governance reforms, combating corruption, land tenure, reinforcement of private commitments, moratoria, and “combining environmental actions with social and economic development efforts”.
And at the end of the report, UCS recommends that policy makers should “Implement REDD+ programs” because REDD “has had a major impact”. But like England’s performance in Brazil, REDD can only be declared a success if we don’t look for failures.
This is a report that seems to have started from two conclusions: that REDD will save the world’s forests, and that more money is needed to finance REDD. The authors worked back from these conclusions, cherry picking information that supported their case. Of course, there is some truth in the report. Since 2004, Brazil has reduced its rate of deforestation, for example. But it is ridiculous to make policy recommendations based on an analysis that explicitly did not look into additionality, leakage, sharing of socio-economic benefits, and that did not look for failures.
Thanks Chris
Any idea who funded this fairy tale?
Wally