Why is biodiversity ‘good’? To what extent is this sort of goodness even relevant to ecosystems—as opposed to us humans? I’d like to study this mathematically.
To do this, we’d need to extract some answerable questions out of the morass of subtlety and complexity. For example: what role does biodiversity play in the ability of ecosystems to be robust under sudden changes of external conditions? This is already plenty hard to study mathematically, since it requires understanding ‘biodiversity’ and ‘robustness’.
Luckily there has already been a lot of work on the mathematics of biodiversity and its connection to entropy. For example:
• Tom Leinster, Measuring biodiversity, Azimuth, 7 November 2011.
But how does biodiversity help robustness?
There’s been a lot of work on this. This paper has some inspiring passages:
• Robert E. Ulanowicz,, Sally J. Goerner, Bernard Lietaer and Rocio Gomez, Quantifying sustainability: Resilience, efficiency and the return of information theory, Ecological Complexity 6 (2009), 27–36.
I’m not sure the math lives up to their claims, but I like these lines:
In other words, (14) says that the capacity for a system to undergo evolutionary change or self-organization consists of two aspects: It must be capable of exercising sufficient directed power (ascendancy) to maintain its integrity over time. Simultaneously, it must possess a reserve of flexible actions that can be used to meet the exigencies of novel disturbances. According to (14) these two aspects are literally complementary.
The two aspects are ‘ascendancy’, which is something like efficiency or being optimized, and ‘reserve capacity’, which is something like random junk that might come in handy if something unexpected comes up.
You know those gadgets you kept in the back of your kitchen drawer and never needed… until you did? If you’re aiming for ‘ascendancy’ you’d clear out those drawers. But if you keep that stuff, you’ve got more ‘reserve capacity’. They both have their good points. Ideally you want to strike a wise balance. You’ve probably sensed this every time you clean out your house: should I keep this thing because I might need it, or should I get rid of it?
I think it would be great to make these concepts precise. The paper at hand attempts this by taking a matrix of nonnegative numbers to describe flows in an ecological network. They define a kind of entropy for this matrix, very similar in look to Shannon entropy. Then they write this as a sum of two parts: a part closely analogous to mutual information, and a part closely analogous to conditional entropy. This decomposition is standard in information theory. This is their equation (14).
If you want to learn more about the underlying math, click on this picture:
The new idea of these authors is that in the context of an ecological network, the mutual information can be understood as ‘ascendancy’, while the conditional entropy can be understood as ‘reserve capacity’.
I don’t know if I believe this! But I like the general idea of a balance between ascendancy and reserve capacity.
While the dynamics of this dialectic interaction can be quite subtle and highly complex, one thing is boldly clear—systems with either vanishingly small ascendancy or insignificant reserves are destined to perish before long. A system lacking ascendancy has neither the extent of activity nor the internal organization needed to survive. By contrast, systems that are so tightly constrained and honed to a particular environment appear ‘‘brittle’’ in the sense of Holling (1986) or ‘‘senescent’’ in the sense of Salthe (1993) and are prone to collapse in the face of even minor novel disturbances. Systems that endure—that is, are sustainable—lie somewhere between these extremes. But, where?