At the heart of fisheries management is a delicate balancing act – weighing the needs of fishermen (and their families and communities) against those of fish (and their ecosystems). It is a difficult task, even with the best possible information. Unfortunately, managers often don’t have the quantity or quality of data one would wish for. A new report by scientists at Stanford University and Environmental Defense fund breaks down the options available to those attempting to manage such “data-poor” fisheries.
- Extrapolation: This is the worst case scenario – a fish stock about which almost nothing is known. The only option available is to make educated guesses based on other populations or similar species. The authors of the report urge “extreme caution” and “precautionary management” because it is fairly easy to make a mistake that could allow a population to crash.
- Vulnerability: When scientists know something about the basic life-history of a fish – how long it lives, how fast it grows and reproduces, whether it migrates or forms large schools – this information can be used to get a handle on how vulnerable a population might be to a given level of fishing pressure. The report calls such methods “useful in determining potential conservation measures and management decisions” but cautions they are “only as good as the original data inputs.”
- Trends: By adding some relatively easy-to-collect data about how many and what sizes of fish are being caught, managers can start to get a sense of changes in the fish population. The idea is to use recent fishing history to set sustainable catch limits; if catches are declining, the limits are probably too high. The caveat here is that it’s hard to know what’s causing changes – fishing, or some natural process. Additional information about environmental conditions can help in that regard. Even then, the use of such trends to make management decisions is based on the assumption that current trends are constant, not just a passing phase. That’s a rather large assumption, and we all know the old adage about what happens when we assume. For example, a stock that was overfished before catch data was collected could present the appearance of sustainability when, in fact, it’s just barely hanging on.
- Decision-making: The final option, requiring the most detailed data, is the use of so-called decision trees. Comparison of fished and unexploited populations – either from another location where fishing is restricted or from the same population before fishing began – can produce estimates of how healthy the stock is and, with some further refinement, define acceptable catch limits.
These methods provide efficient, inexpensive ways to get at the most fundamental information necessary for fishery management – stock health and sustainable catch levels – using limited data and relatively simple math. Since all fisheries were at some point (and many remain) both data- and resource-poor, it only seems logical that such methods would be the first ones developed. And yet, they’re fairly new. I asked Rod Fujita why:
It doesn’t make much sense on the face of it. But I suppose the conventional wisdom was that we should just assess the important stocks and use sophisticated methods. The trouble is, we don’t really know which stocks are important ecologically and there has never been enough resources or talent to get the job done. Then Congress mandated annual catch limits for ALL federally managed stocks, so that lit a fire and created strong incentives for new methods to be developed.
The pressure of a deadline seems to have produced the desired result. NOAA’s retired chief fisheries scientist, Steve Murawski, recently announced that the 2010 fishing season (which ends at various points in 2011) is expected to be the first in which no U.S. stocks are being actively overfished – thanks largely to reduced or newly established fishing limits. But Fujita also points out that simple methods for managing fisheries have been around for millenia – they just weren’t science-based.