by Michael Conathan
When fishermen cast off their lines and leave the dock, they believe their skill, knowledge, and experience will lead them to the fish. They trust that weather and natural forces will not present more of an obstacle than their crafts can handle. And they hope when they return to port, the price they can get for their haul of fish will exceed the investment they have made in gear, crew, fuel, supplies, insurance, and the other countless costs of doing business.
At the heart of these costs and every economic decision fishermen have to make is one fundamental data point: the annual catch limit. In 2006 when Congress reauthorized the Magnuson-Stevens Fishery Conservation and Management Act — the law that provides the framework for U.S. fishery management — it included a requirement that the National Oceanic and Atmospheric Association, or NOAA, must set a cap on the amount of each species that fishermen would be allowed to catch in a given year. This cap is referred to as an annual catch limit. This mandate was further strengthened by another provision of law stating that managers could not set a limit exceeding the level recommended by scientists.
Imposing annual catch limits certainly makes sense. In order to get the best economic return from our fisheries over time, we must catch what we can today while leaving enough in the water to ensure the resource remains solvent for the future. Still, as Eric Schwaab, then-administrator of NOAA’s National Marine Fisheries Service, announced at the time, doing so was an incredibly “heavy lift.”
Now, not even a month after NOAA’s announcement, there is reason to wonder whether our current approach to funding and executing the science that produces the fishery stock assessments that underpin the annual catch limits can support the weight of that responsibility.
In October 2011 stakeholders in New England’s groundfishery began to catch wind that the latest assessment of the health of one of its iconic species, the Gulf of Maine cod, was likely to show dramatically different results than the previous assessment. Until word of the new study began to leak out, Gulf of Maine cod had been considered a major success story — a harbinger of this historic fishery’s return to greatness. Then seemingly overnight the cod stock went from feast to famine.
The 2008 assessment had estimated the total amount of Gulf of Maine cod in the ecosystem — referred to as the biomass — was 74.9 million pounds. When the updated assessment was released in the fall of 2011 however, the new estimate of biomass was just one-third the previous size — 26.5 million pounds. Even though fishermen had been acting responsibly and staying within what the scientists and managers told them was a sustainable catch limit, the stock went from not experiencing overfishing and on track to achieve its rebuilding goals to being so incredibly overfished that even if all cod fishing stopped immediately, the species would not recover by the end of its required rebuilding period in 2014.
In a fishery struggling to regain its feet after decades of overfishing, and in which relationships among regulators, fishermen, politicians, scientists, and other stakeholders are testy at best, this news was decidedly unhelpful and unwelcome. Given that the industry is still polarized by and struggling to adapt to a new management system, the timing of this announcement couldn’t have been worse.
So what happened? Superstitious folks might cast a pointed glance in my direction, given that the news of the new assessment began to leak out just weeks after I wrote a piece titled “Optimism for New England’s Groundfishery.”
Fortunately for me, the contents of the assessment itself point to some more likely reasons for the Gulf of Maine cod’s dramatic reversal of fortune. To simplify, it appears the 2008 assessment gave undue weight to a couple of particularly robust survey trawls and construed them to indicate the presence of an unusually good year class of cod born in 2005 (think a one-year baby boom). Conversely the 2011 assessment found no presence of a robust 2005-year class. Additionally the 2011 assessment incorporated new data on recreational catch, new percentages of fish discarded (caught but thrown back), and revised average weights-at-age for fish.
Some fishermen have contested these results, suggesting the recreational data is flawed, and that they have seen no decline in abundance of cod on the fishing grounds. Assessing these counterarguments is beyond the scope of this piece, but for a more detailed breakdown of why some fishermen haven’t noticed a decline in fish, check out this article from the Gulf of Maine Research Institute.
The bottom line is that stock assessments are not exact. They are estimates. They contain vast uncertainties that can swing both ways. In 2010, for example, groundfishermen feared a low catch-limit on pollock would cause an early shutdown of the fishery, but a midyear revision of the stock assessment showed a much healthier population and allowed managers to increase the catch limits — by 600 percent.
So what happens when predictions prove to be wrong? It’s not the fault of the fishermen that flaws were found in the 2008 cod assessment that have now painted the industry into a very tight corner. Furthermore it’s a virtual certainty that when the next assessment is done, it will find flaws in the 2011 version as well.
The problem of uncertain stock assessments is not unique to the groundfishery. NOAA regulates 528 distinct fish stocks, and each one is now required to have an annual catch limit. Yet the agency only has sufficient funding to assess a small fraction of them. And as the Gulf of Maine cod has shown, even when the best science is applied, results are sure to vary.
Fluctuations on such an extreme scale — up or down — are bad for business. Stability — or at least reasonable predictability — is the key to long-term sustainability. It’s clear that if we continue to manage our fisheries on this boom-and-bust cycle, we’re never going to get there. Day-trading thrill-seekers aside, any reputable investment manager will tell you that trying to recalibrate your portfolio to short-term market fluctuations is a losing game.
It seems the one thing all fishery stakeholders agree on at the moment is that we need “better science.” While clearly true, such a simplistic statement leaves two points unaddressed. First, to get it will require a greater infusion of funding, and it’s no secret that the congressional minders of the government’s wallet aren’t exactly “making it rain” these days. Second, even in the best-case scenario, uncertainty will still be part of the scientific mix. Natural ecosystem fluctuations alone would cause populations to rise and fall even in the absence of any human activity.
This does not mean we should abandon science as the foundation for fisheries management. To do so would be beyond foolish. Rather, we must evolve the current management system to become one that recognizes this uncertainty and includes measures to prevent wild up-and-down fluctuations in catch limits.
The Magnuson-Stevens Act will once again be up for reauthorization in 2013. Congress, fishermen, scientists, and regulators should take this Gulf of Maine cod example as a warning and get to work now on revisions to the statute that recognize the inherent fluctuation that will occur in even the best stock assessment science and develop a means of building in buffers to prevent the kind of annual boom-and-bust cycle that has contributed to the financial instability and deterioration of trust among fishermen, regulators, and other fishery stakeholders in New England and increasingly in other parts of the country.
With all the uncertainties fishermen face every time they leave the dock, the least we can do is help them build a slightly higher degree of certainty into their business plans.