Observational data are imperfect, and predictive models based on those data represent simplifications of how some aspect of the world works. In the fisheries advisory process (e.g. development of catch advice using stock assessments, evaluation of management strategies), it is clear that our analyses are fraught with uncertainty, stemming from uncertainty in the input data (observations) or from the structure (degree of simplification and validity of assumptions) of the methods and models employed. Input uncertainty is easier to measure using standard statistical procedures and easier to address through improvements to survey designs, sampling schemes, and statistical methods. Structural uncertainties however, are more intangible as they often represent “known unknowns", i.e. we know there are limitations to the methods and models, but it is difficult to describe and measure them without comprehensive analyses, such as simulation testing or cross-validation. With the development of more advanced analytical frameworks that support implementation of machine learning, artificial intelligence, and ensemble modeling, fisheries scientists are invited to this session to present advances in identifying, quantifying, and dealing with structural uncertainties in the fisheries science advisory process. The conveners invite contributions on the following themes:
- Uncertainties throughout the stock assessment and management strategy evaluation process
- Identification and testing of plausible structural hypotheses, and structural uncertainties
- Sensitivity analysis
- Model ensembles (within and between models)
*see NOAA workshop that included this among the topics
- Gaps and needs for developing ensembles
- Evaluating trade-offs between one versus multiple models
- Combining and communicating results across ensemble members and stakeholders
|