DescriptionOver the last decade the analytical toolset of an ecologist studying species communities has undergone major evolution. In response to the widespread understanding that analyzing species jointly rather than one-by-one provides a more in-depth insight on the community structure, many statistical approaches for joint modelling have been published, enabling coherent and rigid testing of various ecological hypotheses. Furthermore, many of such models are accompanied with documented software implementations that greatly facilitate their reusability by other researchers.
However, alongside the increasing expressiveness and complexity of joint species models, the associated model fitting computational costs have also dramatically increased. Thus, analyzing even moderate-sized community datasets with existing packages can take days to obtain a single reliable model fit.
In this work we seek to alleviate such computational load for Bayesian inference in a flexible class of joint species distribution models (JSDM), titled Hierarchical Model of Species Communities (HMSC), which principally encapsulates several other popular JSDMs. We ground our approach in latent Gaussian representation of HMSC combined with several techniques from probabilistic machine learning domain. First, we analytically exploit the specific HMSC-imposed structure for the marginal Gaussian form and blend it together with the Expectation Propagation technique to accommodate non-Gaussian observation models. Next, we consider three strategies for learning the analytically intractable parameters: Maximum Posterior estimate, Variational Inference and Hamiltonian Monte Carlo. We demonstrate our approach using simulated and real datasets and compare its performance to two JSDM R packages representing the extremes on the current model fitting techniques gradient – Hmsc and gllvm.
|Aikajakso||22 kesäkuuta 2020 → 26 kesäkuuta 2020|
|Tapahtuman otsikko||International Statistical Ecology Conference|
Tähän liittyvä sisältö