dc.contributor.author
Ryo, Masahiro
dc.contributor.author
Angelov, Boyan
dc.contributor.author
Mammola, Stefano
dc.contributor.author
Kass, Jamie M.
dc.contributor.author
Benito, Blas M.
dc.contributor.author
Hartig, Florian
dc.date.accessioned
2021-02-02T13:03:58Z
dc.date.available
2021-02-02T13:03:58Z
dc.identifier.uri
https://refubium.fu-berlin.de/handle/fub188/29152
dc.identifier.uri
http://dx.doi.org/10.17169/refubium-28901
dc.description.abstract
Species distribution models (SDMs) are widely used in ecology, biogeography and conservation biology to estimate relationships between environmental variables and species occurrence data and make predictions of how their distributions vary in space and time. During the past two decades, the field has increasingly made use of machine learning approaches for constructing and validating SDMs. Model accuracy has steadily increased as a result, but the interpretability of the fitted models, for example the relative importance of predictor variables or their causal effects on focal species, has not always kept pace. Here we draw attention to an emerging subdiscipline of artificial intelligence, explainable AI (xAI), as a toolbox for better interpreting SDMs. xAI aims at deciphering the behavior of complex statistical or machine learning models (e.g. neural networks, random forests, boosted regression trees), and can produce more transparent and understandable SDM predictions. We describe the rationale behind xAI and provide a list of tools that can be used to help ecological modelers better understand complex model behavior at different scales. As an example, we perform a reproducible SDM analysis in R on the African elephant and showcase some xAI tools such as local interpretable model-agnostic explanation (LIME) to help interpret local-scale behavior of the model. We conclude with what we see as the benefits and caveats of these techniques and advocate for their use to improve the interpretability of machine learning SDMs.
en
dc.format.extent
7 Seiten
dc.rights.uri
https://creativecommons.org/licenses/by/4.0/
dc.subject
ecological modeling
en
dc.subject
explainable artificial intelligence
en
dc.subject
habitat suitability modeling
en
dc.subject
interpretable machine learning
en
dc.subject
species distribution model
en
dc.subject.ddc
500 Naturwissenschaften und Mathematik::570 Biowissenschaften; Biologie::570 Biowissenschaften; Biologie
dc.title
Explainable artificial intelligence enhances the ecological interpretability of black-box species distribution models
dc.type
Wissenschaftlicher Artikel
dcterms.bibliographicCitation.doi
10.1111/ecog.05360
dcterms.bibliographicCitation.journaltitle
Ecography
dcterms.bibliographicCitation.number
2
dcterms.bibliographicCitation.pagestart
199
dcterms.bibliographicCitation.pageend
202
dcterms.bibliographicCitation.volume
44
dcterms.bibliographicCitation.url
https://doi.org/10.1111/ecog.05360
refubium.affiliation
Biologie, Chemie, Pharmazie
refubium.affiliation.other
Institut für Biologie
refubium.funding
DEAL Wiley
refubium.resourceType.isindependentpub
no
dcterms.accessRights.openaire
open access
dcterms.isPartOf.eissn
1600-0587
refubium.resourceType.provider
WoS-Alert