首页 | 本学科首页   官方微博 | 高级检索  
     检索      


The challenges of integrating explainable artificial intelligence into GeoAI
Authors:Jin Xing  Renee Sieber
Institution:1. TD Bank Group, Toronto, Ontario, Canada;2. Department of Geography, Bieler School of Environment, McGill University, Montreal, Quebec, Canada
Abstract:Although explainable artificial intelligence (XAI) promises considerable progress in glassboxing deep learning models, there are challenges in applying XAI to geospatial artificial intelligence (GeoAI), specifically geospatial deep neural networks (DNNs). We summarize these as three major challenges, related generally to XAI computation, to GeoAI and geographic data handling, and to geosocial issues. XAI computation includes the difficulty of selecting reference data/models and the shortcomings of attributing explanatory power to gradients, as well as the difficulty in accommodating geographic scale, geovisualization, and underlying geographic data structures. Geosocial challenges encompass the limitations of knowledge scope—semantics and ontologies—in the explanation of GeoAI as well as the lack of integrating non-technical aspects in XAI, including processes that are not amenable to XAI. We illustrate these issues with a land use classification case study.
Keywords:
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号