Guidelines to compare semantic segmentation maps at different resolutions

dc.contributor.authorAyala Lauroba, Christian
dc.contributor.authorAranda, Carlos
dc.contributor.authorGalar Idoate, Mikel
dc.contributor.departmentEstadística, Informática y Matemáticases_ES
dc.contributor.departmentEstatistika, Informatika eta Matematikaeu
dc.contributor.departmentInstitute of Smart Cities - ISCen
dc.contributor.funderUniversidad Pública de Navarra / Nafarroako Unibertsitate Publikoa
dc.date.accessioned2024-09-11T18:27:23Z
dc.date.available2024-09-11T18:27:23Z
dc.date.issued2024
dc.date.updated2024-09-11T18:12:15Z
dc.description.abstractChoosing the proper ground sampling distance (GSD) is a vital decision in remote sensing, which can determine the success or failure of a project. Higher resolutions may be more suitable for accurately detecting objects, but they also come with higher costs and require more computing power. Semantic segmentation is a common task in remote sensing where GSD plays a crucial role. In semantic segmentation, each pixel of an image is classified into a predefined set of classes, resulting in a semantic segmentation map. However, comparing the results of semantic segmentation at different GSDs is not straightforward. Unlike scene classification and object detection tasks, which are evaluated at scene and object level, respectively, semantic segmentation is typically evaluated at pixel level. This makes it difficult to match elements across different GSDs, resulting in a range of methods for computing metrics, some of which may not be adequate. For this reason, the purpose of this work is to set out a clear set of guidelines for fairly comparing semantic segmentation results obtained at various spatial resolutions. Additionally, we propose to complement the commonly used scene-based pixel-wise metrics with region-based pixel-wise metrics, allowing for a more detailed analysis of the model performance. The set of guidelines together with the proposed region-based metrics are illustrated with building and swimming pool detection problems. The experimental study demonstrates that by following the proposed guidelines and the proposed region-based pixel-wise metrics, it is possible to fairly compare segmentation maps at different spatial resolutions and gain a better understanding of the model's performance. To promote the usage of these guidelines and ease the computation of the new region-based metrics, we create the seg-eval Python library and make it publicly available at https://github.com/itracasa/ seg-eval.en
dc.description.sponsorshipThe work of Christian Ayala was supported in part by the Government of Navarre through the Industrial Ph.D. Program 2020 Reference under Grant 0011-1408-2020-000008. The work of Mikel Galar was supported in part by the Spanish Ministry of Science and Innovation (MCIN/Agencia Estatal de Investigación (AEI)/10.13039/501100011033) under Project PID2019-108392GB-I00 and Project PID2022-136627NB-I00 and in part by the Public University of Navarre under Project PJUPNA25-2022.
dc.format.mimetypeapplication/pdfen
dc.format.mimetypeapplication/zipen
dc.identifier.citationAyala, C., Aranda, C., Galar, M. (2024) Guidelines to compare semantic segmentation maps at different resolutions. IEEE Transactions on Geoscience and Remote Sensing, 62, 1-16. https://doi.org/10.1109/TGRS.2024.3369310.
dc.identifier.doi10.1109/TGRS.2024.3369310
dc.identifier.issn0196-2892
dc.identifier.urihttps://academica-e.unavarra.es/handle/2454/51590
dc.language.isoeng
dc.publisherIEEE
dc.relation.ispartofIEEE Transactions on Geoscience and Remote Sensing 62(4702416), 2024
dc.relation.projectIDinfo:eu-repo/grantAgreement/Gobierno de Navarra//0011-1408-2020-000008/
dc.relation.projectIDinfo:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/PID2019-108392GB-I00/ES/
dc.relation.projectIDinfo:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/PID2022-136627NB-I00/ES/
dc.relation.publisherversionhttps://doi.org/10.1109/TGRS.2024.3369310
dc.rights© 2024 The Authors. This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess
dc.subjectError metricsen
dc.subjectQuality assessmenten
dc.subjectRemote sensingen
dc.subjectSemantic segmentationen
dc.titleGuidelines to compare semantic segmentation maps at different resolutionsen
dc.typeinfo:eu-repo/semantics/article
dc.type.versioninfo:eu-repo/semantics/publishedVersion
dspace.entity.typePublication
relation.isAuthorOfPublication4c0a0a12-02e3-479d-8562-b5d9a39bab40
relation.isAuthorOfPublication44c7a308-9c21-49ef-aa03-b45c2c5a06fd
relation.isAuthorOfPublication.latestForDiscovery4c0a0a12-02e3-479d-8562-b5d9a39bab40

Files

Original bundle
Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
Ayala_Guidelines.pdf
Size:
10.87 MB
Format:
Adobe Portable Document Format
No Thumbnail Available
Name:
Ayala_Guidelines_MatCompl.zip
Size:
95.94 KB
Format:
ZIP
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed to upon submission
Description: