dc.contributor.author
Possart, Dennis
dc.contributor.author
Mill, Leonid
dc.contributor.author
Vollnhals, Florian
dc.contributor.author
Hildebrand, Tor
dc.contributor.author
Suter, Peter
dc.contributor.author
Hoffmann, Mathis
dc.contributor.author
Utz, Jonas
dc.contributor.author
Augsburger, Daniel
dc.contributor.author
Thies, Mareike
dc.contributor.author
Gu, Mingxuan
dc.contributor.author
Wagner, Fabian
dc.contributor.author
Sarau, George
dc.contributor.author
Christiansen, Silke
dc.contributor.author
Breininger, Katharina
dc.date.accessioned
2025-07-28T08:31:36Z
dc.date.available
2025-07-28T08:31:36Z
dc.identifier.uri
https://refubium.fu-berlin.de/handle/fub188/48402
dc.identifier.uri
http://dx.doi.org/10.17169/refubium-48124
dc.description.abstract
Nanomaterials’ properties, influenced by size, shape, and surface characteristics, are crucial for their technological, biological, and environmental applications. Accurate quantification of these materials is essential for advancing research. Deep learning segmentation networks offer precise, automated analysis, but their effectiveness depends on representative annotated datasets, which are difficult to obtain due to the high cost and manual effort required for imaging and annotation. To address this, we present DiffRenderGAN, a generative model that produces annotated synthetic data by integrating a differentiable renderer into a Generative Adversarial Network (GAN) framework. DiffRenderGAN optimizes rendering parameters to produce realistic, annotated images from non-annotated real microscopy images, reducing manual effort and improving segmentation performance compared to existing methods. Tested on ion and electron microscopy datasets, including titanium dioxide (TiO2), silicon dioxide (SiO2), and silver nanowires (AgNW), DiffRenderGAN bridges the gap between synthetic and real data, advancing the quantification and understanding of complex nanomaterial systems.
en
dc.format.extent
11 Seiten
dc.rights
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
dc.rights.uri
https://creativecommons.org/licenses/by/4.0/
dc.subject
nanomaterial segmentation networks
en
dc.subject
differentiable rendering
en
dc.subject
generative modeling
en
dc.subject.ddc
500 Naturwissenschaften und Mathematik::530 Physik::530 Physik
dc.title
Addressing data scarcity in nanomaterial segmentation networks with differentiable rendering and generative modeling
dc.type
Wissenschaftlicher Artikel
dc.date.updated
2025-07-03T02:49:34Z
dcterms.bibliographicCitation.articlenumber
197
dcterms.bibliographicCitation.doi
10.1038/s41524-025-01702-6
dcterms.bibliographicCitation.journaltitle
npj Computational Materials
dcterms.bibliographicCitation.number
1
dcterms.bibliographicCitation.volume
11
dcterms.bibliographicCitation.url
https://doi.org/10.1038/s41524-025-01702-6
refubium.affiliation
Physik
refubium.affiliation.other
Institut für Experimentalphysik

refubium.resourceType.isindependentpub
no
dcterms.accessRights.openaire
open access
dcterms.isPartOf.eissn
2057-3960
refubium.resourceType.provider
DeepGreen