Deep learning excels at extracting complex patterns but faces catastrophic forgetting when fine-tuned on new data. This book investigates how class- and domain-incremental learning affect neural networks for automated driving, identifying semantic shifts and feature changes as key factors. Tools for quantitatively measuring forgetting are selected and used to show how strategies like image augmentation, pretraining, and architectural adaptations mitigate catastrophic forgetting.
Umfang: XIV, 203 S.
Preis: 43.00 €
These are words or phrases in the text that have been automatically identified by the Named Entity Recognition and Disambiguation service, which provides Wikipedia () and Wikidata () links for these entities.
Kalb, T. 2024. Principles of Catastrophic Forgetting for Continual Semantic Segmentation in Automated Driving. Karlsruhe: KIT Scientific Publishing. DOI: https://doi.org/10.5445/KSP/1000171902
Dieses Buch ist lizenziert unter Creative Commons Attribution + ShareAlike 4.0
Dieses Buch ist Peer reviewed. Informationen dazu finden Sie hier
Veröffentlicht am 21. Oktober 2024
Englisch
236
Paperback | 978-3-7315-1373-5 |