Learning on Multistatic Simulation Data for Radar-Based Automotive Gesture Recognition

peer-reviewed
Erstveröffentlichung
2022-09-05Autoren
Kern, Nicolai
Aguilar, Julian
Grebner, Timo
Meinecke, Benedikt
Waldschmidt, Christian
Wissenschaftlicher Artikel
Erschienen in
IEEE Transactions on Microwave Theory and Techniques ; 70 (2022), 11. - S. 5039-5050. - ISSN 0018-9480. - eISSN 1557-9670
Link zur Originalveröffentlichung
https://dx.doi.org/10.1109/TMTT.2022.3200595Fakultäten
Fakultät für Ingenieurwissenschaften, Informatik und PsychologieInstitutionen
Institut für MikrowellentechnikDokumentversion
Akzeptierte VersionZusammenfassung
Radar-based gesture recognition can play a vital role in autonomous vehicles’ interaction with vulnerable road users (VRUs). However, in automotive scenarios the same gesture produces strongly differing radar responses owed to the wide range of variations such as position, orientation, or ego-motion. Since including all kinds of modifications in a measured dataset is laborious, gesture simulations alleviate the measurement effort and increase the robustness against edge and corner cases. Hence, this paper presents a flexible geometric human target model allowing the direct introduction of a wide range of modifications, while it facilitates the handling of shadowing effects and multiradar constellations. Using the proposed simulation model, a dataset recorded with a radar sensor network consisting of three chirp sequence (CS) radars is resimulated based on motion data simultaneously captured with a stereo video system. Completely substituting the measured by simulated data for training, a convolutional neural network (CNN) classifier still achieves 80.4% cross-validation accuracy on a challenging gesture set,
compared to 89.4% for training on measured data. Moreover, using simulated data the classifier is shown to successfully generalize to new scenarios not observed in measurements.
Projekt uulm
INTUITIVER - INTeraktion zwischen aUtomatIsierTen Fahrzeugen und leicht verletzbaren VerkehrsteilnehmERn / MWK Baden-Württemberg
Schlagwörter
[GND]: Gestenerkennung | Convolutional Neural Network | Data Augmentation[LCSH]: Radar | Training | Ellipsometry | Gesture recognition (Computer science)
[Freie Schlagwörter]: Data models | Computational modeling | Solid modeling | Ellipsoids | Automotive radar | human target simulation | radar sensor networks
[DDC Sachgruppe]: DDC 620 / Engineering & allied operations
Metadata
Zur LanganzeigeDOI & Zitiervorlage
Nutzen Sie bitte diesen Identifier für Zitate & Links: http://dx.doi.org/10.18725/OPARU-46148
Kern, Nicolai et al. (2022): Learning on Multistatic Simulation Data for Radar-Based Automotive Gesture Recognition. Open Access Repositorium der Universität Ulm und Technischen Hochschule Ulm. http://dx.doi.org/10.18725/OPARU-46148
Verschiedene Zitierstile >