• English
    • Deutsch
  • English 
    • English
    • Deutsch
  • Login
View Item 
  •   Home
  • Universität Ulm
  • Publikationen
  • View Item
  •   Home
  • Universität Ulm
  • Publikationen
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Learning on Multistatic Simulation Data for Radar-Based Automotive Gesture Recognition

Thumbnail
2022_TMTT_Kern.pdf (11.80Mb)

peer-reviewed

Erstveröffentlichung
2022-09-05
Authors
Kern, Nicolai
Aguilar, Julian
Grebner, Timo
Meinecke, Benedikt
Waldschmidt, Christian
Wissenschaftlicher Artikel


Published in
IEEE Transactions on Microwave Theory and Techniques ; 70 (2022), 11. - S. 5039-5050. - ISSN 0018-9480. - eISSN 1557-9670
Link to original publication
https://dx.doi.org/10.1109/TMTT.2022.3200595
Faculties
Fakultät für Ingenieurwissenschaften, Informatik und Psychologie
Institutions
Institut für Mikrowellentechnik
Document version
accepted version
Abstract
Radar-based gesture recognition can play a vital role in autonomous vehicles’ interaction with vulnerable road users (VRUs). However, in automotive scenarios the same gesture produces strongly differing radar responses owed to the wide range of variations such as position, orientation, or ego-motion. Since including all kinds of modifications in a measured dataset is laborious, gesture simulations alleviate the measurement effort and increase the robustness against edge and corner cases. Hence, this paper presents a flexible geometric human target model allowing the direct introduction of a wide range of modifications, while it facilitates the handling of shadowing effects and multiradar constellations. Using the proposed simulation model, a dataset recorded with a radar sensor network consisting of three chirp sequence (CS) radars is resimulated based on motion data simultaneously captured with a stereo video system. Completely substituting the measured by simulated data for training, a convolutional neural network (CNN) classifier still achieves 80.4% cross-validation accuracy on a challenging gesture set, compared to 89.4% for training on measured data. Moreover, using simulated data the classifier is shown to successfully generalize to new scenarios not observed in measurements.
Project uulm
INTUITIVER - INTeraktion zwischen aUtomatIsierTen Fahrzeugen und leicht verletzbaren VerkehrsteilnehmERn / MWK Baden-Württemberg
Subject headings
[GND]: Gestenerkennung | Convolutional Neural Network | Data Augmentation
[LCSH]: Radar | Training | Ellipsometry | Gesture recognition (Computer science)
[Free subject headings]: Data models | Computational modeling | Solid modeling | Ellipsoids | Automotive radar | human target simulation | radar sensor networks
[DDC subject group]: DDC 620 / Engineering & allied operations
License
Lizenz A
https://oparu.uni-ulm.de/xmlui/licenseA_v1

Metadata
Show full item record

DOI & citation

Please use this identifier to cite or link to this item: http://dx.doi.org/10.18725/OPARU-46148

Kern, Nicolai et al. (2022): Learning on Multistatic Simulation Data for Radar-Based Automotive Gesture Recognition. Open Access Repositorium der Universität Ulm und Technischen Hochschule Ulm. http://dx.doi.org/10.18725/OPARU-46148
Citation formatter >



Policy | kiz service OPARU | Contact Us
Impressum | Privacy statement
 

 

Advanced Search

Browse

All of OPARUCommunities & CollectionsPersonsInstitutionsPublication typesUlm SerialsDewey Decimal ClassesEU projects UlmDFG projects UlmOther projects Ulm

My Account

LoginRegister

Statistics

View Usage Statistics

Policy | kiz service OPARU | Contact Us
Impressum | Privacy statement