Infrastructure-based Perception with Cameras and Radars for Cooperative Driving Scenarios
dc.contributor.author | Tsaregorodtsev, Alexander | |
dc.contributor.author | Buchholz, Michael | |
dc.contributor.author | Belagiannis, Vasileios | |
dc.date.accessioned | 2024-07-24T15:22:22Z | |
dc.date.available | 2024-07-24T15:22:22Z | |
dc.date.issued | 2024-07-15 | |
dc.description.abstract | Roadside infrastructure has enjoyed widespread adoption for various tasks such as traffic surveillance, traffic monitoring, control of traffic flow, and prioritization of public transit and emergency vehicles. As automated driving functions and vehicle communications continue to be researched, cooperative and connected driving scenarios can now be realized. Cooperative driving, however, imposes stringent environmental perception and model requirements. In particular, road users, including pedestrians and cyclists, must be reliably detected and accurately localized. Furthermore, the perception framework must have low latency to provide up-to-date information. In this work, we present a refined, camera-based reference point detector design that does not rely on annotated infrastructure datasets and incorporates fusion with cost-effective radar sensor data to increase system reliability, if available. The reference point detector design is realized with box and instance segmentation object detector models to extract object ground points. In parallel, objects are extracted from radar target data through a clustering pipeline and fused with camera object detections. To demonstrate the real-world applicability of our approaches for cooperative driving scenarios, we provide an extensive evaluation of data from a real test site. | |
dc.description.version | acceptedVersion | |
dc.identifier.doi | https://doi.org/10.18725/OPARU-53397 | |
dc.identifier.url | https://oparu.uni-ulm.de/handle/123456789/53473 | |
dc.identifier.urn | http://nbn-resolving.de/urn:nbn:de:bsz:289-oparu-53473-2 | |
dc.language.iso | en | |
dc.publisher | Universität Ulm | |
dc.relation1.doi | 10.1109/IV55156.2024.10588604 | |
dc.rights | CC BY 4.0 International | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
dc.subject | Automatisiertes Fahren | |
dc.subject | Roadside Infrastructure | |
dc.subject | Autonomes Fahrzeug | |
dc.subject | Automated Vehicles | |
dc.subject | Cooperative Driving | |
dc.subject | Connected Driving | |
dc.subject | Vernetztes Fahren | |
dc.subject.ddc | DDC 620 / Engineering & allied operations | |
dc.subject.gnd | Autonomes Fahrzeug | |
dc.subject.lcsh | Automated vehicles | |
dc.title | Infrastructure-based Perception with Cameras and Radars for Cooperative Driving Scenarios | |
dc.type | Beitrag zu einer Konferenz | |
source.fromPage | 1678 | |
source.identifier.eissn | 2642-7214 | |
source.identifier.issn | 1931-0587 | |
source.publisher | Institute of Electrical and Electronics Engineers (IEEE) | |
source.title | 2024 IEEE Intelligent Vehicles Symposium (IV) | |
source.toPage | 1685 | |
source.year | 2024 | |
uulm.affiliationGeneral | Fakultät für Ingenieurwissenschaften, Informatik und Psychologie | |
uulm.affiliationSpecific | Institut für Mess-, Regel- und Mikrotechnik | |
uulm.bibliographie | uulm | |
uulm.category | Publikationen | |
uulm.conferenceEndDate | 2024-06-05 | |
uulm.conferenceName | IEEE Intelligent Vehicles Symposium (IV) | |
uulm.conferencePlace | Jeju Island, Korea | |
uulm.conferenceStartDate | 2024-06-02 | |
uulm.peerReview | ja | |
uulm.projectEU | PoDIUM / PDI connectivity and cooperation enablers building trust and sustainability for CCAM / EC / HE / 101069547 | |
uulm.projectOther | LUKAS / Verbundprojekt: LUKAS - Lokales Umfeldmodell für das kooperative, automatisierte Fahren in komplexen Verkehrssituationen; Teilvorhaben: Infrastrukturseite Datenverarbeitung und kooperative Handlungsplanung / BMWi / 19A20004F | |
uulm.typeDCMI | Text | |
uulm.updateStatusURN | url_update_general |
Files
Original bundle
1 - 1 of 1