Infrastructure-based Perception with Cameras and Radars for Cooperative Driving Scenarios

dc.contributor.authorTsaregorodtsev, Alexander
dc.contributor.authorBuchholz, Michael
dc.contributor.authorBelagiannis, Vasileios
dc.date.accessioned2024-07-24T15:22:22Z
dc.date.available2024-07-24T15:22:22Z
dc.date.issued2024-07-15
dc.description.abstractRoadside infrastructure has enjoyed widespread adoption for various tasks such as traffic surveillance, traffic monitoring, control of traffic flow, and prioritization of public transit and emergency vehicles. As automated driving functions and vehicle communications continue to be researched, cooperative and connected driving scenarios can now be realized. Cooperative driving, however, imposes stringent environmental perception and model requirements. In particular, road users, including pedestrians and cyclists, must be reliably detected and accurately localized. Furthermore, the perception framework must have low latency to provide up-to-date information. In this work, we present a refined, camera-based reference point detector design that does not rely on annotated infrastructure datasets and incorporates fusion with cost-effective radar sensor data to increase system reliability, if available. The reference point detector design is realized with box and instance segmentation object detector models to extract object ground points. In parallel, objects are extracted from radar target data through a clustering pipeline and fused with camera object detections. To demonstrate the real-world applicability of our approaches for cooperative driving scenarios, we provide an extensive evaluation of data from a real test site.
dc.description.versionacceptedVersion
dc.identifier.doihttps://doi.org/10.18725/OPARU-53397
dc.identifier.urlhttps://oparu.uni-ulm.de/handle/123456789/53473
dc.identifier.urnhttp://nbn-resolving.de/urn:nbn:de:bsz:289-oparu-53473-2
dc.language.isoen
dc.publisherUniversität Ulm
dc.relation1.doi10.1109/IV55156.2024.10588604
dc.rightsCC BY 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectAutomatisiertes Fahren
dc.subjectRoadside Infrastructure
dc.subjectAutonomes Fahrzeug
dc.subjectAutomated Vehicles
dc.subjectCooperative Driving
dc.subjectConnected Driving
dc.subjectVernetztes Fahren
dc.subject.ddcDDC 620 / Engineering & allied operations
dc.subject.gndAutonomes Fahrzeug
dc.subject.lcshAutomated vehicles
dc.titleInfrastructure-based Perception with Cameras and Radars for Cooperative Driving Scenarios
dc.typeBeitrag zu einer Konferenz
source.fromPage1678
source.identifier.eissn2642-7214
source.identifier.issn1931-0587
source.publisherInstitute of Electrical and Electronics Engineers (IEEE)
source.title2024 IEEE Intelligent Vehicles Symposium (IV)
source.toPage1685
source.year2024
uulm.affiliationGeneralFakultät für Ingenieurwissenschaften, Informatik und Psychologie
uulm.affiliationSpecificInstitut für Mess-, Regel- und Mikrotechnik
uulm.bibliographieuulm
uulm.categoryPublikationen
uulm.conferenceEndDate2024-06-05
uulm.conferenceNameIEEE Intelligent Vehicles Symposium (IV)
uulm.conferencePlaceJeju Island, Korea
uulm.conferenceStartDate2024-06-02
uulm.peerReviewja
uulm.projectEUPoDIUM / PDI connectivity and cooperation enablers building trust and sustainability for CCAM / EC / HE / 101069547
uulm.projectOtherLUKAS / Verbundprojekt: LUKAS - Lokales Umfeldmodell für das kooperative, automatisierte Fahren in komplexen Verkehrssituationen; Teilvorhaben: Infrastrukturseite Datenverarbeitung und kooperative Handlungsplanung / BMWi / 19A20004F
uulm.typeDCMIText
uulm.updateStatusURNurl_update_general

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
IV2024-OPARU.pdf
Size:
624.3 KB
Format:
Adobe Portable Document Format

Collections