Show simple item record

AuthorRößing, Christophdc.contributor.author
Date of accession2016-03-15T10:39:54Zdc.date.accessioned
Available in OPARU since2016-03-15T10:39:54Zdc.date.available
Year of creation2014dc.date.created
AbstractThe present thesis proposes novel methods for the perceptually motivated real-time rendering of pictorial depth and motion cues in still images and video sequences. Therefore, the human visual system is evaluated from a psychological and physiological viewpoint. The proposed novel rendering techniques address these specific neuronal structures to subconsciously convey supplemental scene information to the viewer. The novel monocular depth cue enhancement pipeline amplifies pictorial cues to increase the perceived spatiality in images. It utilizes a stereo algorithm to generate an initial supplemental disparity map which is refined and subsequently used to render artificial defocus blur, distance-adaptive local contrast enhancement, depth controlled desaturation, and artificial drop shadows. Merging these visualizations yields an immersive depth sensation. The novel motion depiction pipeline computes the optical flow with the novel À-trous Flow method. Subsequently, a rendering stage approximates natural motion blur, stroboscopic exposure or comic-like speed lines. As result, trajectory and dynamics inside the scene can be easily assessed. The synergies created by the combined computation of scene reconstruction and visualization with real-time capabable a-trous wavelets enable video streaming applications. The third pipeline is specialized on monocular highway traffic scene reconstruction and the enrichment of a rear-view camera videos. In order to support the driver in his scene assessment, the gathered information is conveyed via artificial motion or defocus blur and a color-coded risk potential rendering. All suggested methods are discussed from a perceptual viewpoint. Furthermore, two conducted perceptual studies confirm that the proposed renderings guide visual attention towards specific image areas and convey rough velocity information to the viewer.dc.description.abstract
Languagededc.language.iso
PublisherUniversität Ulmdc.publisher
LicenseStandard (ohne Print-On-Demand)dc.rights
Link to license texthttps://oparu.uni-ulm.de/xmlui/license_opod_v1dc.rights.uri
KeywordDepth of field (Psychology)dc.subject
KeywordMotion Blurdc.subject
Dewey Decimal GroupDDC 000 / Computer science, information & general worksdc.subject.ddc
LCSHDepth perceptiondc.subject.lcsh
MeSHMotion perceptiondc.subject.mesh
TitleVideo and image manipulation for enhanced perceptiondc.title
Resource typeDissertationdc.type
DOIhttp://dx.doi.org/10.18725/OPARU-3214dc.identifier.doi
PPN81345946Xdc.identifier.ppn
URNhttp://nbn-resolving.de/urn:nbn:de:bsz:289-vts-93195dc.identifier.urn
GNDBewegungssehendc.subject.gnd
GNDBewegungsunschärfedc.subject.gnd
FacultyFakultät für Ingenieurwissenschaften und Informatikuulm.affiliationGeneral
Date of activation2014-12-15T11:59:08Zuulm.freischaltungVTS
Peer reviewneinuulm.peerReview
Shelfmark print versionW: W-H 13.926uulm.shelfmark
DCMI TypeTextuulm.typeDCMI
VTS-ID9319uulm.vtsID
CategoryPublikationenuulm.category
Ulm seriesSchriftenreihe des Instituts für Mess-, Regel- und Mikrotechnikuulm.dissSeriesUlmName
Ulm series - number12uulm.dissSeriesUlmNumber
University Bibliographyjauulm.unibibliographie


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record