Show simple item record

AuthorBayerl, Pierredc.contributor.author
Date of accession2016-03-14T13:38:46Zdc.date.accessioned
Available in OPARU since2016-03-14T13:38:46Zdc.date.available
Year of creation2005dc.date.created
AbstractThe neural mechanisms underlying the segregation and integration of detected motion still remain unclear to a large extent. Motion of an extended boundary can locally be measured by neurons only orthogonal to its orientation (aperture problem) while this ambiguity is resolved for localized image features. In this thesis, a novel neural model of visual motion processing is developed that involves early stages of the cortical, dorsal and ventral pathways of the primate brain to integrate and segregate visual motion and in particular to solve the motion aperture problem. Our model makes predictions concerning the time course of cells in area MT and V1 and serves as a means to link physiological mechanisms with perceptual behavior. We further demonstrate that our model also successfully processes natural image sequences. Moreover we present several extensions of the neural model to investigate the influence of form information, the effects of attention, and the perception of transparent motion. The major computational bottleneck of the presented neural model is the amount of memory necessary for the representation of neural activity. In order to derive a computational mechanism for large-scale simulations we propose a sparse coding framework for neural motion activity patterns and suggest a means how initial activities are detected efficiently. The presented work combines concepts and findings from computational neuroscience, neurophysiological observations, psychophysical observations, and computer science. The outcome of our investigations is a biologically plausible model of motion segmentation together with a fast algorithmic implementation which explains and predicts perceptual and neural effects of motion perception and allows to extract optic flow from given image sequences.dc.description.abstract
Languageendc.language.iso
PublisherUniversität Ulmdc.publisher
LicenseStandard (Fassung vom 03.05.2003)dc.rights
Link to license texthttps://oparu.uni-ulm.de/xmlui/license_v1dc.rights.uri
KeywordComputational models of visiondc.subject
KeywordFeature attentiondc.subject
KeywordMotion aperture problemdc.subject
KeywordMotion estimationdc.subject
KeywordMotion transparencydc.subject
KeywordOptic flowdc.subject
KeywordRecurrent information processingdc.subject
KeywordVisual attentiondc.subject
Dewey Decimal GroupDDC 004 / Data processing & computer sciencedc.subject.ddc
LCSHAlgorithmsdc.subject.lcsh
LCSHComputer visiondc.subject.lcsh
LCSHMotion perception. Visiondc.subject.lcsh
LCSHNeural networks: Computer sciencedc.subject.lcsh
LCSHVisual perceptiondc.subject.lcsh
TitleA model of visual motion perceptiondc.title
Resource typeDissertationdc.type
DOIhttp://dx.doi.org/10.18725/OPARU-361dc.identifier.doi
URNhttp://nbn-resolving.de/urn:nbn:de:bsz:289-vts-56293dc.identifier.urn
GNDBewegungssehendc.subject.gnd
GNDBewegungswahrnehmungdc.subject.gnd
FacultyFakultät für Informatikuulm.affiliationGeneral
Date of activation2006-07-04T11:35:28Zuulm.freischaltungVTS
Peer reviewneinuulm.peerReview
Shelfmark print versionZ: J-H 11.167 ; W. W-H 9.285uulm.shelfmark
DCMI TypeTextuulm.typeDCMI
VTS ID5629uulm.vtsID
CategoryPublikationenuulm.category
Bibliographyuulmuulm.bibliographie


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record