• English
    • Deutsch
  • English 
    • English
    • Deutsch
  • Login
View Item 
  •   Home
  • Universität Ulm
  • Publikationen
  • View Item
  •   Home
  • Universität Ulm
  • Publikationen
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

What, where, and when? Mechanisms of learning biological motion representations

Thumbnail
Layher_Dissertation. ... (17.02Mb)
Erstveröffentlichung
2019-08-05
Authors
Layher, Georg
Referee
Neumann, Heiko
Minker, Wolfgang
Dissertation


Faculties
Fakultät für Ingenieurwissenschaften, Informatik und Psychologie
Institutions
Institut für Neuroinformatik
Institut für Nachrichtentechnik
Abstract
The understanding and recognition of human actions is one of the major challenges for technical systems aiming at visual behavior analysis. Evidences from psychophysical and neurophysiological studies provide indications on the feature characteristics and neural processing principles involved in the perception of biological motion sequences. Modeling efforts from the field of computational neuroscience complement these empirical findings by proposing potential functional mechanisms and learning schemes enabling the establishment and recognition of biological motion representations and show how such principles can be transferred to technical domains. First, results of psychophysical investigations are presented that demonstrate significant increases in the human recognition performance for motion (sub-) sequences containing highly articulated poses, which co-occur with local extrema in the motion energy and the extension of a body. Such key poses thus qualify as candidates to establish biological motion representations. Second, based on these findings, a neural model for the learning of biological motion representations is presented. The model combines hierarchical feedforward and feedback processing along the ventral (form; what) and dorsal (motion; where) pathways with an unsupervised Hebbian learning mechanism for the learning of prototypical form and motion representations. More specifically, gated learning in the form pathway realizes the selective learning of highly articulated postures. Sequence selective representations are established using temporal association learning driven by motion and form input. The proposed model shows how the unsupervised learning of key poses can form the basis for the establishment of biological motion representations and gives a potential explanation for empirically observed phenomena, such as implied motion perception. Third, as a transfer to technical application scenarios, a real-time biologically inspired action recognition system is presented which automatically selects key poses in action sequences and employs a deep convolutional neural network (DCNN) to learn class-specific pose representations. The network is mapped onto a neuromorphic platform, enabling the real-time (~1000 fps) and energy-efficient (~70 mW) assignment of key poses to action classes. Last, it is shown how an associative learning scheme similar to the one applied in the neural model for the learning of biological motion representations can be used for the learning of visual category and subcategory representations. Here, instar learning is used to learn representations of visual categories, while outstar learning on the other hand is applied to establish representations of the expected input distribution. The category specific pattern is propagated back to the preceding stage where a residual signal reflecting the difference to the current input signal is derived. This difference is emphasized by modulation of the input with the residual signal and a subsequent normalization. If the difference is large enough, a new subcategory representation is established.
Date created
2018
DFG Project THU
TRR 62 / Eine Companion-Technologie für kognitive technische Systeme / DFG / 54371073
Cumulative dissertation containing articles
• G. Layher and H. Neumann (2018). “Points and Stripes: A Novel Technique for Masking Biological Motion Point-Light Stimuli”. In: Frontiers in Psychology 9, p. 12. ISSN: 1664-1078. DOI: 10.3389/fpsyg.2018.01455
• G. Layher, M. A. Giese, and H. Neumann (2014a). “Learning Representations of Animated Motion Sequences — A Neural Model”. In: Topics in Cognitive Science 6.1, pp. 170–182. ISSN: 1756-8765. DOI: 10.1111/tops.12075
• G. Layher, T. Brosch, and H. Neumann (2017a). “Real-time Biologically Inspired Action Recognition from Key Poses using a Neuromorphic Architecture”. In: Frontiers in Neurorobotics 11.13, p. 19. ISSN: 1662-5218. DOI: 10.3389/fnbot.2017.00013
• G. Layher, F. Schrodt, M. V. Butz, and H. Neumann (2014b). “Adaptive Learning in a Compartmental Model of Visual Cortex - How Feedback Enables Stable Category Learning and Refinement”. In: Frontiers in Psychology 5.1287, p. 19. ISSN: 1664-1078. DOI: 10.3389/fpsyg. 2014.01287
Subject headings
[GND]: Psychophysik | Bewegungswahrnehmung | Bewegungsanalyse <Technik> | Motorisches Lernen
[LCSH]: Psychophysics | Neural networks (Computer science) | Machine learning | Motion perception (Vision)
[Free subject headings]: Biological motion | Neural model | Form and motion processing | Artificial neural networks | Deep learning | Action recognition | Unsupervised learning | Implied motion | Point-light stimuli
[DDC subject group]: DDC 004 / Data processing & computer science | DDC 150 / Psychology
License
Standard
https://oparu.uni-ulm.de/xmlui/license_v3

Metadata
Show full item record

DOI & citation

Please use this identifier to cite or link to this item: http://dx.doi.org/10.18725/OPARU-17461

Layher, Georg (2019): What, where, and when? Mechanisms of learning biological motion representations. Open Access Repositorium der Universität Ulm und Technischen Hochschule Ulm. Dissertation. http://dx.doi.org/10.18725/OPARU-17461
Citation formatter >



Policy | kiz service OPARU | Contact Us
Impressum | Privacy statement
 

 

Advanced Search

Browse

All of OPARUCommunities & CollectionsPersonsInstitutionsPublication typesUlm SerialsDewey Decimal ClassesEU projects UlmDFG projects UlmOther projects Ulm

My Account

LoginRegister

Statistics

View Usage Statistics

Policy | kiz service OPARU | Contact Us
Impressum | Privacy statement