Unsupervised learning of generative and discriminative weights encoding elementary image components in a predictive coding model of cortical function

Research output: Contribution to journalArticlepeer-review

49 Citations (Scopus)
15 Downloads (Pure)

Abstract

A method is presented for learning the reciprocal feedforward and feedback connections required by the predictive coding model of cortical function. When this method is used, feedforward and feedback connections are learned simultaneously and independently in a biologically plausible manner. The performance of the proposed algorithm is evaluated by applying it to learning the elementary components of artificial and natural images. For artificial images, the bars problem is employed, and the proposed algorithm is shown to produce state-of-the-art performance on this task. For natural images, components resembling Gabor functions are learned in the first processing stage, and neurons responsive to corners are learned in the second processing stage. The properties of these learned representations are in good agreement with neurophysiological data from V1 and V2. The proposed algorithm demonstrates for the first time that a single computational theory can explain the formation of cortical RFs and also the response properties of cortical neurons once those RFs have been learned.
Original languageEnglish
Pages (from-to)60-103
Number of pages44
JournalNeural Computation
Volume24
Issue number1
DOIs
Publication statusPublished - Jan 2012

Fingerprint

Dive into the research topics of 'Unsupervised learning of generative and discriminative weights encoding elementary image components in a predictive coding model of cortical function'. Together they form a unique fingerprint.

Cite this