dc.contributor.author
Iamshchinina, Polina
dc.contributor.author
Karapetian, Agnessa
dc.contributor.author
Kaiser, Daniel
dc.contributor.author
Cichy, Radoslaw M.
dc.date.accessioned
2022-08-08T11:11:06Z
dc.date.available
2022-08-08T11:11:06Z
dc.identifier.uri
https://refubium.fu-berlin.de/handle/fub188/35808
dc.identifier.uri
http://dx.doi.org/10.17169/refubium-35523
dc.description.abstract
Humans can effortlessly categorize objects, both when they are conveyed through visual images and spoken words. To resolve the neural correlates of object categorization, studies have so far primarily focused on the visual modality. It is therefore still unclear how the brain extracts categorical information from auditory signals. In the current study, we used EEG (n = 48) and time-resolved multivariate pattern analysis to investigate 1) the time course with which object category information emerges in the auditory modality and 2) how the representational transition from individual object identification to category representation compares between the auditory modality and the visual modality. Our results show that 1) auditory object category representations can be reliably extracted from EEG signals and 2) a similar representational transition occurs in the visual and auditory modalities, where an initial representation at the individual-object level is followed by a subsequent representation of the objects’ category membership. Altogether, our results suggest an analogous hierarchy of information processing across sensory channels. However, there was no convergence toward conceptual modality-independent representations, thus providing no evidence for a shared supramodal code.
NEW & NOTEWORTHY Object categorization operates on inputs from different sensory modalities, such as vision and audition. This process was mainly studied in vision. Here, we explore auditory object categorization. We show that auditory object category representations can be reliably extracted from EEG signals and, similar to vision, auditory representations initially carry information about individual objects, which is followed by a subsequent representation of the objects’ category membership.
en
dc.format.extent
7 Seiten
dc.rights.uri
https://creativecommons.org/licenses/by/4.0/
dc.subject
auditory modality
en
dc.subject
object categorization
en
dc.subject
visual modality
en
dc.subject.ddc
100 Philosophie und Psychologie::150 Psychologie::150 Psychologie
dc.title
Resolving the time course of visual and auditory object categorization
dc.type
Wissenschaftlicher Artikel
dcterms.bibliographicCitation.doi
10.1152/jn.00515.2021
dcterms.bibliographicCitation.journaltitle
Journal of Neurophysiology
dcterms.bibliographicCitation.number
6
dcterms.bibliographicCitation.pagestart
1622
dcterms.bibliographicCitation.pageend
1628
dcterms.bibliographicCitation.volume
127
dcterms.bibliographicCitation.url
https://doi.org/10.1152/jn.00515.2021
refubium.affiliation
Erziehungswissenschaft und Psychologie
refubium.affiliation.other
Arbeitsbereich Neural Dynamics of Visual Cognition
refubium.resourceType.isindependentpub
no
dcterms.accessRights.openaire
open access
dcterms.isPartOf.eissn
1522-1598
refubium.resourceType.provider
WoS-Alert