publications
publications by categories in reversed chronological order. generated by jekyll-scholar.
2026
- computer visionIsolating the Role of Temporal Information in Video Saliency: A Controlled Experimental AnalysisPeter El-Jiz, Matthias Kuemmerer, Matthias Tangemann, Matthias Bethge, Andreas Bartels, and Michael Mario BannertIn Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), Mar 2026
2025
- Memory color influences conscious object perceptionVincent Plikat, Pablo R Grassi, Michael M Bannert, and Andreas BartelsMay 2025
Can knowledge influence perception? A central case suggesting it can, is evidence showing that knowledge about a color-diagnostic object’s typical color can influence its appearance. For example, a grey banana is allegedly perceived with a tint of yellow. However, methodological and conceptual considerations, leave it unclear whether the purported “memory-color” effect actually reflects changes in perception or changes in judgment and responses instead. Here, we combine memory-color with binocular rivalry to test if top-down influences affect the color an object is perceived in. We showed 24 participants familiar objects in their typical and opponent color and asked for concurrent reports of the perceived color. Consistent with Bayesian models of rivalry, we observed that conscious perception of identical spectral color pairs was biased towards the typical color of the presented object. Our results suggest that prior knowledge aids interpretation of ambiguous stimuli and biases conscious perception towards the most plausible interpretation.
- cortical retinotopyLarge-Scale Color Biases in the Retinotopic Functional Architecture Are Region Specific and Shared across Human BrainsMichael M. Bannert and Andreas BartelsJournal of Neuroscience, Oct 2025
Despite the functional specialization in visual cortex, there is growing evidence that the processing of chromatic and spatial visual features is intertwined. While past studies focused on visual field biases in retina and behavior, large-scale dependencies between coding of color and retinotopic space are largely unexplored in the cortex. Using a sample of male and female volunteers, we asked whether spatial color biases are shared across different human observers and whether they are idiosyncratic for distinct areas. We tested this by predicting the color a person was seeing using a linear classifier that has never been trained on chromatic responses from that same brain, solely by taking into account: (1) the chromatic responses in other individuals’ brains and (2) commonalities between the spatial coding in brains used for training and the test brain. We were able to predict the color (and luminance) of stimuli seen by an observer based on other subjects’ activity patterns in areas V1–V3, hV4, and LO1. In addition, we found that different colors elicited systematic, large-scale retinotopic biases that were idiosyncratic for distinct areas and common across brains. The area-specific spatial color codes and their conservation across individuals suggest functional or evolutionary organization pressures that remain to be elucidated.
- Decoding Illusory Colours From Human Visual CortexMarek Nemecek, Barbora Wolf, Karl R. Gegenfurtner, Philipp Sterzer, Andreas Bartels, Michael M. Bannert, and Matthias GuggenmosIn 28th Meeting of the Association for the Scientific Study of Consciousness (ASSC 2025), Heraklion, Greece, 2025
2024
- The causal involvement of the visual cortex in visual working memory remains uncertainPablo Rodrigo Grassi, Michael M. Bannert, and Andreas BartelsRoyal Society Open Science, Jun 2024
The role of the early visual cortex in visual working memory (VWM) is a matter of current debate. Neuroimaging studies have consistently shown that visual areas encode the content of working memory, while transcranial magnetic stimulation (TMS) studies have presented incongruent results. Thus, we lack conclusive evidence supporting the causal role of early visual areas in VWM. In a recent registered report, Phylactou et al. (Phylactou P, Shimi A, Konstantinou N 2023 R. Soc. Open Sci. 10, 230321 (doi:10.1098/rsos.230321)) sought to tackle this controversy via two well-powered TMS experiments, designed to correct possible methodological issues of previous attempts identified in a preceding systematic review and meta-analysis (Phylactou P, Traikapi A, Papadatou-Pastou M, Konstantinou N 2022 Psychon. Bull. Rev. 29, 1594–1624 (doi:10.3758/s13423-022-02107-y)). However, a key part of their critique and experimental design was based on a misunderstanding of the visual system. They disregarded two important anatomical facts, namely that early visual areas of each hemisphere represent the contralateral visual hemifield, and that each hemisphere receives equally strong input from each eye—both leading to confounded conditions and artefactual effects in their studies. Here, we explain the correct anatomy, describe why their experiments failed to address current issues in the literature and perform a thorough reanalysis of their TMS data revealing important null results. We conclude that the causal role of the visual cortex in VWM remains uncertain.
2022
- Visual cortex: Big data analysis uncovers food specificityMichael M. Bannert and Andreas BartelsCurrent Biology, Oct 2022
2019
- Predictive deep learning explains human BOLD responses during natural viewingMichael M. Bannert, Celia Foster, Michael J. Black, and Andreas BartelsIn ESI Systems Neuroscience Conference (ESI-SyNC 2019): The recurrent cortex: feedback, dynamics, and dimensionality, 2019
2018
- Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object ColorMichael M Bannert and Andreas BartelsThe Journal of Neuroscience, 2018
Color is special among basic visual features in that it can form a defining part of objects that are engrained in our memory. Whereas most neuroimaging research on human color vision has focused on responses related to external stimulation, the present study investigated how sensory-driven color vision is linked to subjective color perception induced by object imagery. We recorded fMRI activity in male and female volunteers during viewing of abstract color stimuli that were red, green, or yellow in half of the runs. In the other half we asked them to produce mental images of colored, meaningful objects (such as tomato, grapes, banana) corresponding to the same three color categories. Although physically presented color could be decoded from all retinotopically mapped visual areas, only hV4 allowed predicting colors of imagined objects when classifiers were trained on responses to physical colors. Importantly, only neural signal in hV4 was predictive of behavioral performance in the color judgment task on a trial-by-trial basis. The commonality between neural representations of sensory-driven and imagined object color and the behavioral link to neural representations in hV4 identifies area hV4 as a perceptual hub linking externally triggered color vision with color in self-generated object imagery.SIGNIFICANCE STATEMENTHumans experience color not only when visually exploring the outside world, but also in the absence of visual input, for example when remembering, dreaming, and during imagery. It is not known where neural codes for sensory-driven and internally generated hue converge. In the current study we evoked matching subjective color percepts, one driven by physically presented color stimuli, the other by internally generated color imagery. This allowed us to identify area hV4 as the only site where neural codes of corresponding subjective color perception converged regardless of its origin. Color codes in hV4 also predicted behavioral performance in an imagery task, suggesting it forms a perceptual hub for color perception.1.
- Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object ColorMichael Bannert and Andreas BartelsJournal of Vision, Sep 2018
2017
- Invariance of surface color representations across illuminant changes in the human cortexMichael M Bannert and Andreas BartelsNeuroImage, Sep 2017
Color is the brain’s estimate of reflectance for a given surface. Reflectance describes how much light a surface reflects at different wavelengths. Since the light reflected from a surface depends on its reflectance and on the spectral power distribution of the incident light, it is impossible to predict surface reflectance directly from the wavelength composition of the reflected light. Despite this computational problem, the human visual system is remarkably accurate at inferring the reflectance – perceived}nas color – of surfaces across different illuminants. This ability is referred to as color constancy and it is essential for the organism to use color as a cue in object search, recognition, and identification. We devised images}nof two surfaces presented under three different illuminants using physically realistic rendering methods to study the neural architecture underlying surface color perception. Measuring patterns of fMRI voxel activity elicited by these images, we tested to what extent responses to surface}ncolor in various retinotopically mapped areas remained stable across illuminants and which regions encoded illuminant information. We made three important observations: First, patterns of fMRI responses to surface}ncolor generalized across illuminants in V1 but not V2, V3, hV4, or VO1. Second, accuracy of illuminant decoding was positively correlated with psychophysically measured color constancy as predicted by the Equivalent Illuminant Model. Third, when fMRI activity was elicited by stimuli that}nwere matched in reflected light but differed in illumination and therefore also differed in perceived surface color, there was a gradient from lower to higher visual areas to distinguish between the two inputs in terms of a difference in surface color rather than illumination. Our results demonstrate that V1 represents chromatic invariances in the stimulus environment (possibly via feedback) whereas downstream visual areas are more}nbiased to link chromatic differences to different surface color percepts.
2016
- The constructive nature of color vision: evidence from human fMRIMichael M. Bannert and Andreas BartelsIn Seeing Colors: International Symposium on Color Vision. Regensburg, Germany, 2016
2015
- The invariance of surface color representations across illuminant changes in the human cortexMichael M. Bannert and Andreas BartelsIn Donders Discussions 2015, Nijmegen, The Netherlands, 2015
The light reflected from a surface depends on the reflectance of that surface and the spectral power distribution of the incident light, thus making it impossible to predict surface color directly from its wavelength composition. Despite this computational problem, the human visual system is remarkably accurate at inferring the color of surfaces across different illuminants. This ability is referred to as color constancy and it is essential for the organism to use color as a cue in object search, recognition, and identification. We devised images of two surfaces presented under three different illuminants using physically realistic rendering methods to disentangle the influences of wavelength composition, surface reflectance, and illumination. Measuring patterns of fMRI voxel activity elicited by these images we tested to what extent responses to surface color in various retinotopically mapped visual areas remained stable across illuminants. While surface color could be decoded in all ROIs when the illuminants did not differ between training and test sets, we found generalization across illuminants in V1 only. When viewing the scene in a cue conflict condition that abolished color constancy as measured psychophysically, generalization also broke down in V1. When fMRI activity was elicited by stimuli that were matched in reflected light but differed in illumination and therefore also perceived surface color, higher visual areas showed an increasing bias towards surface color representation and a decrease in illuminant color representation. Our results demonstrate the differential roles that V1 and V4 areas play in transforming chromatic input into color constant percepts.
2013
- memory colourDecoding the yellow of a gray banana.Michael M Bannert and Andreas BartelsCurrent Biology, Nov 2013
Some everyday objects are associated with a particular color, such as bananas, which are typically yellow. Behavioral studies show that perception of these so-called color-diagnostic objects is influenced by our knowledge of their typical color, referred to as memory color. However, neural representations of memory colors are unknown. Here we investigated whether memory color can be decoded from visual cortex activity when color-diagnostic objects are viewed as grayscale images. We trained linear classifiers to distinguish patterns of fMRI responses to four different hues. We found that activity in V1 allowed predicting the memory color of color-diagnostic objects presented in grayscale in naive participants performing a motion task. The results imply that higher areas feed back memory-color signals to V1. When classifiers were trained on neural responses to some exemplars of color-diagnostic objects and tested on others, areas V4 and LOC also predicted memory colors. Representational similarity analysis showed that memory-color representations in V1 were correlated specifically with patterns in V4 but not LOC. Our findings suggest that prior knowledge is projected from midlevel visual regions onto primary visual cortex, consistent with predictive coding theory.
2012
- Predicting memory color from neural responses to achromatic images of color-diagnostic objectsMichael M. Bannert and Andreas BartelsIn 42nd Annual Meeting of the Society for Neuroscience (Neuroscience 2012), New Orleans, LA, USA, 2012
Some objects that we deal with on a daily basis are associated with an object-specific color [[unable to display character: –]] such as yellow for bananas, red for strawberries, green for lettuce, etc. Such objects are referred to as color-diagnostic and their associated color as their memory color (Hering, 1920). Psychophysical evidence shows that achromatic , i.e. grayscale, images of color-diagnostic objects elicit percepts that are differentially biased towards their memory color (Hansen, Olkkonen, Walter, & Gegenfurtner, 2006; Olkkonen, Hansen, & Gegenfurtner, 2008). This phenomenon suggests some form of learned and automatic association between colors and particular objects. In the present study we tested whether neural responses to color-diagnostic objects convey color-specific information, even when the objects were presented achromatically to subjects who were naïve to the purpose of the study. We first collected fMRI data while participants viewed grayscale images of 8 different color-diagnostic objects (4 colors, 2 per color). We then recorded responses to chromatic stimulation with red, green, blue, and yellow abstract color stimuli that contained no object information. All object and color stimuli were set to equiluminance for each subject individually. To analyze the data, we applied a whole-brain searchlight procedure by training linear support vector machine classifiers to distinguish between local voxel patterns associated with the four colors. They were then tested on patterns elicited by color-diagnostic achromatic objects to predict their correct memory colors. At the group level, we found significant decoding accuracy in a large cluster covering foveal regions of early visual cortex. In some but not all individual subjects, smaller clusters were also evident in the fusiform gyrus. Our results suggest that memory color and color signals evoked by chromatic stimulation share a common neural mechanism in early visual cortex. Retinotopic mapping in combination with classification techniques will be used to clarify the contribution of individual visual areas to this mechanism.
2011
- Working memory maintenance of grasp-target information in the human posterior parietal cortexKatja Fiehler, Michael M. Bannert, Matthias Bischoff, Carlo Blecker, Rudolf Stark, Dieter Vaitl, Volker H. Franz, and Frank RöslerNeuroImage, 2011
Event-related functional magnetic resonance imaging was applied to identify cortical areas involved in maintaining target information in working memory used for an upcoming grasping action. Participants had to grasp with their thumb and index finger of the dominant right hand three-dimensional objects of different size and orientation. Reaching-to-grasp movements were performed without visual feedback either immediately after object presentation or after a variable delay of 2-12. s. The right inferior parietal cortex demonstrated sustained neural activity throughout the delay, which overlapped with activity observed during encoding of the grasp target. Immediate and delayed grasping activated similar motor-related brain areas and showed no differential activity. The results suggest that the right inferior parietal cortex plays an important functional role in working memory maintenance of grasp-related information. Moreover, our findings confirm the assumption that brain areas engaged in maintaining information are also involved in encoding the same information, and thus extend previous findings on working memory function of the posterior parietal cortex in saccadic behavior to reach-to-grasp movements. © 2010 Elsevier Inc.
2009
- Gibt es ein Kurzzeitgedächtnis für Greifbewegungen im parietalen Cortex?Michael M. Bannert, Volker H. Franz, Matthias Bischoff, Carlo Blecker, Rudolf Stark, Dieter Vaitl, Frank Rösler, and Katja FiehlerIn A. Eder, K. Rothermund, S. Schweinberger, M. Steffens, & H. Wiese (Eds.), 51. Tagung Experimentell Arbeitender Psychologen (TeaP 2009) (pp. 52). Lengerich, Germany: Pabst., 2009
Visuelle Kontrolle von Greifbewegungen erfordert die Anpassung der greifenden Hand an das Zielobjekt auf der Grundlage visueller Information über dessen physikalische Eigenschaften. Einzelzellableitungen an Affen zeigen, dass der anteriore intraparietale Sulcus auf die visuelle Kontrolle und kurzzeitige Speicherung von Greifbewegungen spezialisiert ist. Funktionelle Bildgebungsstudien deuten darauf hin, dass eine vergleichbare Region auch im menschlichen Gehirn existiert. Welche Rolle dieses Areal jedoch bei der kurzfristigen Speicherung visuomotorischer Repräsentationen spielt, wird allerdings kontrovers diskutiert. In der aktuellen fMRT-Studie wurden Versuchspersonen instruiert, nach einem Behaltensintervall variabler Dauer blind nach einem zuvor visuell enkodierten Objekt zu greifen. In einer Kontrollbedingung griffen Versuchspersonen unmittelbar im Anschluss an die Enkodierungsphase nach dem Objekt. Wir finden eine anhaltende Aktivierung des anterioren intraparietalen Sulcus während des Behaltensintervalls. Dies steht im Einklang zu Befunden aus Einzelzellableitungen und aktuellen Arbeitsgedächtnistheorien, denen zufolge Regionen, die für die Echtzeitverarbeitung von Informationen zuständig sind, auch zu deren kurzzeitiger Speicherung beitragen.
2007
- Context in free recall: multi-voxel pattern analysis of fMRIGreg J. Detre, Sean M. Polyn, Michael M. Bannert, and Kenneth A. NormanIn 37th Annual Meeting of the Society for Neuroscience (Neuroscience 2007), San Diego, CA, USA, 2007
Several researchers (e.g., Howard Kahana, 2002) have proposed that recalling an event is bound up with recall of that event’s surrounding context, and that retrieved context information can be used to cue memory for other items from that context. In this study, we sought evidence for this contextual reinstatement process using fMRI. Specifically, we wanted to know whether the task being performed when forming a memory would be recalled along with that memory, and how this would influence subsequent recalls. Subjects studied lists of 24 words, performing either a size, animacy or pleasantness judgment task on each word. After a series of arithmetic distractors, subjects were asked to recall out loud and in any order the words from the most recent list. Since subjects were being scanned during both study and recall phases, we trained a classifier on the study period to distinguish which of the three tasks were being performed. We then tested this classifier during recall to estimate the degree to which each task representation was active in the subject’s mind, moment by moment (Polyn et al, 2005). To analyze the recall data, we labelled each recall with its judgment task from the study period. These were predicted better than chance by the classifier’s estimates of task activity at recall. We broke the data down further, looking at the transitions from one recall to the next. We found that high classifier activity for one kind of task judgment indicated that the next recall would be another item from that task, and that the inter-response latency would be small. In other words, a highly active task representation would facilitate recalls of other items from the same task. These results support the contextual reinstatement theory, suggesting that reinstating the context surrounding an event improves recall of other items that were studied in that context.